Beginning the Text Comparison Application

Since I have been given the task of implementing a new text comparison feature for the Versioning Machine 5, it is clear that many questions are going to have to be answered before I can begin even thinking about writing the code and embedding it into the software.

In a VM survey carried out a few months ago, 17 out of 21 survey participants noted that they used the VM for the comparison of prose texts, 10 for poetry and 3 for drama.

The VM, however, lacks any differentiation comparison capabilities. In its current iteration it can only highlight the same line across each of the different witnesses. Its inability to show up difference seems a deficit somewhat, given the fact that users expressly stated in the survey that they would like to use the VM for longer texts in the future, and would therefore find a comparison functionality which highlights the difference within each witness more useful – as with Juxta’s heat map view.

Only in the last couple of weeks have I been set the task, by my mentor, to begin development of the text comparison application. The lifecycle of such a task doesn’t begin, however, with coding. There is a lengthly incubation process in which I must break down the code into a list of steps. These steps can be thought of as something like detailed descriptions of each function that will be requisite for successful implementation of the app.

In order to begin my development task, I needed to start by writing what is commonly referred to in IT as pseudo-code. Pseudo-code is not a functioning code in and of  itself, but it is an important aid in web development and software engineering  and helps the human developer to better understand the complexity and semantics of their software.

Being a novice at best at JavaScript, it has been extremely tricky for me to identify how I should begin drafting my code. For instance, in order to even begin creating a text comparison diff. (difference) utility, I will need to be able to grab the correct parts of the DOM elements from the Versioning Machine’s HTML files, as this is the only way that I will be able to get the code to parse through each line that has the same line number from each different witness. The pseudo-code, for the time being anyway, begins like this:

1.Loop over all lines of all witnesses

2. Get all witnesses for one line
a.  get the a class or line ID to find all readings of same line across different witnesses
b. using a class or line ID get different readings in HTML DOM
c. store the readings in either a JS array or JS object
d.  return the JS array or object

The pseudo-code above, however, is only one possible way to begin writing a diff. utility tool. There are already available many text comparison tools and APIs available on the internet, like prettydiff <prettydif.com> or <stringjs.com> , each with their own idiosyncrasies and differing capabilities. Another issue I will have to chew over is whether or not it may be easiest to embed one of these open-source APIs into the VM instead of building it from scratch.

The project team will need to discuss whether these APIs are capable of delivering what is needed for the VM. They may not, for instance, be able to compare four or five different texts side by side (usually they go as far as two). It may be better in the long run to develop a bespoke differentiation application since it will be easier to debug if the VM were to pass through later iterations down the line.

Other ontologically-based problems at hand are questions such as, should the text comparison app deal with mark-up based characters such as punctuation, or should it just highlight different words? Should the diff. comparison be viewed as a heat map, in which the colour gradient highlighting each word becomes darker, say, to indicate where there are is more differentiation across each of the separate witness documents? — or should there be a side by side view where one text is offset against another? If it was decided to highlight difference across three, four, or five different witnesses side-by-side at the same time, how would this work? Would there need to be a choice for the user to toggle on and off a base text of choice against which all the other witnesses are compared?

As I continue to revise and refine the pseudo-code I can begin to test it out, incrementally, by writing small pieces of code (like functions and methods) through which I can learn how to implement my ideas through the JavaScript syntax.

The meetings I have had with my mentor have been incredibly useful so far – as he has shown me how to create a workable JavaScript environment on my computer (which involves creating a html file through which JavaScript code can be tried and tested). My mentor has suggested that I write a small sample webpage consisting of four strings, something like:

<div class=’apparatus v1′>The is a test string</div>
<div class=’apparatus v2′>This is a test string.</div>
<div class=’apparatus v3′>This is a string test</div>
<div class=’apparatus v4′>This is, a string test.</div>

and through this I can learn how to write simple import or ‘get strings’ function, or cycle through the <div> tags with for or do… while loops. Writing these functions will be the first step in learning how to get, grab and manipulate the html code for the Versioning Machine.

Javascript Module Pattern: Implied Globals and Module Augmentation

 

Global Import:

Another important aspect of JS that makes the Module Pattern possible is the implied globals feature. There is a good video describing the process of implied globals here: <https://www.youtube.com/watch?v=6VxkOC65Msk>. Basically, if a variable is declared within the scope of a given function, then that variable is inaccessible to any code outside of that function. If, however, the var is not used, but just the name of the variable, then the interpreter assumes that this is a global variable, and so the variable and its value can be accessed outside of the function.

Berry suggests that passing globals as parameters is a clearer and faster way than passing implied globals for importing globals into the code:

(function ($, YAHOO) {
// now have access to globals jQuery (as $) and YAHOO in this code
}(jQuery, YAHOO));

Berry’s next step in expounding the ins and outs of the module pattern in JS is the declaring of globals within the anonymous function, rather than simply importing them as parameter values. Using the return value means that only what we want to come out of the function as a global object will be passed through it. Everything else remains well hidden within the wrapper.

Berry’s example:

var MODULE = (function () {

var my = {},

privateVariable = 1;
function privateMethod() {
// …
}
my.moduleProperty = 1;
my.moduleMethod = function () {
// …
};
return my;
}());

 

In the above code, Berry has created a namespace so that the methods within it can be accessed when returned.

In order to gain a more comprehensive understanding of Berry’s code, it may be worth looking at this shorter segment of code by Todd Motto <http://toddmotto.com/mastering-the-module-pattern/>:

 

var Module = (function () {
var privateMethod = function () {
// do something
};
})();

 

In the above code a function is declared: privateMethod. This is locally contained within the new scope of the anonymous function.

Using return within a module’s scope will then return the methods inside of it back to its declaration “var Module” (or namespace), which essentially means that the object (the result of the methods inside the module) will then be global:

 

var Module = (function () {

return {
publicMethod: function () {
// code
}
};
})();

 

As it’s an object literal being returned, we can then call them anywhere in the code as globals in this way:

Module.publicMethod();

Motto gives us a distinct example of how object literals can be returned, so that it can then be called globally as: Module.publicMethodOne

 

var Module = (function () {
var privateMethod = function () {};
return {
publicMethodOne: function () {
// I can call `privateMethod()` you know…
},
publicMethodtwo: function () {
},
publicMethodThree: function () {
}
};
})();

 

Motto also demonstrates how to access private methods if we want to do so. All we need to do is to pass the private methods into public ones within the module:

 

var Module = (function () {
var privateMethod = function (message) {
console.log(message);
};
var publicMethod = function (text) {
privateMethod(text);
};
return {
publicMethod: publicMethod
};
})();

// Example of passing data into a private method
// the private method will then `console.log()` ‘Hello!’

Module.publicMethod(‘Hello!’);

 

Note that using publicMethod:publicMethod with the return keyword is possible due to something called the JS Revealing Module Pattern.

Another interesting function of the module pattern is the ability to augment modules, which basically means importing other modules from different files into our current module.  We can then do things with the imported module before passing this through the current module via the return keyword:

 

var MODULE = (function (my) {
my.anotherMethod = function () {
// added method…
};
return my;
}(MODULE));

We could access this method globally with the statement:

MODULE.anotherMethod

 

Loose Augmentation:

One particular pattern we can use when augmenting a module is Loose Augmentation. Berry describes how with this method, scripts are loaded asynchronously, where “flexible multi-part modules that can load themselves in any order“:

 

var MODULE = (function (my) {

// add capabilities…

return my;

}(MODULE || {}));

 

So, what is happening in the above code? If we resolve it into its component parts we see that the code is an anonymous self-invoking function that handles the “my” object parameter. This means that changes are made to “my” within the code and then returned to Module.

The “my” parameter is sent back as (MODULE || {}).

This expression means: if MODULE is defined, use it, otherwise, create a new empty one.

Javascript Module Pattern: Anonymous Functions and Implied Globals

Recently, I have been assigned the task to look into the Javascript Module Pattern. Restructuring the VM JS code in this way may make it easier to delineate different components of the code into discrete functions, so to obviate the possibility of variable duplication and control the scope of variables so that they remain local. A very good introductory article to the JS module pattern can be found here <http://www.adequatelygood.com/JavaScript-Module-Pattern-In-Depth.html>.

To quote Ben Cherry, “the fundamental construct that makes it all possible” is the JavaScript anonymous function. Cherry describes this code as a closure, that is: everything that runs inside the anonymous function is discrete and is isolated from the rest of the surrounding JS code.

Explanatory Youtube video about anonymous functions: <https://www.youtube.com/watch?v=JRCJ0zmooJE>.

None of the code inside an anonymous function has global scope, and Cherry uses the term ‘privacy’ to describe this. Likewise, the code within the anonymous function is afforded ‘state’. What I understand to mean by ‘state’ is that if any code is altered or changed within the anonymous function, this will not effect any of the code extraneous to it.

Click here to find out more about JavaScript state:<http://www.dofactory.com/javascript/state-design-pattern>.

Below is an example Berry gives of an anonymous function, which states that globals can still be accessed within the wrapped element, but any variables or functions declared within the scope of the anonymous function are contained within its scope:

(function () {

// … all vars and functions are in this scope only

// still maintains access to all globals

}());

Berry adds that the () around the anonymous function is required, since statements that begin with the token function are considered to be function declarations. The () creates a function expression instead.

A function expression is different to a function declaration in that a function can be assigned to a variable in a function expression. When a function expression has been stored in a variable, the variable can then be used as a function.

Here is an example from W3schools: <http://www.w3schools.com/js/tryit.asp?filename=tryjs_function_expression_variable>

The example above is in fact an anonymous function, as the function has not been assigned a name.

The example above that Berry gives of an anonymous function can also be more accurately described as an anonymous self-invoking function. These functions do not have to be called, since they invoke themselves. You cannot self-invoke a function declaration, which is why a self—invoking function needs to be a function expression. Example of self-invoking expressions from W3schools: <http://www.w3schools.com/js/tryit.asp?filename=tryjs_function_expression_self>.

Of course, one of the main differences between a self-invoking function and a function declaration is the fact that a function declaration needs to be called to be executed (i.e. the function is stored and saved for later use).

Annotated Bibliography: Cultural-Criticism in the Digital Humanities: Race, Gender and Identity in the Technological Age

The following is a selection of essays written by and for digital humanists that cover the somewhat neglected area of cultural criticism, race, gender and identity politics in DH. Some reprehend DH practitioners for their seeming lack of concern for the wider social and cultural implications of the technologies they employ. Others, through methodology and praxis, attest to just how important solidifying research through constant critiquing of identity representation is, as technological development is coming to be seen as a monopoly of those in positions of monetary privilege and power. Overall, these essays bear out the growing landscape of DH — one that seeks to broaden its horizons through interdisciplinary methods and make its mark as a polemically useful field of study for addressing or redressing contemporary issues of technology and culture in today’s society.

Argamon, Shlomo et al. ‘Gender, Race, and Nationality in Black Drama, 1950-2006: Mining Differences in Language Use in Authors and their Characters’. Digital Humanities Quarterly 3.2 (2009): n. pag. Web. 25 November 2014. 

This article describes a data-mining project using a database developed by Alexander Street Press (ASP), in collaboration with the ARTFL project. Argamon et al. discuss their using of a bespoke machine-reading and text-mining tool, PhiloMine, with a view to finding idiosyncratic patterns in the works of black authors from the Black Drama collection. The results reveal that while the PhiloMine tool could be calibrated such that it would sort texts by author or speaker race and gender, the onus was on the human to aptly identify where the binary extremes of the yes/no computer consolidation outputs overlook the subtle, but salient, differences within the computer-sorted texts.

Bailey, Moya Z. ‘All the Digital Humanists Are White, All the Nerds Are Men, but Some of Us Are Brave’. Journal of Digital Humanities 1.1 (Winter 2011): n.pag. Web. 25 November 2014.

Bailey argues that the way in which identities inform ‘both theory and practice in digital humanities have been largely overlooked’. The tendency within digital humanities to balk at the moment of identity  politics and cultural criticism is tantamount to disabling the yet burgeoning field from unexplored avenues, and ultimately exposes inherent ‘structural limitations’ within its methodological practices. Bailey invokes the writings of Lisa Nakamura, informing us of her tendentious criticisms encompassing such topics as ‘the alienated labour of people of colour in the production of technology that advances digital scholarship practices that they will not be able to access or directly benefit from’. Bailey adds that the interdisciplinary nature of the digital humanities makes it ‘uniquely poised to apply academic research to itself and its products’.

Earhart, Amy E. ‘Can Information Be Unfettered? Race and the New Digital Humanities Canon’. Debates in the Digital Humanities. Ed. Matthew K. Gold. University of Minnesota Press, 2012. 309-318. Print.

Earhart bemoans what she descries as a dearth of digitised collections and materials apropos of race online, despite the opportunities afforded by the internet as a potential platform for minority groups to be heard. Bending her thoughts back to the incipient years of the internet and its virtues for the democratisation of knowledge, Earhart points out the designs of iconoclast scholars who apperceived its potentialities as a conduit from which to bring into public property lesser-known works traditionally shirked by the literary canon. Earhart asserts that digital humanists must theorise technological approaches with a mind towards cultural construction for fear that certain materials will be excluded from digitisation. Earhart finally demonstrates how the TEI tagging system, which includes tags of linguistic and cultural identifiers, follow contemporary cultural-critical theories in their nomenclature — ultimately serving as symbols for future overtures between cultural-critical engagement via technological appropriation.

Finn, Ed. ‘Revenge of the Nerd: Junot Díaz and the Networks of American Literary Imagination’. Digital Humanities Quarterly 7.1 (2013): n. pag. Web. 24 November 2014.

This article employs Junot Diaz’s The Brief Wonderous Life of Oscar Wao as a case study for the growing trend of the mixed-cultural narrative. Diaz’s novel is a mash up of mainstream American ‘pop culture’, the ‘hyperwhite’ technoscience and science-fiction fantasy fuelled domain of  ‘nerddom’, and cultural-historical accounts of ethnicity and ethnic identity as outlier from the Anglo-American cultural locus. Finn makes use of computational algorithms and scripts encoded in Perl to scrape information from Amazon in order to analyse the readership patterns of Diaz’s novel. The diversity of genres, titles and styles connected to Oscar Wao underwrites the multiplicity of Diaz’s style and content. Diaz’s novel can thus be seen as an attempt to interrogate stereotypically defined identities of culture, ethnicity, gender and ‘nerdiness’ in the way his novel remains undefinably elusive.

Fiormonte, Domenico. ‘Towards a Cultural Critique of Digital Humanities’. Historical Social Research.  37.3 (2012): 59-76. Web. 24 November 2014.

Fiormonte reprises Alan Liu’s proposal for a cultural-critical model in DH scholarship (‘Where is Cultural Criticism in the Digital Humanities’) in his assertion that while digital humanists do tend to analyse the repercussions of methodologies involved with the adoption of technological tools, the wider sociological and global questions of technology are not being broached. The article assumes that the digital humanities are dominated by an Anglo-American elite with a ‘mono-cultural view’ in consequence of a perceived unwillingness to partake in cultural-critical debate. Technology, Fiormonte argues, is ‘subject to the influence of its environment [and] culture’ and thus he draws the conclusion that technology is part of culture, not a cause or effect of it. Organisations such as Unicode and standards like ASCII are by their very nature biased towards western viewpoints of the world. The digital humanities, Fiormonte concludes, need to leverage the possibilities for communications technologies to invite digital humanists working outside of the western world, or what is often referred to as the periphery, into discussion.

Gailey, Amanda. ‘A Case for Heavy Editing: the Example of Race and Children’s Literature of the Gilded Age’. The American Literature Scholar in the Digital Age, EDs. Amy Earhart and Andrew Jewell, University of Michigan Press. 125-144. Print. 

The author adumbrates her intention to cut across the heretofore selectively narrow scope of the digital literary canon. Gailey plans to create a digital archive for Joel Chandler Harris, a children’s author and folklorist who has incurred disfavour in literary studies over the course of the twentieth-century due to his controversial depiction of people of colour. Despite the controversy, there has invariably been a running battle between those who promote his works as offering overtures between the black/white divide in turn-of-the-century America and those who denigrate his demotic speech-patterns of the African slave as racially pejorative. Nonetheless Gailey feels his stories, derived from the mythopoeia of black slaves, warrants exposure for their keen insights into racial depictions of the past.

Liu, Alan. ‘Where Is Cultural Criticism in the Digital Humanities?’ Debates in the Digital Humanities. Ed. Matthew K. Gold. University of Minnesota Press, 2012. 490-509. Print.

Liu argues that while the digital humanities develops ‘tools, data, and metadata critically […] rarely do they extend their critique to the full register of society, economics, politics, or culture’. Liu believes that this lack of cultural-criticism in the digital humanities will ultimately serve as a deficit for further development in the field, using Casanova’s The World Republic of Letters and Moretti’s Graphs, Maps, Trees as proxies for the success of cultural-critical cannibalisation for quantitative analysis. It is incumbent upon digital humanists to co-opt tools for the purpose of public advocation and public communication — not just as a plank for themselves but for the humanities in toto, especially in a straightened time of fiscal retrenchment and lack of funding.

McPherson, Tara. ‘Why Are the Digital Humanities So White? or Thinking the Histories of Race and Computation’. Debates in the Digital Humanities. Ed. Matthew K. Gold. University of Minnesota Press, 2012. 139-160. Print.

McPherson, like Fiormonte, descries a causal relationship between the lack of discussion about race, minority and difference in the digital humanities and technology’s matrix in ‘post-World War II computational culture’; or to put it less synecdochically, western culture. McPherson asks why it is that technological and cultural debate rarely overlap, and looks at how, for no perceptible reason, those interested in the cultural upheavals of the 1960s, such as the rise of the American Indian Movement or the assassination of Martin Luther King Jr., do not include technological innovations (like the development of UNIX in 1969) into their sphere of knowledge. Mcpherson likewise concludes that the new media theory that developed in the 1990s failed to acknowledge technology as culturally imbued rather than neutrally disinterested. McPherson examines how the changes in technology, such as the development of more modular systems like UNIX in computing, can afford insights into the changing genius of the society from which it was born, a stance that harks back to Fiormonte’s argument, that the ontological make-up of much technology often mirrors the societal impulses and beliefs from which it  came.

Moretti, Franco. ‘Conjectures on World Literature’. NLR 1 (Jan/Feb 2000): n.pag. Web. 25 November 2014.

Moretti conceives world literature as a marketplace in which the core and periphery are in constant exchange, with the upshot being that the periphery is consistently affected by the ascendant position of the core, with the core completely ignoring the periphery. With the potentiality for a distant reading of world literature through the analysis of sweeping trends and patterns over extended periods of space and time, Moretti expounds a new breed of reading that can analyse world literature as a  singular, albeit incommensurate, system. Morello’s method of distant reading incorporates cultural criticism inasmuch as his exegetical approach to graphs and figures feeds off the wider social, cultural and economic context of the global versus local, orient versus occident.

Wernimont, Jacqueline. ‘Whence Feminism? Assessing Feminist Interventions in Digital Literary Archives’. Digital Humanities Quarterly 7.1 (2013): n. pag. Web. 25 November 2014.

This essay explores how in the wake of the world wide web’s engendering, feminist interests were piqued by the untold possibilities for the dissemination of women’s writing, largely marginalised within print culture. Inaccessibility and scarcity had long been for women’s writing prominent bugbears, and so scholarly recovery and the provision of visibility for archives were invariable concerns for those working with women’s writing. The author makes a note of the outcrop of epistemological concerns regarding the expectations of a digital archive seeking to relive the dream, and scale,  of the Library of Alexandria. The additive approach, for example, of size over content seems to model itself more proximately to traditionally patriarchal models of power and imposition. Wernimont skips to more recent years, mapping out the gender divide in the computer sciences, and highlighting the unquestioning and dangerously passive approach of much interface design today. The growing trend of the passive consumer is dangerous inasmuch that just as the limitations and conventions of a computer system remain unquestioned so to do the corporate systems of our society today. The digital feminist archive might be seen as a “testbed” for the digital humanities in total, as the appropriation and even development of tools for purposes outside of the locus of power may offer insights into alternative methods, solutions and theoretical models of and about technology.

Interpretations not Truths: The Necessity of Play

                                                               and as I said I am not ready
To line phrases with the costly stuff of explanation, and shall not,
Will not do so for the moment. Except to say that the carnivorous
Way of these lines is to devour their own nature, leaving
Nothing but a bitter impression of absence, which as we know involves presence, but still.

John Ashbery ~ The Skaters

 

In Charles Bernstein’s somewhat involved essay-verse ‘Artifice of Absorption’  the poet-critic seems to invoke Benjamin’s ‘absorption’ effect of the ‘aura’ in the mind of the perceiver when he posits that the idea of ‘absorption’ we’ve all come to understand as a fundamental prerequisite when reading poetry (namely, the edifying effect of meaning & the concomitant/coterminous emotive valences semantic meaning brings in tow) is a moment in which the reader/perceiver in an abstract sense outstrips, or even better, overlooks the material constructions of the text or object before them (failing to recognise the typology or bibliographical factors interplaying with the reception processes of a codex, or the jutting-out spikes and rounded contours of paint in an impasto portrait). To ignore the present truth of such objects (texts included) is to be carried away or tricked by the prestidigitation of the magician; it is not to witness the ‘truth’ in the intricacies of the material in front of you or the careful, morphological construction of the material components, with the intention of deceiving.

Benjamin’s finding pedigree with aura, absorption and religious idolatry (Benjamin, 223) bolsters Bernstein’s argument that ‘absorption’ is part of a largely theological, even atavistic paradigm & is ultimately distorting of reality, as it posits that one thing is more true, or closer to the truth, than another. The aura & its absorptive reaction are entirely fictitious. The supposed import of one singular object over another are engendered in the individual per the inculcations of  a wider societal imputation & assertion of what is important & what is not, & ultimately may serve as a beeline towards consolidation & control. For Bernstein, then, ‘absorption’ is an atavistic way of reading or perceiving art, and this mode of reception must be stemmed and can be by investing meaning in what has always been considered the ‘nonsemantic’ components of a text:

I would say
that such elements as line breaks, acoustic
patterns, syntax, etc., are meaningful rather than,
as she [Veronica Forrest-Thomson] has it , that they contribute to the meaning
of the poem. For instance, there is no fixed
threshold at which noise becomes phonically
significant; the further back this threshold is
pushed, the greater the resonance at the cutting
edge. The semantic strata of a poem should not be
understood as only those elements to which a
relatively fixed connotative or denotative meaning
can be ascribed, for this would restrict meaning to
the exclusively recuperable elements of language—a
restriction that if literally applied would make
meaning impossible.

Bernstein’s writing here as one of the standard bearers of the Language ‘school’ of poetry; a study in poetics predicated to the nth degree on form. Meaning in poetry, to put is succinctly, is the form. What we traditionally come to perceive as meaning as readers is, according to this poetics, a fallacy, as this semantic valence of language is completely & entirely constituted by the graphemic & phonemic morphology of a poem’s structure. There’s no getting around or getting past this. To index a poem’s meaning as being extensible beyond this form-barrier is forcing your own singular truth upon it. It is the selfsame process witnessed in religious idolatry (the aura subsummation of pointing to something ‘greater’), by pressing your singular interpretation of the truth or what the object represents upon it. In Bernstein’s reaction to this response, he argues that to focus on the form-morphology of any given object or text, the perceiver/reader is not closing in meaning but keeping the possibilities of what the object might mean open.

Bernstein then continues his argument by pointing out the checks & limitations of such traditional ways of reading, the contradistinction of form & content for the purpose of accommodating orthodox exegeses in which both are compared & contrasted, form feeding into content and vice versa:

The obvious problem is that the poem said in any
other way is not the poem. This may account for
why writers revealing their intentions or
references (“close readings”), just as readers
inventorying devices, often say so little: why
a sober attempt to document or describe runs so
high a risk of falling flat. In contrast, why not
a criticism intoxicated with its own metaphoricity,
or tropicality: one in which the limits of
positive criticism are made more audibly
artificial; in which the inadequacy of our
explanatory paradigms is neither ignored
nor regretted but brought into fruitful play.

In ‘Mind Your P’s and B’s: The Digital Humanities and Interpretation’, Stanley Fish rejects the, what he might consider, latest trends in literary reading (in particular those used by digital humanists, such as algorithmic criticism), while advocating the potentialities of insight yet to be gleaned by a continuation of ‘close reading’. He handily gives a demonstration as to just what ‘close reading’ entails, by positing how it could be argued that in his polemical work Areopagitica, Milton employes the fortuitous circumstance of both the letters ‘b’ & ‘p’ being approximate in sound (both bilabial plosives) to formally signify & ultimately underpin his message that despite the complaining of the Presbyterian ministers about how they were censored by the Episcopalian bishops, the Presbyters were really quite the same in carrying out their own acts of censorship. Fish can then extrapolate from the phonetic conflation of the consonants ‘p’ & ‘b’, of Presbyter & bishop, from a formal device in Milton’s prose apparatus, the content-meaning of the piece. He has drawn an interpretive solution per the formal properties.

What Fish fails to recognise is that this form-content continuum, this method of ‘understanding’ literature per ‘close reading’ (which can, as Franco Moretti rightly devises it, be seen as a school of thought with a trajectory from New Criticism to Post-structuralism [Conjectures on World Literature, 57]) has been consistently undermined, argued against & fudged by poets and writers alike in the attempt to obfuscate the supposition of finding meaning, or truth, in a text. The poet John Ashbery could be considered in many respects to be of a similar stripe to Bernstein inasmuch as his poetry doesn’t pretend to offer up any meaning, & so he obviates  traditional semantic overtures  through a heavy interplay of experimentation with register, parataxis & more traditional devices like rhyme (examples of Ashbery poems: ‘The Skaters’, ‘Daffy Duck in Hollywood’).

I would like to see Fish defend the use of ‘close reading’ for the purpose of extrapolating any meaning from an Ashbery poem, when Bernstein’s model of the ‘nonsemantic’ actually being misinterpreted, as being the true semantic, would fit so much better. In ‘Toward an Algorithmic Criticism’, Stephen Ramsey asserts that the problem with methods of reading grounded in philological, philosophical or ultimately academic principles is that they purport to find empirical truths in their ‘findings’, without regard for the fact that all hypotheses of text hermeneutics are nothing more than that:  hypotheses. To look at a text through a particular (and pre-chosen) lens is to reinvent a text, to re-create. Yet in the way traditional scholarship has come about, the ladder (as Ramsey allegorises it), by which he means the experimental processes of cherry-picking, is thrown out (like the baby with the bathwater) at the end, in the hope of disguising this fundamentally ‘ludic’ makeup of all research:

Throwing away the ladder has, in fact, been the consistent method of literary criticism, which, as a rhetorical practice, is indeed often concerned with finding ways to conceal these steps by making it seem as if the author went from open possibilities of signification in Lear to the hidden significance of the Fool in a single bound. The computational substrate upon which algorithmic criticism rests, however, demands that one pay attention to the hidden details of pattern formation. Algorithmic criticism might indeed be conceived as an activity that seeks to scrutinise the discarded ladder.

(171)

In scrutinising “the discarded ladder”, then, academic research will finally be able to come abreast of the philological and philosophical ideas of much writing today, which could be regarded as an enterprise in eluding the oppressive attempts at pinning down meaning. To come back to Ramsey, the question is not what does a text mean, but rather, how do we ensure that it keeps on meaning? (170). Where an object or text like an Ashbery poem is never supposed to be ‘right’ (Ramsey again, 173), why preclude an engagement with its inchoateness, its refusal for anything other than its contingency, by overlooking its interplay of form, just to meet an end? (To strengthen your academic vita maybe — to make ends meet?) New methods of computerisation in the humanities can underwrite the ludic in the post-modern poem:

Computer-enabled ‘play’ can accomplish the same type of alteration which these writers have pursued in their works. Such poetic play, beyond the poetic products themselves, serves as a tool to increase readers’ awareness of poetry as a unique blend of word, structure, and pattern. By imbuing the poetic text with a new dimension, on-screen manipulation of what has been called ‘electric poetry’ (Silverstein) evokes the reader’s participation in the poetic process. The interactive modality offered by the electronic medium destabilizes the text, allowing the reader to explore it more thoroughly than is possible in the fixed printed medium and to both appreciate and experience poetry as ‘play’.

(Irizarry, Tampering with the Text to Increase Awareness in the Poetry’s Art, 155)

With the prospects of what algorithmic criticism can do finally come the possibility of a criticism semblable to the poetry it purports to understand. Reader engagement will change in a manner more akin to what was intended with multimedia based interactivity of re-imaginings of data-mined works of art. We can finally unfetter ourselves from the trammelling prescriptions of traditional readings that, as Wilkins asserts, are predicated on ‘a deep set of ingrained cultural biases that are largely invisible to us’ & ultimately make us ignorant ‘of their alternatives’ (250). Which brings us back the the aura once more, & to the artifice of the aura, & of course, the artifice of absorption, which we need to constantly hack at in order to see the wood for the trees, to fend off cultural & historical programmes/distortions through which we’ve been told to read ; for the interpretation of interpretations, not the truth.

Works Cited:

Bernstein, Charles. Artifice of Absorption. EPC Digital Library. 2014.

Fish, Stanley. Mind Your P’s and B’s: The Digital Humanities and Interpretation. The New York Times. 2012.

Irizarry, Tampering with the Text to Increase Awareness in the Poetry’s Art. Lit Linguistic Computing  11, 4, 155-162. 1996.

Moretti, Franco. Conjectures on World Literature.

Ramsey, Stephen. Toward an Algorithmic Criticism. Literary and Linguistic Computing 18, 2, 167-174. 2003.