Presentation at ESTS 2015

I recently attended the European Society for Textual Scholarship’s 2015 conference held at De Montfort University in Leicester, UK. At this conference, I gave a presentation entitled Beyond Google Search: Editions as Dynamic Sites of Interaction. The focus of the presentation was a discussion around some of the common UI tropes and metaphors we rely upon in Digital Scholarly Editions and an examination of how these elements are applied.  The presentation consisted of a discussion around the subject of interaction design, a break down of the common tropes & metaphors along with a comparison of 14 different scholarly editions and which of the metaphors were utilised, and a brief case study involving the Letters of 1916, a project at Maynooth University with which I have had the pleasure to be involved.

While my plan is turn this presentation into a paper for the ESTS 2015 Variant, I have had some requests for presentation slides, as a few people were interested in the content I have presented. As such, I’ve included a link to a google slides version of the presentation which can be found here.

This presentation just begins to scratch the surface of my research, and I am more than happy to discuss any questions or comments you may have.  Please feel free to utilise the contact form on this blog to get in touch with me.

Happy reading!

Modeling Maynooth Castle Part 1

For my final project for AFF621 – Remaking the Physical, I was tasked with creating a 3D model of a cultural heritage object.  After considerable deliberation, I selected Maynooth Castle, which is a castle that once stood in the heart of Maynooth and was a major seat of power in Ireland from its construction in the latter part of the 12th century until its seige and destruction in 1534.  I selected it for two primary reasons: my love of ruined castles and the lack of information showing what the castle likely looked like during the height of its power, prior to its destruction.

Over the course of several blogs, I will recount the steps I have taken to produce the model and what trials and tribulations I have endured.  This is my first forray into creating 3D models, and while I’m excited to see the final output, I must admit to some trepidation regarding the scope and ambition of my project.  But I have always risen to a challenge, and this time is no different.  Hopefully, my efforts will be met with success, and by documenting my process and trials as I go, hopefully not only will I learn something from the experience, but so, too, will my readers.

Researching the Castle

The castle itself was originally built in the latter part of the 11th century and was a seat of power for the Fitzgerald family.  While I will save much of the formal history of the castle for my official report, I will note that the castle stood as a whole for nearly 300 years.  It fell in 1534 after a 10-day seige by the British, thanks to the rebellion of Thomas Fitzgerald, 10th Earl of Kildare.  While the castle was restored in 1630, it was destroyed again in the 1640s during the Irish Confederate Wars.  Since then, the castle has remained in ruins and is now run as a cultural heritage site by the State. It is open during the summer months for tours.

Unfortunately, due to the age of the castle, not much information remains regarding what it may have looked like during the height of its power, so most of my model is based on speculation of architects and other historians much more qualified than I.  I will be building my model based on their work, and my report will detail those sources in depth.

Some of the information I received was from some older documents stored in the special collections area of the Maynooth University library.  These included a map of County Kildare (which had a lovely write-up regarding the history of the castle), as well as some hand-drawn images of the castle ruins at the time of publication in 1783.  Additionally, the library was also able to provide me with a development plan that was created for the Office of Public Works in the mid 1990s that included some architectural plans that were drawn up during the castle’s first reconstruction in 1630. Additionally, this document also contained some speculation from the architects creating the development plan as to what the castle may have looked like in the 15th-16th century prior to its fall.

Looking at the Models

Another source of information was the model housed within the castle.  When Maynooth Castle was converted into a cultural heritage site, a scale replica of what the castle likely looked like was created.  This scale model is housed within the keep (which serves as a type of museum for the castle itself).  Catherine O’Connor, the supervisor of the site, was gracious enough to provide me with early access to the castle keep (which is currently closed for the season) so that I could photograph the model and take measurements of each of the buildings.  This will allow me to construct my 3D model as accurately to this model as possible.  Ms. O’Connor was also able to provide me with some additional documents that detailed an archeaological excavation of Maynooth Castle that was conducted in June of 2000. Finally, Ms. O’Connor also provided me with contact information for one of the architects who worked on the development plan.  I will be reaching out to him over the next few days in hopes that he can provide me with any further information that may be of use.

Next Steps

The next thing I will begin tackling is the creation of the model itself.  I will likely begin by trying to create the outer walls, as well as the keep.  I will be doing this from the ground up in 3DS Max.  My next post will be written after I have begun tackling some of these aspects and will detail what struggles (and hopefully triumphs) have resulted from this endeavour. Stay tuned . . .

Perils of Project Planning

For my contribution to the Woodman Diary, which is the project we are creating for Digital Scholarly Editing, I took on the role of Project Manager.  I thought I would take a few moments to discuss something that is often discussed but overlooked in any software project:  project planning.

What is Project Planning?

Project Planning, as defined by Rouse, is “a discipline for stating how to complete a project within a certain timeframe, usually with defined stages, and with designated resources.” [1] The three components—scope, schedule, and resources—mentioned by Rouse are often referred to as the “Scope Triangle” or “Triple Constraint”. The notion of the “Scope Triangle” dictates that the scope, schedule and resources form three sides of a triangle.  In the “Scope Triangle”, one of these resources is always fixed, and the other two, while flexible, move in proportion to the other.[2] For example, if schedule is fixed—meaning the delivery date of the project cannot be changed—then if additional scope (new features) are added, more resources (also sometimes referred to as budget) must be added to accommodate the increase in scope. The “Scope Triangle” is used to ensure both the stability and the overall quality the project and its deliverables.  After all, one cannot logically assume that a project, which was originally stated to take x number of months with y number of features given a budget of z, can still launch at the same time if the budget is suddenly reduced or if new features are added.

Consider this analogy. You decide to build a new home, and so you hire a company to do the work.  You agree with the company that they will build a 1,000 square foot home in 6 months for €100,000. Three months into the project, you decide 1,000 square feet isn’t big enough, and you wish to add another 500 square feet to the home.  Certainly you would expect it to cost more—and quite possibly take longer—than what was originally agreed to. However, for some reason, this notion often flies out the window with regard to software projects. Thus project managers are often brought in to ensure the “Scope Triangle” is adhered to, and the project remains on track with a high level of quality.

Perils & Pitfalls

Most think of project planning as creating Gantt charts and putting dates on deliverables. And while that is certainly a component, it is far from the only aspect. Below, I’ve listed some of the most common mistakes that can be made in regards to project planning:

  1. Thinking Too Small – project managers need to think big, and I don’t mean in regard to scope.  The biggest mistake that can happen while project planning is not considering all of the possible avenues. What if we lose some of our resources due to illness or vacations? What if the server blows up, and we need to buy a new one? What if some feature we really like isn’t technically feasible? All possible avenues need to be explored during the planning phase.  There is no scenario too far-fetched.
  2. Making Assumptions – often, we make assumptions about the projects we are working on. “The computer centre will set up that server for us.”  “That feature is very easy to implement—I’ve seen it done before.” “That software is easy to customise.” The list of examples is endless. But what if the computer centre is unable to set up the server due to their own time or resource constraints? What is the software isn’t so easy to customise or is restricted due to licensing constraints? What if that feature seen elsewhere took months to build and isn’t distributed and thus must be recreated? All of these items can have a significant impact on a project and cause it to derail.  Therefore, it is important to identify assumptions early on and plan accordingly.  Making assumptions is not necessarily a bad thing, but failing to identify them is a major problem.  If they aren’t identified, then contingency plans cannot be created.
  3. Failing to Identify Risks – every project has risks.  Some are obvious: loss of resources due to illness, scope creep (the subtle addition of new features that, while individually seem small, cumulatively add considerable scope to a project), scheduling constraints, etc.  Every project, however, has risks that are unique to the project itself.  For example, while planning for the Woodman Diary, we identified a risk regarding our software implementation.  At the time, we had yet to choose a software package for the diary, so there was a risk that the package we chose could have an impact on our schedule, as it could potentially be more difficult to implement that we assumed (also, for further emphasis, see above item regarding assumptions). Identifying risks early on allows the team to research mitigation tactics.  In fact, not only should every risk be documented, but a mitigation plan should also be created for each risk in order to identify how likely the risk is, what its impact on the project overall could be, and how the risk will be mitigated. By doing so, the team reduces the potential number of surprises that could arise during implementation.  The fewer surprises, the smoother the implementation.
  4. Forgetting the Goal – every software project has a sense of excitement about it.  The team is creating something new and many participants want to innovate or make something that has that “wow” factor.  Thus, it’s easy to get caught up in the “glitz and glamour” and forget about the goal. Whenever the team is considering adding a new feature or changing an already defined feature, the first question that should be asked is: does this change bring us closer to accomplishing the goal of the project? If the answer is “no”, then the feature should be scrapped.  It doesn’t matter how “neat” the feature might be; if it doesn’t serve the goal of the project, the feature is ultimately a distraction.  Of course, if the team answers that question with “What is the goal?”, then a much bigger problem exists.  Before project planning even begins, a goal must be clearly set out and communicated to—and agreed on by—the team.

Conclusion

Project planning is a vital process of any endeavour, especially when creating or implementing software (and ultimately, every digital scholarly edition is, at its heart, a software project).  It should never be ignored, lest the project fall to chaos and disarray. That said, it is important to remember that it is about more than just marking down due dates next to features and holding the project team to a schedule.  Project planning is also about seeing the big picture and knowing how to respond to various situations that may arise that were unexpected.  Project planning is much like warfare—considering all the various angles and developing strategies for dealing with the enemy. However, in the case of project planning, the enemy is often ourselves and our own failures to look ahead.

References

[1] Rouse, Margaret. “What is project planning?“. Whatis.com. March 2007. Web. 19 April 2015.
[2] Jenkins, Nick. “A Project Management Primer: Basic Principles – Scope Triangle“. ProjectSmart.co.uk. n.d. Web. 19 April 2015.

Further Reading

Haughey, Duncan. “Project Planning: A Step-by-Step Guide”. ProjectSmart.co.uk. n.d. Web.
Kerzner, Harold R. Project Management: A Systems Approach to Planning, Scheduling, and Controlling, 11th Edition. Hoboken, NJ: Wiley & Sons. 2013. Print.
Project Management Institute. “The Project Management Office: Aligning Strategy and Implementation“. PMI.org. April 2014. Web.
– – -. “The Value of Project Management. PMI.org. 2010. Web.
Sylvie, George, Jan LeBlanc Wicks, C. Ann Hollifield, Stephen Lacy, and Ardyth Broadrick Sohn. Media Management: A Casebook Approach. New York, NY: Taylor & Francis. 2009. Print.

Temporal Visualisations in Digital Humanities

Visualisations have played a very important role in our understanding of large sets of data.  Contrary to what you might think, they aren’t a relatively recent phenomenon; they’ve existed for hundreds of years.  After all, the very first maps were a type of data visualisation – a way to visualise area (otherwise known as spatial visualisations).  Today, however, I want to talk a bit about temporal visualisations – visual representations of time as it is related to data. Using an annotated bibliography, I will provide a number of sources that will provide further reading on the subject in order to gain a broader understanding.

“11 Ways to Visualize Changes Over Time – A Guide”. Flowingdata.com. 07 January 2011. Web. 24 November 2014. http://flowingdata.com/2010/01/07/11-ways-to-visualize-changes-over-time-a-guide/

This article from Flowingdata.com discusses some of the standard types of visualisations used when dealing with time-related data. The article describes each visualisation, its standard usage, and provides an example (via a link) to an implementation of said visualisation.

While relatively short compared with some of the other articles I’ve posted here, I think this article does a great job of summing up the main types of time-based visualisations. I love the use of examples to illustrate an implementation as well as an explanation regarding when it is appropriate to use a certain type of visualisation.

Aris, Aleks et al. “Representing Unevenly-Spaced Time Series Data for Visualization and Interactive Exploration”. Human-Computer Interaction – INTERACT 2005. Springer Berlin Heidelberg, September 2005: 835-846. Web. 25 November 2014. http://hcil2.cs.umd.edu/trs/2005-01/2005-01.pdf

Aris et al discuss time series data and its use in visualisation. Specifically, they focus on unevenly spaced time data and propose 4 different visualisation techniques: sampled events, aggregated sampled events, event index and interleaved event index. Each is discussed in depth, and an example is provided showing its implementation.

The methods presented here are certainly presented in a more cognisant manner than some of the other entries I’ve listed here. The visualisations presented as examples are easy to follow and interpret, if lacking somewhat in imagination. Shiroi, Misue, and Tanaka (see entry) based much of their work on the work presented here, and I can see the relationship between the two (which is why I called this out as an additional resource). The corpus here provides a really great understanding of time series data but allows for some growth in regards to creativity in the actual implementation of a visualisation method.

Buono, Paul et al. “Interactive Pattern Search in Time Series”. Proc. SPIE 5669, Visualization and Data Analysis 2005. 17 March 2005. Web. 26 November 2014. http://hcil2.cs.umd.edu/trs/2004-25/2004-25.pdf

This white paper opens by discussing some of the methods used for visualising time related data, specifically data in a time series. In addition, the paper discusses query techniques that can be used for searching time series data. The paper then examines TimeSearcher2, the next iteration of the TimeSearcher software. TimeSearch is software that allows a user to load a large set of data into the software, and then visualise said data using a number of analysis tools built into TimeSearcher2. The paper mainly focuses on a few of the features new to TimeSearcher2, such as a view that allows the user to look at multiple variables when visualising the data, improvements to the interactivity of the search, and improvements to the search algorithms. The paper closes with a discussion of shortfalls within the software and improvements that could be made in future versions.

The visualisations used in the software are somewhat primitive, but given the age of the paper (nearly a decade ago), this is not wholly surprising. Buono et al are quite candid in their evaluation, specifically in the conclusion where they discuss the shortfalls of the tool. In addition, they are also quite open with the methods used, particularly in their discussion regarding improvements to the search algorithm. The paper serves as an interesting insight into the history of time based visualisations in the last 10 years.

Capozzi, Mae. A Post Colonial ‘Distant Reading’ of Eighteenth and Nineteenth Century Anglophone Literature. PostColonial Digital Humanities. 08 April 2014. Web. 27 November 2014. http://dhpoco.org/blog/2014/04/08/a-postcolonial-distant-reading-of-eighteenth-and-nineteenth-century-anglophone-literature

Capozzi looks at 19th century British literature that specifically deals with India as its primary subject. In her presentation, she attempts to provide data to support her hypothesis that not only did Britain have an impact on Indian culture but, more importantly, India had an impact on British culture (via literature) as a result of British colonialism. She looks at a random sampling of literature and uses topic modeling, via a programme known as “Mallet“, to plot various topics over time. Via the use of line graphs (simple time visualisations), Capozzi uses this data to provide proof related to her hypothesis.

While Capozzi’s presentation is not a temporal visualisation in the sense that I am using the term throughout this post, I include it here as a cautionary tale of what not to do with a visualisation. Capozzi presents some very simple line graphs which seem to support her hypothesis. However, upon closer inspection, it is clear she relies on correlations between upticks in topic clusters at certain times and events in Indian history (such as the rise of the Raj or political unrest during World War I). She provides no empirical data to prove the correlation, instead merely relying on a cause and effect relationship. Furthermore, Capozzi offers no methodology behind her topic model (and by extension her visualisations). Without a thorough understanding of how her data was derived, we cannot make informed opinions regarding the data she is attempting to visualise. When working with visualisations, it is imperative to not only use a visualisation that will be intuitive to your audience (as I point out in later works) but also to remain transparent in the methodologies used to derive the data.

Day, Shawn. “Visualising Space and Time: Seeing the World in New and Dynamic Dimensions”. University College Cork. 11 February 2013. Web. 25 November 2014. http://www.slideshare.net/shawnday/dah-s-institute-uccrs

Day presents a number of interesting ideas around spatial and temporal visualisations. In his presentation, he discusses how we typically use time and space data, as well as common methods of plotting this data in a graphical form. He then continues by discussing both time and space data, separately and together, in a more in-depth format. He also discusses some great tools for visualising this type of data.

What I love most about this presentation are slides 14, 15, and 16. Here, Day discusses how we are used to seeing time plotted in a linear format. But he delves deeper by plotting out actual time data (using the movie Back to the Future as an example) to illustrate other, non-linear visualisations of time.

De Keyser, V. “Temporal Decision Making in Complex Environments.” Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences, 327.1241 (1990) : 569-576. Web. 20 November 2014. http://www.jstor.org/stable/55328.

De Keyser’s essay delves into the importance of time and the role it plays when making decisions. De Keyser begins by discussing how technology has changed our perception of time in regards to planning, due in large part to the increased availability of data and the ability to control the minutiae of outputs. He then discusses strategies behind temporal decision making, such as anticipation, assessment, and adjustment. He concludes with a discussion of errors that can (and most likely will) evolve from temporal decision making and their effects on a given process.

The article deals largely with the effects of time on technology projects and how the private sector constantly evaluates the success of such projects based on metrics involving time. These metrics often stem from datasets and statistics compiled into visualisations in order to express success as a function of time and resources. While more esoteric than the many of the other entries listed here, this article does provide a theoretical understanding of time and its role in decision making — a factor that largely plays into the importance of temporal visualisations.

Friendly, Michael. “A Brief History of Data Visualisation”. Handbook of Computational Statistics: Data Visualization. 2006. Web. 17 November 2014. http://www.datavis.ca/papers/hbook.pdf.

Friendly discusses the history of data visualisation, noting the first temporal visualisation used in a 10th century graph of stars and planetary movements over a period of time (p. 3). The article continues to trace the history of visualisations and their developments throughout 17th, 18th, and 19th centuries, noting the dramatic shift in approach during the latter part of the 19th century as a result of the increased usage of statistics among government agencies as well as innovations in the field of graphics and design. Following this, Friendly then discusses visualisations in the 20th century, noting the dramatic changes between the earlier and latter parts of the 20th century thanks to innovations such as interactive visualisations, dynamic data, and high dimensional visualisations. Friendly concludes with a look at “history as data” (p. 26) and his evaluation of the “Milestones Project” — a project on which he based much of his review (p. 25).

Overall, Friendly provides an interesting and thorough analysis of the history of data visualisations. His essay provides the reader with the background necessary to understand the context behind visualisations and how the methods have evolved over the course of the last few centuries. This is an excellent starting point for anyone wanting to dive deeper into the theoretical realm of the subject matter.

Mayr, Albert. “Temporal Notations: Four Lessons in the Visualization of Time”. Leonardo, 23.2/3, 1990: 281-286. Web. 24 November 2014. http://www.jstor.org/stable/1578624

While visual representations of space tend to have a sort of uniform acceptance of standard visualisations even across disciplines, Mayr argues that the time-based visualisations are much less standardised and tend to be rather specific to the individual discipline in which the visualisation is focused. In order to address this phenomenon, Mayr discusses several exercises performed with students in an effort to visualise time based data around guidelines Mayr has laid out in the article.

While the article itself is quite interesting, I don’t think Mayr actually manages to create any kind of coherence around the visualisation and notation of time, nor do I agree that consistent visualisations practices do not exist. He opens by discussing how notations vary from discipline to discipline but then proceeds to focus on techniques that rely rather heavily on the field of music to inform his guidelines (Mayr mentions in his article that he has a background in music). However, the exercises he gives to use in the classroom and the corresponding results lead to some interesting takes on time visualisations that I think most will find very interesting.

Moore, David M. and Francis M. Dwyer. Visual Literacy: A Spectrum of Visual Learning. Educational Technology Publications. 1994. Web. 20 November 2014. http://books.google.ie/books?id=icMsdAGHQpEC.

While Friendly’s article is an excellent take on the history of the field, Moore and Dwyer discuss the importance of visualisations and their relation to learning and cognitive development. While their entire book contains a plethora of interesting and important information, of particular note are sections 5 and 6, which discuss the role of visualisations in schools and business, as well as the cultural and socio-political impact of the field of semiotics and its intersection with technology. Semiotics, the study of signs and symbols, plays a major role in the understanding of visualisations and the data they entail.

Moore and Dwyer’s work is an excellent companion to Friendly’s article for providing a strong basis of understanding of the overall realm of data visualisations. Both are a necessary first step to a deeper understanding of the theory and reasons behind why visualisations are both important and utilised.

Shiroi, Satoko, Kazuo Misue, and Jiro Tanaka. “ChronoView: Visualization Technique for Many Temporal Data”. 2012 16th International Conference on Information Visualisation. IEEE, July 2012: 112-117. Web. 25 November 2014. http://ieeexplore.ieee.org/xpls/icp.jsp?arnumber=6295801&tag=1

Shiroi, et al discuss their creation of a visualisation technique they have developed known as “ChronoView”. They begin by discussing one of the problems of temporal visualisations, which is treatment of each time interval as discrete and the lack of ability to cluster a single event around multiple time entries. In order to combat this problem, they developed a circular view of the data.

While I’m not entirely sold on the visualisation used here, it is an interesting approach to visualising time-related data. The paper itself is well thought out, and the methods used for plotting the data are clearly and concisely disclosed — something I feel is incredibly important in the field of visualisation work. This, however, is the type of visualisation that I feel doesn’t lend itself well to the average reader. I would posit that to understand the data presented in this type of format, one would need not only a solid understanding of the particular field or data being discussed but also a strong background in statistics or visualisation theory. However, as a whole, I think it’s a solid take on time visualisations.

Turker, Uraz C. and Selim Balcisoy. “A visualisation technique for large temporal social network datasets in Hyperbolic space”. Journal of Visual Languages & Computing 25.3 (2014): p. 227-242. Web. 27 November 2014. http://dx.doi.org.jproxy.nuim.ie/10.1016/j.jvlc.2013.10.008

Turker and Balcisoy discuss the use of visualisations of temporal data utilising large datasets from social networks. As a result of their research, they have created a new visualisation technique they have dubbed the “Hyperbolic Temporal Layout Method” (HTLM). HTLM utilises geometry and spatial placement to visualise actors and relationships utilising a spiral layout. This paper describes how HTLM was developed, the algorithms used, and examples of the actual visualisation.

Turker and Balcisoy have done an excellent job of researching and proposing a new visualisation technique. They have taken great care in remaining transparent in their approach and have fully disclosed the algorithms used as well as discussed the background information that has led them to the creation of HTLM. That said, I feel that the visualisation itself falls somewhat flat. While an interesting take on a temporal visualisation, I feel that without significant understanding of the data and the field, most users would be unable to parse the data being presented — the visualisation remains almost unreadable to the casual observer. Perhaps Turker & Balcisoy are positioning HTLM towards a specific audience, but there is no indication within the paper itself this is the case. Thus while the visualisation offers a new and creative technique for visualising data, it’s difficult readability makes it a less than ideal visualisation.

Aura in the Digital Realm

In a recent class entitled Transformations In Digital Humanities, we discussed the notion of aura as it relates to an object, and how the individual’s perception of the aura can affect its value.  We then discussed the notion of digital auras, and more specifically, if digital objects have an aura. We also discussed if that aura is lost once the object moves from the analogue to the digital (or if the object is born digital, if it has an aura to begin with). This got me thinking about the notion of auras in general and what impact, if any, the digital realm has on an object’s aura.

Defining Aura

First, let’s start by defining what exactly an aura is when it relates to an object (in this case, we are dealing specifically with objects in the Arts & Humanities realm). The Free Dictionary defines aura as “a distinctive but intangible quality that seems to surround a person or thing; atmosphere”. In his book, Presence of Play, Power takes this definition a bit further by describing aura (or “auratic presence”) as the presence an object has that is beyond what the object’s physical appearance might suggest (p. 47). The easiest way to describe aura in my mind is that feeling one gets from looking at a favorite painting.  What feelings are evoked by Da Vinci’s Mona Lisa or Van Gogh’s Starry Night? If you’ve seen the original, how were those feelings changed? Was there an air of magnitude about the object? Did it change your sense of the object? If you answered yes, then you understand aura.  It’s that presence certain objects have that draw you to them. But where does it come from?

Why Aura?

As Walter Benjamin states in his article, Aura stems directly from the originality of the piece: “its presence in time and space, its unique existence at the place where it happens to be” (Benjamin). I feel, however, that Benjamin doesn’t get quite to the core of the matter of why we derive aura from presence which is, quite simply, the collective human thought that “original = better”. Let us consider a work of art. It has become quite commonplace to understand that the original Mona Lisa is near-priceless, but a reproduction of the Mona Lisa has vastly diminished value. But why? Is it because we associate the original with the great Leonardo da Vinci and since he no longer lives, his original creations are thus worth more than reproductions? This only leads to further questions in my mind. If one can take a piece of art and reproduce it exactly, brush stroke for brush stroke, so that it resembles the original down to the finest detail, why is the original any more valuable than the reproduction? What is it about “originality” that causes us as a society to assign so much value?

Money & Fame

I will probably ruffle quite a few feathers with this statement, but I believe the answer to the above question, like so many things, boils down to a single commonality: money. By assigning mystique (or in this case “auratic presence”) to an item, you place value on the item. It is different. It is unique. Therefore, it must be valuable. And the higher the value, the greater the monetary gain should the item be sold. And even if money isn’t the currency in question, the value will be dispensed in that other great currency, fame. The artist or object in question gains notoriety and thus, in the case of the artist, the value of future objects increases. In the case of the object itself, as notoriety increases, value becomes an expression of time, whereby the more time that passes, the greater the value increases until it eventually plateaus.

But does the digitisation or mass reproduction of an object actually detract from the value? Furzsi certainly doesn’t seem to think so. In Furzsi’s article, the author states, “Instead of destroying the cult status of artworks then, such printed fabrics reinforced the aura of the artist genius and played an important role in familiarising a wide audience with the modernist canon”. One could extrapolate from Furzsi’s argument, then, that mass reproduction of these prints raised the auratic presence and thus the value of the object by making it more readily available. And therein lies the crux of my argument.

Availing the ‘Common Folk’

It is my belief that digitisation of an object does not diminish the analogue object’s auratic presence, nor does the digital object lack aura. In fact, it is quite the opposite. As the internet continues to grow and connect the world, those who might not otherwise have access to Arts & Humanities analogue objects due to a lack of means for travel or opportunity can experience these objects in a digital format. The digital object can still evoke the same sense of wonder or mystery. After all, I’ve never seen the original work of Dali’s The Persistence of Memory, yet it is still one of my favourite paintings and evokes a strong feeling of connection to the subject matter. And by exposing a wider audience to the object or artist via the digital medium, the auratic presence of the analogue increases as well due to the increase in notoriety. It all ties back together.

Dr. Power also contends that aura emanates not just from the object itself but also from the unexpected encounter with the object and the emotions said encounter evokes (p. 48). Here again, the digitisation of the object lends itself to my supposition that by increasing the dissemination of the object to a wider audience, the aura of the object is increased as more individuals experience the object through unexpected digital encounters (via internet searches, online galleries, etc).

Unintended Consequence

The digitisation of objects has also had some unintended, yet beneficial, consequences. In his journal article, Rodríguez-Ferrándiz discusses the case of Edward Munch’s The Scream. The painting was damaged during its theft and recovery, and the curators were able to use the vast collection of reproductions of the work to assist in their restoration of the object. Thus, the auratic presence of the object, which could have suffered irreparable harm as a result of the damage to the painting, was preserved directly as a result of the mass production and digitisation of the object (p. 399).  Furthermore, had the object not been digitised and mass produced, the curators may never have been able to restore the painting, and the aura along with the value of the object could have been greatly reduced.

Conclusion

While I won’t go so far as to say that an analogue object retains an aura that can be matched by a digital representation (after all, seeing the Book of Kells in physical form does evoke something that the digital version doesn’t quite capture), I will emphatically state that digitisation of an object in no way shreds the auratic presence, but rather adds to the aura of the analogue object. In addition, the digital object itself retains a sense of aura through the “unexpected encounter” of the object (thus allowing objects that are born digital to possess aura as well). Aura is entirely intrinsic and subjective characteristic that is unique to the individual experience yet also is built upon by the collective unconscious. As we move evermore into the the digital age, aura will continually be built upon through both the analogue and the digital experience.

Bibliography