(from the discussion list, Humanist. This will give you some idea of the projects underway in the Digital Humanities in Europe)
This is to announce the forthcoming events of the
London Seminar in Digital Text and Scholarship
for 2006-7, a description of which follows. All events
take place at 5.30 pm in Senate House, Malet Street,
unless otherwise noted.[2 November]
Dr Peter Garrard (Royal Free and University College Medical School,
London), “Textual Pathology”. Room NG15.As we humans age, physical and functional changes are detectable in all
organs of the body, yet it is the physical structure and performance
characterisitics of the brain that excites more interest than any other.
The reasons for this cognitive bias are diverse, but a major factor is
undoubtedly the devastating and widespread phenomenon of senile
dementia. Alzheimer’s disease is now recognised as a major (though by no
means the only) cause of dementia, and the changes that take place
within the brain are easily recognised when the brain is examined at
post mortem. By destroying the dense network of neuronal connectivity
with which the brain achieves the highest levels of intellectual
activity, Alzheimer’s pathology disrupts the operation of a profoundly
complex system. Moreover, because of the predilection of this pathology
for some lobes of the brain rather than others, characteristic patterns
of abnormal performance are observed in the early stages of the disease.
These include a typical pattern of linguistic difficulty characterised
by a shrinking vocabulary in the presence of apparently normal sentence
structure. Using well-established techniques of digital textual
analysis, Garrard and colleagues were able to demonstrate similar
changes in the late work of Iris Murdoch, who began to exhibit signs of
cognitive failure soon after publication of her final novel, Jackson’s
Dilemma (1995)*.Arising from the findings of this seminal work are a series of further
questions concerning the relationship between the complex structure of a
text and that of the brain in which it originated. Specifically, whether
ageing is reflected in progressive changes to a higher order structure,
which – just like physical ageing – may follow either a normal or a
pathological trajectory. Similarly, might the presymptomatic phases of
different cerebral pathologies give rise to distinct patterns of textual
change in the same way that Alzheimer’s disease, Pick’s disease, and
vascular dementia are recognisable to the experienced clinician?Results of an approach to plotting such a trajectory through the final
two decades of Murdoch’s life will be presented, as will similar
analyses using serially sampled bodies of spoken rather than written
language output (a modality that is arguably more sensitive than the
written word).
[6 December]
Professor Ian Lancashire (English, Toronto),
“Cybertextuality by the numbers”. Room NG15Cybertextuality theorizes that idiosyncratic verbal patterns in
documents such as poems — the repetition and the variation of segments
— are constrained by working-memory (and an equivalent long-term
memory, it may be) capacity; and that the unselfconsciousness that
characterizes the cognitive uttering of such segments gives rise to a
cybernetic phenomenon, an author’s conscious mental feedback to hearing
his own uttered segments. Self-reflection, anecdotes, and certain
observed quantities give some support to this theory. The last include
George Miller’s “magical number” for working-memory capacity (7 ± 2
chunks; since revised by Nelson Cowan to 4 ± 2) and the still uncertain
capacity of the working-memory chunk (perhaps also 3 – 4). Literary text
analysis has neglected John B. Lord’s observation, in 1979, that
Miller’s “magical number” ideally constrains line-length in verse. If
the Miller-Cowan information limit is valid universally in humans, its
numbers have unrealized explanatory power for our literary works and
offer a literary and linguistic measurement that can be detected by what
John Sinclair calls “concordance and collocation.”[10 January]
Dr John Lavagnino (Centre for Computing in the
Humanities, King’s College London), “Metaphors of
digital and analogue”. Room NG14.In the early twenty-first century, the view is commonplace that most of
the things around us in everyday life are analogue but the class of
digital things is rapidly growing. In fact, most things are neither:
“digital” and “analogue” are not names for two categories that between
them encompass the world, but refer to two types of system that we
deliberately engineer. Most things aren’t systematized rigorously enough
to be either digital or analogue. But looser or metaphorical uses of the
terms do have their uses: so Nelson Goodman and others have talked about
the digital nature of writing systems, though such systems as actually
used are not limited to digital coding, as writers and readers commonly
extend them by expressive use of handwriting or other visual features.
It is only in this looser sense that the everyday world can be seen as
analogue, and that the categories of the digital and the artificial can
be conflated — a point of view particularly prevalent in recent films.In today’s world, “digital” is the privileged term: it’s the one that
has an independent definition (since on this view analogue just reduces
to “not digital”), and even if we’re arguing against its dominance and
assert our difference, we assume things are heading that way. This
extends to very learned discourse: in cognitive science, for example,
the view that the brain is actually digital, deep down, is widespread.
In the mid-twentieth century, when the digital-and-analogue pair first
became established in discussion of computing, it was by no means clear
that digital computing would become the dominant form. It was a more
difficult and expensive way to achieve comparable results at first; it
became dominant only because of eventual economies of scale that
analogue computing did not offer. The thinking about digital and
analogue in the world of cybernetics up through the 1970s was influenced
by a sense of the remarkable things done with information by the living
body, not by computers, and it assumed the view of John von Neumann and
others that the brain worked with a combination of analogue and digital
representations. Each had its own distinctive logic, and neither could
be fully replaced by the other. Just as with today’s casual talk of
digital and analogue, these discussions went beyond the engineering
sense of the terms to metaphorical extensions that did not preserve the
features that are actually required for systems to work. But in making
metaphorical extensions of the terms cybernetics was more sophisticated
than we are today.[8 February]
Professor Peter Shillingsburg (Centre for Textual
Studies, De Montfort), “The Work Implied, the
Work Represented, and the Work Interpreted”. Room ST275, Stewart House.
(Stewart House adjoins Senate House at the rear; see
http://www.london.ac.uk/stewart.html.)This three-part paper begins by (1) describing what might be the nature
of literary works and texts (the “thing” that might be tranported into the
electronic medium from a material one), (2) examining what is entailed in
representing or re-representing a work in ways that might be more or
less–preferably less–misleading, and (3) embracing the subjectivity of
editing and exposing the chimera of objectivity in scholarly editing
regardless of medium. The purpose is to emphasize the complexity of the
task and suggest a collaborative way to address the idea of electronic
scholarly editing.[22 March]
Dr Mary Hammond (Literature and Book History,
Open University), “The Reading Experience
Database 1800-1945: New Directions”. Room NG15.The Reading Experience Database (RED) is ten years old. Currently
holding around 6,000 records of the reading experiences and practices of
British subjects – including perhaps the largest single collection of
experiences from the ‘long’ eighteenth century – it has recently been
awarded a major AHRC grant which will speed up its growth and enable it
to be placed live on the web for the first time. This paper explores the
ways in which electronically-available data on reading drawn from a wide
range of sources might augment studies of literature and the material
book.[12 April]
Drs Barbara Bordalejo and Peter Robinson
(Institute for Textual Scholarship and Electronic
Editing, Birmingham), “Electronic editions for everyone”. Room NG14.Ten years of experience in the actual making and publication of
electronic editions (ranging from the Bayeux Tapestry Digital Edition,
through the Canterbury Tales Project publications, the Parliament Rolls
of Medieval England, to Dante’s Monarchia) has given us some sense of
what electronic editions can do and how they can be made. But so far, we
have no clear answers to two crucial questions: who can use these
editions and how can they use them? Traditionally, critical editions
have had a very narrow readership (usually, advanced scholars) and have
been used in rather limited ways (typically, by scholars themselves
making editions). The promise of electronic editions is that they might
reach a far wider range of readers, who might use them for a far wider
range of purposes, in many more contexts. As yet, this promise has not
been achieved. We may ask: is this because of limitations of the
technology to this point; to the time it takes to break down scholarly
conservatism; or are we simply mistaken, in the belief that scholarly
editions in electronic form might actually reach far wider audiences
than their print predecessors? Or do we need a quite different model, of
what a scholarly edition in digital form might be? In the course of this
talk, we will draw upon the example of two digital editions which aim
(in different ways, from different starting points) to be editions “for
everyone”: the Codex Sinaiticus project, and the publications of the
Canterbury Tales Project.—–
The London Seminar in Digital Text & Scholarship focuses on the ways in
which the digital medium remakes the relationship of readers, writers,
scholars, technical practitioners and designers to the manuscript and
printed book. Its discussions are intended to inform public debate and
policy as well as to stimulate research and provide a broad forum in
which to present its results. Although the forum is primarily for those
working in textual and literary studies, history of the book, humanities
computing and related fields, its mandate is to address and involve an
audience of non-specialists. Wherever possible the issues it raises are
meant to engage all those who are interested in a digital future for the
book.The primary form of discussion is a yearly series of seminars by leading
scholars and practitioners involved in the making of digital editions
and scholarly textual resources, in reflecting on these productions and
in examining the historical and material culture of written language as
these inform practice. Running through and uniting the seminars is the
single question, “What is to be done?” They are in that sense all meant
to be practical investigations from which guiding theory will emerge,
feed back into a revised practice and so help us to progress.The Seminar is deeply rooted in the history of textual production and
its scholarship but is preoccupied with the future. It takes as its
starting point Alan Turing’s principle of computing as a scheme for
constructing indefinitely many machines — from which we derive the
practice of constructing indefinitely many varieties of the digital
book. Its question is not how to arrive at the best successors to this
or that existing form or the best configuration of libraries to house
and manage the products, rather how continuously to remake the digital
book and its environment so that they serve “the living condition of the
human mind” (Peirce). The Seminar explores through practical experiment
the changing ways in which this continuous remaking is to be done and
both the challenges it poses and the opportunities it offers to our
institutions.The Seminar is sponsored by the Centre for Computing in the Humanities,
King=92s College London, and the Institute of English Studies, University
of London. Its convenor is Dr Willard McCarty (KCL).
Leave a Reply