Comments on: Who am I computing? http://nationalhumanitiescenter.org/on-the-human/2009/05/who-am-i-computing/ a project of the National Humanities Center Mon, 13 Feb 2012 19:42:46 +0000 hourly 1 By: Gary Comstock http://nationalhumanitiescenter.org/on-the-human/2009/05/who-am-i-computing/comment-page-1/#comment-44 Tue, 26 May 2009 14:33:42 +0000 http://nationalhumanitiescenter.org/on-the-human/humannature/?p=221#comment-44 Well done indeed, and thanks to everyone who participated by writing in or reading along. The conversation continues in our Facebook group, http://www.facebook.com/group.php?gid=52472677549

]]>
By: mccarty http://nationalhumanitiescenter.org/on-the-human/2009/05/who-am-i-computing/comment-page-1/#comment-53 Sat, 23 May 2009 10:29:28 +0000 http://nationalhumanitiescenter.org/on-the-human/humannature/?p=221#comment-53 To summarize, that is the problem. Reading through the text that has accumulated around my original posting these last two weeks produced the unsurprising impression of variations on the theme. But with Peter Batke’s commentary on our social conformities in mind, I applied in a very rough and ready fashion the standard approach to the analysis of text that I teach my students: gather it all together into an unstructured file, generate a list of word-frequencies, scan that list and group the words morphologically and, as seems to befit the text, conceptually. I grouped the words around those terms that ranked high among those occurring 10 times or more.

As from reading , the result here is also unsurprising – but suggestive. One thing it suggests is that Peter was right in detecting social conformity.

Words morphologically related to “human” (“human”, “humanist” etc, including “life”) occurred 209 times. Words related morphologically and conceptually to “computer” (including “machine”, “tool”) 250 times. “Human” was the highest ranked open-class (content) word; “think” and “computing” tied for second place. From this one could draw the conclusion that we talked about the human and the computer. Unsurprising, indeed – but confirmation that we stayed on topic. (Confirmation is very important in text-analysis; if you cannot confirm that what you already know is true, then your method is suspect!)

Words relating to what we do (actions and products of action), any one of which occurred 10 times or more, I lemmatized. In order of frequency, most to least, they are as follows: say, think, question, work, know, ask, read, make, see, understand, history, research, idea, problem, point. The objects of attention, in frequency order from most to least, were: text, language, literary and words. Again, utterly unsurprising, but perhaps useful as a display analogous to those emblem books of the trades, such as Jost Amman and Hans Sachs, Eygentliche Beschreibung Aller Stände auff Erden [Exact Description of all Ranks on Earth], popularly known as the Ständebuch (1568). Such a display provides inter alia a starting point for explanations of what exactly it is that we as humanists do. The ability to provide such an explanation on the spot can be quite handy. It has become, I’d argue, an urgent necessity.

Decades of work in text-analysis have shown, however, that the most significant words for prying into the less obvious qualities of a text are the very ones we as readers consciously ignore, those which (I am told) linguists call the “closed-class” words, or more accurately the class of words that a language tends very seldom to coin. These are the articles, particles, pronouns and so forth. In my analysis I went particularly for the pronouns, as they are a good indicator of perspective and audience – and they yield their secrets at least in part by methods short of the stylometric tests that require much specialized training properly to apply and considerable experience to explain.

Of the pronouns, the most frequent, and very high on the frequency list (9th ranked, occurring 233 times, or 1.234% of the corpus), was “I”, followed immediately by “we” (218 times), then next but one, “it” (181). All together, lemmatized, the first-person pronouns occurred 597 times, the second-person 41, the third-person 337 (his, 37; her, 35; it, 212; they, 63). One would be justified in concluding that here is a text in which people write mostly about themselves, what they do and so forth, and address others in the group.

Going well beyond the numbers, I also carry away from many discussions concerning the humanities – including the one in which I’m about to participate, at Reed College, “Are the Humanities Still Relevant?” (www.reed.edu/alumni/reunions/alumni_college.html) – a strong sense of a beleaguered minority hunkered down against a coming invasion. Evidence for a society against us, or indifferent to us, is not hard to find. Reasons why the collective behaviour of the humanities should merit such a reaction are likewise not hard to find. But the very real threat from within our institutions by those sympathetic to or in fear of extramural antipathies needs to be met by a response that at base is rooted in the ideals of the humanities – as long as, that is, our institutions remain educational. Questioning the human is what we do, what we have always done. To do that we need to stand apart (no wonder the appearance of solipsism) while regarding critically what is going on in the world (so that solipsism remains only a countable appearance).

The great ethno-historian Greg Dening used to say that one should be able to walk away from a book one has read with a single sentence or two in one’s head. These two weeks have not produced a whole book (though more than one is latent here). But for this my sentence is, “Now that we have made this device, who are we?”

My thanks to all who participated. Well done!

]]>
By: mccarty http://nationalhumanitiescenter.org/on-the-human/2009/05/who-am-i-computing/comment-page-1/#comment-58 Fri, 22 May 2009 15:51:29 +0000 http://nationalhumanitiescenter.org/on-the-human/humannature/?p=221#comment-58 I have a very distant relationship to baseball, so I’m at a bit of a loss to know what game now to play, or how to respond to all the balls flying about. Secret handshakes? I wasn’t aware that any are being given, but perhaps I’m too well trained even to know that I am signalling membership of an self-obsessed group. Solipsism? Perhaps I’m hallucinating so badly, or so well, that I’ve only imagined real exchanges among real people who, like me, are strongly motivated to communicate, even willing to take the sort of risks we can take to get a conversation going and to keep it going. Among the many things I learned from Northrop Frye was that in Paradise, at least as Milton imagined it, conversing is playful. So here we play in the delight of language, as best we can, to manifest, as best we can, what it means to be human. And if that’s normative, then I say that to imagine a life worth living is to imagine a world in which the usual is the normal.

I think I’ve said here a number of times that what people fear really interests me, especially in the context of my current research into the history of literary computing. There are many expressions of fear running through that history and through this discussion, from the keynote about theft of the humanities and in various statements of exhaustion – the humanities being all played out etc. What does all this tell us? It’s certainly clear from the history of computing in its intersections with the humanities that we have felt and continue to feel our identity (or, less flatteringly, our ego) being challenged fundamentally, as it was by Darwin, Freud et al. Now that we have this (discovery, device), who are we? Is this all there really is? What astonishes me about our history with the machine, again and again, is that we think its purpose is, as Blake said, to put the light of knowledge out, and we get so alarmed by the slivers of light coming through the cracks left by whatever latest ham-fisted solution, version 2.0, we’ve just tried. But don’t worry, our advertising masters (who look suspiciously like our line-managers) say, real soon now version 3.0 will put us to sleep for good. In other words, I say back, the failures are the point. Those fears are signs of walls coming down, telling us how to hasten their destruction.

But enough for now. Thank you, Peter.

]]>
By: Peter Batke http://nationalhumanitiescenter.org/on-the-human/2009/05/who-am-i-computing/comment-page-1/#comment-59 Fri, 22 May 2009 00:53:12 +0000 http://nationalhumanitiescenter.org/on-the-human/humannature/?p=221#comment-59 Let me respond to Willard’s response. I had purposefully bowled a googly in hopes of hijacking the discussion away from secret handshakes and obscure references as humanists congratulate themselves for checking their e-mail at Starbucks. I’ll play umpire and declare lbw. For the Americans, I threw Willard a screw ball and I got a long, long fly ball that was caught at the track, or let’s say it curved foul to encourage another swing. I understand the purpose of this forum is to encourage discussion, and the topic is an exploration of solipsism, something my computers and I have never indulged. They understand that they are valued tools; their only humanity is to be an extension of my mind for work and play. I like to concentrate on the work now.

Despite the human in humanities, humanists are not so much about examining the “human” as about examining texts. Scientists examine the human as well. The novelist may be the postmaster general, in the case of Trollope, but his work is examined in detail by humanists, his work is read by everyone else. Trollope is concentrating on the human; the humanists are comparing him to Thackeray. That does not mean that a given humanists would not like to be something else and may even get away with it. I have been working on Leunclavius, my favorite humanist currently, and I am continually amazed how carefully he worked and how he saw the tasks to be done much as we would see them today. And information was hard to get back then, as was a good doctor.

Let’s say we all started way back when, with the first sentence of Aristotle’s Metaphysics, and we were happily expanding the human innate impulse for knowledge when dogmatists tried to monopolize the discussion mostly by burning people with good ideas. While successful initially, the empirical perspective would not be denied, except in what became humanities in the 20th century. I think in the 19th century, there was considerable “science” going on in the collection of texts, understanding languages, compiling biographies and chronologies, certainly at German universities. Of course there were also bogus theories, but noone would confuse belletristic writing with humanistic scholarship. I think it was my generation, BA 68 that was allowed, for the first time, to write dissertations on the living or recently deceased. This, and a host of other factors having to do with an experiment in mass education, including the barely educable, lead to a softening of the humanities, and a vast inclusion, all in all a worthy effort. Hats off to American universities. That is not to say that hard scholarship was not going on. But there were just too many people involved, both to be trained as scholars and to be taught as novices for all the rigorous standards of textual scholarship.

And then came computers. Computers have been very good to us, and I am speaking for the legion of unemployable recent PhD’s (back then) that found rent money and much more in the computer centers of research universities. The spirit of inclusion allowed computing projects to flourish in various orbits because everybody had to learn word processing in a hurry. I think back then the computer was a shadow that crept into happy lives, an exacting demon that would have its way unless it could be tamed. People were having nightmares.

Computers were also causing scientists nightmares. But the computer was tamed, became the ubiquitous tool that allows us to engage in this forum, both technically in posting our response and belletristically in that we can hang ideas on the concept.

The question is how we deal with its history. I maintain that the present has brought us to such a pass with computers that we must rethink the work of the humanities. This may not be possible in an environment of budget constraints when one has to be grateful for every student. This is not really interesting, even if true. But information is piling up. This discussion will add a good 20 pages; all of it will have to be indexed by Google so that random wanderers can read what we have written. Another 20 pages next week. The point is that the quantity of information is such that some serious work needs to be done. First, the information generated exceeds the global storage capacity. Second, a third of the information stored is duplicates. Third, non-textual data can be found only through the metadata. And finally, with text data it is possible to find text even when there is no meta-data. I hope I am inspiring some sense of dread, especially by the last point. It is only an expedient to speak of info-glut and to try to build a wall around our disciplines where we can be secure with our real knowledge.

As an experienced scholar picks up a book in her field, she is aware of the metadata. I have books on my shelf that represent pretty complete set of information or knowledge in a field, and some less complete. I can log on to Hollis et. al. and get some sense of what I may be missing. I can download pdf’s from Google Books and complete my collection. But not everyone is playing by my rules. Google has scanned some millions books and is indexing the questionable OCR to rank pages. That means that books are retrieved not by the metadata that the humanist has internalized in her work, but by snatches of words in an algorithmic indexing scheme. I have thought long and hard about this till I have warmed to the idea. Let us forget about our “secure” knowledge, which will prove not to have been that secure by the next generation anyway, and let’s just think about pages. Let us NOT take the metadata and then internalize the book and then put forth some descriptive analysis based on what we know that we know; let us INSTEAD let the search bring us the pages, without the privileging by the profession. This may be solid post-modern ground.

I’ll leave this thought and fly off some tangents. It would be best if humanists patched things up with science in their own heads (scientists may not be that concerned). Especially when it comes to computing and humanities we cannot ignore the methods of science. Statistics must be taught to computing humanists as it was taught to social workers in the 60’s and 70’s. If someone does not want to learn statistics they should not do humanities computing, let them do philosophy, or communication.

We cannot cast aspersion and innuendo at science and feel good with our focus on the human which we would deny – them, the heartless er.. er.. researchers – yet dig around in the history of science for inspiration. We are all in the same race that began with the thought that humans could make sense of their world. If we would look for inspiration let us look at the group that was at the 1964 Yorktown Heights meeting for Literary Data processing. My own favorite is Stephen Parrish, a working scholar who found a valued tool in the computer. To read some of his work go to: http://www.princeton.edu/~batke/lbs/parrish.htm ( pardon the copyright infringement for a good cause, and pardon the site, it is a piece of computing anthropology that I have not touched in years.) Let us NOT look at V. Bush, although I used to be a fan. Here is a man who could not grasp digital. Let us instead rediscover John B. Smith, not his old work on Joyce, but his new work. Let me not go on; in any case, the list of citations should weigh towards the current present and there are many candidates with important work.

I should also add that I realize that the only real advancement in Computing in the Humanities is to be in a real department (English, History, Computer Science … ), and congratulations to all those who have made that leap. But for the rest, let us keep an eye on the technical issues, which includes theoretical issues and not imitate our non-computing colleagues who do need us, even if the dean does not know it.

In conclusion I would plead for a focus on the task of text. We are overwhelmed with text. Let us not insulate ourselves in the values we carry from the past and trapse of into the land of belles lettres to craft enigmatic sentences, but let us imagine a world where the applied math data mining people actually can come up with the answers, or some answers. Weirder thing have happened, I think. And that is my swing for the fence; let’s hope it was not a worm-burner.

]]>
By: mccarty http://nationalhumanitiescenter.org/on-the-human/2009/05/who-am-i-computing/comment-page-1/#comment-81 Wed, 20 May 2009 09:54:29 +0000 http://nationalhumanitiescenter.org/on-the-human/humannature/?p=221#comment-81 Francois has a gift for almost reading a text, that is, for remaining aware of the ways in which surface-features of a text condition the reading while it is happening. Most of us attend from the many voices and signs and signals to the argument unfolding as we go. He simultaneously attends to them. Processed by all the textual machinery after being processed through the digital media by which the text is presented, are we then closer together than we were before, if not unified? I’d suppose the answer is yes, and I agree that it is uncanny. It is one form of the head-breaking problem of context: how, for example, the first few words of Ovid’s Metamorphoses set limits to what can happen in the following 12,000 lines and open up worlds within them. How does that happen? “Once – upon – a – time…” and already we know where we are. That is the humanities to explore, the quite mysterious “alternativeness of human possibility” (again Bruner’s wonderful phrase).

]]>
By: mccarty http://nationalhumanitiescenter.org/on-the-human/2009/05/who-am-i-computing/comment-page-1/#comment-52 Wed, 20 May 2009 09:31:54 +0000 http://nationalhumanitiescenter.org/on-the-human/humannature/?p=221#comment-52 In Peter’s response to my original posting I am particularly interested from an historical perspective in the fact of feeling both left behind by progressive disciplines and overwhelmed by what we sometimes call knowledge (as he does) but more often information. Let me take the latter first, though I think the two are intimately related.

From my own experience over the last few decades of involvement in communicating such knowledge/information, I know that the expression of “infoglut” is based as much or more on a qualitative reaction as on a quantitative datum of experience. I’ve come to the conclusion that the problem lies in figuring out how to relate to measurably changed volumes of stuff, as when the number of books in a library reaches the level at which a catalogue is required to find the books quickly and reliably. (This seems to have happened quite early in the history of libraries, as the finding aids in Mesopotamian libraries suggest.) The adjustment to new orders of complexity has happened many times in the past (the beginning of Vannevar Bush’s “As We May Think”, July 1945, is but one recent example out of many), but we’re still at it. This is not to say that we’re slow or stupid – we’re facing a reconstruction of our ways of relating to the world. So we find ourselves repeatedly at the point of inventing new equivalents of that library catalogue and concomitant attitudes and behaviours. Some of us are old enough to have been raised with the notion (outmoded then, but still taught) that in going at a research project one should as a matter of course read everything that had been written about the subject. I doubt anyone even pretends to do that now. But what standard of sufficiency do we have now? What do we teach our doctoral students to do? And looking at what we actually do, how do we square this with the ultimate goals of scholarship? Faced by JSTOR et al. (which is where many students now begin), and so with the spread of a topic across many disciplines, and given the limited time mortality imposes, isn’t there a choice between going wide and going deep? Richard Rorty, discussing Gadamer in “Being that can be understood is language”, has an argument for this situation we’re in that seems to me very important indeed – and one which we haven’t yet taken in. The basic question is, I think, how do we humans already navigate a world in which things smaller than a grain of sand could absorb lifetimes of study? And (to echo both Warren McCulloch and Gregory Bateson), what is a human that he or she does this every day in every way?

As medicine for the feeling of being left behind by the sciences zooming ahead, I recommend neuroscientist Semir Zeki’s note at the beginning of A Vision of the Brain (1993), quoted by Philip Davis, “Syntax and pathways”, Interdisciplinary Science Reviews 33.4 (2008):

The study of the brain is still in its infancy and many exciting ideas about it remain to be generated and to be tested. I hope that no one will be deterred from asking new questions and suggesting new experiments simply because they are not specialists in brain studies. Leaving it to the specialist is about the greatest disservice one can render to brain science in its present state…. Perhaps what is needed most in brain studies is the courage to ask questions that may even seem trivial and may therefore inhibit their being asked. . . . You may find that you are making a fool of the specialist, not because he does not have an answer to your question, but because he may not have even realised that there is a question to answer. (ix)

In the humanities we have many such questions (Davis, an English professor, asks some of them). But rather than think of a race (to what finish-line?) in which the winner leaves the loser behind, how about the old story of the blind men and the elephant from the Pali Buddhist Canon – but, since we’re all blind, without the clear-sighted Buddha? Being a humanist I am inclined to think that we have the hardest problems of all, but I have to admit I’m doing well even to grasp in well-diluted terms what my scientific colleagues are working on these days. And from my experience of being a young proto-physicist as well as a reader of intelligent popularizations, I’d say their problems are both very hard and very, very deep.

I wonder too about this sense of the humanities losing out to the sciences in epistemological terms. I can understand that we’re losing out socially and culturally, that we’re being squeezed by the janitocracy of senior administration, whose arm is being painfully twisted by government agencies et al., who are in constant fear of being turfed out by a disgruntled public, who are rightly disgusted and angry at the behaviour of the bankers ad nauseam. But I’d not be at all surprised to learn that genuine scientific research is as threatened as genuine research in the humanities by the persistent wave of anti-intellectualism that constantly erodes our shoreline.

The humanities are concerned with envisioning a life worth living, within which computing now has a role to play, directly or indirectly, for us all. If, as I believe, “in designing tools we are designing ways of being” (Winograd and Flores again), then where else but to the historical disciplines can one look for guidance?

So let me make a strong statement: there can be no more important question for computing’s overall direction than the question of the human. Computing helps us work on it by giving us a powerful way to model our responses. Work enough for all the world’s humanists for a long time to come – if we but have the wit and the courage to take it up.

]]>
By: Francois Lachance http://nationalhumanitiescenter.org/on-the-human/2009/05/who-am-i-computing/comment-page-1/#comment-50 Tue, 19 May 2009 16:03:28 +0000 http://nationalhumanitiescenter.org/on-the-human/humannature/?p=221#comment-50 WiWillard,

The title offers a first person narrative. How does the piece navigate its way to a first person plural?

The agent of the insertion of a plural “we” is a citation from Northrop Frye.

As civilization develops, we become more preoccupied with human life, and less conscious of our relation to non-human nature. […] We have to look at the figures of speech a writer uses, his images and symbols, to realize that underneath the complexity of human life that uneasy stare at an alien nature is still haunting us, and the problem of surmounting it is still with us.

How does one surmount an uneasy stare? The answer may be embedded in some of the textual features that are located in what is traditionally considered the paratext.

The bibliographic items in the apparatus of the Works Cited, if read by a machine looking for plurals, finds Bruner’s possible castles to begin the list and Vickers’s keepers and players to conclude the list. The educated imagination is located between them. Such are the vagaries of alphabetical lists that such positionings can be read off of them (with a little jump between processing strings to identifying sememes).

If the stare at the alien is uncovered by looking underneath, can surmounting that stare be achieved by a looking again for a between hovering on the surface, a scanning?

The title, the citation, the bibliography – the textual mechanics can be regarded as a set of interlocking machines. Reading through a machine “we” become singular. Uncanny if not alien.

]]>
By: Peter Batke http://nationalhumanitiescenter.org/on-the-human/2009/05/who-am-i-computing/comment-page-1/#comment-51 Mon, 18 May 2009 11:17:24 +0000 http://nationalhumanitiescenter.org/on-the-human/humannature/?p=221#comment-51 Those of us in Willard’s ever young generation came to computing after our training was essentially complete. Thus we have had to take our old kit, lots of canvass and wooden spars and clanging tin pots and go camping in a new world of Gore-Tex, titanium and plastic. We can still contribute generally, since we can dig out wonderfully ironic nuggets from the history of thought, and besides, things have not changed that much at our end of campus. I always love to follow one of Willard’s expeditions into the l8th or 19th century when the future could be intimated, but our current present was still far off. It reminds me of a more comfortable time.

It is really our current present that is pressing upon us. The sciences are running with it – they are gone; they have long since left the question of moral and physical sciences behind. I mean, who is going to put the chair of the Department of Nano Technology on the rack today for denying that the earth is flat. In their rush to reorganize their research every 14 months, the scientists have, however, dropped things out of their all-carbon-fiber kits, things that we humanists, essentially incapable of reorganizing our own research every half-century, can use to get on with our tedious plodding.

There are humanists that are more than worried by the current state of the knowledge thing and working frantically to understand it, or at least find some words to describe it. Of course, frantic, worried humanists are nothing new; we have worried about everything, in the most recent distant past, about the knowledge explosion, that little pop that went off in the 60‘s. The most recent electronic issue of D-Lib has an article on the dimensions of the most recent exponential knowledge tsunami [Time Challenges – Challenging Times for Future Information Search, Mestl et. al.]. The ideas offered there to deal with this deluge may well have been intimated earlier, but they push us out of our horizon, literally. Just the mere notion of 2 stacks of books from earth to Pluto next year should make us think. I will not summarize the piece; it is an easy read, but not without things to challenge, and within easy Google reach.

So I am left wondering: can we look for guidance to the past? Yes emphatically, we can get guidance about love, about death, about raising children, dealing with siblings, parents. We can learn about power, about tragedy, about joy, about illness and death. We can even learn about life after death. We may even be able to learn about being overtaken by events. But where should we look for guidance on computing? In the 40’s? Earlier? With Turing and von Neumann? The answer is emphatically NO! We have to come to grasp with the issue that, failing a disastrous crash (we should be so lucky), the system of information will expand at such a tremendous pace, and the tools to present information will morph so dramatically, that the past and even the visionaries in the past have nothing to tell us, except good luck and God bless. Certainly the things our teachers taught us may give us solace as we gaze at our green pasture, they will not inform the information growth in the present. And as a final point, fantastic global information growth may be a problem for those of us forever young, for the actually young it is the opportunity of a universe.

]]>
By: mccarty http://nationalhumanitiescenter.org/on-the-human/2009/05/who-am-i-computing/#comment-80 Sat, 16 May 2009 08:21:16 +0000 http://nationalhumanitiescenter.org/on-the-human/humannature/?p=221#comment-80 I think of Joseph Tabbi typing on his notebook, “accessing the Internet through a wifi connection at a cafe several thousand miles from where I live, with a limited time before my battery runs out, for reading and response”. Thank you, Joseph!

Such happenings are now so commonplace as to be unremarkable – except for the fact that however commonplace, they are what is happening and so constitute part of what we need to understand of how human discourse, here academic human discourse, is reforming around new circumstances. One consequence is the greater informality of this particular interchange and many, many others like it – in particular its conversational qualities. I’ve been writing in an evolving academic mode in this medium for the last ca. 23 years, making what I write public (etymologically, publishing it) across the Internet to an audience so diverse as to be verging on Everyman. Well, ok, academic Everyman. What I’ve observed and pushed is the greatly increased opportunities for using language much more riskily to venture ideas and see what happens to them rather than to state suitably bullet-proofed arguments. In other words, to engage in scholarly interchange not only less formal (rhetorically more malleable) but also at a pace verging on the interactive. It is not much of an exaggeration to say that commonly I turn from my writing to my favourite Internet discussion group to ask, in effect, what do you think of this? Sometimes I say something I know is not exactly true but suspect will act as provocation to unearth something I cannot quite get to. To what extent do others do this? I don’t know. But I observe that it is happening, that it works marvelously well and so mention it here.

I mention this conversationalizing of academic discourse in the humanities (imitating somewhat the social sciences and e.g. computer science) because of the dynamic of engagement and the rapidly increased pace at which we are in this Forum working on the question of the human, meanwhile helping to change it. But what about the tradeoffs, i.e. what we are giving up for what I see as the benefits of being thus?

]]>
By: mccarty http://nationalhumanitiescenter.org/on-the-human/2009/05/who-am-i-computing/comment-page-1/#comment-79 Sat, 16 May 2009 07:47:16 +0000 http://nationalhumanitiescenter.org/on-the-human/humannature/?p=221#comment-79 Thanks to Yorick for “the power that writers have”. In that book I keep rattling on about, The Counterfeiters: An Historical Comedy, Hugh Kenner speaks of a language which theory has separated from its speakers — the belief that Language is “an intricate, self-sufficient machine with which mere speakers should not be allowed to monkey, unless they have first mastered the instruction book” (p. 84). As defense against this belief (which at the time must have been all the rage in linguistics) an English teacher in my ninth-grade class read out some poetry by a Korean teenager who had just learned English. It was ungrammatical to say the least — and wonderful. Now, half a century later, I realise how very hard it is to summon the power that Yorick speaks of, especially against our “close, naked, natural way of speaking”, and how marvellous when, e.g. in Seamus Heaney’s poetry, it takes over.

In that light it is interesting that when in the early years of computing’s encounter with language the attempts to use the former to produce the latter were greeted with such howls of protest, especially when poetry-writing was the object of the computational exercise. One of the louder ones was F. R. Leavis’, in the pages of the Times Literary Supplement (23 April), in a front-page article entitled “’Literarism’ versus ‘Scientism’: The misconception and the menace” (later reprinted in Nor Shall My Sword, 1972). He tells of encountering “point-blank… the preposterous and ominous claim” of computer-generated poetry. It’s not difficult to imagine a truly preposterous claim, which this claimant may have made; it’s also not difficult to agree with many of the prescient remarks Leavis makes about the steep decline then beginning in British higher education. (O tempora, o mores!) But what interests me in the present context is the fear which such claims stirred up and the deaf ear on which highly intelligent and critically cautious proposals (such as Margaret Masterman’s, also articulated in the pages of the TLS) fell. I wonder – with, I must say, insufficient evidence and hope for argument – if the fear was stirred by a barely hidden suspicion that the 17C Royal Society’s English was about to find its enforcer, so that the human as then conceived would thereafter be struck dumb?

Again, Leavis, here referring to that claimant as the philosopher she was: “That any cultivated person should want to believe that a computer can write a poem – the significance of the episode, it seemed to me, lay there; for the intention had been naïve and unqualified. It could be that because of the confusion between different forces of the word ‘poem’. And yet the difference is an essential one; the computerial force of ‘poem’ eliminates the essentially human – eliminates human creativity. My philosopher’s assertion, that is, taken seriously, is reductive; it denies that a poem is possible – without actually saying, or recognizing, that. If the word ‘poem’ can be used plausibly in this way – and by ‘plausibly’ I mean so as to be accepted as doing respectable work – so equally can a good many other of the most important words, the basically human words. Asked how a trained philosophic mind in a cultivated person could lend itself to such irresponsibility, I can only reply that the world we live in, the climate, makes it very possible.”

Leavis reacted to Snow in exactly the same way, as a dark omen (and, in his infamous Richmond Lecture, even jokingly asserted that Lord Snow’s novels had been written by “an electronic brain called Charlie”). But it is abundantly clear from much of the rest of what I have read from the period that the computer had occasioned great waves of anxiety over human identity. We might say the question was, what is the human now? Isn’t that now our question as well?

]]>