Paleography at KCL

Over the last week there has been a groundswell of action in opposition to the decision to eliminate the paleography program at King’s College London, most significantly the position of the Chair of Paleography, Professor David Ganz, which is the only such position in the UK and perhaps in the English-speaking world. Paleography, the science of manuscripts and handwriting, lacks the direct economic and political impact of other fields but has enormous influence on work throughout the historical disciplines. My new book relied significantly on Professor Ganz’s co-edition/translation of Bischoff’s Latin Paleography. More broadly, the notion that any scholar’s research should be narrowly dictated by budgetary considerations – that evaluations of scholarly merit ought to be conducted on the grounds of immediate financial impact – is anathema to the principles of academic freedom.

A Facebook group and an online petition have already been organized to oppose this misguided bureaucratic decision. I encourage any of you who may be concerned about the impact of this decision to become involved through these or other means. A parallel effort has been organized opposing the firing of several KCL philosophers.

Citation anxiety

I am always very careful to indicate, in guidelines for essays and papers, that I don’t care what bibliographic or citation format my students use. APA, MLA, AAA, NWA … I always say that as long as they pick one format and use it consistently, they’ll be just fine. I have a soft spot for Chicago style (author-date) but I certainly don’t ask anyone to use it. Yet every term, I get at least one student who speaks to me or emails me in concern about bibliographic or citation format. Even after I insist that I have no preference, they just can’t quite be convinced that I won’t deduct grades for failure to conform with an arbitrary set of guidelines, including things like whether to capitalize every word of book titles, or whether to put parentheses around dates. They can’t quite believe me, either, when I tell them that many journals and presses use minute variations of the major styles, so that whatever I do as an author will eventually require professional attention.

Everywhere I’ve taught, I’ve seen this phenomenon, again and again. I also see, again and again, students who are apparently indifferent to serious writing or analytical problems but still get stuck on fine points of some style guide. What gives? Is it really the case that most professors are such sticklers for formatting issues that it is rational for students to be so concerned? Maybe, but I’m not convinced. Alternately, maybe citation style is something that seems more objective than other, more significant aspects of paper-writing. When you’re unsure of other issues, or know you have problems with them, hanging on to the one thing that you know you can get just right is a security blanket. Whatever else may be wrong with your paper, at least you got the citations right. I don’t know about this either, though – if it were really true, wouldn’t more students actually use a single style correctly and consistently, even after inquiring?

So, colleagues and students, what do you think? Is citation anxiety ubiquitous? If so, is it reasonable? And what can be done about it?

A feisty embuggerance

When I grade my students’ paper proposals, I make a point of doing a brief Google Scholar search for each student’s proposal, which a) helps me evaluate how thorough they have been; b) helps me help them find additional material (I then give them the sources I found, but also the keywords I used to find them). One of my students in my introductory linguistic anthropology course this term is doing a paper on linguistic aspects of laughter and humor. During my search, I encountered the following citation (direct from Google Scholar to you):

Embuggerance, E., and H. Feisty. 2008. The linguistics of laughter. English Today 1, no. 04: 47-47.

After I stopped laughing, I set to figuring out what was going on.

1) I quickly discarded the theory that an unlikely duo of scholars actually had this pair of names – although that would have been too awesome for words. In fact, no other article listed in Google Scholar has an author named ‘Embuggerance’ (although there are a couple other Feistys).

2) I also considered the possibility that this was one of the many metadata errors in Google Scholar; for instance, there are thousands of articles whose purported authors are named Citations or Introduction or Methods, due to errors where it interprets headings like “IV. Methods” as a name “Dr. I.V. Methods”. But this seemed unlikely in the extreme in this case.

3) This left the possibility that these were pseudonyms adopted by particularly amusing authors as part of a parody article.

In this case the article is in fact a book review (which I could tell because it’s all on one page), so I didn’t recommend it to the student, but I did request it for my own edification. Lo and behold, it arrived today as a PDF.

‘The linguistics of laughter’ is a book review of a The Language of Humour by Walter Nash. It’s perfectly ordinary and non-satirical, and it does not contain the words Embuggerance or Feisty. But next to it is another book review, entitled ‘Concise and human’ which contains the following passage (emphasis added):

Silverlight’s concise and human reports cover a surprising range of curious items, from Acid Rain through Bottom Line, Catch 22, Dinner/Supper, Embuggerance, Escalate, Feisty, Holistic, Krasis, Ms, Naff, Quorate, Shambles and Viable to Yomping.

The four bolded words appear on a single line, and the fact that the Google Scholar metadata thinks that the initials of the ‘authors’ are Dr. E. Embuggerance and Dr. H. Feisty seals the deal. This is the source, and so something like option 2 above is correct. But this is really weird. Not only do the pseudo-authors appear in the middle of a contextualized sentence (not in headings), but the sentence is in the wrong review – a review that itself is found (mostly correctly) in Google Scholar!

To make matters even worse, at the end of the reviews section the phrase ‘Reviews by Tom McArthur’ appears – an attribution which is found in the metadata for ‘Concise and human’ but not for ‘The linguistics of laughter’. And, as if this were not bad enough, even though both reviews are listed as being from 2008, the PDF clearly shows them as being from 1985. If I were a gambling man, I’d wager that 2008 is the year when the metadata was added and/or the file was scanned.

Now, mostly this is just a humorous anecdote; I don’t mean this as an indictment of Google Scholar, which I consider to be the most useful way for most scholars to find academic literature, and which I use virtually every day. But one has to wonder at the process (automated or otherwise) that leads to this comedy of errors. A great deal of virtual ink has been spilled over at Language Log (here and here, for instance) on the metadata problems with Google Books / Google Scholar and its implications for linguistic research, for tenure cases that rest on faulty citation records, and other potential problems. Until there is a way for these sorts of errors to be corrected by end users, we may all be well and truly embuggered.

Ig Nobel 2009

The annual Ig Nobel awards “for achievements that first make people laugh, then make them think” were given out last night, and once again, anthropology has been well-represented. Catherine Bertenshaw Douglas and Peter Rowlinson won the award for veterinary medicine for their demonstration that cows that are humanized by giving them names produce more milk than those that remain, uh, anonymous. Although they are veterinary scientists their work appears in the interdisciplinary anthropological journal Anthrozoös. Meanwhile, the Ig Nobel for physics went to the biological anthropologists Katherine Whitcome, Liza Shapiro and Daniel Lieberman for their work (which appeared in Nature a couple of years ago) explaining why pregnant women don’t tip over. This is extremely important as it bears directly on the evolutionary costs and benefits of bipedalism, among other issues.

See the full list of winners here.

Bertenshaw, Catherine and Peter Rowlinson. 2009. Exploring Stock Managers’ Perceptions of the Human-Animal Relationship on Dairy Farms and an Association with Milk Production. Anthrozoös, vol. 22, no. 1, pp. 59-69.
Whitcome, Katherine, Liza J. Shapiro & Daniel E. Lieberman. 2007. Fetal Load and the Evolution of Lumbar Lordosis in Bipedal Hominins. Nature, vol. 450, 1075-1078.

What we do

Julien at A Very Remote Period Indeed (which I am so glad to see has been revived after a hiatus!) has just offered up a scathing and brilliant response to some clod from Fox News who obviously hasn’t the least idea what university professors do and thinks that the reason postsecondary education is so expensive is that university administrators aren’t whipping faculty to work hard enough. Julien’s writing from the perspective of a field archaeologist, but as someone who got one (yes, one) week of actual vacation this summer, I have to add my “Right on, brother!” Also, most of us don’t actually get *paid* for the work we do in the summer (but if we didn’t do it, we wouldn’t be able to do the research needed for tenure and promotions). And we also serve on any number of committees (I’m on three this year, and I’m considered ‘protected’ from a heavier load, as a junior faculty), advise and mentor grad students, read scholarly literature (shocking, I know!), write scholarly literature (doubly shocking!), apply for grants, respond to emails (constantly), actually prepare and write our lectures/seminars, grade student work, etc., etc, etc. So, although I’m sure no one reading this blog thinks this, if there is anyone who thinks that professors are only working when we’re standing in front of an undergraduate classroom, uhh, I hate to bring it up, but your ignorance is showing.

And with that said, it’s time for me (at 9:45 on a Monday evening) to go work on the index for my book.

No science like Snow’s science

I had the privilege this afternoon of attending a fascinating panel discussion run by several of my colleagues here at Wayne State entitled “The Two Cultures” by C.P. Snow — 50 Years Later sponsored by WSU’s Humanities Center and the Institute for Information Technology and Culture. Snow was a physicist, bureaucrat, and novelist whose Two Cultures lecture in 1959 postulated a grave gap in knowledge and interest between the sciences and the humanities. Specifically, he claimed that while the intellectual milieu of the time required all scientists to be at least somewhat familiar with some history and literature, humanists viewed scientific knowledge as the purview of scientists alone, something of which they could remain blissfully ignorant. This, Snow saw as a grave problem in a society whose technological complexity had grown to a point where policy-makers required scientific literacy to make informed decisions.

The debate today was lively and generally productive. For me, as a linguistic anthropologist who studies the history of mathematics, I was struck by the fact that all three of my disciplines are betwixt and between Snow’s two cultures (anthropology quite proudly so), and moreover, that the panelists today were two mathematicians, an anthropologist, a linguist, and a historian of science! My remarks below are based on an issue I raised at the end of the discussion, but which I feel deserves more of my attention now.

To the point: I do not believe there are any grounds at all to believe that there is such a thing as ‘science’ to be made clearly distinct from ‘the humanities’ – that at best these are used to designate semi-useful collocations of perspectives, and at worst, they are self-serving labels used to isolate oneself and to denigrate others. True, I have elsewhere here posted on linguistic anthropology as an integrated science, and I don’t retract anything I said there. Perhaps I would rephrase what I wrote to say that my belief is that too much linguistic anthropology self-identifies as purely or largely humanistic and thus defines itself out of some really interesting subject material, e.g. relating to the evolution of language.

Snow lived and worked at the height of modernism in the academy: for the social scientist, behaviorism, functionalism, and structuralism were all in full bloom. What he did not foresee, and could not possibly have foreseen, is the emergence of the ‘Science Wars’ or ‘Culture Wars’ in which two camps defined themselves in opposition to one another. Starting in the 1970s (or earlier or later, depending on who you ask), ‘science’ was severely criticized from various angles that we might generally label postmodern or poststructuralist. The response from ‘scientists’ (do people really call themselves ‘scientists’ unironically any more?) ranged from ignoring the new trend to bafflement to outright hostility. Certainly the response from the scientific community followed the initial criticisms of the humanists.

In fact, however, the label ‘war’ is quite inappropriate since very little of the academic discussion that we might now define under one of these terms actually involved academic debate between the two camps. Rather, the sides served as useful straw men to be marshalled in front of one’s fellow-travelers, serving as an emblem of clan identity (as a shibboleth). Moreover, drawing these boundaries allowed one to safely ignore that which lay beyond them as unnecessary, irrelevant, or just plain wrong. Just as we recognize that you can’t draw a line around ‘a culture’ without asking who is doing the defining and for what reason (and in whose interest), I believe that there is ultimately very little behind the distinction between Science and Humanities that cannot be explained in terms of a rather narrow set of interests, both internal and external.

In certain circles among Science, to deride social construction was to be seen as supporting empiricism against know-nothing relativism, and with it came suspicion of the totality of the humanities. They simply were not seen as relevant for anything most practitioners of “serious, empirical” disciplines would do. Conversely, Science was seen as the enemy by some humanists, or at least a subject about which they needed to know nothing in order to comment usefully. Few ethnographers, for instance, could admit publicly that they detested the people among whom they worked, but I would have no difficulty finding ethnographers and historians of science who have contempt for the people they study and the work they do. Still, many in the humanities, probably most, had nothing to do with these debates and continued to do sound empirical research using methodologies not so different from those of Snow’s day. And some disciplines, like anthropology and linguistics, have always been part of multiple traditions and have defied easy definition.

My question ultimately rests on how distinct the humanities and the sciences were as concepts, prior to World War II, and what explanation we might give if they have become increasingly distinct over time. I proposed, only half-jokingly, that to define the humanities as a bounded group of disciplines allows Science to define ‘those whom we do not have to fund’, and to define Science allows the humanities to define ‘the object of our newfound ire’. Snow shows that the labels were cognitively relevant to academics even in the late 50s remain so even in a vastly different social context and funding environment, and one in which both humanists and scientists are beholden to crass interests that emphasize applicability over insight and wealth over wisdom. But rather than suggesting that ‘sciences’ and ‘humanities’ are essential and timeless categories, I suggest that they remain useful to scholars because these definitions allow one to exclude huge swaths of knowledge from one’s purview. They tell us what we do not need to know to do our jobs, much to our ultimate loss.

For my own part, my research involves reading in fields such as semiotics, communication, epigraphy, history of science, historical sociology, anthropology, archaeology, linguistics, mathematics, evolutionary theory, developmental psychology, and cognitive neuroscience. No, really. I admit that I’m only reading from cognate fields to the extent necessary to my own work on number systems, but these are not just pastimes – they are necessary ancillary reading in order for me to have a fuller grasp of the subject I’m most closely engaged with. I can’t possibly imagine how I could neglect the detailed historical studies that form the core of my book, and at the same time I can’t imagine what my work would look like if that was all I did. But I don’t think that I have a particularly broad training or that I am a particularly energetic reader – far from it! I’ve simply defined my areas of specialty in a way other than the grand dichotomy between Science and Humanities.

For me, the critical study of race and gender are impossible without some familiarity with human physiology and the variation in bodies that is really real. Similarly, however, it is inane to study religion solely from a cognitive neuroscience perspective without understanding the moral and metaphysical models that are honestly perceived as valid by believers. My case is that the study of academic topics will frequently – though not inevitably – require one to be familiar with the methods, concepts, and practitioners of several disciplines across the socially-defined ‘spectrum’ of humanities and sciences. This will particularly be true for any topic that involves human beings.

I do not intend the foregoing to be read as a sort of facile constructivism of the “oh, nothing is real, let’s all fall into a nihilist heap and abandon any interest in objectivity” school of thought which has always been more real ideologically than in practice, in any case. If someone has a definition of science that excludes all of the disciplines of history or philosophy or linguistics, and really wants to insist that this is Science, eternal and unchanging, they’re welcome to it. For my part I’ve often found that when the stakes are highest, definitions are little more than post facto rationalizations of what one wants to believe one is doing, and what one believes one’s rivals by definition could not have done.

Warning: long and meta

A recent post by Alun Salt at Archaeoastronomy entitled ‘Blogging and Honesty’ has re-inspired me to think about a subject that had its genesis at the IMC at Kalamazoo a few weeks ago. I had the pleasure of attending a session on weblogs and the academy, specifically on the topic of anonymity / pseudonymity and issues of identity. organized by Shana Worthen and Elisabeth Carnell, which led me to a set of not-new-but-new-to-me insights about this crazy medium. I urge you to read the liveblog post that gives a very good sense of what was said at the panel.

I decided a long time ago that I would make no effort to conceal my identity online. Basically, I don’t trust that anything I say anonymously will remain anonymous, so rather than say something that could later come back to bite me, I’d rather self-censor ‘at the source’, and just not say anything. Obviously this has the disadvantage that there are certain things I just can’t or won’t talk about publicly, which means that I have to find other ways to get them off my chest. Fortunately I have the Growlery, which is also public, but which I can filter and lock away from eyes not meant to see certain things. It’s just what works for me. When I started this blog, the distinction was not between ‘pseudonymous’ and ‘non-pseudonymous’ but between ‘academic’ and ‘non-academic’.

But this does raise some significant issues of identity, because what exactly constitutes my ‘academic’ work? When I look out my office window and see a funny-looking sign, why is this ‘academic’? Or when I write extensively researched posts about particular aspects of English etymology, is this ‘non-academic’?

I may not have mentioned this before, but this blog was a present to myself upon attaining my present academic position. I had always had an interest in academic blogging, but I never felt I had the time or the energy while I was still working contingently. I am very fortunate that in my previous workplace I had enormous academic freedom that not all of my peers enjoyed, and I don’t think that the fear others have, of losing a job or saying something that affects a tenure decision, played a significant role, because by last year I’d been blogging for six years already, and had lots of practice with not saying stupid things that I would later regret.

I think one of the most important reasons why I started an academic blog under my own name, though, was to serve as a public voice for my discipline, my sub-disciplines, and my own crazy point of view. As Kristen Burkholder pointed out in the panel, medievalist bloggers can’t rely on pseudonyms to maintain their identity because the field is specialized. Well, let me tell you, anthropologists who study numerals are significantly more specialized than that, so unless I wanted to avoid any discussion of my actual research, pseudonymity wasn’t an option. I also see this place as a venue for sorting out ideas before they’re ready to publish, and announcing things that I actually do publish. So there’s that.

But as Julie Hofmann pointed out very rightly in her paper, there may well be issues of privilege involved here, because I am a white male, and have less to worry about my colleagues disparaging my blogwork as less than professional. And from what I can see, that’s true for medievalists – but perhaps less so for other disciplines. In linguistics, for instance, the existence of the big collaborative Language Log, which has been running since 2003 and whose non-pseudonymous authors are practically a Who’s Who of the discipline, almost certainly helps budding linguists and linguist-oids such as myself to make the leap without fear of repercussions. LL is authoritative, clearly the top of a hierarchy of linguistics blogs, and sets the gold standard. I don’t know of any linguistics blogger who’s unfamiliar with it.

By contrast, anthropology bloggers are more fragmented, partly due to the fragmentary nature of the discipline, and less prominent within the field. Yet there are also fairly few anonymous anthropology blogs. I’m not sure exactly why that should be. I wonder whether part of it is that anthropologists spend a lot of time in the field, in situations where their research is unique or nearly so. Weirdly enough, while being one of a small coterie of medievalists may encourage pseudonymity as a form of protecting one’s identity, being unique may encourage one to simply be ‘out there’ or risk not being able to say anything professional at all.

So, yeah, identity. Probably the paper that hit me the hardest in the panel was Janice Liedl’s on issues of identity. Because I spend a lot of time and mental energy thinking about how I’m going to come across to others. My dissertation supervisor once phoned me up during my first year of my PhD to tell me what a large superego I had (thanks, I think?). Of course my own persona here is constructed, and is different in subtle (and not-so-subtle) ways from my persona over at my other public place on Livejournal – even though both are public. But I also think you could figure out a lot about my personality from hanging around here long enough, even though this is a professional blog meant for an audience interested in the sort of work I do.

And part of all of this being ‘out’ is about a public display of my identity as a scholar and as a professional – hopefully not narcissistically so, but rather, as a model for other scholars, and also for my students. A couple of the panelists pointed out the usefulness of blogging as a model for other scholars, for instance, what it’s like to be a professional, or what kinds of challenges academics face, and I think that’s right. And one of the only regrets I have about not being pseudonymous is that there are those things that I just can’t blog about here, because it would be grossly inappropriate on a professional level. Even though I agree most of the time with Dr. Crazy over at Reassigned Time, for instance, regarding how to relate socially with excellent students or unexcellent colleagues, she can say publicly things that I can’t.

And this is perhaps another difference: there aren’t too many linguistics OR anthropology blogs that take a strongly personal approach. Hofmann argues, rightly, that ‘academic life’ blogs that focus on the day-to-day goings-on in a professional’s life tend to be pseudonymous and written by women. But it seems to me that there are whole blogging cultures (roughly disciplinary in nature) that tend towards one voice or another. And so yeah, I think we need more anthropology blogs, period, but we also need more blogs that deal with the daily realities of anthropological life, not just research findings. We owe it to one another to engage in the sorts of discussions that serve not only as a model to our peers, but also to our junior colleagues – the students who will form the next generation of bloggers. Without wanting to imply any direct causation, I find it noteworthy that several of my former students from McGill have started their own blogs over the past six months – good ones, too! By blogging (pseudonymously or otherwise), we are engaging in a process of cultural transmission, akin to yet different from the face-to-face mentorship we take on in the classroom and the office (and, let’s be honest here, the bar).

Okay, maybe that sounds pretentious, and maybe it is pretentious. Hell, while we’re on the subject of identity, you don’t think my pretentions only exist here in the blogosphere, do you?

Medieval anthropology

At the International Medieval Congress this past weekend, I had the very unusual experience of being (as far as I am aware) the only representative of my discipline at a conference with over 3000 attendees (although there were plenty of linguists around, and at least some medieval archaeologists). A good time was had by all – well, at least by my wife (an actual medievalist) and myself and our friends. My paper (longer discussion to follow) was very well attended and well received, and initiated some interesting discussions.

My work on numerals is hyper-specialized, to say the least. There are maybe five or eight living anthropologists, worldwide, whose work is centrally about numerals, and perhaps a couple of dozen linguists on top of that. Of course there are people who have written about numerals other than these, but they are not specialists in the topic. But while my core research is hyper-specialized, I think of myself, by contrast, as a polymath. I have graduate-level training in cultural anthropology, history (both ethnohistory and history of science), archaeology, cognitive science, and linguistics. And it makes me very, very happy to have this breadth. So there really aren’t too many situations where I feel extraordinarily out of place in humanities and social science-type conferences, and this one was no exception. If the opportunity arose to present there again, I would jump at the opportunity – despite the fact that by all rights, I don’t really belong there.

On Saturday I attended an excellent roundtable entitled ‘Medievalism across Time and Space’ hosted by my friend Julie Hofmann, which dealt with the definition of the medieval both in scholarly discussions of different time periods and different regions, as well as in how the public perceives and understands ‘the medieval’. Ambitious, no? Also really fascinating stuff. In the middle of this discussion, someone made a comment that made me realize that of all times and places, North American anthropology really excludes ONLY the medieval from its purview. There are dozens of archaeologists trained jointly in anthropology and classics departments, and/or who teach interdisciplinarily in both fields. Hundreds of classical archaeology students every year get their archaeological training primarily in anthropology departments. Similarly, there are hundreds of early modernists in anthropology: people who focus on Spanish colonialism in the New World, for instance, or world-systems theorists, or people interested in Atlantic World / diasporic studies. But the medieval is almost entirely out of our grasp.

This is an odd gap, to say the least, for a discipline that purports to be a holistic comparative study of human behaviour. At the panel it was noted that in many small (and not-so-small) history departments, medievalists get the honour (???) of teaching Western civilization courses that start with Sumer (anthropological archaeology has numerous specialists) and end with the twentieth century (which the vast majority of cultural anthropologists have expertise in). And it’s not at all that I think that somehow this means I, or any other anthropologist, would do a better job than a medievalist would of teaching such a course, nor that I would want to do so. But if I were going to construct a ‘world survey’ anthropology course, it would be very challenging to come up with relevant material written by anthropologists or anthropologically-trained archaeologists that focuses on the millennium of history in which medievalists specialize. But I can’t think of any valid conceptual or methodological reason to exclude the medieval from the anthropological.

But it also occurs to me that, despite it not being a formal part of my training, I do have a lot of experiences, skills, and knowledge pertinent to medieval studies. I have two years of Latin. My wife is a medievalist and through osmosis, I know almost as much about her area of specialty as she does about mine. The first paper I ever wrote in grad school was on the historical ecology of the early Icelandic state, and I even once thought seriously about looking at the post-Roman archaeological record in Britain from the perspective of postcolonialism as my dissertation (little known secret, until now!). About 10% of my book (at least) deals with medieval Europe and the Middle East, and if you count the rest of Asia in there, more than that. And of course, I spout off about the late medieval transition from Roman to Western numerals at virtually any possible occasion, because so many people think they know why it happened, and are so very, very wrong.

One of the central ideas tossed around in the roundtable was that ‘the medieval’ is a nebulous object, subject to both scholarly and popular imposed definitions that satisfy no one. It was noted by many that ‘medieval India’, ‘medieval Japan’, ‘medieval Islam’, etc. all refer to very different social configurations and chronological periods, and I chimed in that of course if there were a chronological definition then one really needed to include New World societies as well – recognizing fully that no one is happy with such a definition in any case.

So in my copious (ha!) spare time over the next couple of weeks, I’m going to put together a working bibliography of ‘medieval anthropology’: scholarly publications in social/cultural, linguistic, or archaeological anthropology that have as one of their central objects the Old World between roughly 500-1500 CE. Because I do know they’re out there, at least in limited quantities, and it seems like a real gap. The only things I want to exclude are a) physical anthropology; b) medieval archaeology as written by non-anthropological archaeologists, i.e. almost anyone trained in Europe.

So how about it: any anthropologists in the audience know of any material I should be including?

Edit (2010/01/31): I have now created this bibliography which can be found here.

To grad or not to grad

There’s been a lot of handwringing lately among academics interested in the academic job market over the question of whether one should advise anyone to go to grad school in humanistic disciplines (presumably including the humanistic social sciences like anthropology). To be sure, the job market has been terrible for the past 40 years and this year’s offerings have been mediocre at best, as the financial crunch has (at minimum) decimated endowments and produced an extremely wary attitude among state legislators responsible for public funding. There have been a couple of rather mediocre, although not entirely wrong, articles in the New York Times of late, bemoaning the academic job market and the prospects for graduate students.

But I want to talk about a couple of articles entitled ‘Just Don’t Go’ by Thomas Benton, a regular columnist in the fantastic Chronicle of Higher Education Careers section, which are of interest to me because they are written by a working, tenured academic in the humanities (links here and here). Benton argues that the only honest thing for faculty to tell prospective graduate students in the humanities is not to go to grad school; the prospects are simply too dim, and the waste of effort and of human productivity so immense, that the best thing to do is to give blanket advice not to go. He knows, of course, that many (most?) students who receive this advice will still apply to grad school, because they are drawn to intellectual life.

It’s unsurprising that such advice would become more pertinent in a down job market, where grad school might seem like a safe haven to ride out the economic maelstrom. Benton nevertheless notes that this is not just about this year, but about a general trend in academic employment over many decades. He’s absolutely right: academic employment is sparse and not about to improve dramatically anytime soon, possibly ever. Many people who complete a PhD will never work full-time as tenure-track faculty. And so he is also quite right that discouragement is a rational strategy for faculty confronted with multiple students interested in pursuing the doctorate.

Nevertheless, I find it intellectually dishonest and generally unwise to advise interested students, ‘Don’t go to grad school’. First, because to do so would be the height of hypocrisy, and would be perceived by many as a statement that I (as someone who ‘made it’) don’t think that my students have what it takes to make it. Second, because I think that if you gave every student that advice, some students would take it who shouldn’t, and others wouldn’t who should, with the potential result that the next generation of scholars would consist of those too foolish not to listen to their professor’s good advice. Third, because telling someone anything is less desirable than giving them good information and allowing them to make their own decision.

I do think that there are far too many PhDs in anthropology, and indeed in most of the humanities and social sciences. At the very least, there are too many degree-holders on the job market in comparison to the number of jobs available. While the number of jobs available in academia fluctuates, in most disciplines it has increased modestly over the past 20 years, while the number of candidates has increased dramatically. This Malthusian logic dictates that one’s chances of getting a job are not that great.

The reality is that approximately 50% of PhD graduates in anthropology will eventually end up on the tenure track somewhere (most often within five years of obtaining the degree, after which your chances decrease as you are perceived rightly or wrongly as ‘damaged goods’). Another 25% will end up in professionally-appropriate positions in the public or private sector (this is particularly relevant for anthropologists, for which there are well-defined non-academic yet professional jobs), while the rest end up somewhere else – but very few end up unemployed altogether. Are these chances good enough for you?

They might be. Of course, up to 50% of people who start the degree do not finish, and so saying that 50% of PhDs will hold a tenure-track job eventually is incomplete, because it does not account for the many students who never finish the degree. Now, virtually no one admitted to a doctoral program ‘fails out’ in the sense that they lack the intellect to complete the degree. By far the most common reasons, in my experience, for people not finishing the PhD are a lack of money or a lack of motivation.

Motivation is not just about ‘having the will to persist’, although that’s important. It’s about developing a network of social relationships, especially with mentors but also with peers, that allow you to feel good about continuing in the program, to be intellectually rewarded and validated, and to remain on track. I was particularly blessed, as a student, to have some top-notch mentorship, but I regret to this day that my peer group was neither as large nor as close as I would have liked it to be. And I know plenty of people who had or have situations less congenial than mine, and who found themselves stranded without any meaningful support. This problem only gets worse if you are underfunded. The reality is that while persistence is the key, persistence can only be realistic when you have a lot of support. So I think it’s worth telling students to research programs as thoroughly as possible, and to find schools where they can plausibly work with multiple people.

Similarly, money really is central, and is one of the reasons why, even though I don’t advise students, “Don’t go to grad school” outright, I do advise them not to go to grad school if it means taking on substantial debt, and realistically, only to go where they have funding. For some students who can achieve admission to a top PhD program right out of a BA, or for others who can find a funded MA program, they are in good shape to move forward. For others, though, an MA means taking on tens of thousands of dollars of debt only to go into a PhD that may only be poorly or partially funded. To finish the PhD you may need to take on a lot of extra non-professional work or go even deeper in debt, possibly taking longer than your peers because of demands on your time. If you have sources of income to allow you to do an unfunded MA, more power to you.

And what happens when you’re done? For an indeterminate period, you will likely be underpaid and underemployed, while paying back student loans and trying to find a job. If I had had any substantial student debt at all, I simply could not have afforded to work in academia for the four years following the completion of my degree. I could not have supported my family, and I would have left the discipline, not out of a lack of ability, but simply out of a lack of funds to continue the search in a tight job market. And the market is ALWAYS tight.

Another factor influencing post-degree success on the market is institutional prestige. It is a sad fact that academic disciplines are, and always have been, hierarchical in a way that is rarely recognized by most undergraduates. To demonstrate this, you need only go to a faculty list from a department and see where the faculty got their PhDs. You will find that the vast majority of tenured and tenure-track faculty got their degrees from the top 50 or so institutions in the world (for that discipline), with the top 10-20 schools being very well represented indeed. Not coincidentally, these institutions have the highest degree of student financial support for PhDs (although not always the highest degree of emotional, psychological, and other forms of support). It may be true that only 50% of PhDs in humanistic disciplines ever hold TT jobs, but nevertheless, if you attended a high-ranked institution, your individual odds may be much better.

So, if you have the good fortune and ability to attend one of those programs, then you will find that your chances of employment after completion are very great. If not, well, your chances will be less. What’s more, you should prepare for the fact that even if you do get a PhD, you will probably work at a less prestigious institution than the one you graduated from. Everyone can think of exceptions, but that’s just what they are – exceptions to an overwhelming statistical probability. And because many smaller and less prestigious institutions don’t even have anthropology departments (as opposed to, say, biology or psychology), attending a less well-known institution can harm your opportunities for finding academic employment at all. Unfortunately, few departments provide detailed information about where their graduates end up after completion, and those with poor records have the least incentive to do so.

Now Benton wants to argue that the solution to this is to develop/train/find/invent/construct graduate students who do a PhD with no expectation of an academic career. And I think at some level it’s good advice to students that they need to prepare for the possibility of a non-academic career, not only psychologically but also in terms of the skills they obtain. This is particularly true in disciplines like anthropology which do have significant (although not always obvious) professional outlets where PhDs earn a decent living outside of academia.

But more to the point, I think that the sorts of people who should be considering graduate school are those for whom the actual process of going to grad school is enjoyable and rewarding for its own sake (despite its struggles). One thing I do tell my students is to ask themselves, “If I spend six years in grad school, even if I never get a job, will it still have been worth it?” If they can honestly answer yes, that the process of learning and intellectual exploration is worth it for its own sake, then they should do it; if not, then they shouldn’t. And again the money comes into play – if one has to go into massive debt to do it, then it’s certainly less likely to be worth it.

And even further, I worry that while Benton is right about the job market, and right about the need to inform students of the realities of the market, he’s asking more of academics than anyone would ask of other professionals. We don’t tell artists not to do art, and the chances of financial success as an artist are far, far dimmer than the prospects for an academic. We don’t tell baseball players not to try out for the minor leagues just because the chances of them ever playing major league ball are minuscule. (The baseball analogy is one that a friend of mine mentioned to me some years ago and that I have been using ever since to talk to non-academics about the model under which academic employment works.)

And finally, I despair that Benton, while laudably promoting the vision of the grad student who doesn’t have the least expectation of a tenure-track job and expects to work outside academia, unrealistically imagines a world where anybody cares about the PhD outside of academia. It is a sad reality that PhDs who work outside of their fields completely (not just outside academia, but outside any profession where their disciplinary training is relevant) often have to conceal the fact that they hold an advanced degree in order to find work – the PhD actually serves as a deterrent to employers. Without denying that there can be a role for a ‘public intellectual’, I do deny that there is room for public intellectuals who are divorced entirely from the academic world and its own peculiar economy. Benton is imagining a world that simply does not exist, never has existed, and for which no plausible means exists by which we might bring it into existence.

My feeling is that of course we should apprise our undergraduates that their chances of success are not 100% or perhaps not even 50%, and then we should take every possible step necessary to ensure that the best and brightest students who are going to go to grad school anyway, regardless of what we say, have the maximum chance possible to have the sort of productive career that we ourselves enjoy.

Reference letters: a letter-writer’s views

I’m now nearing the end of what has been a very busy reference-letter-writing season for graduate and medical schools. I’ve been writing letters both for McGill students (some of whom I know extremely well) and for Wayne students (who I’ve known for four months, tops) and also serving on my department’s graduate committee reading admissions applications and the reference letters that come with them. For me, writing reference letters is, if not actually enjoyable, a part of my job that is more than just a duty, but something I care about doing well, and the (hopeful) result is something I care about a hell of a lot. I also (humbly?) think I’m damn good at it.

A recent conversation with a friend has got me thinking about different practices and traditions with regards to the practice of letter-writing. We don’t talk enough about the nuts and bolts of the process, particularly not in public venues. Obviously, I am not at liberty to discuss any specific contents of any of these letters, but I think a general discussion of some of the issues might be helpful to students who may find this post, and to my colleagues for whom this should be a fairly enjoyable part of the job: helping their students move forward in their chosen professions. Please bear in mind that everything below reflects my practice, but others may well behave differently.

The Talk: Firstly, everyone who wants me to write them a reference letter for grad school gets to hear from me about the harsh realities of grad school. I even have a workshop that I give on that subject. The one thing I don’t do is give a blanket ‘Don’t go to grad’, although some of my colleagues follow that plan. I can’t fault their intentions, but my view is that ‘Do as I say not as I do’ is pretty much always hypocritical and that in any case, these students are adults and are more likely to be influenced by information than blanket moral pronouncements. But even so, no one gets away without hearing that academia is a tough gig. I also do try to mention that they aren’t crazy for wanting to go, and that intellectual life can be very rewarding – I personally found grad school to be a very relaxing period of my life – but I don’t hide the downsides of penury, anxiety and uncertainty.

Saying No: I sometimes say no to a student who asks for a letter, usually under one of three conditions:
a) Students who come to me very shortly before the due date. Guys, I usually do make an effort for, but realistically I want a minimum of a week’s notice, with two to four weeks preferred. If I have a letter already written for you and you know that, then obviously a short turnaround is possible, but otherwise, don’t expect me to be happy.
b) Students I just don’t remember at all. Usually the student knows I may not know them well and mentions that in their email or in person, giving me a way out. I don’t even know why they bother, although most commonly it’s students who major in some other, larger field and for whom my anthro seminar was one of the only classes where they interacted with a faculty member.
c) The third category is the hardest: students who I don’t think are cut out for grad school. Usually rather than saying no outright, I will say something to the effect that I wouldn’t be able to write them a strong letter, or suggest programs that may be more suitable for the student.

Information: I want as much information as the student is able to give me to allow me to write the best letter possible. At minimum I want a transcript, letter of intent, and CV, with a writing sample if I don’t already know the student’s writing well, and GRE/test scores if relevant. If some aspect of a student’s record is substandard, I can (and will) ‘write around’ that issue: not ignore it, but frame it in its context. Also, a student may have experiences that I may not know about, but which can be used to really beef up my letter. I can’t think of any case where a student has provided information that has made my letter worse than it otherwise would be.

Content: I always try to be as specific as possible in every aspect of the letter. I believe that the genre of reference letters (and that’s what it is -a literary genre) is tricky because so many letters are nearly-identically praise-filled, as everyone tries to get their students in the best places possible. Specificity is my solution to this problem: the more facts I can include, the better. Basically, I write on the assumption that the reader needs something memorable which could be used in the student’s favour in a committee meeting, to separate the student from the pack of mediocrities out there.

Superlatives: Although I despise the casual superlative, combined with specificity, superlatives can be used to great effect. Here’s where my anal-retentiveness comes into play. While there may only be one ‘best student in her class’ or ‘best student I’ve ever taught’, there are more ‘best in X course’ and even more ‘best essay out of X essays’. Or, if a student works full-time in addition to high intellectual performance, ‘most industrious’ might be brought into play (with supporting evidence, of course). Here we get back to the information issue

Length: Most of my letters are one single-spaced page for undergraduates. There have been a couple of exceptions for students I know especially well or whose records demand a fuller accounting, but realistically, one page is usually going to be enough. Given the limits of a committee’s time, long letters can conceal relevant information rather than highlighting it.

Customization: I would be lying if I said that I wrote a different letter for each student for each school, but I customize the school name and program that the person is applying to. Also, if the student is applying to two very different programs (e.g. in different disciplines), I will have two variants to encompass that fact. If a student has given me specific information as to the people they might want to work with, it only takes me a moment to add a sentence with specific names to my letter.

Writing My Own Letters: The other day, a friend remarked to me that it was customary in her department for referees to ask students to write their own letters and bring them to be signed. What the bloody hell kind of practice is that? While I understand that letter-writing can be a chore, I consider it to be profoundly unethical for students to be asked to write their own letters (not that I blame students who agree to do this – they are obviously not free to say no). Am I really strange to find this practice so disturbing?

Student Input: I will say, however, that I welcome student input into letters, particularly with regard to correcting errors or mentioning specific facts they would like me to have in the letter. After all, I don’t know every aspect of every student’s life, or why exactly this program is important to them. While I won’t censor my opinions or feel bound by a particular request, I do take suggestions very seriously.

Seeing Letters: I have a standing and explicit policy that before I send any letter (or after, or whenever), I will show any student the letter I have written for them. This not only ensures that I don’t make any stupid errors, but also (I hope) gives people peace of mind as to what is being said. So much of the grad admissions process is opaque, and my policy is intended to alleviate any fears – and of course, if the student doesn’t care to use my letter, they know exactly on what basis they make that decision.

Online Forms: I love the fact that many institutions now allow me to upload a PDF of my letter directly to the institution, bypassing the old ‘signed over the seal’ snail-mail method. I even have an electronic letterhead and signature to make it all fancy and official-looking. And they mostly send confirmation emails afterwards so I know it got there. And I certainly don’t mind having to fill out online check-boxes of the ‘top 5% – top 10% – top 25% – top 50% – oh god don’t admit this one’ variety. What I do object to is institutions that force you to rewrite your comments to fit a set of online questions (often with character limits). The process is annoying enough already without making faculty spend more time dividing up a perfectly good letter into little chunks, then getting anxious about whether we did it right.

Results: I was commenting to a friend the other day that if students knew how anxious and nervous we, as faculty, get about our students getting into grad school, they would either react with disbelief or laughter. And I suppose maybe some faculty just don’t care, but I’m young and idealistic and dammit, I do care a hell of a lot. So when I write a letter and then I don’t even hear where a student has been accepted, much less where they’re going, I get a little peeved. I don’t need a hand-written thank you note – email or Facebook or whatever will do quite fine. But I do think it’s rude not to let me know what the results of my efforts (and theirs) were.

So that’s it! Any issues I have missed? Any pet peeves or quirks? Am I way off base on some of these points?