Today’s Saturday Morning Breakfast Cereal is more or less true. This is how I behave when faced with more or less any unusual behaviour, or, you know, anything. Ask me about all the stop signs I’ve run while driving over the past year, due to paying too close attention to the wear or vandalism. Or ask my extremely patient wife. I’m going to ignore all the various ethnocentrisms (though the skin tone of the characters is noteworthy), and instead wonder whether the reason the anthropologist is writing with his left hand is that he’s actually left-handed or because of the spear through his right shoulder.
Category: Anthropology
To grad or not to grad
There’s been a lot of handwringing lately among academics interested in the academic job market over the question of whether one should advise anyone to go to grad school in humanistic disciplines (presumably including the humanistic social sciences like anthropology). To be sure, the job market has been terrible for the past 40 years and this year’s offerings have been mediocre at best, as the financial crunch has (at minimum) decimated endowments and produced an extremely wary attitude among state legislators responsible for public funding. There have been a couple of rather mediocre, although not entirely wrong, articles in the New York Times of late, bemoaning the academic job market and the prospects for graduate students.
But I want to talk about a couple of articles entitled ‘Just Don’t Go’ by Thomas Benton, a regular columnist in the fantastic Chronicle of Higher Education Careers section, which are of interest to me because they are written by a working, tenured academic in the humanities (links here and here). Benton argues that the only honest thing for faculty to tell prospective graduate students in the humanities is not to go to grad school; the prospects are simply too dim, and the waste of effort and of human productivity so immense, that the best thing to do is to give blanket advice not to go. He knows, of course, that many (most?) students who receive this advice will still apply to grad school, because they are drawn to intellectual life.
It’s unsurprising that such advice would become more pertinent in a down job market, where grad school might seem like a safe haven to ride out the economic maelstrom. Benton nevertheless notes that this is not just about this year, but about a general trend in academic employment over many decades. He’s absolutely right: academic employment is sparse and not about to improve dramatically anytime soon, possibly ever. Many people who complete a PhD will never work full-time as tenure-track faculty. And so he is also quite right that discouragement is a rational strategy for faculty confronted with multiple students interested in pursuing the doctorate.
Nevertheless, I find it intellectually dishonest and generally unwise to advise interested students, ‘Don’t go to grad school’. First, because to do so would be the height of hypocrisy, and would be perceived by many as a statement that I (as someone who ‘made it’) don’t think that my students have what it takes to make it. Second, because I think that if you gave every student that advice, some students would take it who shouldn’t, and others wouldn’t who should, with the potential result that the next generation of scholars would consist of those too foolish not to listen to their professor’s good advice. Third, because telling someone anything is less desirable than giving them good information and allowing them to make their own decision.
I do think that there are far too many PhDs in anthropology, and indeed in most of the humanities and social sciences. At the very least, there are too many degree-holders on the job market in comparison to the number of jobs available. While the number of jobs available in academia fluctuates, in most disciplines it has increased modestly over the past 20 years, while the number of candidates has increased dramatically. This Malthusian logic dictates that one’s chances of getting a job are not that great.
The reality is that approximately 50% of PhD graduates in anthropology will eventually end up on the tenure track somewhere (most often within five years of obtaining the degree, after which your chances decrease as you are perceived rightly or wrongly as ‘damaged goods’). Another 25% will end up in professionally-appropriate positions in the public or private sector (this is particularly relevant for anthropologists, for which there are well-defined non-academic yet professional jobs), while the rest end up somewhere else – but very few end up unemployed altogether. Are these chances good enough for you?
They might be. Of course, up to 50% of people who start the degree do not finish, and so saying that 50% of PhDs will hold a tenure-track job eventually is incomplete, because it does not account for the many students who never finish the degree. Now, virtually no one admitted to a doctoral program ‘fails out’ in the sense that they lack the intellect to complete the degree. By far the most common reasons, in my experience, for people not finishing the PhD are a lack of money or a lack of motivation.
Motivation is not just about ‘having the will to persist’, although that’s important. It’s about developing a network of social relationships, especially with mentors but also with peers, that allow you to feel good about continuing in the program, to be intellectually rewarded and validated, and to remain on track. I was particularly blessed, as a student, to have some top-notch mentorship, but I regret to this day that my peer group was neither as large nor as close as I would have liked it to be. And I know plenty of people who had or have situations less congenial than mine, and who found themselves stranded without any meaningful support. This problem only gets worse if you are underfunded. The reality is that while persistence is the key, persistence can only be realistic when you have a lot of support. So I think it’s worth telling students to research programs as thoroughly as possible, and to find schools where they can plausibly work with multiple people.
Similarly, money really is central, and is one of the reasons why, even though I don’t advise students, “Don’t go to grad school” outright, I do advise them not to go to grad school if it means taking on substantial debt, and realistically, only to go where they have funding. For some students who can achieve admission to a top PhD program right out of a BA, or for others who can find a funded MA program, they are in good shape to move forward. For others, though, an MA means taking on tens of thousands of dollars of debt only to go into a PhD that may only be poorly or partially funded. To finish the PhD you may need to take on a lot of extra non-professional work or go even deeper in debt, possibly taking longer than your peers because of demands on your time. If you have sources of income to allow you to do an unfunded MA, more power to you.
And what happens when you’re done? For an indeterminate period, you will likely be underpaid and underemployed, while paying back student loans and trying to find a job. If I had had any substantial student debt at all, I simply could not have afforded to work in academia for the four years following the completion of my degree. I could not have supported my family, and I would have left the discipline, not out of a lack of ability, but simply out of a lack of funds to continue the search in a tight job market. And the market is ALWAYS tight.
Another factor influencing post-degree success on the market is institutional prestige. It is a sad fact that academic disciplines are, and always have been, hierarchical in a way that is rarely recognized by most undergraduates. To demonstrate this, you need only go to a faculty list from a department and see where the faculty got their PhDs. You will find that the vast majority of tenured and tenure-track faculty got their degrees from the top 50 or so institutions in the world (for that discipline), with the top 10-20 schools being very well represented indeed. Not coincidentally, these institutions have the highest degree of student financial support for PhDs (although not always the highest degree of emotional, psychological, and other forms of support). It may be true that only 50% of PhDs in humanistic disciplines ever hold TT jobs, but nevertheless, if you attended a high-ranked institution, your individual odds may be much better.
So, if you have the good fortune and ability to attend one of those programs, then you will find that your chances of employment after completion are very great. If not, well, your chances will be less. What’s more, you should prepare for the fact that even if you do get a PhD, you will probably work at a less prestigious institution than the one you graduated from. Everyone can think of exceptions, but that’s just what they are – exceptions to an overwhelming statistical probability. And because many smaller and less prestigious institutions don’t even have anthropology departments (as opposed to, say, biology or psychology), attending a less well-known institution can harm your opportunities for finding academic employment at all. Unfortunately, few departments provide detailed information about where their graduates end up after completion, and those with poor records have the least incentive to do so.
Now Benton wants to argue that the solution to this is to develop/train/find/invent/construct graduate students who do a PhD with no expectation of an academic career. And I think at some level it’s good advice to students that they need to prepare for the possibility of a non-academic career, not only psychologically but also in terms of the skills they obtain. This is particularly true in disciplines like anthropology which do have significant (although not always obvious) professional outlets where PhDs earn a decent living outside of academia.
But more to the point, I think that the sorts of people who should be considering graduate school are those for whom the actual process of going to grad school is enjoyable and rewarding for its own sake (despite its struggles). One thing I do tell my students is to ask themselves, “If I spend six years in grad school, even if I never get a job, will it still have been worth it?” If they can honestly answer yes, that the process of learning and intellectual exploration is worth it for its own sake, then they should do it; if not, then they shouldn’t. And again the money comes into play – if one has to go into massive debt to do it, then it’s certainly less likely to be worth it.
And even further, I worry that while Benton is right about the job market, and right about the need to inform students of the realities of the market, he’s asking more of academics than anyone would ask of other professionals. We don’t tell artists not to do art, and the chances of financial success as an artist are far, far dimmer than the prospects for an academic. We don’t tell baseball players not to try out for the minor leagues just because the chances of them ever playing major league ball are minuscule. (The baseball analogy is one that a friend of mine mentioned to me some years ago and that I have been using ever since to talk to non-academics about the model under which academic employment works.)
And finally, I despair that Benton, while laudably promoting the vision of the grad student who doesn’t have the least expectation of a tenure-track job and expects to work outside academia, unrealistically imagines a world where anybody cares about the PhD outside of academia. It is a sad reality that PhDs who work outside of their fields completely (not just outside academia, but outside any profession where their disciplinary training is relevant) often have to conceal the fact that they hold an advanced degree in order to find work – the PhD actually serves as a deterrent to employers. Without denying that there can be a role for a ‘public intellectual’, I do deny that there is room for public intellectuals who are divorced entirely from the academic world and its own peculiar economy. Benton is imagining a world that simply does not exist, never has existed, and for which no plausible means exists by which we might bring it into existence.
My feeling is that of course we should apprise our undergraduates that their chances of success are not 100% or perhaps not even 50%, and then we should take every possible step necessary to ensure that the best and brightest students who are going to go to grad school anyway, regardless of what we say, have the maximum chance possible to have the sort of productive career that we ourselves enjoy.
Why is archaeology anthropology?
A recent post over at The Blogaeological Record, a new archaeology blog run by my former student Lars Anderson, has got me thinking about this crazy discipline of which I am a part. Lars has strong opinions, and is not afraid to state them, and is in the process of formulating his thoughts on anthropological archaeology in a public forum. So you should all head over there and welcome him to the community of anthropology bloggers.
In a recent set of posts, Lars has been talking about Kent Flannery’s now classic allegorical article, “The Golden Marshalltown” (Flannery 1982). Rereading this remarkable article for the first time in over a decade has got me thinking about some general issues in anthropology, in terms of the interaction of methods and theory, and the ‘proper’ relationship between archaeology and anthropology. In ‘Marshalltown’, Flannery, a renowned Mesoamerican archaeologist, invokes both empiricism and disciplinary holism as central to the survival of anthropology as a discipline, and of archaeological anthropology as a part of it.
The collection of more data (regardless of the source) is always a fundamental part of what we do as scholars. Flannery was writing against the tendency, always present in social science and sporadically in archaeology, to give pride of place to theoretical formulations ahead of basic day-to-day science. It’s not that he is anti-theory, but rather that he recognizes that theory without data is empty twaddle. For the archaeologist the gold-plating of his Marshalltown trowel in Flannery’s allegory is equivalent to the athlete hanging up his sneakers. While for the rest of us, there is nothing quite so symbolic, the idea that what we are doing as scholars is constantly asking new questions and finding data to help us answer them is persuasive. The notion that there can be such a thing as ‘just a theorist’ is abhorrent to me and should be to any social scientist, regardless of field.
The second criterion, disciplinary holism, is trickier to negotiate. Archaeology is a set of methods as well as an academic discipline, and those methods (survey and excavation foremost among them) can be employed in the service of many disciplines other than anthropology: medieval history, or classics, or Egyptology, etc. A well-known proverb among North American archaeologists, is that “archaeology is anthropology or it is nothing”. But in fact the original quotation from Philip Phillips was that “New World archaeology is anthropology or it is nothing.” (1955: 246-7), later revised to “American archaeology is anthropology or it is nothing.” (Willey and Phillips: 1958: 2).
In either form, this is an odd statement to make that just gets odder the more you think about it. It’s arguing that there is something fundamentally different about the New World that makes its study anthropological, whereas presumably some aspects of Old World archaeology can be anthropological, or not. But the criteria on which this is to be decided seem to me entirely arbitrary. In the latter form, it is giving a nod to different disciplinary practices in Europe, where cultural anthropology stands apart from archaeology. But to define a regional tradition of archaeological practice in this way is hopelessly parochial and essentialistic. It also raises all sorts of problems when anthropological concepts and units are used uncritically to analyze phenomena where the temporal or spatial scale does not permit such facile analogues. In a now-famous article, Martin Wobst (1978) notes that the ‘tyranny of the ethnographic record’ has led some archaeologists to mis-interpret aspects of the record of hunter-forager prehistory precisely because the units defined by ethnography have no direct relationship with material recovered archaeologically.
In ‘Marshalltown’, Flannery is, I think, not really concerned with this division – rather, he is concerned with the alternative perspective that ‘archaeology is archaeology is archaeology’ (Clarke 1968): that archaeological theories should not be dependent on insights from other disciplines. Flannery instead wants to insist on the robustness and utility of the anthropologically-derived culture concept for a vigorous anthropological archaeology. And I certainly have no beef with that (although if you talk to 100 anthropologists you will get at least 110 definitions of culture). But Flannery’s formulation is that of a New World and American archaeologist, and I think it is far too narrow.
I do not want to deny that the link between archaeology and anthropology is fundamental, and that the link must go both ways: social (and linguistic, and any other sort of) anthropology must learn from archaeology, and vice versa. The problem as I see it is that anthropology is not ambitious enough, and that both archaeology and cultural anthropology must conceptualize themselves as part of a broader human science if they are to remain useful. And in place of pronouncements about where archaeology fits within the Great Chain of Disciplinary Being, we ought to ask why certain formulations might (or might not be useful).
Throughout his career, my mentor Bruce Trigger worked tirelessly to bridge the gaps between Egyptology and anthropological archaeology, with some success, but ultimately most Egyptologists even today have little anthropological training, and when a few of them do make efforts to expose their work to anthropologists, they are received with some skepticism. Even though fundamental techniques like seriation and stratigraphy developed in Egyptological contexts, primarily through the work of scholars like Flinders Petrie, Egyptology remains distinct from archaeological anthropology, and to this day is part of ‘Near Eastern studies’, a historical/archaeological/literary discipline defined regionally, whereas Maya, Aztec, and Inka archaeology are linked to anthropology (as with the prehistoric archaeology of both the New and Old Worlds). This is methodologically unjustified, potentially ethnocentric, and theoretically timid (2).
An example: One of my favourite Egyptological papers is John Baines’ ‘Color terminology and color classification’ (Baines 1985), which is an attempt to integrate cognitive-anthropological work on colour terminology (e.g., Berlin and Kay 1969) with Egyptian art history. Published in American Anthropologist, it is also an effort to expose anthropologists to Egyptological work and to demonstrate that Egyptology is capable of being theoretically highly sophisticated. Baines points out that while the ancient Egyptian language has a paucity of colour words, the colour palette used in art has a greater variety of basic colours, and one that increases over time. Baines uses this to support the Berlin/Kay theory of a patterned development of colour terms along a universal framework while pointing out that there may not be a simple correspondence between the linguistic ‘palette’ and the artistic one. Because Egyptology has access to both linguistic (textual) and archaeological (art) evidence throughout several thousand years, it is possible to directly verify (and to complicate) an evolutionary sequence that can only be inferentially reconstructed using ethnography.
I should be clear that I don’t really blame archaeologists for any of this; to be treated (as it is by many cultural anthropologists) as a ‘kid brother’ subdiscipline that can at best borrow from other fields is a gross injustice. Virtually every archaeologist is expected to be at least moderately familiar with the techniques, theories, and concepts of cultural anthropology in North America, while the converse is not even remotely true except at a very few institutions. I am one of a small minority of non-archaeologists who has read and taught widely on archaeological subjects. I’m certainly not saying that everyone should have done what I did – for instance, it clearly hurt my career to be ‘hard to define’ subdisciplinarily. But I think that having people who are trained as generalists, as polymaths, and as interdisciplinary scholars even while maintaining a core disciplinary allegiance, can only be to the benefit of the human sciences, which are (or ought to be) hard to delineate in such clear ways.
I’m a synthesist by nature; I love finding hidden connections between fields of study that otherwise don’t have any obvious connection, like evolutionary anthropology and the history of mathematics, or Assyriology and developmental psychology, or (as with Baines) Egyptology and cognitive anthropology. I worry that by defining anthropology too narrowly as ‘ethnography’ or ‘ethnology’, archaeologists miss real opportunities for contributing to a broader framework of social and historical theory. No one is arguing that archaeologists should gild their Marshalltowns, but to define themselves methodologically rather than conceptually would be an even greater mistake. But even more importantly, anthropologists of all sorts are missing an opportunity to frame themselves as the holistic core of an integrated mosaic of human sciences.
Notes
(1) For those of you who may not know, Marshalltown is the largest and most prominent manufacturer of archaeological trowels, and is iconic among American archaeologists.
(2) The same is true to a greater or lesser extent of Assyriology, classics, Sinology, medieval history, and Indology, which conceptualize archaeology as part of history rather than as part of the cross-cultural enterprise currently exemplified by anthropological research.
Works Cited
Baines, J. 1985. Color terminology and color classification: Ancient Egyptian color terminology and polychromy. American Anthropologist 87: 282-297.
Berlin, B., and P. Kay. 1969. Basic color terms. University of California Press Berkeley.
Flannery, K. V. 1982. The golden Marshalltown: A parable for the archeology of the 1980s. American Anthropologist 84: 265-278.
Phillips, P. 1955. American archaeology and general anthropological theory. Southwestern Journal of Anthropology 11: 246-250.
Willey, G.R. and P. Phillips. 1958. Method and Theory in American Archaeology. Chicago: University of Chicago Press.
Wobst, H. M. 1978. The archaeo-ethnology of hunter-gatherers or the tyranny of the ethnographic record in archaeology. American Antiquity: 303-309.
Sciencing up the place
I got back late Saturday from the SaSci/SCCR conference in Las Vegas, to be greeted in Detroit by several inches of new-fallen snow … oh joy! Although I hardly had the time or inclination to do any serious gambling while away, I did win modestly at the airport slots due to my flight being delayed for half an hour. My talk was sparsely attended but nonetheless well-received, and it looks like as a result of these discussions, I’ll be presenting next year at the same conference as part of a session on anthropology and numerical cognition (in other words, exactly my field). In general, discussions about methodology in cognitive anthropology have led me to think quite a bit about my upcoming work this summer working with Detroit middle school students and learning about mathematical concept formation. A real challenge in the anthropology of mathematics is that there aren’t very many anthropologists working on mathematics, and because mathematics is a weird sort of domain where referents are often abstract, our methodologies aren’t extremely well developed, as opposed to, say, the study of kinship terms or ethnobotanical knowledge. So I have been spending the past few days thinking a lot more seriously about elicitation tasks and what exactly a mathematics-oriented ethnographic interview ought to look like and how on earth I can/should apply any of the highly theoretical knowledge I have acquired to this very grounded situation. Of course, I won’t really have the slightest clue what I’m doing until I actually start doing it, and possibly not even then.
But more generally, and despite receiving other, unrelated good news while away, it’s hard to be back from this particular conference feeling unmitigatedly positive about my discipline and my particular orientation within it. I’ve always been an oddball (and usually proud of it) in that I refuse to define myself within the usual four-field subdisciplinary taxonomy (physical, archaeological, cultural, and linguistic anthropology) common for the past century. I just don’t see any point, insofar as most of what ought to distinguish archaeologists from cultural anthropologists (e.g.) is methodological rather than conceptual. But then inevitably we get caught up in what is versus what ought to be, and the ways in which methodologies affect all other aspects of our work, and then we end up yellling at one another instead of being productive.
On top of that, you add the division between anthropology-as-humanism and anthropology-as-science, where I lean rather heavily towards the latter perspective even though as a ‘labelled’ linguistic anthropologist most of my attributed subfield leans the other way. The Science Wars had enormous fissioning effects on anthropology, such that some departments actually split administratively between humanistic and scientific wings, but some of that fissioning exists at a subdisciplinary level as well: you would be hard-pressed to find a physical anthropologist who rejects the label ‘scientist’, for instance. The Society for Anthropological Sciences is both a symptom of and a potential solution to these issues: it reflects a profound dissatisfaction with the humanistic bent of most cultural and linguistic anthropology, but at the same time by organizing itself in opposition to those trends, does little to convince any non-scientific anthropologists of the merits of the perspective.
For my part, I’m quite happy to use humanistic approaches when relevant, which is often. A lot of the empirical work underlying my forthcoming book, Numerical Notation: A Comparative History, examines the social, cultural, and political contexts under which particular numerical systems arose, spread, and declined. Lots of the work is essentially epigraphy as applied to numbers, and the scholars I relate to are linguists, historians, classicists, etc. In terms of much of my analysis, historians would surely recognize it as akin to what they do, even if, by the nature of the subject, it tends to underemphasize the individual personalities involved.
But I can’t escape the feeling that all this humanistic analysis acquires greater relevance when embedded in the broader search for patterns, and within anthropology the analysis of social processes and the comparison of social systems. I am thrilled that the structure of the book retains the basic structure of my dissertation, which has two separate analytical chapters, one cognitive, the other social, neither of which stands alone. But ultimately it is a comparative history, one which seeks to transcend the particular and get at something pan-human underlying it all. For an anthropologist today to admit to being a comparativist, outside of a very small number of venues, is like admitting you’re a cannibal, it seems sometimes. I do think I see some glimmers of hope that the field is becoming methodologically and theoretically more inclusive than when I was a grad student. I guess we’ll see, when the book is out, whether the reviewers agree.
What happens in Vegas
In a couple of hours I’m off to Las Vegas for the 2009 Society for Anthropological Sciences conference, where I’m presenting a paper entitled, “Frequency dependent biases in the transmission of communication technologies”. If any of my readers are going to be there (unlikely though that may be), it’ll be … well, it will be more compelling than the abstract that follows below makes it seem:
—
Frequency dependent biases in the transmission of communication technologies
Frequency dependent bias is a form of horizontal cultural transmission bias in which the frequency of a cultural trait influences the likelihood that others will adopt it. Previously seen as a unitary phenomenon, frequency dependence in fact consists of three separate types, each involving distinct decision-making processes and having different patterns of acceptance, retention, and abandonment. In particular, communication technologies, whose popularity determines their utility, exhibit unusual characteristics of cultural transmission. A brief case study from the phylogenetic history of written numerals demonstrates the usefulness of considering the different effects of frequency for the adoption of new communication technologies. More broadly, the prevalence of frequency dependent phenomena in various cultural evolutionary contexts suggests the need to evaluate decision-making processes more rigorously when evaluating the adoption and retention of cultural traits.
—
I’ll try to put together something interesting in the way of a blog post while I’m away, provided I don’t get sucked in by the charms of the city. Catch you on the flipside!
