Handbookery

Here are a couple of new publications of which I am very proud and which may be of interest to you. I’ve included them both lest the publishers involved think I’m playing favourites!

Chrisomalis, Stephen. 2008. The cognitive and cultural foundations of numbers. In The Oxford Handbook of the History of Mathematics, Eleanor Robson and Jacqueline Stedall, eds., pp. 495-517. Oxford: Oxford University Press.

Numbers are represented and manipulated through three distinct but interrelated techniques: numeral words, computational technologies, and numerical notation systems. Each of these has potential consequences for its users’ numerical cognition, but these consequences must be understood in terms of the functions and uses of each technique, not merely their formal structure. Most societies use numerical notation only to represent numbers, and have a variety of other techniques for performing arithmetic. The current Western practice of pen-and-paper arithmetic is anomalous historically. The transmission, adoption, and extinction of numerical systems thus depends primarily the social and economic context in which cultural contacts occur, and only minimally on their perceived efficiency for arithmetic.

Chrisomalis, Stephen. 2009. The origins and co-evolution of literacy and numeracy. IN The Cambridge Handbook of Literacy, David R. Olson and Nancy Torrance, eds, pp. 59-74. Cambridge: Cambridge University Press.

While number concepts are panhuman, numerical notation emerged independently only in state societies with significant social inequality and social needs beyond those of face-to-face interaction, and in particular with the development of written texts. This survey of seven ancient civilizations demonstrates that, although written numerals tend to develop alongside the first writing, the specific functions for which writing and numerals co-evolve are cross-culturally variable. A narrowly functionalistic approach that generalizes the Mesopotamian case to all early civilizations and proposes that numerals always emerge for accounting and bookkeeping is empirically inadequate. An alternate theory is proposed that regards the emergence of writing and numerical notation as an outgrowth of elite interests relating to social control, but leaving unspecified the particular domains of social life over which those elites use to control non-elites. Numerical notation is a special-purpose representational system that, in its simplest form, unstructured tallying, is a precursor to written communication, and which persists and expands as a parallel notation in literate contexts.

Sciencing up the place

I got back late Saturday from the SaSci/SCCR conference in Las Vegas, to be greeted in Detroit by several inches of new-fallen snow … oh joy! Although I hardly had the time or inclination to do any serious gambling while away, I did win modestly at the airport slots due to my flight being delayed for half an hour. My talk was sparsely attended but nonetheless well-received, and it looks like as a result of these discussions, I’ll be presenting next year at the same conference as part of a session on anthropology and numerical cognition (in other words, exactly my field). In general, discussions about methodology in cognitive anthropology have led me to think quite a bit about my upcoming work this summer working with Detroit middle school students and learning about mathematical concept formation. A real challenge in the anthropology of mathematics is that there aren’t very many anthropologists working on mathematics, and because mathematics is a weird sort of domain where referents are often abstract, our methodologies aren’t extremely well developed, as opposed to, say, the study of kinship terms or ethnobotanical knowledge. So I have been spending the past few days thinking a lot more seriously about elicitation tasks and what exactly a mathematics-oriented ethnographic interview ought to look like and how on earth I can/should apply any of the highly theoretical knowledge I have acquired to this very grounded situation. Of course, I won’t really have the slightest clue what I’m doing until I actually start doing it, and possibly not even then.

But more generally, and despite receiving other, unrelated good news while away, it’s hard to be back from this particular conference feeling unmitigatedly positive about my discipline and my particular orientation within it. I’ve always been an oddball (and usually proud of it) in that I refuse to define myself within the usual four-field subdisciplinary taxonomy (physical, archaeological, cultural, and linguistic anthropology) common for the past century. I just don’t see any point, insofar as most of what ought to distinguish archaeologists from cultural anthropologists (e.g.) is methodological rather than conceptual. But then inevitably we get caught up in what is versus what ought to be, and the ways in which methodologies affect all other aspects of our work, and then we end up yellling at one another instead of being productive.

On top of that, you add the division between anthropology-as-humanism and anthropology-as-science, where I lean rather heavily towards the latter perspective even though as a ‘labelled’ linguistic anthropologist most of my attributed subfield leans the other way. The Science Wars had enormous fissioning effects on anthropology, such that some departments actually split administratively between humanistic and scientific wings, but some of that fissioning exists at a subdisciplinary level as well: you would be hard-pressed to find a physical anthropologist who rejects the label ‘scientist’, for instance. The Society for Anthropological Sciences is both a symptom of and a potential solution to these issues: it reflects a profound dissatisfaction with the humanistic bent of most cultural and linguistic anthropology, but at the same time by organizing itself in opposition to those trends, does little to convince any non-scientific anthropologists of the merits of the perspective.

For my part, I’m quite happy to use humanistic approaches when relevant, which is often. A lot of the empirical work underlying my forthcoming book, Numerical Notation: A Comparative History, examines the social, cultural, and political contexts under which particular numerical systems arose, spread, and declined. Lots of the work is essentially epigraphy as applied to numbers, and the scholars I relate to are linguists, historians, classicists, etc. In terms of much of my analysis, historians would surely recognize it as akin to what they do, even if, by the nature of the subject, it tends to underemphasize the individual personalities involved.

But I can’t escape the feeling that all this humanistic analysis acquires greater relevance when embedded in the broader search for patterns, and within anthropology the analysis of social processes and the comparison of social systems. I am thrilled that the structure of the book retains the basic structure of my dissertation, which has two separate analytical chapters, one cognitive, the other social, neither of which stands alone. But ultimately it is a comparative history, one which seeks to transcend the particular and get at something pan-human underlying it all. For an anthropologist today to admit to being a comparativist, outside of a very small number of venues, is like admitting you’re a cannibal, it seems sometimes. I do think I see some glimmers of hope that the field is becoming methodologically and theoretically more inclusive than when I was a grad student. I guess we’ll see, when the book is out, whether the reviewers agree.

What happens in Vegas

In a couple of hours I’m off to Las Vegas for the 2009 Society for Anthropological Sciences conference, where I’m presenting a paper entitled, “Frequency dependent biases in the transmission of communication technologies”. If any of my readers are going to be there (unlikely though that may be), it’ll be … well, it will be more compelling than the abstract that follows below makes it seem:


Frequency dependent biases in the transmission of communication technologies

Frequency dependent bias is a form of horizontal cultural transmission bias in which the frequency of a cultural trait influences the likelihood that others will adopt it. Previously seen as a unitary phenomenon, frequency dependence in fact consists of three separate types, each involving distinct decision-making processes and having different patterns of acceptance, retention, and abandonment. In particular, communication technologies, whose popularity determines their utility, exhibit unusual characteristics of cultural transmission. A brief case study from the phylogenetic history of written numerals demonstrates the usefulness of considering the different effects of frequency for the adoption of new communication technologies. More broadly, the prevalence of frequency dependent phenomena in various cultural evolutionary contexts suggests the need to evaluate decision-making processes more rigorously when evaluating the adoption and retention of cultural traits.

I’ll try to put together something interesting in the way of a blog post while I’m away, provided I don’t get sucked in by the charms of the city. Catch you on the flipside!

How to read phone numbers

Over at my personal blog, The Growlery, I am conducting a non-scientific poll in the name of Science, collecting preliminary data to help me formulate research questions on a new project. The topic: how to read phone numbers. While I’m not using this data directly, the more respondents I get, the better I’ll be able to think about issues relating to the lexical interpretation of non-lexical numerical symbols. The poll should take no more than a few minutes to complete. To respond, you need to have a Livejournal account, which can be obtained for free here.

An unshort answer to an unsimple question

I have not been as diligent as I should have been in completing a post that I’ve been thinking about for well over a month now. As her prize for successfully deciphering the unusual Wayne StatE UniversitY public inscription I posted back in September, my colleague Katherine Tong earned the right to ask a question relating to the subjects of this blog. Katherine asked me a question that is seemingly simple and yet highly complex. She would like me to address the question of in what ways computers (or by extension, other technologies) may have affected the way we use language. In particular she would like to know whether the morpheme ‘un-‘ has become more common (and more productive linguistically) since the advent of information technologies that allow operations to be readily reversed. I’ll deal with the broad issue first, followed by the more specific one.

This topic is broadly part of media ecology, whose anthropological proponents include such luminaries as Edmund Carpenter and Jack Goody, but which is better known through the work of people like the Canadian public intellectual Marshall McLuhan (Carpenter 1973, Goody 1977, McLuhan 1962). I was first introduced to these ideas through my teacher Christopher Hallpike at McMaster in the mid-90s, expanded my knowledge of them during my Ph.D. under Bruce Trigger (Trigger 1976), who was influenced by ‘Toronto School’ thinkers like Harold Innis in the 1950s, and most recently was influenced by the work of the developmental social psychologist of literacy, David Olson (Olson 1994).

The Media Ecology Association website defines the field as ‘the idea that technology and techniques, modes of information and codes of communication play a leading role in human affairs’ (http://www.media-ecology.org). In this fairly broad conception, virtually every social scientist is a media ecologist. More narrowly construed, it is the idea that differences in the way that information is represented and communicated affect our perception and cognition of that information. It ranges from studies of Paleolithic art to text messaging – very broad, nonetheless.

Now, Katherine is asking about the effects of information technology and media on language, and this is a tricky issue. Perhaps the trickiest of all is establishing any sort of causality. How do we know, for instance, that any particular linguistic change is the direct result of a change in medium? But beyond that, there is the question of what non-trivial effects media have on language. There are obvious changes, such as the introduction of new lexical items: blog, spam, blogspam, blogosphere, Internet, web, intarwebs … the list could of course be expanded virtually indefinitely, without telling us very much about how people categorize and perceive the world. But I’m a cognitive anthropologist, so establishing meaningful links between language and non-linguistic behaviour is what I’m really interested in. So what about it?

So let’s look at ‘un-‘. One of the fascinating things about this morpheme is that it was actually more heavily used in Old English (prior to the Norman Conquest) than after. The Oxford English Dictionary tells me that “the number of un- words recorded in OE [Old English] is about 1250, of which barely an eighth part survived beyond the OE period.” This reduction came about as many of the artificial constructions attested in Anglo-Saxon poetry ceased to be used, words which would never have been used in everyday usage but which were coined for specific metrical purposes. This is media ecology par excellence: the medium (poetic oral presentation) influenced output, and when the medium disappears, so do the linguistic forms.

One of the odd things about ‘un-‘ words is that a number of Anglo-Saxon negations survive even where the positive versions of the word have disappeared. Michael Quinion, author of the brilliant site / e-newsletter World Wide Words, has a fascinating article on ‘unpaired words’ such as unwieldy, unruly, and disgruntled, all of which formerly had positive counterparts, but which have now disappeared. But what’s important to note here is that the loss of these terms was not predictable from any sort of social or technological change, and that despite these gaps in our lexicon, we seem to get along quite fine with synonyms, or with multi-word phrases.

Important for this discussion is the word *uncleftish, which doesn’t exist, and never existed until the publication of ‘Uncleftish Beholding‘, science fiction author Poul Anderson’s fascinating account of atomic theory using only words and morphemes of Anglo-Saxon origin. Despite the fact that chemical jargon is filled with Greek and Latin terminology, it is possible (though not simple) to construct an understandable discussion of atomic theory using words like ‘uncleftish’ for ‘atomic’ (both mean ‘indivisible’). I’ve used this essay to get students to think about how language affects thought (linguistic relativity), most recently on my devilishly fiendish Language and Culture take-home exam last term, but also in my Evolutionary Anthropology class at McGill. It’s worth noting though that while you don’t need the word atomic to express the concept of indivisibility, nor indeed any Greek or Latin roots whatsoever, Anderson does need to coin uncleftish out of three existing morphemes, un-, -cleft-, and -ish.

The most famous ‘un-‘ neologism is the Orwellian ‘ungood’, a classic example of the form of linguistic relativity known as doublethink. “If you have a word like ‘good’, what need is there for a word like ‘bad’? ‘Ungood’ will do just as well – better, because it’s an exact opposite, which the other is not.” (Orwell 1949: 53). Pace Orwell, ungood has a long history in English, going back to Old English and attested sporadically thereafter right up until Orwell’s writing, at which time, of course, the word took on a far more sinister meaning, and acquired a very different connotation.

But despite the obvious media-ecological implications of the quotation, there is no reason why ‘ungood’ requires a cognitive gap of ‘bad’, or that the absence of the word ‘bad’ has any cognitive implications whatsoever. I’m a humanist of generally left-ish political persuasion, and a great admirer of Orwell’s novels and short fiction, but his essay, ‘Politics and the English Language‘ (Orwell 1950) is not one of his best pieces of thinking, and falls prey to this sort of muddle-headed thinking, equating the products of thought (in this case, written language) with the thoughts themselves. This is a form of linguistic relativity to which few if any linguists or anthropologists subscribe. I criticize this view in my short little humorous article, ‘The perils of pseudo-Orwellianism’ (Chrisomalis 2007); without denying that good writing is easier to understand than poor writing, it simply isn’t sustainable that the use of jargon, or buzzwords, or neologisms, or clumsy phrasing, inexorably leads to laxity of thought, or to particular political positions. The literature in the use of metaphor in linguistics is less reductionist, and far more sophisticated, than Orwell’s pronouncements, and requires that we understand, cognitively, exactly how words are used by human beings (e.g., Lakoff 1987). Shocking, I know.

In fact, there’s pretty good evidence for non-linguistic concept formation, which means that we have access to cognitive resources other than language to allow us to sidestep or ignore the cognitive frameworks that our particular language(s) might encourage. From my own narrow research perspective, I’m fascinated by the differences between linguistic and non-linguistic representations of number, with the implication that there are structured patterns of thought which follow from the use of particular graphic numerical systems, regardless of the structure of the number words of its users’ languages. Numerical notation is a visual technology for communicating numerical information: does it matter that we write 238 instead of CCXXXVIII? And if so, how so? In a couple of weeks I’m going to be giving a talk here at Wayne where, in part, I discuss the effects of the Western (Hindu-Arabic) numerals on the grammar of English numeral words, using telephone numbers as an example domain. For instance, if your phone number is 639-4625, you most likely pronounce it ‘six-three-nine-four-six-two-five’, and certainly not ‘six hundred and thirty nine, four thousand six hundred and twenty-five’. For a user of Roman numerals, the pronunciation of digits as distinct lexemes would be nonsensical, but for users of Western numerals, this is commonplace.

But now we are back to the effects of technology on language. I do think there are effects, but specifying where and when those effects will occur is tremendously complex, domain-specific, and (unfortunately) not predictable in any obvious way. Some people do in fact say ‘LOL’ and the verb ‘to lol’ may actually be achieving some currency; this of course is an acronym derived from ‘laugh(ing) out loud’ and emerged from online communication. LOL exists as a social lubricant, mediating online text-based communication in a medium that denies its participants the ability to see each others’ expressions and other nonverbal cues. But could we have *predicted* that LOL would emerge? I don’t think so. (Incidentally, I just used asterisks to indicate emphasis on ‘predicted’ – another media-ecological effect on language. In a Facebook chat conversation with a friend last week, she inquired about this usage, which was non-standard for her, but to me, indicates stress WITHOUT QUITE RISING TO THE LEVEL OF YELLING, WHICH REQUIRES ALL CAPS). Having both these tools in my repertoire of online communication techniques – as well as the emoticon :o – gives me choices that wouldn’t otherwise be available.

You may have noted my use of the term ‘intarweb’, which emerged out of Usenet newsgroups in the early 90s as a means of gently mocking the ‘noobs’ – the new users of the Internet whose mastery of online lingo was sub-par and indeed mock-worthy. Of course, people have been blending words for as long as there have been words, probably, but this particular coinage reflected a particular moment in the history of electronic technology, in which terms like ‘internet’, ‘web’, ‘online’, ‘e-‘ ‘Information Superhighway’, and ‘Information Age’ (cue laughter from those of my readers in on a particular inside joke) were well-known in the public sphere but where knowledge of how to deploy these terms was less well-developed. But again, we can explain this phenomenon only in historical and sociocultural terms, rather than as a known effect of the new technology itself.

This is why, in my opinion, media ecology is most profitably practiced today through linguistic anthropology, which has as its central goal the comparative study of patterns of relationships between communication and culture. If we ever hope to get beyond the recitation of media-ecological anecdotes, we need a comparative framework within which to examine similarities and differences among communicative situations. Of course, I’m talking about a linguistic anthropology informed by biological and cognitive constraints on human communicative capacities, and which includes archaeological and historical as well as ethnographic data as its sources. But only if we make this endeavour will we truly be able to answer Katherine’s unassuming and unfoolish question.

Works cited
Carpenter, E. S. 1973. Oh, what a blow that phantom gave me! Holt, Rinehart and Winston.
Chrisomalis, S. 2007. The perils of pseudo-Orwellianism. Antiquity 81: 204-207.
Goody, J. 1977. The Domestication of the Savage Mind. Cambridge University Press.
Lakoff, G. 1987. Women, Fire, and Dangerous Things: What Categories Reveal About the Mind. University of Chicago Press.
McLuhan, M. 1962. The Gutenberg Galaxy: The Making of Typographic Man. University of Toronto Press.
Olson, D. R. 1994. The World on Paper: The Conceptual and Cognitive Implications of Writing and Reading. Cambridge University Press.
Orwell, George. 1949. Nineteen Eighty-Four. A novel. New York: Harcourt, Brace & Co.
Orwell, George. 1950. Shooting an Elephant and Other Essays. London: Secker & Warburg.
Trigger, B. G. 1976. Inequality and Communication in Early Civilizations. Anthropologica 18.