A typology of quotation marks

I’ve been thinking about quotation marks lately (okay, now I’ve lost 99% of my readership already, way to go, Steve!) and the different ways we use them. Because I have a strong interest in literacy and culture and the way in which language gets turned into text, these sorts of things excite me in a way that is probably not entirely healthy, but then again, if I wasn’t, you wouldn’t have a post to read. So without further delay, I give you…

“A” “typology” “of” “quotation” “marks”

Quotative: This is the common case in which quotation marks serve to distinguish matter spoken or written in another context, with the presumption that the quoted matter is being reproduced somewhat faithfully. The material may have been spoken or written originally, but there is a much higher expectation of word-for-word reproduction when quoting written material, for the obvious reason that the writer can copy from a written source. This was the original, and remains the most common sense of quotation marks in printed matter. It helps us to distinguish sentences like

Martha said, “Canada is a fascist dictatorship.”


Martha said that Canada is a fascist dictatorship.

In the first case we are clearly meant to understand that Martha spoke those words, where in the second Martha might well have said, “Our government is heading towards fascism” or any number of other things.

Neologistic: Quotation marks are frequently used when an author coins a neologism, or coins a phrase using already existing words. One is not quoting some earlier source directly; one is seeking instead to indicate the novelty of the term being used. So, for instance, in this post, I write:

The effect of this ‘conspicuous computation’ was to impress the reader with the vastness of the quantity, serving as an indexical sign of Rome’s military might.

I’m not quoting myself here – I’m coining a new phrase and using quotation marks to alert the reader to this fact. We get into trickier ground when we put quotation marks around a single, existing word that we intend to use in a new sense, as in the following passage:

Let me explain first what I understand by “sociolinguistic”. I use the term in its adjectival form and speak of “sociolinguistic” kinds of research rather than “sociolinguistics”. (Hymes 1971: 42)

The context strongly suggests neologism, but another reading is that Dell Hymes (the author, a renowned sociolinguist / linguistic anthropologist) is seeking to dismantle the entire concept of sociolinguistics, or at least to shift its meaning substantially in this context. If so, we’re dealing with another sense entirely.

Distancing: The quotation marks serve to distance the author from the matter in quotes, but where that matter is not a faithful reproduction of other matter. One finds these very often in the titles of British newspaper articles, possibly because British libel laws are very strict and one could find oneself liable for making a statement that is not a direct quotation of another source but which is also not hard fact. They frequently have a quotative smell to them, insofar as they often relate to assertions or claims by another party, but in fact they are not quotative at all, and often appear to be paraphrases at best. I posted about this elsewhere a couple of years ago, and I still find this use jarring. An example:

‘Many killed’ in Yemen air raid

The BBC is not trying to say that someone wrote or said the words “many killed” in that order or even that the quote is an abbreviation of “many people were killed”. It is reporting on others’ claims, true, but the purpose is not quotative. We can think of the distancing quotes as being quotative minus the condition of (near-)faithfulness.

Ironic: Ironic quotation marks often also distance the author from the words written, but more importantly, distance the meaning of the quoted matter from its standard or accepted one. These are often called “scare quotes” by academics, a term which I find bothersome because they aren’t meant to scare anyone. I am indebted to my colleague Jacalyn Harden who came up with the metaphor of quotation marks as eyelashes – ironic quotes serve as a textual “wink” alerting the reader that some novel sense is intended. Wikipedia uses an example from my late mentor, Bruce Trigger:

Moctezuma II was reported to have had two wives and many concubines, by whom he had a total of 150 children. The king of Texcoco was said to have had more than two thousand “wives” by whom he had had 144 children, 11 born of his chief wife. (Trigger 2003: 178)

So we understand here that two thousand “wives” in the second sentence is not to be understood in the same sense as two wives in the first sentence. In both cases, a ruler is claimed (“reported” vs. “said”) to have some number of wives, so we can tell that the difference is not due to the quotative vs. non-quotative distinction. Because I knew the author of those words and worked on that very book, that I do not think that Trigger meant them deconstructively (see below) or in any other sense. Rather, it is because having two wives is not at all uncommon (even having two wives simultaneously is hardly a historical anomaly), but having two thousand wives strains credulity: the semantic associations we derive from the word wife could never be extended to the relationship between one man and two thousand women.

Deconstructive: There are scare quotes, and then there are “scare quotes”, and these are the latter. Where ironic quotes use the word in a different sense than that intended, deconstructive quotes imply that the object being quote-marked does not in fact exist. So, for instance, when one talks about “race” as opposed to race, one is noting that there is no biological reality to the race concept. Perhaps the most fantastic and potentially incomprehensible example is the following, from the linguistic anthropologist Michael Silverstein:

The important fact, then, is that “I” am to a certain extent what “I” say about “what” “I” drink as much as what “I” say about “it” reflects what “I” can discern about “what” “it” is. (Silverstein 2006)

In Trigger’s example above, he cannot mean that wives do not exist at all – he explicitly rejects this by his use of the un-quote-marked word in the previous sentence, and the un-quote-marked word wife in the second sentence as well. There are wives, and then there are “wives”. But in Silverstein’s example, he is really saying that “I” and “what” and “it” (the latter two referring to ‘that which I drink’) do not exist as real entities – they are socially constructed, to use one well-understood if less-than-ideal term. In ironic quotation marks, “A” is not A, but B, while in deconstructive ones, “A” is not A and is not anything else either.

Emphatic: The quotation marks serve as visual emphasis alone, and are not meant in an ironic, distancing, or quotative function. Most writers, I suspect, would treat this usage as an error, but it is widespread enough to deserve our attention. It is most frequently found on mercantile and informational signs, especially handmade ones. I refer you to The “Blog” of “Unnecessary” Quotation Marks, which gives such great examples as:

Closed “Monday”

“Fire Exit”
Please Do Not Use
Alarm Is On

and my personal favourite:

Thank “God” For All the Troops

These can clearly be excluded from the five other categories. Instead, the quotation marks serve as a sort of typographic highlighter, a means of emphasizing some words in the text. This is confirmed by the contextual association of emphatic quotes with billboards, signs, placards, and other texts meant for wide public visibility, and by the fact that many of the quote-marked words are also emphasized in some other way: boldface, underlined, capitalized, or in larger letters than the rest of the text. Are they truly “unnecessary”? Yes, in the sense that there are other ways to emphasize text, and because this sense is non-standard, some humor derives from understanding emphatic quotes as meaning something else (usually ironic). For instance, take this discussion at the unnecessary quotations blog over the sign Sellersburg Welcomes “President” George W. Bush. Sly jab at perceived electoral fraud, or over-ebullient semantic extension of well-known punctuation? You decide.

It would be very interesting to expand this analysis to specify more clearly the “etymology” (ironic) of each of the six forms and then to examine the historical and semantic relations among them. For instance, I suspect that the quotative and neologistic usages are earliest but that the broad semantic aspect of distance is what unifies all the senses except the emphatic. I also think one could do some very interesting corpus linguistics using students to code instances of quotation reliably, both in terms of frequency in different texts and in terms of this semantic typology. Finally, I haven’t even discussed the use of single versus double quotes (which could have some interesting correlations with my typology), or talked about “embodied” (neologistic) quotation marks in the form of “air quotes” (quotative?). Well anyway, if I write the paper, it’ll give me something to “talk” (ironic) about.

Works cited
Hymes, Dell. 1971. The Contribution of Folklore to Sociolinguistic Research. The Journal of American Folklore 84, no. 331 (March): 42-50.
Silverstein, Michael. 2006. Old wine, new ethnographic lexicography. Annual review of anthropology 35: 481-496.
Trigger, Bruce G. 2003. Understanding early civilizations: a comparative study. Cambridge Univ Press.


What we do

Julien at A Very Remote Period Indeed (which I am so glad to see has been revived after a hiatus!) has just offered up a scathing and brilliant response to some clod from Fox News who obviously hasn’t the least idea what university professors do and thinks that the reason postsecondary education is so expensive is that university administrators aren’t whipping faculty to work hard enough. Julien’s writing from the perspective of a field archaeologist, but as someone who got one (yes, one) week of actual vacation this summer, I have to add my “Right on, brother!” Also, most of us don’t actually get *paid* for the work we do in the summer (but if we didn’t do it, we wouldn’t be able to do the research needed for tenure and promotions). And we also serve on any number of committees (I’m on three this year, and I’m considered ‘protected’ from a heavier load, as a junior faculty), advise and mentor grad students, read scholarly literature (shocking, I know!), write scholarly literature (doubly shocking!), apply for grants, respond to emails (constantly), actually prepare and write our lectures/seminars, grade student work, etc., etc, etc. So, although I’m sure no one reading this blog thinks this, if there is anyone who thinks that professors are only working when we’re standing in front of an undergraduate classroom, uhh, I hate to bring it up, but your ignorance is showing.

And with that said, it’s time for me (at 9:45 on a Monday evening) to go work on the index for my book.

Ysteriousmay esselvay

Archaeologists working at Mount Zion in Israel have uncovered a stone vessel (dated between 37 BCE and 70 CE by archaeological association) bearing a cryptic script. The vessel, which is around 13 cm high, bears ten lines of writing scratched into the stone, but the inscription, which appears to be a mixture of Hebrew and Aramaic (we are not told on what grounds), but has not yet been deciphered.

Now that’s interesting. Let’s think about that for a minute. The article claims that “the cup’s script appears to be a secret code, written in a mixture of Hebrew and Aramaic, the two written languages used in Jerusalem at the time”. Both Hebrew and Aramaic are well-understood scripts that represent well-understood languages. So what on earth is going on here?

We don’t have a transcription at all, and I’ve been unable to find any further information yet, but what I suspect is going on is that some of the characters on the vessel are in Hebrew script, and others in Aramaic script, but that the translation of these doesn’t yet reveal a meaningful interpretation in either the Hebrew or Aramaic languages. It’s possible that the inscription records a third language (oh, let’s say … Phrygian, just to be obscure yet controversial), or that it is some sort of linguistic code (think ‘Pig Aramaic’, if you will) or even a substitution cipher. We just don’t know what the language is yet – and that’s really the most likely way we could get a text like this in two well-known scripts that we can’t read.

This is often a problem in media reporting in paleography and epigraphy – no distinction is made between the writing system (script) and the language of the inscription. The good news is that the research team is “sharing pictures of the cup with experts on the writing of the period. The researchers also plan to post detailed photos of the cup and its inscriptions online soon.” Now that’s going to make progress a lot faster.

No science like Snow’s science

I had the privilege this afternoon of attending a fascinating panel discussion run by several of my colleagues here at Wayne State entitled “The Two Cultures” by C.P. Snow — 50 Years Later sponsored by WSU’s Humanities Center and the Institute for Information Technology and Culture. Snow was a physicist, bureaucrat, and novelist whose Two Cultures lecture in 1959 postulated a grave gap in knowledge and interest between the sciences and the humanities. Specifically, he claimed that while the intellectual milieu of the time required all scientists to be at least somewhat familiar with some history and literature, humanists viewed scientific knowledge as the purview of scientists alone, something of which they could remain blissfully ignorant. This, Snow saw as a grave problem in a society whose technological complexity had grown to a point where policy-makers required scientific literacy to make informed decisions.

The debate today was lively and generally productive. For me, as a linguistic anthropologist who studies the history of mathematics, I was struck by the fact that all three of my disciplines are betwixt and between Snow’s two cultures (anthropology quite proudly so), and moreover, that the panelists today were two mathematicians, an anthropologist, a linguist, and a historian of science! My remarks below are based on an issue I raised at the end of the discussion, but which I feel deserves more of my attention now.

To the point: I do not believe there are any grounds at all to believe that there is such a thing as ‘science’ to be made clearly distinct from ‘the humanities’ – that at best these are used to designate semi-useful collocations of perspectives, and at worst, they are self-serving labels used to isolate oneself and to denigrate others. True, I have elsewhere here posted on linguistic anthropology as an integrated science, and I don’t retract anything I said there. Perhaps I would rephrase what I wrote to say that my belief is that too much linguistic anthropology self-identifies as purely or largely humanistic and thus defines itself out of some really interesting subject material, e.g. relating to the evolution of language.

Snow lived and worked at the height of modernism in the academy: for the social scientist, behaviorism, functionalism, and structuralism were all in full bloom. What he did not foresee, and could not possibly have foreseen, is the emergence of the ‘Science Wars’ or ‘Culture Wars’ in which two camps defined themselves in opposition to one another. Starting in the 1970s (or earlier or later, depending on who you ask), ‘science’ was severely criticized from various angles that we might generally label postmodern or poststructuralist. The response from ‘scientists’ (do people really call themselves ‘scientists’ unironically any more?) ranged from ignoring the new trend to bafflement to outright hostility. Certainly the response from the scientific community followed the initial criticisms of the humanists.

In fact, however, the label ‘war’ is quite inappropriate since very little of the academic discussion that we might now define under one of these terms actually involved academic debate between the two camps. Rather, the sides served as useful straw men to be marshalled in front of one’s fellow-travelers, serving as an emblem of clan identity (as a shibboleth). Moreover, drawing these boundaries allowed one to safely ignore that which lay beyond them as unnecessary, irrelevant, or just plain wrong. Just as we recognize that you can’t draw a line around ‘a culture’ without asking who is doing the defining and for what reason (and in whose interest), I believe that there is ultimately very little behind the distinction between Science and Humanities that cannot be explained in terms of a rather narrow set of interests, both internal and external.

In certain circles among Science, to deride social construction was to be seen as supporting empiricism against know-nothing relativism, and with it came suspicion of the totality of the humanities. They simply were not seen as relevant for anything most practitioners of “serious, empirical” disciplines would do. Conversely, Science was seen as the enemy by some humanists, or at least a subject about which they needed to know nothing in order to comment usefully. Few ethnographers, for instance, could admit publicly that they detested the people among whom they worked, but I would have no difficulty finding ethnographers and historians of science who have contempt for the people they study and the work they do. Still, many in the humanities, probably most, had nothing to do with these debates and continued to do sound empirical research using methodologies not so different from those of Snow’s day. And some disciplines, like anthropology and linguistics, have always been part of multiple traditions and have defied easy definition.

My question ultimately rests on how distinct the humanities and the sciences were as concepts, prior to World War II, and what explanation we might give if they have become increasingly distinct over time. I proposed, only half-jokingly, that to define the humanities as a bounded group of disciplines allows Science to define ‘those whom we do not have to fund’, and to define Science allows the humanities to define ‘the object of our newfound ire’. Snow shows that the labels were cognitively relevant to academics even in the late 50s remain so even in a vastly different social context and funding environment, and one in which both humanists and scientists are beholden to crass interests that emphasize applicability over insight and wealth over wisdom. But rather than suggesting that ‘sciences’ and ‘humanities’ are essential and timeless categories, I suggest that they remain useful to scholars because these definitions allow one to exclude huge swaths of knowledge from one’s purview. They tell us what we do not need to know to do our jobs, much to our ultimate loss.

For my own part, my research involves reading in fields such as semiotics, communication, epigraphy, history of science, historical sociology, anthropology, archaeology, linguistics, mathematics, evolutionary theory, developmental psychology, and cognitive neuroscience. No, really. I admit that I’m only reading from cognate fields to the extent necessary to my own work on number systems, but these are not just pastimes – they are necessary ancillary reading in order for me to have a fuller grasp of the subject I’m most closely engaged with. I can’t possibly imagine how I could neglect the detailed historical studies that form the core of my book, and at the same time I can’t imagine what my work would look like if that was all I did. But I don’t think that I have a particularly broad training or that I am a particularly energetic reader – far from it! I’ve simply defined my areas of specialty in a way other than the grand dichotomy between Science and Humanities.

For me, the critical study of race and gender are impossible without some familiarity with human physiology and the variation in bodies that is really real. Similarly, however, it is inane to study religion solely from a cognitive neuroscience perspective without understanding the moral and metaphysical models that are honestly perceived as valid by believers. My case is that the study of academic topics will frequently – though not inevitably – require one to be familiar with the methods, concepts, and practitioners of several disciplines across the socially-defined ‘spectrum’ of humanities and sciences. This will particularly be true for any topic that involves human beings.

I do not intend the foregoing to be read as a sort of facile constructivism of the “oh, nothing is real, let’s all fall into a nihilist heap and abandon any interest in objectivity” school of thought which has always been more real ideologically than in practice, in any case. If someone has a definition of science that excludes all of the disciplines of history or philosophy or linguistics, and really wants to insist that this is Science, eternal and unchanging, they’re welcome to it. For my part I’ve often found that when the stakes are highest, definitions are little more than post facto rationalizations of what one wants to believe one is doing, and what one believes one’s rivals by definition could not have done.

Levantine hieroglyphs in the Early Bronze Age

A 4cm fragment of a carved stone plaque (photo here) has been found in northern Israel at the site of Tel Bet Yerah, depicting an arm bearing a scepter and an early form of the Egyptian ankh symbol. It appears to date to the First Dynasty of Egypt (ca. 3000 BCE) – by some centuries, this is the earliest evidence of Egyptian writing outside of Egypt proper – although it isn’t a text that could be understood linguistically, and in fact is very small. The press release from Tel Aviv University isn’t clear how it was dated (whether contextually by association with other material, or paleographically/iconographically from the style of the inscription), but notes that it is “the first artifact of its type ever found in an archaeological context outside Egypt”, whatever ‘of its type’ means. Either way, this is strong evidence that Egyptian representational traditions were known in the Levant in the Early Bronze Age, 1500 years before the New Kingdom, when Egypt first exercised direct political authority in the region.

Ajami and Western numerals

Pardon me as I sort out my long list of posts that got shelved prematurely this summer during my fieldwork. There is a really neat little article entitled Lost Language in Bostonia, the Boston University alumni magazine. It’s a fascinating look at the research of Fallou Ngom, who specializes in Ajami writing. Ajami is the name given collectively to modified versions of the Arabic script used to write various West African (non-Arabic) languages. Once you set aside the ridiculous title of the article – Ajami is only ‘lost’ in the ethnocentric sense that most Western scholars don’t know about it and is most definitively not a language, but rather a set of writing systems, each used to write a specific language – it’s an interesting look at a neglected subject relating to an area that is often misperceived as illiterate and having made no contributions to intellectual life.

But what interested me most about the article were the two photos of Ajami manuscripts – one right at the top of the article, another around two-thirds of the way down. And while, yes, I may be the only person to find this really striking, but both of the pages are numbered using Western numerals (51 and 7, respectively). In virtually any handwritten and printed Arabic literature, the set of Arabic numerals ٠١٢٣٤٥٦٧٨٩ are used, not the Western numerals 0123456789. In fact, these numerals have been one of the most resistant to being replaced by Western numerals, even as other regions of the world, such as Japan and India, have partially or fully abandoned their traditional numbering systems. That these Ajami texts are paginated in Western numerals is thus notable, and raises the question of how widely this practice has spread. Is it just chance that these two texts selected happened to use Western numerals, or is this a systematic difference between Arabic and Ajami texts? And if it is a real difference, when and in what context(s) did it emerge? Sounds like a good project for a master’s thesis.


In my internettic peregrinations yesterday, I came across a thoughtful personal essay, A Requiem for Philology, by Prof. William Harris of Middlebury College, who unfortunately passed away earlier this year at the age of 83. (I found it through this interesting reminiscence from Steve Cotler, linked yesterday by Mark Liberman on Language Log, and then all those articles were mentioned today on Language Hat. Thus ends my internetymology.) These extended reminiscences concern Joshua Whatmough (1897-1964), a prominent classical philologist of the mid-20th century and an expert on the non-Italic languages of Italy.

Philology is the academic discipline that focuses on the meaning and history of words and the comparative analysis of texts deriving from this analysis. I’m extremely sympathetic to philology as an academic discipline. It is a set of related practices and concepts exported from classics to a whole host of regionally and linguistically specific disciplines. Today, you have Egyptian philologists in Near Eastern studies/Egyptology, Anglo-Saxon philologists in English departments, and of course the classical philologists in classics … and so on. Now it’s not always that way (Whatmough was a true comparative philologist, and indeed his department at Harvard was first classical philology, then comparative philology, and eventually linguistics), and indeed, in the past, philology had much more substance and more impact than it does today. Philology is a set of methods and concepts that hang together quite nicely regardless of regional specializations. It is also extraordinarily useful for the historical and comparative investigation of languages and cultures. So as a discipline, it makes a great deal of sense to me on those grounds. Philological research is of enormous interest and relevance to my own scholarship on pre-modern numerical systems.

It is indeed the case that there are virtually no departments identified specifically with philology in the English-speaking world (although there are plenty in Eastern Europe); the only North American one I was able to find is Columbia’s Department of French and Romance Philology. However, this itself doesn’t tell us very much. The Columbia department is not producing scholarship that is substantially different than other Romance languages / French / comparative lit departments. Conversely, any number of scholars in departments that don’t bear the name ‘philology’ are, effectively, philologists – I can think of a few at my own institution, for instance, and we have a regularly taught Romance philology graduate course. Moreover, much of what was once labelled philology now simply falls under the rubric of historical linguistics – and again, most major institutions have at least one historical linguist, often more.

Indeed, the essay is neither focused on the (true but trivial) fact that there are virtually no philology departments these days or the issue of whether some academics are philologists. It is a lament that the form of close analysis of words and language undertaken by philologists is not taught to undergraduates in the way that Prof. Whatmough and others once did. And that is all it is. No one is asserting that no one does the kind of work that Whatmough once did – this would be ridiculous and patently false. In fact, given the proliferation and expansion of institutions, I’d wager that there are in fact more academics practicing philological research than ever before (even as they constitute an increasingly tiny percentage of the academy as a whole). But Harris is asserting that there has been a decline in the teaching of a set of rigorous methods towards language whose absence is detrimental to the cognition and character of students, who would have profited from it.

And now let’s ask ourselves: why is this so? Surely it was not that every undergraduate was once required to take philology – at least not in the twentieth century! But equally I don’t think it is that we have become distracted at a macrosocietal level, as Harris suggests: “our public eye has become loose, accustomed to glancing at two second flash-shots on block-buster film and TV. We tend to get overall meanings, we think and buy on impulse and we don’t read the fine print on our personal and political contracts well.” At a pragmatic level, declining enrollments in a major over a period of time result in fewer classes being offered in that major, which in turn reduce its visibility – and thus fewer students hear about it. So there is a positive feedback effect going on here that can result in the demise of many a small department or specialty, usually through merging it with another discipline. Indeed, not only did this happen with philology, but it is ongoing with classics as a whole, as many departments merge into history or lit-languages departments and become allied programs with graduate degrees, and then potentially just a few courses.

But at a bigger level, a societal level, the university has become a very different sort of place than it was 50 years ago, at institutions big and small. In an age where postsecondary education has really reached a mass clientele, and where the role of the university has become to a large degree professionalizing and pragmatic, and where very few students come from a life of leisure, it is completely unsurprising that disciplines like philology have difficulty justifying their existence within the social, economic, and political framework of higher ed. Foreign languages and cultures – sure, that’s good for business. Archaeology’s business model works because CRM firms provide jobs to graduates without the PhD. Classics gets by (barely) by taking a ‘cultural turn’ towards the study of race, gender, and class. And linguistics has linked its fate to cognitive science, for better or for worse, and has thus hitched itself to a behavioral-science model of funding and scholarship. Attributing the decline of the discipline to students losing interest is missing the point – it is indeed necessary that the modern university shed disciplines that do not conform to the structural needs of employment markets.

This is not the university that we have chosen – not academics (philologists or otherwise), and not students – not a free choice, at any rate, but one conditioned by an insatiable demand for relevance and applicability that philology simply lacks. And if we want to sit about and lament the loss of the university that once was, that’s all very well. But if we want to make the case for ‘irrelevant’ and ‘inapplicable’ disciplines – and I would insist that we can and must – we need to be cognizant that blaming students or faculty fails to address the larger issues in the contemporary academy.