The spiral of ‘anti-other rhetoric’

The spiral of ‘anti-other rhetoric’: Discourses of identity and the international media echo. By Elisabeth Le. (Discourse approaches to politics, society and culture 22.) Amsterdam: John Benjamins, 2006. Pp. xii, 280. ISBN 9789027227126. $173 (Hb).

Reviewed by Richard W. Hallett, Northeastern Illinois University

In Ch. 1, ‘Media, international relations, collective memories, and Critical Discourse Analysis’ (1–16), which essentially serves as the introduction to the book, Elisabeth Le begins to present a case study of elite newspapers in three nations: France, the United States, and Russia. Her study employs a critical discourse analysis (CDA) of the editorials in these newspapers to determine the effect of the ‘international media echo’ (6). According to Howard Frederick (Global communication and international relations, Belmont, CA: Wadsworth Publishing, 1993), this effect can occur ‘when the content of one nation’s media becomes news in the media of another country’ (228).

Using the interacting cascading networks model (9), L presents overviews of Russian, French, and American societies in Ch. 2, ‘National and international contexts for the international media echo’ (17–52). Following these overviews is a discussion of the print elite media in these three countries, which includes histories of the French newspaper Le Monde, the American newspaper The New York Times, and the Russian newspapers Izvestija, Nezavisimaja Gazeta, and Segodnija.

The linguistic analysis in Ch. 3, ‘Russia in Le Monde and The New York Times’ (53–105), shows how seventy-four editorials about Russia in the French and American elite newspapers, which appeared from August 1999 to July 2001, differ in terms of argumentation, debate construction, and the presentation and positioning of ‘us’ and ‘them’. In Ch. 4, ‘Le Monde’s and The New York Times’ editorials in their national societies’ (107–28), these editorials are shown to be a reflection of the world conceptions of France and America that, in part, form their respective national identities.

Claiming to leave a linguistic analysis of Russian media to ‘those who possess more than [her] fluent reading abilities of Russian’ (129), L utilizes the interacting cascading networks model to analyze the Russian socio-political organization in Ch. 5, ‘Russian reactions to the West’ (129–60). To this end, she discusses the three Russian elite newspapers, the Russian official position, Russian public opinion, and Russia’s position in the world vis-à-vis the West from 1999 until 2001.

The conclusion, Ch. 6, ‘Crossing cultural and disciplinary boundaries’ (161–81), summarizes L’s analyses of the interactions among the national elite print media on the textual, ideational, and interactional levels. This chapter also discusses the ‘anti-other rhetoric’ spiral, which L declares ‘the most negative manifestation of the international media echo’ (162), and calls for the necessity of cross-cultural and cross-disciplinary research. Following this chapter are five extensive appendices that contain a wealth of information: ‘Editorials’ (183–85), ‘Chronology’ (187–96), ‘Coherence analysis’ (197–209), ‘Content coding’ (211–35), and ‘Negative representation of Russia’ (237–43).

This book is a welcome addition to the growing number of CDA studies. It could be used as supplemental reading material in discourse studies as well as international relations and media studies.

Dimensions of forensic linguistics

Dimensions of forensic linguistics. Ed. by John Gibbons and M. Teresa Turell. (AILA applied linguistics series 5.) Amsterdam: John Benjamins, 2008. Pp. vi, 316. ISBN 9789027205216. $149 (Hb).

Reviewed by Amy Gurvis, Northeastern Illinois University

According to its editors, the main purpose of this book is ‘to provide a guide to the multidisciplinary nature of Forensic Linguistics—understood in its broadest sense as the interface between language and the law—that could be of interest for scholars, graduate students and professionals working in Applied Linguistics’ (1).

John Gibbons and Teresa Turell summarize the collection of fourteen papers in the introduction (1–4). The remainder of this volume is organized into three sections: ‘The language of the law’ (7–111), ‘The language of the court’ (115–211), and ‘Forensic linguistic evidence’ (216–99). In Section 1, Peter Tiersma (‘The nature of legal language’) discusses the language of regulation and legislation and examines the growth and use of legal languages. In ‘Language education for law professionals’, Jill Northcott addresses the difficulties encountered in teaching English legal language to second language learning legal professionals. Chris Heffer, in ‘The language and communication of jury instruction’, presents an examination of the language used in jury instructions and its comprehension. Then, Phil Hall examines ‘Policespeak’. Concluding the section is ‘Legal translation’, in which Enrique Alcaraz Varo examines the challenges and issues of English legal translation.

Section 2, ‘The language of the court’, begins with ‘Questioning in common law criminal courts’ written by John Gibbons, which examines the differences between everyday questioning and courtroom questioning. Then, in ‘Bilingual courtrooms: In the interests of justice?’, Richard Powell discusses the variable of code choice in courtrooms and its potential effect on justice. Dennis Kurzon, in ‘The silent witness: Pragmatic and literal interpretations’, addresses how silence can be used in the courtroom and how different countries interpret it. In ‘Language and disadvantage before the law’, Diana Eades presents an argument about the imbalance of power in the courtroom that linguistically disadvantaged groups, such as the deaf, encounter throughout the justice system. ‘Interpreting for the minority, interpreting for the power’ by Ester S. M. Leung concludes the section, by examining how interpreting and translating testimony can maintain the power of the status quo.

Section 3, ‘Forensic linguistic evidence’, starts with Tim Grant’s ‘Approaching questions in forensic authorship analysis’, which examines the analysis and challenges of the written authorship process, breaking it into four stages and discussing the inability of any one identification technique to address all of these stages. In ‘Trademarks and other proprietary terms’, Ronald R. Butters discusses the linguistic criteria that define trademarks. Then, William G. Eggington addresses ‘Deception and fraud’ by analyzing the text of a deceptive email. Finally, in ‘Plagiarism’, M. Teresa Turell tackles different types of plagiarism, methodologies used, and their subsequent challenges.

The book provides an extensive look into the field of forensic linguistics and could easily serve as a textbook for a course. It could also be used as an informative resource to anyone interested in language and the law in any capacity.

The representation and processing of compound words

The representation and processing of compound words. Ed. by Gary Libben and Gonia Jarema. Oxford: Oxford University Press, 2006. Pp. 242. ISBN 9780199228911. $55.

Reviewed by Marina Gorlach, Metropolitan State College of Denver

The eight chapters of this book imply that the ability to understand compound sequences is a reflection of the interplay between storage and computation processes in the mind. As noted by Ray Jackendoff (Foundations of language: Brain, meaning, grammar, evolution. Oxford: Oxford University Press, 2002), compounds ‘may be viewed as “protolinguistic ‘fossils”’, a structural type that has survived from the earliest forms of human language’ (50). Compounds represent one of the most universal types of derivation found in language and occupy a unique position at the crossroads between words and sentences in that they are easily segmentable into constituent morphemes, similar to the way sentences are segmentable into their constituent words.

Gary Libben’s introductory chapter, ‘Why study compound processing?’, emphasizes the opportunity compounding provides for understanding the fundamental aspects of mental architecture. His data show that compound processing is characterized by redundancy rather than efficiency: both whole-words and their constituents are initially activated in the mental lexicon and subsequently inappropriate activations are inhibited—for example, in case of semantically opaque constituents.

Wolfgang U. Dressler provides a comprehensive classification of compounds on the basis of their structural, semantic, and syntactic properties in ‘Compound types’. In the next chapter, ‘Compound representation and processing: A cross-language perspective’, Gonia Jarema brings together cross-linguistic evidence for the representation of compounds in the mental lexicon and assesses the possibility of uncovering general principles of compound access and storage in natural languages. Studies conducted in twelve languages (eight Indo-European and four non-Indo-European) of unimpaired and impaired (e.g. aphasic) populations show that the constituents are indeed activated during compound processing, which underscores the privileged status of the first constituent.

In ‘The neuropsychology of compound words’, Carlo Semenza and Sara Mondini present the findings of several studies of the processing of compound words in aphasiology. The data from neuropsychology provide evidence that compound processing involves decomposition, even for opaque compounds, as well as simultaneous activation of the constituents in retrieval and activation of all meaningful representations, both whole words and isolated components.

In ‘Preschool children’s acquisition of compounds’, Elena Nicoladis discusses the factors that affect children’s ability to create and understand novel compounds. Research to date is limited to just noun-noun compounds or the English object-verb-er construction, which overlooks many structural types of compounds and excludes the semantically opaque ones.

Erika S. Levy, Mira Goral, and Loraine K. Obler explore the ways in which bilinguals process compounds in ‘Doghouse/chien-maison/niche: Approaches to the understanding of compound processing in bilinguals’. They include an overview of the psycholinguistic studies of the bilingual lexicon and a discussion of particular characteristics of compounds that vary cross-linguistically. Furthermore, they investigate hypotheses on how such differences might affect the way bilinguals process compounds. Analyzing reaction times and transfer errors in both unimpaired and polyglot aphasic bilinguals, the authors compare the role of factors such as the productivity of compounds within a language, semantic transparency, morphological headedness, the position of constituents in a string, and morphological constraints. They conclude that further documentation of bilinguals’ transfer errors in the on-line processing of compounds may reveal active links between first language and second language lexemes.

The relationship between research on conceptual combination in compounds and the structure of the mental lexicon is discussed in ‘Conceptual combination: Implications for the mental lexicon’ by Christina L. Gagné and Thomas L. Spalding. Comparing schema-based approaches to conceptual combination with the competition-among relations-in-nominals (CARIN ) approach, the authors bring strong empirical evidence that supports the CARIN theory by demonstrating that the modifier has a primary effect on relation selection in both familiar and novel compounds. This implies the processing of familiar and novel compounds is more similar than most theories of mental lexicon would expect.

In ‘Processing Chinese compounds’, James Myers presents an overview of compound processing in Chinese. Unlike other languages, Chinese visual stimuli (i.e. characters) have a great influence on the access to spoken words. Similar to other languages, activation of the components of a compound in Chinese depends on semantic transparency, modality, and the mutual predictability of compounds.

The process of compounding and access to familiar compound words is critical to understanding the mental lexicon as a whole, and this volume makes an important contribution to this field.

Cognitive linguistics

Cognitive linguistics: Basic readings. Ed. by Dirk Geeraerts. Berlin: Mouton de Gruyter, 2006. Pp. 485. ISBN 9783110190854. $35.

Reviewed by Ioana-Rucsandra Dascalu, University of Craiova

The book Cognitive linguistics: Basic readings is a collection of articles divided into twelve chapters (as well as an introduction and an epilogue) edited by Dirk Geeraerts. Cognitive linguistics, as an autonomous domain of linguistic study, must be distinguished from generative grammar. The main focus of this volume is on meaning, on how it is processed and stored, and on the way it transforms the outside world (4). The twelve subthemes of this volume reflect the essential issues of this field. Moreover, the contributors are well-known authorities who provide the foundation of this theoretical collective work.

In Ch.1, ‘Introduction to concept, image, and symbol’ (29–68), Ronald W. Langacker takes a stand against the traditional trends in analyzing language, which he replaces with cognitive grammar: a description of language that uses cognitive processes. Like the lexicon, grammar is imagic—for example, when using a particular construction or grammatical morpheme we select an image that corresponds appropriately for our communicative purposes. As Langacker declares, this perspective is completely new and innovating, formulating a cognitively realistic and linguistically well-motivated framework.

In Ch. 2, ‘Grammatical construal: The relation of grammar to cognition’ (69–108), Leonard Talmy analyzes language as being made-up of two parts: the grammatical and the lexical. These two parts are distinguished by the notion of open-class items—that is, classes of items that are ‘quite large and readily augmentable relative to other classes’ (70) and closed-class items—that is, classes of items that are ‘relatively small and fixed in membership’ (70). Other categories such as dimension (e.g. space and time; 78), state of dividedness (83), and degree of extension (85) are also related to the grammar and lexicon.

In Ch. 3, ‘Radial network: Cognitive topology and lexical networks’(109–40), to demonstrate the inadequacy of feature analyses and the necessity of new cognitive typologies, Claudia Brugmann and George Lakoff discuss a polysemous word—the English preposition over, and all of its meanings (e.g. above-across, above, covering senses, and metaphorical senses). Brugmann and Lakoff conclude that features are not enough for a proper linguistic description and that networks are needed to characterize the multiple senses of polysemous words.

In Ch.4, ‘Prototype theory’ (141–66), Dirk Geeraerts provides an overview of prototype theory, a cognitive trend in linguistics with growing success since the early 1980s as a reaction to the componential model of linguistic analysis. The notion of a prototype is described by reference to the analysis of necessary and sufficient attributes that have both a family resemblance and degrees of category membership that are blurred at the edges.

Ch. 5, ‘Schematic network’ (167–84), deals with multi-sense words and the types of relations established between their forms and meanings. Words are ambiguous when a single phonological form has two or more meanings (e.g. bank—financial institution vs. bank—land at river edge). Words are vague when two or more meanings are united as nondistinguished subcases of a more general meaning (e.g. aunt—father’s sister vs. aunt—mother’s sister). The verb paint is analyzed as a case of polysemy.

One of the most well-known issues of cognitive linguistics is metaphor and the differences in its traditional definitions. According to the contributors, metaphor is a figure of speech that represents an object through similarity to another object and the recent classification as a matter of thought and conceptualization. Everyday conventional language is not metaphorical, however: One can commonly find many metaphors of time, space, and purpose. The use of personification and proverbs is also interpreted by reference to the metaphors they are made up of.

In their chapter entitled ‘The cognitive psychological reality of image schemas and their transformations’ (239–68), Raymond W. Gibbs and Herbert L. Colston analyze image schemas that arise from our daily experience (e.g. perceptual interactions, bodily actions, manipulations of objects), which involve auditory, kinesthetic, and bodily aspects. This exploration of image schemas combines criteria from cognitive linguistics and psycholinguistics as well as cognitive and developmental psychology.

Charles J. Fillmore’s chapter deals with ‘Frame semantics’ (373–400) whose central item—the frame—refers to a ‘system of concepts related in such a way that to understand any one of them you have to understand the whole structure into which it fits’ (373).

The twelve chapters of this book, along with the introduction and the epilogue, are a great guide to the most important themes of cognitive linguistics.

Key terms in semiotics

Key terms in semiotics. By Bronwen Martin and Felizitas Ringham. New York: Continuum, 2006. Pp. 275. ISBN 9780826484567. $27.95.

Reviewed by Richard W. Hallett, Northeastern Illinois University

Seeing the boundary between linguistics and semiotics as ‘extremely fluid’ (1), Bronwen Martin and Felizitas Ringham claim ‘the theory [of semiotics] purports to explore the generation of signification in all its forms’ and add ‘semiotics thus covers all disciplines and signifying systems as well as all social practices’ (2). The purpose of their book is, accordingly, to familiarize readers with the semiotic approach.

The introduction (1–16) presents some of the basics of semiotics. For example, the authors briefly discuss the differences and increasing convergences between the American and European branches of the discipline; mention the contributions of some scholars in the field such as Ferdinand de Saussure, Louis Hjelmslev, Claude Lévi-Strauss, and Algirdas Julien Greimas; and explain important concepts like canonical narrative schema and the semiotic square.

Section 2, ‘Key terms in semiotics’ (17–211), comprises the bulk of the text. Four hundred and eight entries for semiotic concepts are provided in alphabetical order, from absence (17) to zoosemiotics (211). All of the terms are defined and exemplified. Many of the examples in the entries make reference to a wide variety of literary works that range from current fiction to seemingly universal fairy tales.

Following the key terms is a section entitled ‘Key thinkers in semiotics’ (212–48). In this portion, short biographies are provided for Roland Barthes, Noam Chomsky, Umberto Eco, Algirdas Julien Greimas, Louis Hjelmslev, Roman Jakobson, Julia Kristeva, Claude Lévi-Strauss, Maurice Merleau-Ponty, Charles Sanders Peirce, Vladimir Propp, and Ferdinand de Saussure. The next section, ‘Key texts in semiotics’ (249–53), presents a bibliography containing 107 references. The appendix (254–75) contains two parts: the fairy tale of “Sleeping beauty” (254–56) and ‘A semiotic analysis of the fairy-tale Sleeping Beauty: An example of the Greimassian approach’ (257–75).

With its myriad examples and useful definitions, this volume will be a nice supplement not only for an introductory course in semiotics but also for courses on discourse studies. Linguists working on semiotic analyses will welcome this handy reference book.

Quantitative and experimental linguistics

Quantitative and experimental linguistics. Ed. by David Eddington. (LINCOM handbooks in linguistics 23.) Munich: LINCOM Europa, 2009. Pp. 428. ISBN 9783895867378. €134.

Reviewed by Haitao Liu, Communication University of China

In the eyes of many linguists, linguistics is moving from the arts or humanities to the cognitive or life sciences. Linguistic methods are also changing from introspection to experimentation, and modern techniques and equipment are being developed.

These new methods are helpful to quantitatively and experimentally explore questions of linguistic structure and processing. However, these techniques are relatively recent and have been developed mainly in fields separate from linguistics. Therefore, many researchers and students in linguistics may be unfamiliar with these techniques. This volume is designed to provide an introduction to such a field.

The volume includes nine chapters. In Ch. 1, ‘Linguistics and the scientific method’, David Eddington compares empirical and nonempirical approaches to linguistics. He insists that valid explanations about real-world language processing must depend on adherence to scientific methodology and that only researchers who choose the method that is standard in scientific research can make progress in the field.

Wendy Baker discusses the pros and cons of the most common quantitative methods used in sociolinguistic studies—namely, field, experimental, and corpus methods. Patrick Juola sketches some mathematical basics of common connectionist systems and presents how these systems can be applied to a variety of linguistic problems. This chapter illustrates how connectionism can be a useful linguistic method and warns about the pitfalls of blindly using this method.

Exemplar-based models of language imply that linguistic behavior is a matter of comparison between a current expression and memories for tokens of similar expressions. One or more of these tokens are chosen as the basis for deciding how to operate on, interpret, or compose the current expression. After investigating Royal Skousen’s analogical model and the memory-based models of Walter Daelemans and colleagues, Steve Chandler concludes that exemplar-based models can consistently simulate real language behavior more accurately than either the rule-based models of generative grammar theory or connectionist simulations.

Experimental phonetics uses the scientific approach of testing certain variables while controlling other potential sources of variation in order to study the many factors that influence how speech is produced or perceived. Caroline Smith outlines several methods used to understand what the properties of speech sounds are and how they work as part of a linguistic system.

In the chapter entitled ‘Chronometric psycholinguistic techniques: Timing the lexicon’, which focuses on lexical decision making and priming, Gary Libben discusses technical issues that arise in the implementation of response time experiments and presents the most popular techniques that employ response time as the main dependent variable.

Bruce L. Derwing and Roberto G. de Almeida review ‘Non-chronometric experiments in linguistics’. They argue that human language is fundamentally a psychological phenomenon and that experimental psychological methods are therefore necessary to uncover its essential nature and character. Several experimental procedures, which can be used to explore the psychological reality of linguistic units and structures across a wide range of language and language types, are also described in this chapter.

Martin Meyer introduces the problems of ‘Neuroimaging of speech and language’. His article includes three sections: (i) a sketch of theoretical and technical principles and limitations of the most prominent brain imaging techniques (PET and fMRI); (ii) an overview of neuroimaging research on comprehension and production of spoken, written, and signed languages in the past fifteen years; and (iii) a perspective on how to complement fMRI and PET studies to provide more precise information about functional and anatomical connectivity within and between remote and adjacent language-related brain sites.

Usage-based linguists present a wide range of evidence to prove that linguistic structure is largely a product of humans’ collective and accumulated experience. Gathering relevant research from synchronic and diachronic research on grammar, from cognitive psychology and cognitive modeling, and also from computational linguistics, Joyce Tang Boyland provides a cross-disciplinary explanation of why usage is so central in the mental processes and representations of human language.

This volume does not have an index; therefore, it is not an easy task to find any particular topic. Additionally, quantitative linguistics in the volume is not very consistent with Quantitative Linguistics commonly understood by the International Quantitative Linguistics Association (IQLA, www.iqla.org/).

Syntax: A generative introduction

Syntax: A generative introduction. 2nd edn. By Andrew Carnie. Oxford: Blackwell, 2007. Pp. xviii, 489. ISBN 9781405133845. $57.95.

Reviewed by Dinha T. Gorgis, Jadara University

This volume discusses generative syntax as a cognitive science. Most of the seventeen chapters end with a summary followed by a glossary of the terms and technical expressions introduced in the chapter. Additionally, Andrew Carnie includes a section of further readings as well as exercises that range from general to challenging. Ch. 1 introduces generative syntax, which ‘assumes that a certain amount of grammar is built in and the rest is acquired’ (26).

Ch. 2 explores the parts of speech (also known as syntactic categories or word classes) whose functional identities are uncovered through morphological and syntactic distributional tests. Building upon this knowledge of words and their subcategorization, in Ch. 3, C moves to the technical workings of generative syntax. The concept of structure is demonstrated through constituency relations represented as trees. As with words, larger structures are tested for their constituency.

Ch. 4 is more technical in that it relies on mathematical reasoning and formal logic. Here, C explores domination, precedence, and c-command. He makes it clear that ‘grammatical relations are not structural relations’ (118). Building upon these structural relations, Ch. 5 offers a set of principles that ‘governs the distribution of [noun phrases] NPs’ (144).

Part 2 is exclusively devoted to X-bar theory, its extension, and the constraints imposed upon it. Ch. 6 deals with intermediate structures that earlier versions of the phrase structure rules (PSRs) could not account for. X-bar theory suggests options for speakers of different languages: parameterization is a key concept. More refinement of the already revised PSRs is introduced in Ch. 7, and a new category, determiner phrase (DP), is also proposed. Of interest is C’s observation that the main clause in Peter thinks that Cathy loves him, is not Peter thinks, because this string is not a constituent. Rather, it is ‘everything under the root [tense phrase] TP node’ (202).

Despite the revisions and extensions, which could account nicely for constituency and cross-categorial issues across languages, X-bar theory is constrained in Ch. 8 by appealing to lexical information, preventing the generation of ungrammatical sentences.

Part 3 is devoted to movement: ‘Head-to-head movement’, ‘DP movement,’ ‘Wh-movement’, and ‘A unified theory of movement’. Among the phenomena covered are subject-auxiliary inversion, word order differences among languages, do-support, case, movement locality, and universal semantics. In sum, ‘movement can both be overt (before SPELLOUT) and covert (after SPELLOUT)’ (368).

The ‘Advanced topics’ in Part 4 move away from the generally agreed upon principles and parameters framework. Ch. 13 tackles the problem of ditransitive verbs. Ch. 14 sketches control theory and the question of null subject categories, and Ch. 15 elaborates on binding theory, which was first presented in Ch. 5.

Part 5 is a brief overview of two formal alternatives to the Chomskyan principles and parameters approach (and its descendent, minimalism): lexical-functional grammar (Ch. 16) and head-driven phrase structure grammar (Ch. 17). C assumes that newcomers to syntactic theory will be able to further investigate these theories on their own elsewhere.

The problem sets are one of the best features of this book: they can provide an excellent platform for peer work inside the class but are also highly understandable on an individual basis. Those who use Radford’s Transformational grammar (Cambridge: Cambridge University Press, 1988)—an excellent introduction indeed—are advised to gain new insights from C’s equally excellent book.

Brave new digital classroom

Brave new digital classroom: Technology and foreign language learning. By Robert J. Blake. Washington, DC: Georgetown University Press, 2008. Pp. xv, 189. ISBN 9781589012127. $24.95.

Reviewed by Colette van Kerckvoorde, Bard College at Simon’s Rock

Although new technology is increasingly incorporated into the general curriculum at all levels of education, many foreign-language teachers are still resisting this trend in their own classrooms. In this book, Robert Blake makes a plea for the use of computer technology in the foreign-language curriculum, stressing that it offers the potential not only to improve the students’ skills in the target language but also to augment their opportunities to receive comprehended input. If applied responsibly and properly integrated, technology will indeed be beneficial to student learning.

B intends this work for the neophyte as well as for the teacher who has already incorporated some technology in the classroom. He recognizes that language acquisition is a social, face-to-face process but also reminds us that it takes several years to become fluent in the target language, that most American students are not able to study abroad, and that advanced-level courses are not always readily available. Technology, he argues, can thus enrich the overall language-learning experience and be used in addition to classroom instruction. B further points out that most students enjoy using computers and generally respond well to the use of technology in the classroom.

The book is well structured. Ch. 1 serves as a general introduction and briefly discusses the importance of exposure to comprehended output in the language learning process. Here, B also addresses some common misgivings that teachers frequently cite about the use of technology in the foreign-language classroom and argues that technology only provides a set of tools that some consider methodologically neutral. Such tools, he claims, can make a valuable contribution to a language program, but only if their use has been carefully thought out and is not limited to drills and/or exercises with little negotiating of meaning. The following three chapters each deal respectively with the Web, with computer assisted language learning (CALL), and with computer-mediated communication. B takes time to explain each component and indicates how each technology can be successfully used to promote active and meaningful learning. As a rule, he suggests that teachers map out a learning pathway for the students and provide ample guidance. They should always construct sound pedagogy around the materials and remember that the negotiation of meaning is at the center of the implementation. In the last chapter, B addresses distance learning and focuses on some recent studies that were done on this topic. He further provides some guidelines that can be used to enhance the online language instruction experience.

This is an easy introduction to sound practices in application of computer technology in language instruction. The book is not meant as a how-to manual for specific languages but rather as an invitation to create a more student-centered environment in which the student can be an active participant in a learning process that consists of several components.

Communication in medical care

Communication in medical care: Interaction between primary care physicians and patients. Ed. by John Heritage and Douglas W. Maynard. (Studies in interactional linguistics 20.) Cambridge: Cambridge University Press, 2006. Pp. 488. ISBN 9780521621236. $48.

Reviewed by Sarah May Fauzi, The University of Texas at Arlington

Bringing together medical discourse linguists and nonlinguists, this book uses conversation analysis to discuss interactions between patients and their primary care physicians. A forward by Debra Roter explains some of the changes in the way medical discourse has been analyzed; she stresses the importance of incorporating both quantitative and qualitative analysis. Many of the authors use mixed methods to reach their conclusions, which adds clarity to the findings and implications.

The introduction, ‘Analyzing interaction between doctors and patients in primary care encounters’, explains how conversation analysis can be used to understand and improve medical communication. The following thirteen chapters seem to fall into four themes: Symptom taking, exam and diagnosis, treatment and closings, and telephone medicine.

The chapters on symptom and history taking provide an analysis of how and when patients choose to present their complaints to a physician as well as insight into patients’ complaints or lists of symptoms that often justify their decision to visit a physician. The patient will often provide evidence such as third party accounts to explain how the illness was severe enough to warrant the office visit. This section ends with a chapter (Ch. 6) that establishes the dimensions of questioning and answering during history taking to address the agendas, presuppositions, and preferences of the participants.

The chapters about examination and diagnosis include an investigation of the body as an instrument of language during the physical examination. Ch. 7 analyzes physical movements such as turns of the head and eyes as means of communicating with the physician during the exam. Ch. 8 explains the patient’s opinions and presuppositions about the diagnosis as well as the physician’s response to such opinions. Finally, Ch. 9 discusses the delivery of different kinds of news to the patient.

Four chapters explore medical interview treatments and closings; specifically how patients and physicians negotiate treatment plans and prescriptions. Perhaps the most disappointing chapter is Ch. 11, ‘Prescription and prescribing: Coordinating talk- and text-based activities’. Although this chapter attempts to address the complications of using a computer in the exam room to enter information and type prescriptions, it fails to fully investigate the effect of computers on interaction within the room. It is a fascinating area and a new problem that technology has bestowed upon us, but its effects are yet to be fully explored. Ch. 12 provides an interesting look into lifestyle discussions between the physician and patient that occur after diagnosis. Ch. 13 examines the closings of the medical interview and ends appropriately with the closing of a door.

The final chapter observes physicians’ negotiation with patients (mainly parents of pediatric patients) and the decision of whether or not to visit the home based on the information given over the phone. Communication in medical care brings much needed addition to the field of medical discourse.

Do you make these mistakes in English?

Do you make these mistakes in English? The story of Sherwin Cody’s famous language school. By Edwin L. Battistella. Oxford: Oxford University Press, 2009. Pp. x, 214. ISBN 9780195367126. $29.95 (Hb).

Reviewed by Matthew J. Gordon, University of Missouri

This book tells the tale of Sherwin Cody (1868–1959), an educational entrepreneur who created a correspondence course in prescriptive grammar. Cody successfully marketed his 100% self-correcting course in English language for over four decades, and Edwin Battistella offers a portrait of the man and his work that demonstrates how Cody’s success—while tapping into perennial linguistic anxieties—was very much a product of his times.

After setting the stage in an introductory chapter, B turns to a sketch of Cody’s biography. Cody was a remarkably prolific author whose early work focused on literary topics. He published books on various American authors as well as guides for writing fiction. He got his first taste of what would be his life’s work in 1896 when he became responsible for a home-study course in English offered by the Chicago Tribune. Eventually he reworked the material for that course into The art of writing and speaking the English language, and in a move that became a hallmark of his career, Cody marketed this book with print ads in prominent venues. Indeed, B argues that Cody deserves to be recognized among the pioneers of twentieth-century advertising. The title of the book comes from the most frequently used (and most successful, in terms of sales) headline in the ads for Cody’s 100% self-correcting course.

B’s attention to the rapidly evolving world of advertising is just one example how he fleshes out Cody’s story by framing it within the broader social context of the times. He draws parallels between Cody’s work and other commercial campaigns for self-improvement including the Harvard classics book series and even Charles Atlas’s bodybuilding course. B devotes an entire chapter to perhaps the best known guru of self-improvement, Dale Carnegie, whose personality, message, and business model represent a strong counterpoint to Cody’s. Carnegie and Cody promoted a shared goal; after all, the purpose of learning to win friends and influence people, like the purpose of training to correct your mistakes in English, is to achieve financial success. In this way Cody differs from many other prescriptivist authors. To be sure, he uncritically accepts the notion of ‘correct English’, but apparently he does not interpret the prevalence of certain ‘errors’ as a sign of declining morality, much less as an attack on the English language. The problem with speaking or writing incorrectly, according to one of Cody’s ads, is that mistakes ‘may cause others to lower their estimates of your education and refinement’ (7) and thus may hold you back in business and life. Cody’s emphasis was on practical strategies for developing correct habits to make a good impression. As B notes, Cody was equally sensitive to usages that might be viewed as pedantic or stuffy, and he advocated a ‘colloquial middle ground’ (9).

B avoids the polemical tone that typifies linguists’ writings about prescriptivism. Instead of challenging the assumptions surrounding correctness directly, he sketches the social context surrounding Cody’s 100% self-correcting course in such a way as to expose prescriptivist ideas for what they are: matters of etiquette. This strategy no doubt reflects the fact that the book is written for general audiences. Still, there is much here of note for specialist readers as well, particularly for linguists interested in the history of prescriptivism.