You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
The Handbook of Terminology Management is a unique work designed to meet the practical needs of terminologists, translators, lexicographers, subject specialists (e.g., engineers, medical professionals, etc.), standardizers and others who have to solve terminological problems in their daily work.In more than 900 pages, the Handbook brings together contributions from approximately 50 expert authorities in the field. The Handbook covers a broad range of topics integrated from an international perspective and treats such fundamental issues as: practical methods of terminology management; creation and use of terminological tools (terminology databases, on-line dictionaries, etc.); terminological applications.The high level of expertise provided by the contributors, combined with the wide range of perspectives they represent, results in a thorough coverage of all facets of a burgeoning field. The lay-out of the Handbook is specially designed for quick and for cross reference, with hypertext and an extensive index.See also "Handbook of Terminology Management" set (volumes 1 and 2).
This volume presents the results of the international symposium Chunks in Corpus Linguistics and Cognitive Linguistics, held at the University of Erlangen-Nuremberg to honour John Sinclair's contribution to the development of linguistics in the second half of the twentieth century. The main theme of the book, highlighting important aspects of Sinclair's work, is the idiomatic character of language with a focus on chunks (in the sense of prefabricated items) as extended units of meaning. To pay tribute to Sinclair's enormous impact on research in this field, the volume contains two contributions which deal explicitly with his work, including material from unpublished manuscripts. Beyond that,...
In recent years, research on valency has led to important insights into the nature of language. Some of these findings are published in this volume for the first time with up-to-date accounts of language description and new reflections on language, above all for English and German. The volume also presents examples of contrastive analysis, which are of use for all those who deal professionally with these two languages. Furthermore, the articles in the psycholinguistic and computational linguistics section demonstrate the applicability and value of valency theory for these approaches and shed light on a fruitful cooperation between theoretical and descriptive linguistics and applied disciplin...
This volume constitutes the proceedings of the Third International Workshop of the European Association for Machine Translation, held in Heidelberg, Germany in April 1993. The EAMT Workshops traditionally aim at bringing together researchers, developers, users, and others interested in the field of machine or computer-assisted translation research, development and use. The volume presents thoroughly revised versions of the 15 best workshop contributions together with an introductory survey by the volume editor. The presentations are centered primarily on questions of acquiring, sharing, and managing lexical data, but also address aspects of lexical description.
What are the principles according to which lexical data should be represented in order to form a lexical database that can serve as a basis for the construction of several different monofunctional dictionaries? Starting from the notion of lexicographic functions as defined by Henning Bergenholtz and Sven Tarp, this question is approached by analysing how current electronic dictionaries and lexical resource models attempt to satisfy the needs of different types of users in different usage situations, in order to identify general requirements on the model for a lexical resource that aims to be “multifunctional” in the above sense. Based on this analysis, this book explores the use of formalisms developed in the context of the semantic web to approach both general and specific lexicographic questions, in particular the representation of multi-word expressions and their properties and relations. In doing so, this book not only addresses several topics which are of relevance to lexicographers and computational linguists alike, but also supports its claims by providing a prototypical implementation of a multifunctional lexical resource using semantic web formalisms.
What is community interpreting? What are the roles of the community interpreter? What are the standards, evaluation methods and accreditation procedures pertaining to community interpreting? What training is available or required in this field? What are the current issues and practices in community interpreting in different parts of the world? These key questions, discussed at the first international conference on community interpreting, are addressed in this collection of selected conference papers. The merit of this volume is that it presents the first comprehensive and global view of a rapidly growing profession, which has developed out of the need to provide services to those who do not speak the official language(s) of a country. Both the problems and the successes related to the challenge of providing adequate community interpreting services in different countries are covered in this volume.
Lexicography is one of the oldest linguistic sub-disciplines and began to compile extensive corpora early on as the basis for dictionary work. Surprisingly, these corpora and the dictionary articles have not been used very frequently for the study of language variation, although most dictionaries do not only contain information about word meanings and grammar, but also on regional distribution or style level. This volume explores the value of lexicographical data in the study of language variation. The contributions focus on different types of dictionaries for different languages as well as on various linguistic research questions ranging from the dictionaries' approach to loan words or morphology to practical issues regarding digital frameworks for lexicographic work.
The Stylistique comparée du français et de l’anglais has become a standard text in the French-speaking world for the study of comparative stylistics and the training of translators. This updated, first English edition makes Vinay & Darbelnet's classic methodology of translation available to a wider readership. The translation-oriented contrastive grammatical and stylistic analyses of the two languages are extensively exemplified by expressions, phrases and texts. Combining description with methodological guidelines for translation, this volume serves both as a course book and through its detailed index and glossary as a reference manual for specific translation problems.
This book deals with the oft-neglected tensions between perspicuity and fuzziness in specialised communication. It describes the manifestations, functions and implications of indeterminacy phenomena in a range of LSP specialisations where it has been customary to expect precision and consistency. The volume presents case studies and methodological frameworks that draw on theoretical, anthropological and cognitive linguistics, safety-critical translating, history and theory of terminology studies, development of ontologies, software localisation, jurisprudence, macroeconomics and interoperability of digital knowledge representation resources. With chapters by leading scholars drawn from eleven countries, this book contributes to the benchmarking of indeterminacy scholarship in LSP studies and is a fitting tribute to its dedicatee, Professor Heribert Picht who, even in retirement, remains a constant presence in LSP and terminology studies. The book should be of interest to scholars of the aforementioned areas.