You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
Knowledge and Inference discusses an important problem for software systems: How do we treat knowledge and ideas on a computer and how do we use inference to solve problems on a computer? The book talks about the problems of knowledge and inference for the purpose of merging artificial intelligence and library science. The book begins by clarifying the concept of ""knowledge"" from many points of view, followed by a chapter on the current state of library science and the place of artificial intelligence in library science. Subsequent chapters cover central topics in the artificial intelligence: search and problem solving, methods of making proofs, and the use of knowledge in looking for a proof. There is also a discussion of how to use the knowledge system. The final chapter describes a popular expert system. It describes tools for building expert systems using an example based on Expert Systems—A Practical Introduction by P. Sell (Macmillian, 1985). This type of software is called an ""expert system shell."" This book was written as a textbook for undergraduate students covering only the basics but explaining as much detail as possible.
This dictionary is intended for anyone who is interested in translation and translation technology. Especially, translation as an academic discipline, a language activity, a specialized profession, or a business undertaking. The book covers theory and practice of translation and interpretation in a number of areas. Addressing and explaining important concepts in computer translation, computer-aided translation, and translation tools. Most popular and commercially available translation software are included along with their website addresses for handy reference. This dictionary has 1,377 entries. The entries are alphabetized and defined in a simple and concise manner.
The field of machine translation (MT) - the automation of translation between human languages - has existed for more than 50 years. MT helped to usher in the field of computational linguistics and has influenced methods and applications in knowledge representation, information theory, and mathematical statistics.
CICLing 2001 is the second annual Conference on Intelligent text processing and Computational Linguistics (hence the name CICLing), see www.CICLing.org. It is intended to provide a balanced view of the cutting edge developments in both theoretical foundations of computational linguistics and practice of natural language text processing with its numerous applications. A feature of the CICLing conferences is their wide scope that covers nearly all areas of computational linguistics and all aspects of natural language processing applications. The conference is a forum for dialogue between the specialists working in these two areas. This year our invited speakers were Graeme Hirst (U. Toronto, C...
Discourse anaphora is a challenging linguistic phenomenon that has given rise to research in fields as diverse as linguistics, computational linguistics and cognitive science. Because of the diversity of approaches these fields bring to the anaphora problem, the editors of this volume argue that there needs to be a synthesis, or at least a principled attempt to draw the differing strands of anaphora research together. The selected papers in this volume all contribute to the aim of synthesis and were selected to represent the growing importance of corpus-based and computational approaches to anaphora description, and to developing natural language systems for resolving anaphora in natural language.
A concise, nontechnical overview of the development of machine translation, including the different approaches, evaluation issues, and major players in the industry. The dream of a universal translation device goes back many decades, long before Douglas Adams's fictional Babel fish provided this service in The Hitchhiker's Guide to the Galaxy. Since the advent of computers, research has focused on the design of digital machine translation tools—computer programs capable of automatically translating a text from a source language to a target language. This has become one of the most fundamental tasks of artificial intelligence. This volume in the MIT Press Essential Knowledge series offers a...
In Marcus (1980), deterministic parsers were introduced. These are parsers which satisfy the conditions of Marcus's determinism hypothesis, i.e., they are strongly deterministic in the sense that they do not simulate non determinism in any way. In later work (Marcus et al. 1983) these parsers were modified to construct descriptions of trees rather than the trees them selves. The resulting D-theory parsers, by working with these descriptions, are capable of capturing a certain amount of ambiguity in the structures they build. In this context, it is not clear what it means for a parser to meet the conditions of the determinism hypothesis. The object of this work is to clarify this and other issues pertaining to D-theory parsers and to provide a framework within which these issues can be examined formally. Thus we have a very narrow scope. We make no ar guments about the linguistic issues D-theory parsers are meant to address, their relation to other parsing formalisms or the notion of determinism in general. Rather we focus on issues internal to D-theory parsers themselves.
This classified and annotated research bibliography is meant to serve as an introduction to the rich field of Japanese psycholinguistics, by providing an exhaustive inventory of what has been done in or about Japanese in a psycholinguistic sense. Thus, this volume captures the tradition of psycholinguistic research currently being pursued in Japan, its history and development over the past thirty years, and its current directions and research themes, as well as international research in modern psycholinguistics which targets the Japanese language as the focal point of empirical procedures or deductive analysis in psychology, linguistics, psycholinguistics, and cognitive science. The bibliogr...
Information technology has created new challenges for translation. In this text contributors in computational linguistics, machine translation and translation studies discuss the effect of electronic tools on translation, and the conceptual gaps raised by the interface of human and machine.
The FGCS project was introduced at a congerence in 1981 and commenced the following year. This volume contains the reports on the final phase of the project, showing how the research goals set were achieved.