You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
This book studies formal semantics in modern type theories (MTTsemantics). Compared with simple type theory, MTTs have much richer type structures and provide powerful means for adequate semantic constructions. This offers a serious alternative to the traditional settheoretical foundation for linguistic semantics and opens up a new avenue for developing formal semantics that is both model-theoretic and proof-theoretic, which was not available before the development of MTTsemantics. This book provides a reader-friendly and precise description of MTTs and offers a comprehensive introduction to MTT-semantics. It develops several case studies, such as adjectival modification and copredication, to exemplify the attractiveness of using MTTs for the study of linguistic meaning. It also examines existing proof assistant technology based on MTT-semantics for the verification of semantic constructions and reasoning in natural language. Several advanced topics are also briefly studied, including dependent event types, an application of dependent typing to event semantics.
Head-Driven Phrase Structure Grammar (HPSG) is a constraint-based or declarative approach to linguistic knowledge, which analyses all descriptive levels (phonology, morphology, syntax, semantics, pragmatics) with feature value pairs, structure sharing, and relational constraints. In syntax it assumes that expressions have a single relatively simple constituent structure. This volume provides a state-of-the-art introduction to the framework. Various chapters discuss basic assumptions and formal foundations, describe the evolution of the framework, and go into the details of the main syntactic phenomena. Further chapters are devoted to non-syntactic levels of description. The book also considers related fields and research areas (gesture, sign languages, computational linguistics) and includes chapters comparing HPSG with other frameworks (Lexical Functional Grammar, Categorial Grammar, Construction Grammar, Dependency Grammar, and Minimalism).
The Constraint Solving and Language Processing (CSLP) workshop considers the role of constraints in the representation of language and the implementation of language processing applications. This theme should be interpreted inclusively: it includes contributions from linguistics, computer science, psycholinguistics and related areas, with a particular interest in interdisciplinary perspectives. Constraints are widely used in linguistics, computer science, and psychology. How they are used, however, varies widely according to the research domain: knowledge representation, cognitive modelling, problem solving mechanisms, etc. These different perspectives are complementary, each one adding a piece to the puzzle.
Edited under the auspices of the Association of Logic, Language andInformation (FoLLI), this book constitutes the refereed proceedings ofthe 20th anniversary of the International Conference on LogicalAspects of Computational Linguistics, LACL 2016, held in LORIA Nancy,France, in December 2016. The 19 contributed papers, presentedtogether with 4 invited papers and 6 abstracts, were carefullyreviewed and selected from 38 submissions. The focus of the conferenceis the use of type theoretic, proof theoretic, and model theoreticmethods for describing and formalising natural language syntax,semantics, and pragmatics as well as the implementation of thecorresponding tools.
This is an open access title available under the terms of a CC BY-NC-ND 4.0 International licence. It is free to read at Oxford Scholarship Online and offered as a free PDF download from OUP and selected open access locations. This book characterizes a notion of type that covers both linguistic and non-linguistic action, and lays the foundations for a theory of action based on a Theory of Types with Records (TTR). Robin Cooper argues that a theory of language based on action allows the adoption of a perspective on linguistic content that is centred on interaction in dialogue; this approach is crucially different to the traditional view of natural languages as essentially similar to formal la...
In Enthymemes and Topoi in Dialogue, Ellen Breitholtz presents a novel and precise account of reasoning from an interactional perspective. The account draws on the concepts of enthymemes and topoi, originating in Aristotelian rhetoric and dialectic, and integrates these in a formal dialogue semantic account using TTR, a type theory with records. Argumentation analysis and formal approaches to reasoning often focus the logical validity of arguments on inferences made in discourse from a god’s-eye perspective. In contrast, Breitholtz’s account emphasises the individual perspectives of interlocutors and the function and acceptability of their reasoning in context. This provides an analysis of interactions where interlocutors have access to different topoi and therefore make different inferences.
Meaning is a fundamental concept in Natural Language Processing (NLP), in the tasks of both Natural Language Understanding (NLU) and Natural Language Generation (NLG). This is because the aims of these fields are to build systems that understand what people mean when they speak or write, and that can produce linguistic strings that successfully express to people the intended content. In order for NLP to scale beyond partial, task-specific solutions, researchers in these fields must be informed by what is known about how humans use language to express and understand communicative intents. The purpose of this book is to present a selection of useful information about semantics and pragmatics, as understood in linguistics, in a way that's accessible to and useful for NLP practitioners with minimal (or even no) prior training in linguistics.
Algebraic Structures in Natural Language addresses a central problem in cognitive science concerning the learning procedures through which humans acquire and represent natural language. Until recently algebraic systems have dominated the study of natural language in formal and computational linguistics, AI, and the psychology of language, with linguistic knowledge seen as encoded in formal grammars, model theories, proof theories and other rule-driven devices. Recent work on deep learning has produced an increasingly powerful set of general learning mechanisms which do not apply rule-based algebraic models of representation. The success of deep learning in NLP has led some researchers to que...
Philosophy of Linguistics investigates the foundational concepts and methods of linguistics, the scientific study of human language. This groundbreaking collection, the most thorough treatment of the philosophy of linguistics ever published, brings together philosophers, scientists and historians to map out both the foundational assumptions set during the second half of the last century and the unfolding shifts in perspective in which more functionalist perspectives are explored. The opening chapter lays out the philosophical background in preparation for the papers that follow, which demonstrate the shift in the perspective of linguistics study through discussions of syntax, semantics, phon...
This book deals with the category of case and where to place it in grammar. Chapters explore a range of issues relating to the division between syntactic Case and morphological case, investigating the relevant phenomena, and drawing on data from a variety of typologically diverse languages.