You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
The classical theory of computation has its origins in the work of Goedel, Turing, Church, and Kleene and has been an extraordinarily successful framework for theoretical computer science. The thesis of this book, however, is that it provides an inadequate foundation for modern scientific computation where most of the algorithms are real number algorithms. The goal of this book is to develop a formal theory of computation which integrates major themes of the classical theory and which is more directly applicable to problems in mathematics, numerical analysis, and scientific computing. Along the way, the authors consider such fundamental problems as: * Is the Mandelbrot set decidable? * For s...
Most works of art, whether illustrative, musical or literary, are created subject to a set of constraints. In many (but not all) cases, these constraints have a mathematical nature, for example, the geometric transformations governing the canons of J. S. Bach, the various projection systems used in classical painting, the catalog of symmetries found in Islamic art, or the rules concerning poetic structure. This fascinating book describes geometric frameworks underlying this constraint-based creation. The author provides both a development in geometry and a description of how these frameworks fit the creative process within several art practices. He furthermore discusses the perceptual effects derived from the presence of particular geometric characteristics. The book began life as a liberal arts course and it is certainly suitable as a textbook. However, anyone interested in the power and ubiquity of mathematics will enjoy this revealing insight into the relationship between mathematics and the arts.
The goal of learning theory is to approximate a function from sample values. To attain this goal learning theory draws on a variety of diverse subjects, specifically statistics, approximation theory, and algorithmics. Ideas from all these areas blended to form a subject whose many successful applications have triggered a rapid growth during the last two decades. This is the first book to give a general overview of the theoretical foundations of the subject emphasizing the approximation theory, while still giving a balanced overview. It is based on courses taught by the authors, and is reasonably self-contained so will appeal to a broad spectrum of researchers in learning theory and adjacent fields. It will also serve as an introduction for graduate students and others entering the field, who wish to see how the problems raised in learning theory relate to other disciplines.
This book gathers threads that have evolved across different mathematical disciplines into seamless narrative. It deals with condition as a main aspect in the understanding of the performance ---regarding both stability and complexity--- of numerical algorithms. While the role of condition was shaped in the last half-century, so far there has not been a monograph treating this subject in a uniform and systematic way. The book puts special emphasis on the probabilistic analysis of numerical algorithms via the analysis of the corresponding condition. The exposition's level increases along the book, starting in the context of linear algebra at an undergraduate level and reaching in its third part the recent developments and partial solutions for Smale's 17th problem which can be explained within a graduate course. Its middle part contains a condition-based course on linear programming that fills a gap between the current elementary expositions of the subject based on the simplex method and those focusing on convex programming.
Conference in honor of Stephen Smale's 70th birthday.
A diverse collection of articles by leading experts in computational mathematics, written to appeal to established researchers and non-experts.
A collection of essays celebrating the influence of Alan Turing's work in logic, computer science and related areas.
Sure to be influential, Watanabe's book lays the foundations for the use of algebraic geometry in statistical learning theory. Many models/machines are singular: mixture models, neural networks, HMMs, Bayesian networks, stochastic context-free grammars are major examples. The theory achieved here underpins accurate estimation techniques in the presence of singularities.
Zusammenfassung: Metric algebraic geometry combines concepts from algebraic geometry and differential geometry. Building on classical foundations, it offers practical tools for the 21st century. Many applied problems center around metric questions, such as optimization with respect to distances. After a short dive into 19th-century geometry of plane curves, we turn to problems expressed by polynomial equations over the real numbers. The solution sets are real algebraic varieties. Many of our metric problems arise in data science, optimization and statistics. These include minimizing Wasserstein distances in machine learning, maximum likelihood estimation, computing curvature, or minimizing the Euclidean distance to a variety. This book addresses a wide audience of researchers and students and can be used for a one-semester course at the graduate level. The key prerequisite is a solid foundation in undergraduate mathematics, especially in algebra and geometry. This is an open access book