You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
This monograph presents many interesting results, old and new, about theta functions, Abelian integrals and kernel functions on closed Riemann surfaces. It begins with a review of classical kernel function theory for plane domains. Next there is a discussion of function theory on closed Riemann surfaces, leading to explicit formulas for Szegö kernels in terms of the Klein prime function and theta functions. Later sections develop explicit relations between the classical Szegö and Bergman kernels and between the Szegö and modified (semi-exact) Bergman kernels. The author's results allow him to solve an open problem mentioned by L. Sario and K. Oikawa in 1969.
Kernel Functions and Differential Equations
The Kernel Function and Conformal Mapping by Stefan Bergman is a revised edition of ""The Kernel Function"". The author has made extensive changes in the original volume. The present book will be of interest not only to mathematicians, but also to engineers, physicists, and computer scientists. The applications of orthogonal functions in solving boundary value problems and conformal mappings onto canonical domains are discussed; and publications are indicated where programs for carrying out numerical work using high-speed computers can be found.The unification of methods in the theory of functions of one and several complex variables is one of the purposes of introducing the kernel function and the domains with a distinguished boundary. This approach has been extensively developed during the last two decades. This second edition of Professor Bergman's book reviews this branch of the theory including recent developments not dealt with in the first edition. The presentation of the topics is simple and presupposes only knowledge of an elementary course in the theory of analytic functions of one variable.
This monograph reviews different methods to design or learn valid kernel functions for multiple outputs, paying particular attention to the connection between probabilistic and regularization methods.
Covers the theory of boundary value problems in partial differential equations and discusses a portion of the theory from a unifying point of view while providing an introduction to each branch of its applications. 1953 edition.
Offering a fundamental basis in kernel-based learning theory, this book covers both statistical and algebraic principles. It provides over 30 major theorems for kernel-based supervised and unsupervised learning models. The first of the theorems establishes a condition, arguably necessary and sufficient, for the kernelization of learning models. In addition, several other theorems are devoted to proving mathematical equivalence between seemingly unrelated models. With over 25 closed-form and iterative algorithms, the book provides a step-by-step guide to algorithmic procedures and analysing which factors to consider in tackling a given problem, enabling readers to improve specifically designed learning algorithms, build models for new applications and develop efficient techniques suitable for green machine learning technologies. Numerous real-world examples and over 200 problems, several of which are Matlab-based simulation exercises, make this an essential resource for graduate students and professionals in computer science, electrical and biomedical engineering. Solutions to problems are provided online for instructors.
This book contains select chapters on support vector algorithms from different perspectives, including mathematical background, properties of various kernel functions, and several applications. The main focus of this book is on orthogonal kernel functions, and the properties of the classical kernel functions—Chebyshev, Legendre, Gegenbauer, and Jacobi—are reviewed in some chapters. Moreover, the fractional form of these kernel functions is introduced in the same chapters, and for ease of use for these kernel functions, a tutorial on a Python package named ORSVM is presented. The book also exhibits a variety of applications for support vector algorithms, and in addition to the classificat...
This memoir is a study of Ray-Singer analytic torsion for hermitian vector bundles on a compact Riemann surface [italic]C. The torsion is expressed through the trace of a modified resolvent. Thus, one can develop perturbation-curvature formulae for the Green-Szegö kernel and also for the torsion in terms of the Ahlfors-Bers complex structure of the Teichmuller space and Mumford complex structure of the moduli space of stable bundles of degree zero on [italic]C.
The representer theorem from the reproducing kernel Hilbert space theory is the origin of many kernel-based machine learning and signal modelling techniques that are popular today. Most kernel functions used in practical applications behave in a homogeneous manner across the domain of the signal of interest, and they are called stationary kernels. One open problem in the literature is the specification of a non-stationary kernel that is computationally tractable. Some recent works solve large-scale optimization problems to obtain such kernels, and they often suffer from non-identifiability issues in their optimization problem formulation. Many practical problems can benefit from using applic...
A comprehensive and self-contained introduction to Gaussian processes, which provide a principled, practical, probabilistic approach to learning in kernel machines. Gaussian processes (GPs) provide a principled, practical, probabilistic approach to learning in kernel machines. GPs have received increased attention in the machine-learning community over the past decade, and this book provides a long-needed systematic and unified treatment of theoretical and practical aspects of GPs in machine learning. The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics. The book deals with the supervised-learning problem for both ...