You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
This book presents a selection of topics from probability theory. Essentially, the topics chosen are those that are likely to be the most useful to someone planning to pursue research in the modern theory of stochastic processes. The prospective reader is assumed to have good mathematical maturity. In particular, he should have prior exposure to basic probability theory at the level of, say, K.L. Chung's 'Elementary probability theory with stochastic processes' (Springer-Verlag, 1974) and real and functional analysis at the level of Royden's 'Real analysis' (Macmillan, 1968). The first chapter is a rapid overview of the basics. Each subsequent chapter deals with a separate topic in detail. T...
This research monograph summarizes a line of research that maps certain classical problems of discrete mathematics and operations research - such as the Hamiltonian Cycle and the Travelling Salesman Problems - into convex domains where continuum analysis can be carried out. Arguably, the inherent difficulty of these, now classical, problems stems precisely from the discrete nature of domains in which these problems are posed. The convexification of domains underpinning these results is achieved by assigning probabilistic interpretation to key elements of the original deterministic problems. In particular, the approaches summarized here build on a technique that embeds Hamiltonian Cycle and T...
The first comprehensive account of controlled diffusions with a focus on ergodic or 'long run average' control.
This is the only book on stochastic modelling of infinite dimensional dynamical systems.
Since the publication of the first edition of the present volume in 1980, the stochastic stability of differential equations has become a very popular subject of research in mathematics and engineering. To date exact formulas for the Lyapunov exponent, the criteria for the moment and almost sure stability, and for the existence of stationary and periodic solutions of stochastic differential equations have been widely used in the literature. In this updated volume readers will find important new results on the moment Lyapunov exponent, stability index and some other fields, obtained after publication of the first edition, and a significantly expanded bibliography. This volume provides a solid foundation for students in graduate courses in mathematics and its applications. It is also useful for those researchers who would like to learn more about this subject, to start their research in this area or to study the properties of concrete mechanical systems subjected to random perturbations.
This volume contains the proceedings of the workshop on Variational and Optimal Control Problems on Unbounded Domains, held in memory of Arie Leizarowitz, from January 9-12, 2012, in Haifa, Israel. The workshop brought together a select group of worldwide experts in optimal control theory and the calculus of variations, working on problems on unbounded domains. The papers in this volume cover many different areas of optimal control and its applications. Topics include needle variations in infinite-horizon optimal control, Lyapunov stability with some extensions, small noise large time asymptotics for the normalized Feynman-Kac semigroup, linear-quadratic optimal control problems with state delays, time-optimal control of wafer stage positioning, second order optimality conditions in optimal control, state and time transformations of infinite horizon problems, turnpike properties of dynamic zero-sum games, and an infinite-horizon variational problem on an infinite strip. This book is co-published with Bar-Ilan University (Ramat-Gan, Israel).
This book provides a unified approach for the study of constrained Markov decision processes with a finite state space and unbounded costs. Unlike the single controller case considered in many other books, the author considers a single controller with several objectives, such as minimizing delays and loss, probabilities, and maximization of throughputs. It is desirable to design a controller that minimizes one cost objective, subject to inequality constraints on other cost objectives. This framework describes dynamic decision problems arising frequently in many engineering fields. A thorough overview of these applications is presented in the introduction. The book is then divided into three sections that build upon each other.
Optical networks epitomize complex communication systems, and they comprise the Internet’s infrastructural backbone. The first of its kind, this book develops the mathematical framework needed from a control perspective to tackle various game-theoretical problems in optical networks. In doing so, it aims to help design control algorithms that optimally allocate the resources of these networks. With its fresh problem-solving approach, Game Theory in Optical Networks is a unique resource for researchers, practitioners, and graduate students in applied mathematics and systems/control engineering, as well as those in electrical and computer engineering.