You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
Architecture of a Database System presents an architectural discussion of DBMS design principles, including process models, parallel architecture, storage system design, transaction system implementation, query processor and optimizer architectures, and typical shared components and utilities.
The problem of privacy-preserving data analysis has a long history spanning multiple disciplines. As electronic data about individuals becomes increasingly detailed, and as technology enables ever more powerful collection and curation of these data, the need increases for a robust, meaningful, and mathematically rigorous definition of privacy, together with a computationally rich class of algorithms that satisfy this definition. Differential Privacy is such a definition. The Algorithmic Foundations of Differential Privacy starts out by motivating and discussing the meaning of differential privacy, and proceeds to explore the fundamental techniques for achieving differential privacy, and the ...
Theoretical results suggest that in order to learn the kind of complicated functions that can represent high-level abstractions (e.g. in vision, language, and other AI-level tasks), one may need deep architectures. Deep architectures are composed of multiple levels of non-linear operations, such as in neural nets with many hidden layers or in complicated propositional formulae re-using many sub-formulae. Searching the parameter space of deep architectures is a difficult task, but learning algorithms such as those for Deep Belief Networks have recently been proposed to tackle this problem with notable success, beating the state-of-the-art in certain areas. This paper discusses the motivations and principles regarding learning algorithms for deep architectures, in particular those exploiting as building blocks unsupervised learning of single-layer models such as Restricted Boltzmann Machines, used to construct deeper models such as Deep Belief Networks.
The Efficient Market Hypothesis (EMH) asserts that, at all times, the price of a security reflects all available information about its fundamental value. The implication of the EMH for investors is that, to the extent that speculative trading is costly, speculation must be a loser's game. Hence, under the EMH, a passive strategy is bound eventually to beat a strategy that uses active management, where active management is characterized as trading that seeks to exploit mispriced assets relative to a risk-adjusted benchmark. The EMH has been refined over the past several decades to reflect the realism of the marketplace, including costly information, transactions costs, financing, agency costs...
Financial Markets and the Real Economy reviews the current academic literature on the macroeconomics of finance.
This survey summarizes almost 50 years of research and development in the field of Augmented Reality (AR). From early research in the 1960's until widespread availability by the 2010's there has been steady progress towards the goal of being able to seamlessly combine real and virtual worlds. We provide an overview of the common definitions of AR, and show how AR fits into taxonomies of other related technologies. A history of important milestones in Augmented Reality is followed by sections on the key enabling technologies of tracking, display and input devices. We also review design guidelines and provide some examples of successful AR applications. Finally, we conclude with a summary of directions for future work and a review of some of the areas that are currently being researched.
This survey covers techniques and approaches that promise to directly enable opinion-oriented information-seeking systems.
The magic of search engines starts with crawling. While at first glance Web crawling may appear to be merely an application of breadth-first-search, the truth is that there are many challenges ranging from systems concerns such as managing very large data structures to theoretical questions such as how often to revisit evolving content sources. Web Crawling outlines the key scientific and practical challenges, describes the state-of-the-art models and solutions, and highlights avenues for future work. Web Crawling is intended for anyone who wishes to understand or develop crawler software, or conduct research related to crawling.
The goal of Optimal Transport (OT) is to define geometric tools that are useful to compare probability distributions. Their use dates back to 1781. Recent years have witnessed a new revolution in the spread of OT, thanks to the emergence of approximate solvers that can scale to sizes and dimensions that are relevant to data sciences. Thanks to this newfound scalability, OT is being increasingly used to unlock various problems in imaging sciences (such as color or texture processing), computer vision and graphics (for shape manipulation) or machine learning (for regression, classification and density fitting). This monograph reviews OT with a bias toward numerical methods and their applicatio...
An Introduction to Variational Autoencoders provides a quick summary for the of a topic that has become an important tool in modern-day deep learning techniques.