You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
The availability of financial data recorded on high-frequency level has inspired a research area which over the last decade emerged to a major area in econometrics and statistics. The growing popularity of high-frequency econometrics is driven by technological progress in trading systems and an increasing importance of intraday trading, liquidity risk, optimal order placement as well as high-frequency volatility. This book provides a state-of-the art overview on the major approaches in high-frequency econometrics, including univariate and multivariate autoregressive conditional mean approaches for different types of high-frequency variables, intensity-based approaches for financial point processes and dynamic factor models. It discusses implementation details, provides insights into properties of high-frequency data as well as institutional settings and presents applications to volatility and liquidity estimation, order book modelling and market microstructure analysis.
Recent years have witnessed a growing importance of quantitative methods in both financial research and industry. This development requires the use of advanced techniques on a theoretical and applied level, especially when it comes to the quantification of risk and the valuation of modern financial products. Applied Quantitative Finance (2nd edition) provides a comprehensive and state-of-the-art treatment of cutting-edge topics and methods. It provides solutions to and presents theoretical developments in many practical problems such as risk management, pricing of credit derivatives, quantification of volatility and copula modelling. The synthesis of theory and practice supported by computat...
This book provides a methodological framework to model univariate and multivariate irregularly spaced financial data. It gives a thorough review of recent developments in the econometric literature, puts forward existing approaches and opens up new directions. The book presents alternative ways to model so-called financial point processes using dynamic duration as well as intensity models and discusses their ability to account for specific features of point process data, like the occurrence of time-varying covariates, censoring mechanisms and multivariate structures. Moreover, it illustrates the use of various types of financial point processes to model financial market activity from different viewpoints and to construct volatility and liquidity measures under explicit consideration of the passing trading time.
This book is a collection of feature articles published in Risks in 2020. They were all written by experts in their respective fields. In these articles, they all develop and present new aspects and insights that can help us to understand and cope with the different and ever-changing aspects of risks. In some of the feature articles the probabilistic risk modeling is the central focus, whereas impact and innovation, in the context of financial economics and actuarial science, is somewhat retained and left for future research. In other articles it is the other way around. Ideas and perceptions in financial markets are the driving force of the research but they do not necessarily rely on innov...
A straightforward guide to the mathematics of algorithmic trading that reflects cutting-edge research.
Generalized method of moments (GMM) estimation of nonlinear systems has two important advantages over conventional maximum likelihood (ML) estimation: GMM estimation usually requires less restrictive distributional assumptions and remains computationally attractive when ML estimation becomes burdensome or even impossible. This book presents an in-depth treatment of the conditional moment approach to GMM estimation of models frequently encountered in applied microeconometrics. It covers both large sample and small sample properties of conditional moment estimators and provides an application to empirical industrial organization. With its comprehensive and up-to-date coverage of the subject which includes topics like bootstrapping and empirical likelihood techniques, the book addresses scientists, graduate students and professionals in applied econometrics.
Ole Martin extends well-established techniques for the analysis of high-frequency data based on regular observations to the more general setting of asynchronous and irregular observations. Such methods are much needed in practice as real data usually comes in irregular form. In the theoretical part he develops laws of large numbers and central limit theorems as well as a new bootstrap procedure to assess asymptotic laws. The author then applies the theoretical results to estimate the quadratic covariation and to construct tests for the presence of common jumps. The simulation results show that in finite samples his methods despite the much more complex setting perform comparably well as methods based on regular data. About the Author: Dr. Ole Martin completed his PhD at the Kiel University (CAU), Germany. His research focuses on high-frequency statistics for semimartingales with the aim to develop methods based on irregularly observed data.
First published in 1952, the International Bibliography of the Social Sciences (anthropology, economics, political science, and sociology) is well established as a major bibliographic reference for students, researchers and librarians in the social sciences worldwide. Key features * Authority: Rigorous standards are applied to make the IBSS the most authoritative selective bibliography ever produced. Articles and books are selected on merit by some of the world's most expert librarians and academics. *Breadth: today the IBSS covers over 2000 journals - more than any other comparable resource. The latest monograph publications are also included. *International Coverage: the IBSS reviews schol...
This book analyzes how the choice of a particular disclosure limitation method, namely additive and multiplicative measurement error, affects the quality of the data and limits its usefulness for empirical research. Generally, a disclosure limitation method can be regarded as a data filter that transforms the true data generating process. This book focuses explicitly on the consequences of additive and multiplicative measurement error for the properties of nonlinear econometric estimators. It investigates the extent to which appropriate econometric techniques can yield consistent and unbiased estimates of the true data generating process in the case of disclosure limitation. Sandra Nolte received her PhD in Economics at the University of Konstanz, Germany in 2008 and is a postdoctoral researcher at the Financial Econometric Research Centre at the Warwick Business School, UK since 2009. Her research areas include microeconometrics and financial econometrics.
European legislation affects countless aspects of daily life in modern Europe but just how does the European Union make such significant legislative decisions? How important are the formal decision-making procedures in defining decision outcomes and how important is the bargaining that takes place among the actors involved? Using a combination of detailed evidence and theoretical rigour, this volume addresses these questions and others that are central to understanding how the EU works in practice. It focuses on the practice of day-to-day decision-making in Brussels and the interactions that take place among the Member States in the Council and among the Commission, the Council and the European Parliament. A unique data set of actual Commission proposals are examined against which the authors develop, apply and test a range of explanatory models of decision-making, exemplifying how to study decision-making in other political systems using advanced theoretical tools and appropriate research design.