You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
Liquid markets generate hundreds or thousands of ticks (the minimum change in price a security can have, either up or down) every business day. Data vendors such as Reuters transmit more than 275,000 prices per day for foreign exchange spot rates alone. Thus, high-frequency data can be a fundamental object of study, as traders make decisions by observing high-frequency or tick-by-tick data. Yet most studies published in financial literature deal with low frequency, regularly spaced data. For a variety of reasons, high-frequency data are becoming a way for understanding market microstructure. This book discusses the best mathematical models and tools for dealing with such vast amounts of data.This book provides a framework for the analysis, modeling, and inference of high frequency financial time series. With particular emphasis on foreign exchange markets, as well as currency, interest rate, and bond futures markets, this unified view of high frequency time series methods investigates the price formation process and concludes by reviewing techniques for constructing systematic trading models for financial assets.
This book covers the techniques of data mining, knowledge discovery, genetic algorithms, neural networks, bootstrapping, machine learning, and Monte Carlo simulation. Computational finance, an exciting new cross-disciplinary research area, draws extensively on the tools and techniques of computer science, statistics, information systems, and financial economics. This book covers the techniques of data mining, knowledge discovery, genetic algorithms, neural networks, bootstrapping, machine learning, and Monte Carlo simulation. These methods are applied to a wide range of problems in finance, including risk management, asset allocation, style analysis, dynamic trading and hedging, forecasting, and option pricing. The book is based on the sixth annual international conference Computational Finance 1999, held at New York University's Stern School of Business.
Risk control and derivative pricing have become of major concern to financial institutions, and there is a real need for adequate statistical tools to measure and anticipate the amplitude of the potential moves of the financial markets. Summarising theoretical developments in the field, this 2003 second edition has been substantially expanded. Additional chapters now cover stochastic processes, Monte-Carlo methods, Black-Scholes theory, the theory of the yield curve, and Minority Game. There are discussions on aspects of data analysis, financial products, non-linear correlations, and herding, feedback and agent based models. This book has become a classic reference for graduate students and researchers working in econophysics and mathematical finance, and for quantitative analysts working on risk management, derivative pricing and quantitative trading strategies.
This book covers the latest theories and empirical findings of financial risk, its measurement and management, and its applications in the world of finance.
Twenty-four contributions, intended for a wide audience from various disciplines, cover a variety of applications of heavy-tailed modeling involving telecommunications, the Web, insurance, and finance. Along with discussion of specific applications are several papers devoted to time series analysis, regression, classical signal/noise detection problems, and the general structure of stable processes, viewed from a modeling standpoint. Emphasis is placed on developments in handling the numerical problems associated with stable distribution (a main technical difficulty until recently). No index. Annotation copyrighted by Book News, Inc., Portland, OR
In the late 1980s, as the empirical appeal of macro-economic exchange rate models began to fade, a few people including Professor Charles Goodhart at the London School of Economics and researchers at Olsen & Associates in Zurich, started to collect intra-daily exchange rate data. The resulting database provides new insight into the foreign exchange markets and thereby opens up previously unexplored avenues of research. Intra-Daily Exchange Rate Movements presents an extensive study of the Olsen & Associates database and is one of the first monographs in this exciting new area. This book aims to provide a systematic study of the characteristics of intra-daily exchange rate data as well as an ...
As the world's political and economic leaders struggle with the aftermath of the Financial Debacle of 2008, this book asks the question: have financial crises presented opportunities to rebuild the financial system? Examining eight global financial crises since the late nineteenth century, this new historical study offers insights into how the financial landscape - banks, governance, regulation, international cooperation, and balance of power - has been (or failed to be) reshaped after a systemic shock. It includes careful consideration of the Great Depression of the 1930s, the only experience of comparable moment to the recession of the early twenty-first century, yet also marked in its differences. Taking into account not only the economic and business aspects of financial crises, but also their political and socio-cultural dimensions, the book highlights both their idiosyncrasies and common features, and assesses their impact in the broader context of long-term historical development.
Today, investment in financial technology and digital transformation is reshaping the financial landscape and generating many opportunities. Too often, however, engineers and professionals in financial institutions lack a practical and comprehensive understanding of the concepts, problems, techniques, and technologies necessary to build a modern, reliable, and scalable financial data infrastructure. This is where financial data engineering is needed. A data engineer developing a data infrastructure for a financial product possesses not only technical data engineering skills but also a solid understanding of financial domain-specific challenges, methodologies, data ecosystems, providers, form...
The authors have done an admirable job...This book is a revealing and fascinating glimpse of the technologies which may rule the financial world in the years to come. --The Financial Times, February 1997 [This] new book looks at the progress made, both in practice and in theory, toward producing a usable model of the market. Some of the theoretical foundations of efficient market theory are being demolished.
The subprime crisis has shown that the sophisticated risk management models used by banks and insurance companies had serious flaws. Some people even suggest that these models are completely useless. Others claim that the crisis was just an unpredictable accident that was largely amplified by the lack of expertise and even naivety of many investors. This book takes the middle view. It shows that these models have been designed for "tranquil times", when financial markets behave smoothly and efficiently. However, we are living in more and more "turbulent times": large risks materialize much more often than predicted by "normal" models, financial models periodically go through bubbles and crashes. Moreover, financial risks result from the decisions of economic actors who can have incentives to take excessive risks, especially when their remunerations are ill designed. The book provides a clear account of the fundamental hypotheses underlying the most popular models of risk management and show that these hypotheses are flawed. However it shows that simple models can still be useful, provided they are well understood and used with caution.