You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
This book covers the techniques of data mining, knowledge discovery, genetic algorithms, neural networks, bootstrapping, machine learning, and Monte Carlo simulation. Computational finance, an exciting new cross-disciplinary research area, draws extensively on the tools and techniques of computer science, statistics, information systems, and financial economics. This book covers the techniques of data mining, knowledge discovery, genetic algorithms, neural networks, bootstrapping, machine learning, and Monte Carlo simulation. These methods are applied to a wide range of problems in finance, including risk management, asset allocation, style analysis, dynamic trading and hedging, forecasting, and option pricing. The book is based on the sixth annual international conference Computational Finance 1999, held at New York University's Stern School of Business.
Risk control and derivative pricing have become of major concern to financial institutions, and there is a real need for adequate statistical tools to measure and anticipate the amplitude of the potential moves of the financial markets. Summarising theoretical developments in the field, this 2003 second edition has been substantially expanded. Additional chapters now cover stochastic processes, Monte-Carlo methods, Black-Scholes theory, the theory of the yield curve, and Minority Game. There are discussions on aspects of data analysis, financial products, non-linear correlations, and herding, feedback and agent based models. This book has become a classic reference for graduate students and researchers working in econophysics and mathematical finance, and for quantitative analysts working on risk management, derivative pricing and quantitative trading strategies.
In the late 1980s, as the empirical appeal of macro-economic exchange rate models began to fade, a few people including Professor Charles Goodhart at the London School of Economics and researchers at Olsen & Associates in Zurich, started to collect intra-daily exchange rate data. The resulting database provides new insight into the foreign exchange markets and thereby opens up previously unexplored avenues of research. Intra-Daily Exchange Rate Movements presents an extensive study of the Olsen & Associates database and is one of the first monographs in this exciting new area. This book aims to provide a systematic study of the characteristics of intra-daily exchange rate data as well as an ...
The interaction between mathematicians, statisticians and econometricians working in actuarial sciences and finance is producing numerous meaningful scientific results. This volume introduces new ideas, in the form of four-page papers, presented at the international conference Mathematical and Statistical Methods for Actuarial Sciences and Finance (MAF), held at Universidad Carlos III de Madrid (Spain), 4th-6th April 2018. The book covers a wide variety of subjects in actuarial science and financial fields, all discussed in the context of the cooperation between the three quantitative approaches. The topics include: actuarial models; analysis of high frequency financial data; behavioural fin...
A hands-on guide to the fast and ever-changing world of high-frequency, algorithmic trading Financial markets are undergoing rapid innovation due to the continuing proliferation of computer power and algorithms. These developments have created a new investment discipline called high-frequency trading. This book covers all aspects of high-frequency trading, from the business case and formulation of ideas through the development of trading systems to application of capital and subsequent performance evaluation. It also includes numerous quantitative trading strategies, with market microstructure, event arbitrage, and deviations arbitrage discussed in great detail. Contains the tools and techniques needed for building a high-frequency trading system Details the post-trade analysis process, including key performance benchmarks and trade quality evaluation Written by well-known industry professional Irene Aldridge Interest in high-frequency trading has exploded over the past year. This book has what you need to gain a better understanding of how it works and what it takes to apply this approach to your trading endeavors.
As the world's political and economic leaders struggle with the aftermath of the Financial Debacle of 2008, this book asks the question: have financial crises presented opportunities to rebuild the financial system? Examining eight global financial crises since the late nineteenth century, this new historical study offers insights into how the financial landscape - banks, governance, regulation, international cooperation, and balance of power - has been (or failed to be) reshaped after a systemic shock. It includes careful consideration of the Great Depression of the 1930s, the only experience of comparable moment to the recession of the early twenty-first century, yet also marked in its differences. Taking into account not only the economic and business aspects of financial crises, but also their political and socio-cultural dimensions, the book highlights both their idiosyncrasies and common features, and assesses their impact in the broader context of long-term historical development.
Today, investment in financial technology and digital transformation is reshaping the financial landscape and generating many opportunities. Too often, however, engineers and professionals in financial institutions lack a practical and comprehensive understanding of the concepts, problems, techniques, and technologies necessary to build a modern, reliable, and scalable financial data infrastructure. This is where financial data engineering is needed. A data engineer developing a data infrastructure for a financial product possesses not only technical data engineering skills but also a solid understanding of financial domain-specific challenges, methodologies, data ecosystems, providers, form...
The authors have done an admirable job...This book is a revealing and fascinating glimpse of the technologies which may rule the financial world in the years to come. --The Financial Times, February 1997 [This] new book looks at the progress made, both in practice and in theory, toward producing a usable model of the market. Some of the theoretical foundations of efficient market theory are being demolished.
The subprime crisis has shown that the sophisticated risk management models used by banks and insurance companies had serious flaws. Some people even suggest that these models are completely useless. Others claim that the crisis was just an unpredictable accident that was largely amplified by the lack of expertise and even naivety of many investors. This book takes the middle view. It shows that these models have been designed for "tranquil times", when financial markets behave smoothly and efficiently. However, we are living in more and more "turbulent times": large risks materialize much more often than predicted by "normal" models, financial models periodically go through bubbles and crashes. Moreover, financial risks result from the decisions of economic actors who can have incentives to take excessive risks, especially when their remunerations are ill designed. The book provides a clear account of the fundamental hypotheses underlying the most popular models of risk management and show that these hypotheses are flawed. However it shows that simple models can still be useful, provided they are well understood and used with caution.
This volume presents the most recent achievements in risk measurement and management, as well as regulation of the financial industry, with contributions from prominent scholars and practitioners, and provides a comprehensive overview of recent emerging standards in risk management from an interdisciplinary perspective.