You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
The proceedings of the international conference “SMSEC2014”, a joint conference of the first “Social Modeling and Simulations” and the 10th “Econophysics Colloquium”, held in Kobe in November 2014 with 174 participants, are gathered herein. Cutting edge scientific researches on various social phenomena are reviewed. New methods for analysis of big data such as financial markets, automobile traffics, epidemic spreading, world-trades and social media communications are provided to clarify complex interaction and distributions underlying in these social phenomena. Robustness and fragility of social systems are discussed based on agent models and complex network models. Techniques about high performance computers are introduced for simulation of complicated social phenomena. Readers will feel the researchers minds that deep and quantitative understanding will make it possible to realize comprehensive simulations of our whole society in the near future, which will contribute to wide fields of industry also to scientific policy decision.
Computer Simulation and Computer Algebra. Starting from simple examples in classical mechanics, these introductory lectures proceed to simulations in statistical physics (using FORTRAN) and then explain in detail the use of computer algebra (by means of Reduce). This third edition takes into account the most recent version of Reduce (3.4.1) and updates the description of large-scale simulations to subjects such as the 170000 X 170000 Ising model. Furthermore, an introduction to both vector and parallel computing is given.
Covering research topics from system software such as programming languages, compilers, runtime systems, operating systems, communication middleware, and large-scale file systems, as well as application development support software and big-data processing software, this book presents cutting-edge software technologies for extreme scale computing. The findings presented here will provide researchers in these fields with important insights for the further development of exascale computing technologies. This book grew out of the post-peta CREST research project funded by the Japan Science and Technology Agency, the goal of which was to establish software technologies for exploring extreme performance computing beyond petascale computing. The respective were contributed by 14 research teams involved in the project. In addition to advanced technologies for large-scale numerical computation, the project addressed the technologies required for big data and graph processing, the complexity of memory hierarchy, and the power problem. Mapping the direction of future high-performance computing was also a central priority.
This text gives the proceedings for the fifth conference on parallel processing for scientific computing.
This book presents a systematic and coherent approach to phase transitions and critical phenomena, namely the coherent-anomaly method (CAM theory) based on cluster mean-field approximations. The first part gives a brief review of the CAM theory and the second part a collection of reprints covering the CAM basic calculations, the Blume-Emery-Griffiths model, the extended Baxter model, the quantum Heisenberg model, zero-temperature phase transitions, the KT-transition, spin glasses, the self-avoiding walk, contact processes, branching processes, the gas-liquid transition and even non-equilibrium phase transitions.
Contains 20 papers presented at the Sixth International Nobeyama Workshop on the New Century of Computational Fluid Dynamics, Nobeyama, Japan, April 21-24, 2003. These papers cover computational electromagnetics, astrophysical topics, CFD research and applications in general, large-eddy simulation, mesh generation topics, visualization, and more.
This book provides an easy-to-read introduction into quantum computing as well as classical simulation of quantum circuits. The authors showcase the enormous potential that can be unleashed when doing these simulations using decision diagrams—a data structure common in the design automation community but hardly used in quantum computing yet. In fact, the covered algorithms and methods are able to outperform previously proposed solutions on certain use cases and, hence, provide a complementary solution to established approaches. The award-winning methods are implemented and available as open-source under free licenses and can be easily integrated into existing frameworks such as IBM’s Qiskit or Atos’ QLM.
Artificial intelligence (AI) is regarded as the science and technology for producing an intelligent machine, particularly, an intelligent computer program. Machine learning is an approach to realizing AI comprising a collection of statistical algorithms, of which deep learning is one such example. Due to the rapid development of computer technology, AI has been actively explored for a variety of academic and practical purposes in the context of financial markets. This book focuses on the broad topic of “AI and Financial Markets”, and includes novel research associated with this topic. The book includes contributions on the application of machine learning, agent-based artificial market simulation, and other related skills to the analysis of various aspects of financial markets.