You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
Since the beginning of the seventies computer hardware is available to use programmable computers for various tasks. During the nineties the hardware has developed from the big main frames to personal workstations. Nowadays it is not only the hardware which is much more powerful, but workstations can do much more work than a main frame, compared to the seventies. In parallel we find a specialization in the software. Languages like COBOL for business orientated programming or Fortran for scientific computing only marked the beginning. The introduction of personal computers in the eighties gave new impulses for even further development, already at the beginning of the seven ties some special languages like SAS or SPSS were available for statisticians. Now that personal computers have become very popular the number of pro grams start to explode. Today we will find a wide variety of programs for almost any statistical purpose (Koch & Haag 1995).
This book covers all the topics found in introductory descriptive statistics courses, including simple linear regression and time series analysis, the fundamentals of inferential statistics (probability theory, random sampling and estimation theory), and inferential statistics itself (confidence intervals, testing). Each chapter starts with the necessary theoretical background, which is followed by a variety of examples. The core examples are based on the content of the respective chapter, while the advanced examples, designed to deepen students’ knowledge, also draw on information and material from previous chapters. The enhanced online version helps students grasp the complexity and the practical relevance of statistical analysis through interactive examples and is suitable for undergraduate and graduate students taking their first statistics courses, as well as for undergraduate students in non-mathematical fields, e.g. economics, the social sciences etc.
With a wealth of examples and exercises, this is a brand new edition of a classic work on multivariate data analysis. A key advantage of the work is its accessibility as it presents tools and concepts in a way that is understandable for non-mathematicians.
This book describes an interactive statistical computing environment called 1 XploRe. As the name suggests, support for exploratory statistical analysis is given by a variety of computational tools. XploRe is a matrix-oriented statistical language with a comprehensive set of basic statistical operations that provides highly interactive graphics, as well as a programming environ ment for user-written macros; it offers hard-wired smoothing procedures for effective high-dimensional data analysis. Its highly dynamic graphic capa bilities make it possible to construct student-level front ends for teaching basic elements of statistics. Hot keys make it an easy-to-use computing environment for stat...
International Association for Statistical Computing The International Association for Statistical Computing (IASC) is a Section of the International Statistical Institute. The objectives of the Association are to foster world-wide interest in e?ective statistical computing and to - change technical knowledge through international contacts and meetings - tween statisticians, computing professionals, organizations, institutions, g- ernments and the general public. The IASC organises its own Conferences, IASC World Conferences, and COMPSTAT in Europe. The 17th Conference of ERS-IASC, the biennial meeting of European - gional Section of the IASC was held in Rome August 28 - September 1, 2006. Th...
This COMPSTAT 2002 book contains the Keynote, Invited, and Full Contributed papers presented in Berlin, August 2002. A companion volume including Short Communications and Posters is published on CD. The COMPSTAT 2002 is the 15th conference in a serie of biannual conferences with the objective to present the latest developments in Computational Statistics and is taking place from August 24th to August 28th, 2002. Previous COMPSTATs were in Vienna (1974), Berlin (1976), Leiden (1978), Edinburgh (1980), Toulouse (1982), Pra~ue (1984), Rome (1986), Copenhagen (1988), Dubrovnik (1990), Neuchatel (1992), Vienna (1994), Barcelona (1996), Bris tol (1998) and Utrecht (2000). COMPSTAT 2002 is organise...
This is the first book to bring together in one place the techniques for regression curve smoothing involving more than one variable.
The Handbook of Computational Statistics: Concepts and Methodology is divided into four parts. It begins with an overview over the field of Computational Statistics. The second part presents several topics in the supporting field of statistical computing. Emphasis is placed on the need of fast and accurate numerical algorithms and it discusses some of the basic methodologies for transformation, data base handling and graphics treatment. The third part focuses on statistical methodology. Special attention is given to smoothing, iterative procedures, simulation and visualization of multivariate data. Finally a set of selected applications like Bioinformatics, Medical Imaging, Finance and Network Intrusion Detection highlight the usefulness of computational statistics.
This is one book that can genuinely be said to be straight from the horse’s mouth. Written by the originator of the technique, it examines parallel coordinates as the leading methodology for multidimensional visualization. Starting from geometric foundations, this is the first systematic and rigorous exposition of the methodology's mathematical and algorithmic components. It covers, among many others, the visualization of multidimensional lines, minimum distances, planes, hyperplanes, and clusters of "near" planes. The last chapter explains in a non-technical way the methodology's application to visual and automatic data mining. The principles of the latter, along with guidelines, strategies and algorithms are illustrated in detail on real high-dimensional datasets.
Recent years have witnessed a growing importance of quantitative methods in both financial research and industry. This development requires the use of advanced techniques on a theoretical and applied level, especially when it comes to the quantification of risk and the valuation of modern financial products. Applied Quantitative Finance (2nd edition) provides a comprehensive and state-of-the-art treatment of cutting-edge topics and methods. It provides solutions to and presents theoretical developments in many practical problems such as risk management, pricing of credit derivatives, quantification of volatility and copula modelling. The synthesis of theory and practice supported by computat...