You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
Written by the co-managers of the Kermit Project, this is a revised and updated tutorial on data communications, with new material on today's high-speed modems and how to make the best use of them
None
The classic and authoritative reference in the field of computer security, now completely updated and revised With the continued presence of large-scale computers; the proliferation of desktop, laptop, and handheld computers; and the vast international networks that interconnect them, the nature and extent of threats to computer security have grown enormously. Now in its fifth edition, Computer Security Handbook continues to provide authoritative guidance to identify and to eliminate these threats where possible, as well as to lessen any losses attributable to them. With seventy-seven chapters contributed by a panel of renowned industry professionals, the new edition has increased coverage i...
If you have ever looked at a fantastic adventure or science fiction movie, or an amazingly complex and rich computer game, or a TV commercial where cars or gas pumps or biscuits behaved liked people and wondered, “How do they do that?”, then you’ve experienced the magic of 3D worlds generated by a computer. 3D in computers began as a way to represent automotive designs and illustrate the construction of molecules. 3D graphics use evolved to visualizations of simulated data and artistic representations of imaginary worlds. In order to overcome the processing limitations of the computer, graphics had to exploit the characteristics of the eye and brain, and develop visual tricks to simula...
The history of American elections changed profoundly on the night of November 4, 1952. An outside-the-box approach to predicting winners from early returns with new tools—computers—was launched live and untested on the newest medium for news: television. Like exhibits in a freak show, computers were referred to as “electronic brains” and “mechanical monsters.” Yet this innovation would help fuel an obsession with numbers as a way of understanding and shaping politics. It would engender controversy down to our own time. And it would herald a future in which the public square would go digital. The gamble was fueled by a crisis of credibility stemming from faulty election-night fore...
None
Examining the science behind everyday predictions—such as why the supermarket sends particular coupons to the appropriate people and how a bank can foretell if someone will default on a loan within a few minutes—this guide explains the basics of what data mining is, details a variety of data mining and techniques, and profiles the key figures behind the data-mining process. After first demonstrating fundamental approaches such as nearest neighbor and association rules, the resource goes on to analyze probabilistic techniques that use Bayes' theorem and artificial intelligence algorithms using neural networks. With chapters on a wide range of topics—from calculating similarity to dealing with uncertainty and modeling the brain—this comprehensive volume reveals how anyone with enough information can get an intimate view of someone's life and what to do about it.
It’s the founding myth of humanities computing and digital humanities: In 1949, the Italian Jesuit scholar, Roberto Busa, S.J., persuaded IBM to offer technical and financial support for the mechanized creation of a massive lemmatized concordance to the works of St. Thomas Aquinas. Using Busa’s own papers, recently accessioned in Milan, as well as IBM archives and other sources, Jones illuminates this DH origin story. He examines relationships between the layers of hardware, software, human agents, culture, and history, and answers the question of how specific technologies afford and even constrain cultural practices, including in this case the academic research agendas of humanities computing and, later, digital humanities.
The focus of this book is on the birth and historical development of permutation statistical methods from the early 1920s to the near present. Beginning with the seminal contributions of R.A. Fisher, E.J.G. Pitman, and others in the 1920s and 1930s, permutation statistical methods were initially introduced to validate the assumptions of classical statistical methods. Permutation methods have advantages over classical methods in that they are optimal for small data sets and non-random samples, are data-dependent, and are free of distributional assumptions. Permutation probability values may be exact, or estimated via moment- or resampling-approximation procedures. Because permutation methods are inherently computationally-intensive, the evolution of computers and computing technology that made modern permutation methods possible accompanies the historical narrative. Permutation analogs of many well-known statistical tests are presented in a historical context, including multiple correlation and regression, analysis of variance, contingency table analysis, and measures of association and agreement. A non-mathematical approach makes the text accessible to readers of all levels.