You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
This comprehensive volume is the product of an intensive collaborative effort among researchers across the United States, Europe and Japan. The result -- a change in the way we think of humans and computers.
Papers presented at HCI '91, held in Edinburgh.
This book brings together a number of researchers and developers from industry and academia who report on their work. It is of interest to language designers and the creators of toolkits, UIMSs, and other user interface tools.
This book challenges the ways we read, write, store, and retrieve information in the digital age. Computers—from electronic books to smart phones—play an active role in our social lives. Our technological choices thus entail theoretical and political commitments. Dennis Tenen takes up today's strange enmeshing of humans, texts, and machines to argue that our most ingrained intuitions about texts are profoundly alienated from the physical contexts of their intellectual production. Drawing on a range of primary sources from both literary theory and software engineering, he makes a case for a more transparent practice of human–computer interaction. Plain Text is thus a rallying call, a frame of mind as much as a file format. It reminds us, ultimately, that our devices also encode specific modes of governance and control that must remain available to interpretation.
User Error doesn't argue that we should simply reject computers, but neither does it uncritically embrace the current state of affairs but offers other options.
The first paper describes a parallel model designed to solve a class of relatively simple problems from elementary physics, and discusses the implications for models of problem solving in general. The authors show how one of the most salient features of problem solving, sequentiality, can emerge naturally within a parallel model that has no explicit knowledge of how to sequence analysis. This model exploits a new type of parallel distributed processing that employs stochastic processors and is based on a formal mapping between parallel computation and thermal physics. The mathematical theory is this type of processing-harmony theory-is discussed in the second and third papers.
Hypermedia technology needs a creative approach from the outset in the design of software to facilitate human thinking and learning. This book opens a discussion of the potential of hypermedia and related approaches to provide open exploratory learning environments. The papers in the book are based on contributions to a NATO Advanced Research Workshop held in July1990 and are grouped into six sections: - Semantic networking as cognitive tools, - Expert systems as cognitive tools, - Hypertext as cognitive tools, - Collaborative communication tools, - Microworlds: context-dependent cognitive tools, - Implementing cognitive tools. The book will be valuable for those who design, implement and evaluate learning programs and who seek to escape from rigid tactics like programmed instruction and behavioristic approaches. The book presents principles for exploratory systems that go beyond existing metaphors of instruction and provokes the reader to think in a new way about the cognitive level of human-computer interaction.
This handbook provides a systematic overview of the present state of international research in digital public history. Individual studies by internationally renowned public historians, digital humanists, and digital historians elucidate central issues in the field and present a critical account of the major public history accomplishments, research activities, and practices with the public and of their digital context. The handbook applies an international and comparative approach, looks at the historical development of the field, focuses on technical background and the use of specific digital media and tools. Furthermore, the handbook analyzes connections with local communities and different publics worldwide when engaging in digital activities with the past, indicating directions for future research, and teaching activities.
Minimalism is an action- and task-oriented approach to instruction and documentation that emphasizes the importance of realistic activities and experiences for effective learning and information seeking. Since 1990, when the approach was defined in John Carroll's The Nurnberg Funnel, much work has been done to apply, refine, and broaden the minimalist approach to technical communication. This volume presents fourteen major contributions to the current theory and practice of minimalism.Contributors evaluate the development of minimalism up to now, analyze the acceptance of minimalism by the mainstream technical communications community, report on specific innovations and investigations, and discuss future challenges and directions. The book also includes an appendix containing a bibliography of published research and development work on minimalism since 1990. Contributors Tricia Anson, R. John Brockmann, John M. Carroll, Steve Draper, David K. Farkas, JoAnn T. Hackos, Robert R. Johnson, Greg Kearsley, Barbara Mirel, Janice (Ginny) Redish, Stephanie Rosenbaum, Karl L. Smart, Hans van der Meij. Published in association with the Society for Technical Communication.
Claude Draude analyzes embodied software agents – interface solutions that are designed to talk back and give emotional feedback – from a gender and media studies perspective. She addresses technological and sociocultural concepts in their interplay of shifting the boundary between what is considered as human and what as machine. The author discusses the technological realization of specific personality models that define the design of embodied software agents – emotion and gaze models, in particular. Finally, she explores these models in their broader cultural context by relating them to the prominent topic of the Turing test and the notion of the Uncanny Valley.