You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
This volume presents state-of-the-art tools and techniques for automatically detecting, diagnosing, and predicting the effects of adverse events in an engineered system. It emphasizes the importance of these techniques in managing the intricate interactions within and between engineering systems to maintain a high degree of reliability. Reflecting the interdisciplinary nature of the field, the book explains how the fundamental algorithms and methods of both physics-based and data-driven approaches effectively address systems health management in application areas such as data centers, aircraft, and software systems.
The papers in this volume were presented at the 9th Annual International C- puting and Combinatorics Conference (COCOON 2003), held July 25–28, 2003, in Big Sky, MT, USA. The topics cover most aspects of theoretical computer science and combinatorics related to computing. Submissionstotheconferencethisyearwereconductedelectronically.Atotal of 114 papers were submitted, of which 52 were accepted. The papers were evaluated by an international program committee consisting of Nina Amenta, Tetsuo Asano, Bernard Chazelle, Zhixiang Chen, Francis Chin, Kyung-Yong Chwa, Robert Cimikowski, Anne Condon, Michael Fellows, Anna Gal, Michael Hallett,DanielHuson,NaokiKatoh,D.T.Lee,BernardMoret,BrendanMume...
Access control is one of the fundamental services that any Data Management System should provide. Its main goal is to protect data from unauthorized read and write operations. This is particularly crucial in today's open and interconnected world, where each kind of information can be easily made available to a huge user population, and where a damage or misuse of data may have unpredictable consequences that go beyond the boundaries where data reside or have been generated. This book provides an overview of the various developments in access control for data management systems. Discretionary, mandatory, and role-based access control will be discussed, by surveying the most relevant proposals...
Declarative Networking is a programming methodology that enables developers to concisely specify network protocols and services, which are directly compiled to a dataflow framework that executes the specifications. Declarative networking proposes the use of a declarative query language for specifying and implementing network protocols, and employs a dataflow framework at runtime for communication and maintenance of network state. The primary goal of declarative networking is to greatly simplify the process of specifying, implementing, deploying and evolving a network design. In addition, declarative networking serves as an important step towards an extensible, evolvable network architecture ...
Workflows may be defined as abstractions used to model the coherent flow of activities in the context of an in silico scientific experiment. They are employed in many domains of science such as bioinformatics, astronomy, and engineering. Such workflows usually present a considerable number of activities and activations (i.e., tasks associated with activities) and may need a long time for execution. Due to the continuous need to store and process data efficiently (making them data-intensive workflows), high-performance computing environments allied to parallelization techniques are used to run these workflows. At the beginning of the 2010s, cloud technologies emerged as a promising environmen...
We met again in front of the statue of Gottfried Wilhelm von Leibniz in the city of Leipzig. Leibniz, a famous son of Leipzig, planned automatic logical inference using symbolic computation, aimed to collate all human knowledge. Today, artificial intelligence deals with large amounts of data and knowledge and finds new information using machine learning and data mining. Machine learning and data mining are irreplaceable subjects and tools for the theory of pattern recognition and in applications of pattern recognition such as bioinformatics and data retrieval. This was the fourth edition of MLDM in Pattern Recognition which is the main event of Technical Committee 17 of the International Ass...
The two-volume set LNCS 3032 and LNCS 3033 constitute the thoroughly refereed post-proceedings of the Second International Workshop on Grid and Cooperative Computing, GCC 2003, held in Shanghai, China in December 2003. The 176 full papers and 173 poster papers presented were carefully selected from a total of over 550 paper submissions during two rounds of reviewing and revision. The papers are organized in topical sections on grid applications; peer-to-peer computing; grid architectures; grid middleware and toolkits; Web security and Web services; resource management, scheduling, and monitoring; network communication and information retrieval; grid QoS; algorithms, economic models, and theoretical models of the grid; semantic grid and knowledge grid; remote data access, storage, and sharing; and computer-supported cooperative work and cooperative middleware.
Data profiling refers to the activity of collecting data about data, {i.e.}, metadata. Most IT professionals and researchers who work with data have engaged in data profiling, at least informally, to understand and explore an unfamiliar dataset or to determine whether a new dataset is appropriate for a particular task at hand. Data profiling results are also important in a variety of other situations, including query optimization, data integration, and data cleaning. Simple metadata are statistics, such as the number of rows and columns, schema and datatype information, the number of distinct values, statistical value distributions, and the number of null or empty values in each column. More...
As data represent a key asset for today's organizations, the problem of how to protect this data from theft and misuse is at the forefront of these organizations' minds. Even though today several data security techniques are available to protect data and computing infrastructures, many such techniques -- such as firewalls and network security tools -- are unable to protect data from attacks posed by those working on an organization's "inside." These "insiders" usually have authorized access to relevant information systems, making it extremely challenging to block the misuse of information while still allowing them to do their jobs. This book discusses several techniques that can provide effe...
Advances in hardware technology have increased the capability to store and record personal data. This has caused concerns that personal data may be abused. This book proposes a number of techniques to perform the data mining tasks in a privacy-preserving way. This edited volume contains surveys by distinguished researchers in the privacy field. Each survey includes the key research content as well as future research directions of a particular topic in privacy. The book is designed for researchers, professors, and advanced-level students in computer science, but is also suitable for practitioners in industry.