You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
This book explains how to perform data de-noising, in large scale, with a satisfactory level of accuracy. Three main issues are considered. Firstly, how to eliminate the error propagation from one stage to next stages while developing a filtered model. Secondly, how to maintain the positional importance of data whilst purifying it. Finally, preservation of memory in the data is crucial to extract smart data from noisy big data. If, after the application of any form of smoothing or filtering, the memory of the corresponding data changes heavily, then the final data may lose some important information. This may lead to wrong or erroneous conclusions. But, when anticipating any loss of information due to smoothing or filtering, one cannot avoid the process of denoising as on the other hand any kind of analysis of big data in the presence of noise can be misleading. So, the entire process demands very careful execution with efficient and smart models in order to effectively deal with it.
Almost every element of life, from commerce and agriculture to communication and entertainment, has been profoundly altered by computing. Around the world, people rely on computers for the creation of systems for energy, transportation, and military use. Additionally, computing fosters scientific advancements that advance our basic understanding of the world and assist in finding answers to pressing health and environmental issues. Novel Research and Development Approaches in Heterogeneous Systems and Algorithms addresses novel research and developmental approaches in heterogenous systems and algorithms for information-centric networks of the future. Covering topics such as image identification and segmentation, materials data extraction, and wireless sensor networks, this premier reference source is a valuable resource for engineers, consultants, practitioners, computer scientists, students and educators of higher education, librarians, researchers, and academicians.
Rapid digital transformation is forcing the manufacturing industry to drastically alter its current trajectory for future success. The remarkable convergence of digitalization and manufacturing is reshaping industries, ushering in an era known as Industry 5.0. This revolutionary transition has given birth to digital manufacturing and smart factories, heralding a new dawn in the way we produce goods. The amalgamation of artificial intelligence (AI), robotics, the internet of things (IoT), augmented reality (AR), virtual reality (VR), big data analytics, cloud computing, and additive manufacturing stands poised to unlock unprecedented avenues in the realm of production. Practitioners, research...
Algorithms are ubiquitous in the contemporary technological world, and they ultimately consist of finite sequences of instructions used to accomplish tasks with necessary input values. This book analyses the top performing algorithms in areas as diverse as Big Data, Artificial Intelligence, Optimization Techniques and Cloud & Cyber Security Systems in order to explore their power and limitations.
The book introduces the outcomes of latest research in the field of Chemical Engineering. The book also illustrates the application of Chemical Engineering principles to provide innovative and state of the art solutions to problems associated with chemical industries. It covers a wide spectrum of topics in the area of Chemical Engineering such as Transfer operations, novel separation processes, adsorption, photooxidation, process control, modelling, and simulation. The book provides timely contribution towards implementation of recent approaches and methods in Chemical Engineering Research. It presents chapters focussed on several Chemical Engineering principles and methodologies of wide multidisciplinary applicability. The intended audience of this book will mainly consist of researchers, research students, and practitioners in Chemical Engineering and allied fields. The book can also serve researchers and students involved in multidisciplinary research.
Without mathematics no science would survive. This especially applies to the engineering sciences which highly depend on the applications of mathematics and mathematical tools such as optimization techniques, finite element methods, differential equations, fluid dynamics, mathematical modelling, and simulation. Neither optimization in engineering, nor the performance of safety-critical system and system security; nor high assurance software architecture and design would be possible without the development of mathematical applications. De Gruyter Series on the Applications of Mathematics in Engineering and Information Sciences (AMEIS) focusses on the latest applications of engineering and information technology that are possible only with the use of mathematical methods. By identifying the gaps in knowledge of engineering applications the AMEIS series fosters the international interchange between the sciences and keeps the reader informed about the latest developments.
This volume is authored by Rajat K. Baisya, alumnus of the department of Food Technology and Biochemical Engineering and a distinguished scholar, author and management consultant. The foundations of Jadavpur university and its origins as a technological institution imagined in a nationalist mould, established as a counter to the colonial British education and as a part of the movement for independence, are relatively well-known. What is less explored is the journey that the National Council of Education underwent to transform itself into the Jadavpur University. As a premier institution of higher learning in India at the present time, Jadavpur University has a number of stalwart professors to thank for its worldwide reputation. This book covers the biographies of twenty-two such professors of the Faculty of Engineering and Technology. Written from the ‘technological perspective’, the book attempts to trace a form of history of Jadavpur University through the microhistories of the individuals responsible for its beginnings and subsequent growth.
Steganography is the art of hiding and transmitting data through apparently innocuous carriers in an effort to conceal the existence of the secret data. The Least Significant Bit (LSB) steganography that replaces the least significant bits of the host medium is a widely used technique with low computational complexity and high insertion capacity. Although it has good perceptual transparency, it is vulnerable to steganalysis which is based on statistical analysis. Many other steganography algorithms have been developed such as Discrete Cosine Transform (DCT), Discrete Wavelet Transform (DWT) and Spread Spectrum Embedding. But the insertion capacities for all the above methods were not satisfi...
This book constitutes the proceedings of the Third International Conference on Algorithms and Discrete Applied Mathematics, CALDAM 2017, held in Goa, India, in February 2017. The 32 papers presented in this volume were carefully reviewed and selected from 103 submissions. They deal with the following areas: algorithms, graph theory, codes, polyhedral combinatorics, computational geometry, and discrete geometry.