You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
In this work, a novel knowledge discovery framework able to analyze data produced in the Gasoline Direct Injection (GDI) context through machine learning is presented and validated. This approach is able to explore and exploit the investigated design spaces based on a limited number of observations, discovering and visualizing connections and correlations in complex phenomena. The extracted knowledge is then validated with domain expertise, revealing potential and limitations of this method.
This book constitutes the refereed proceedings of the 13th IFIP WG 5.1 International Conference on Product Lifecycle Management, PLM 2016, held in Columbia, SC, USA, in July 2016. The 57 revised full papers presented were carefully reviewed and selected from 77 submissions. The papers are organized in the following topical sections: knowledge sharing, re-use and preservation; collaborative development architectures; interoperability and systems integration; lean product development and the role of PLM; PLM and innovation; PLM tools; cloud computing and PLM tools; traceability and performance; building information modeling; big data analytics and business intelligence; information lifecycle management; industry 4.0; metrics, standards and regulation; and product, service and systems.
This book constitutes the refereed proceedings of the 12th IFIP WG 5.1 International Conference on Product Lifecycle Management, PLM 2015, held in Doha, Qatar, in October 2015. The 79 revised full papers were carefully reviewed and selected from 130 submissions. The papers are organized in the following topical sections: smart products, assessment approaches, PLM maturity, building information modeling (BIM), languages and ontologies, product service systems, future factory, knowledge creation and management, simulation and virtual environments, sustainability and systems improvement, configuration and engineering change, education studies, cyber-physical and smart systems, design and integration issues, and PLM processes and applications.
Bridging theoretical modelling and advanced empirical techniques is a central aim of current linguistic research. The progress in empirical methods contributes to the precise estimation of the properties of linguistic data and promises new ways for justifying theoretical models and testing their implications. The contributions to the present collective volume take up this challenge and focus on the relevance of empirical results achieved through up-to-date methodology for the theoretical analysis and modelling of argument structure. They tackle issues of argument structure from different perspectives addressing questions related to diverse verb types (unaccusatives, unergatives, (di)transiti...
The German language, due to its verb-final nature, relatively free order of constituents and morphological Case system, poses challenges for models of human syntactic processing which have mainly been developed on the basis of head-initial languages with little or no morphological Case. The verb-final order means that the parser has to make predictions about the input before receiving the verb. What are these predictions? What happens when the predictions turn out to be wrong? Furthermore, the German morphological Case system contains ambiguities. How are these ambiguities resolved under the normal time pressure in comprehension? Based on theoretical as well as experimental work, the present monograph develops a detailed account of the processing steps that underly language comprehension. At its core is a model of linking noun phrases to arguments of the verb in the developing phrase structure and checking the result with respect to features such as person, number and Case. This volume contains detailed introductions to human syntactic processing as well as to German syntax which will be helpful especially for readers less familiar with psycholinguistics and with Germanic.
The derivation, exposition, and justification of the Selective Tuning model of vision and attention. Although William James declared in 1890, "Everyone knows what attention is," today there are many different and sometimes opposing views on the subject. This fragmented theoretical landscape may be because most of the theories and models of attention offer explanations in natural language or in a pictorial manner rather than providing a quantitative and unambiguous statement of the theory. They focus on the manifestations of attention instead of its rationale. In this book, John Tsotsos develops a formal model of visual attention with the goal of providing a theoretical explanation for why hu...
Ein kollaborationsgerechtes Wissens- und Prozessmanagement muss die Integration aller beteiligten Systeme unterstützen, spezifische Fragen der Arbeitsteilung im Rahmen der Steuerung adressieren, aber auch die Planung und das Engineering neuer Varianten ermöglichen und durch Persistierung und Analyse von Prozessdaten zu einer stetigen Qualitätssteigerung beitragen. - Collaboration-oriented knowledge and process management must support the integration of all systems involved, address specific questions of the division of labor within the control system, but also enable the planning and engineering of new variants and contribute to a continuous increase in quality through the persistence and analysis of process data.
The dimensional accuracy in the body-in-white manufacturing process is analysed, basically confirming the simulation scope already in use. An enhanced prediction approach and a visualisation tool enable a precise and cost-efficient mapping of stamped parts deviation. As a result of joining operations parts can be deformed plastically, questioning the practical use of elastic tolerance simulation. A modified approach is presented based on the rigid body tolerance simulation.