You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
This collection of essays provides a timely assessment of the life and work of one of the twentieth century's most original thinkers.
Popper’s Critical Rationalism presents Popper’s views on science, knowledge, and inquiry, and examines the significance and tenability of these in light of recent developments in philosophy of science, philosophy of probability, and epistemology. It develops a fresh and novel philosophical position on science, which employs key insights from Popper while rejecting other elements of his philosophy. Central theses include: Crucial questions about scientific method arise at the level of the group, rather than that of the individual. Although criticism is vital for science, dogmatism is important too. Belief in scientific theories is permissible even in the absence of evidence in their favour. The aim of science is to eliminate false theories. Critical rationalism can be understood as a form of virtue epistemology
The general concept of information is here, for the first time, defined mathematically by adding one single axiom to the probability theory. This Mathematical Theory of Information is explored in fourteen chapters: 1. Information can be measured in different units, in anything from bits to dollars. We will here argue that any measure is acceptable if it does not violate the Law of Diminishing Information. This law is supported by two independent arguments: one derived from the Bar-Hillel ideal receiver, the other is based on Shannon's noisy channel. The entropy in the 'classical information theory' is one of the measures conforming to the Law of Diminishing Information, but it has, however, ...
We are happy to present the first volume of the Handbook of Defeasible Reasoning and Uncertainty Management Systems. Uncertainty pervades the real world and must therefore be addressed by every system that attempts to represent reality. The representation of uncertainty is a ma jor concern of philosophers, logicians, artificial intelligence researchers and com puter sciencists, psychologists, statisticians, economists and engineers. The present Handbook volumes provide frontline coverage of this area. This Handbook was produced in the style of previous handbook series like the Handbook of Philosoph ical Logic, the Handbook of Logic in Computer Science, the Handbook of Logic in Artificial Int...
The modern discussion on the concept of truthlikeness was started in 1960. In his influential Word and Object, W. V. O. Quine argued that Charles Peirce's definition of truth as the limit of inquiry is faulty for the reason that the notion 'nearer than' is only "defined for numbers and not for theories". In his contribution to the 1960 International Congress for Logic, Methodology, and Philosophy of Science at Stan ford, Karl Popper defended the opposite view by defining a compara tive notion of verisimilitude for theories. was originally introduced by the The concept of verisimilitude Ancient sceptics to moderate their radical thesis of the inaccessibility of truth. But soon verisimilitudo,...
The local justification of beliefs and hypotheses has recently become a major concern for epistemologists and philosophers of induction. As such, the problem of local justification is not entirely new. Most pragmatists had addressed themselves to it, and so did, to some extent, many classical inductivists in the Bacon-Whewell-Mill tradition. In the last few decades, however, the use of logic and semantics, probability calculus, statistical methods, and decision-theoretic concepts in the reconstruction of in ductive inference has revealed some important technical respects in which inductive justification can be local: the choice of a language, with its syntactic and semantic features, the rel...
There are two competing pictures of science. One considers science as a system of inferences, whereas another looks at science as a system of actions. The essays included in this collection offer a view which intends to combine both pictures. This compromise is well illustrated by Szaniawski's analysis of statistical inferences. It is shown that traditional approaches to the foundations of statistics do not need to be regarded as conflicting with each other. Thus, statistical rules can be treated as rules of behaviour as well as rules of inference. Szaniawski's uniform approach relies on the concept of rationality, analyzed from the point of view of decision theory. Applications of formal tools to the problem of justice and division of goods shows that the concept of rationality has a wider significance. Audience: The book will be of interest to philosophers of science, logicians, ethicists and mathematicians.
Although there is an abundance of highly specialized monographs, learned collections and general introductions to the philosophy of science, only a few 25 years. synthetic monographs and advanced textbooks have appeared in the last The philosophy of science seems to have lost its self-confidence. The main reason for such a loss is that the traditional analytical, logical-empiricist approaches to the philosophy of science had to make a number of concessions, especially in response to the work of Popper, Kuhn and Lakatos. With Structures in Science I intend to present both a synthetic mono graph and an advanced textbook that accommodates and integrates the insight of these philosophers, in what I like to call a neo-classical approach. The resulting monograph elaborates several important topics from one or more perspectives, by distinguishing various kinds of research programs, and various ways of explaining and reducing laws and concepts, and by summarizing an integrated explication (presented in From Instrumentalism to Constructive Realism, ICR) of the notions of confirmation, empirical progress and truth approximation.