You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
The communication complexity of a function f(x, y) measures the number of bits that two players, one who knows x and the other who knows y, must exchange to determine the value f(x, y). Communication complexity is a fundamental measure of complexity of functions. Lower bounds on this measure lead to lower bounds on many other measures of computational complexity. This monograph surveys lower bounds in the field of communication complexity. Our focus is on lower bounds that work by first representing the communication complexity measure in Euclidean space. That is to say, the first step in these lower bound techniques is to find a geometric complexity measure, such as rank or trace norm, that serves as a lower bound to the underlying communication complexity measure. Lower bounds on this geometric complexity measure are then found using algebraic and geometric tools.
The two-volume set LNCS 5125 and LNCS 5126 constitutes the refereed proceedings of the 35th International Colloquium on Automata, Languages and Programming, ICALP 2008, held in Reykjavik, Iceland, in July 2008. The 126 revised full papers presented together with 4 invited lectures were carefully reviewed and selected from a total of 407 submissions. The papers are grouped in three major tracks on algorithms, automata, complexity and games, on logic, semantics, and theory of programming, and on security and cryptography foundations. LNCS 5125 contains 70 contributions of track A selected from 269 submissions as well as 2 invited lectures. The papers are organized in topical sections on complexity: boolean functions and circuits, data structures, random walks and random structures, design and analysis of algorithms, scheduling, codes and coding, coloring, randomness in computation, online and dynamic algorithms, approximation algorithms, property testing, parameterized algorithms and complexity, graph algorithms, computational complexity, games and automata, group testing, streaming, and quantum, algorithmic game theory, and quantum computing.
This book constitutes the refereed proceedings of the 18th Annual Conference on Learning Theory, COLT 2005, held in Bertinoro, Italy in June 2005. The 45 revised full papers together with three articles on open problems presented were carefully reviewed and selected from a total of 120 submissions. The papers are organized in topical sections on: learning to rank, boosting, unlabeled data, multiclass classification, online learning, support vector machines, kernels and embeddings, inductive inference, unsupervised learning, generalization bounds, query learning, attribute efficiency, compression schemes, economics and game theory, separation results for learning models, and survey and prospects on open problems.
This book constitutes the refereed proceedings of the 34th International Symposium on Mathematical Foundations of Computer Science, MFCS 2009, held in Novy Smokovec, High Tatras, Slovakia, in August 2009. The 56 revised full papers presented together with 7 invited lectures were carefully reviewed and selected from 148 submissions. All current aspects in theoretical computer science and its mathematical foundations are addressed, including algorithmic game theory, algorithmic tearning theory, algorithms and data structures, automata, grammars and formal languages, bioinformatics, complexity, computational geometry, computer-assisted reasoning, concurrency theory, cryptography and security, databases and knowledge-based systems, formal specifications and program development, foundations of computing, logic in computer science, mobile computing, models of computation, networks, parallel and distributed computing, quantum computing, semantics and verification of programs, theoretical issues in artificial intelligence.
A deterministic extractor is a function that extracts almost perfect random bits from a weak random source. In this research monograph the author constructs deterministic extractors for several types of sources. A basic theme in this work is a methodology of recycling randomness which enables increasing the output length of deterministic extractors to near optimal length. The author's main work examines deterministic extractors for bit-fixing sources, deterministic extractors for affine sources and polynomial sources over large fields, and increasing the output length of zero-error dispersers. This work will be of interest to researchers and graduate students in combinatorics and theoretical computer science.
This book constitutes the joint refereed proceedings of the 10th International Workshop on Approximation Algorithms for Combinatorial Optimization Problems, APPROX 2007 and the 11th International Workshop on Randomization and Computation, RANDOM 2007, held in Princeton, NJ, USA, in August 2007. The 44 revised full papers presented were carefully reviewed and selected from 99 submissions. Topics of interest covered by the papers are design and analysis of approximation algorithms, hardness of approximation, small space and data streaming algorithms, sub-linear time algorithms, embeddings and metric space methods, mathematical programming methods, coloring and partitioning, cuts and connectivi...
The Proceedings of the ICM publishes the talks, by invited speakers, at the conference organized by the International Mathematical Union every 4 years. It covers several areas of Mathematics and it includes the Fields Medal and Nevanlinna, Gauss and Leelavati Prizes and the Chern Medal laudatios.