SEPTEMBER 2005 (Vol. 38, No. 9) p. 98
0018-9162/05/$31.00 © 2005 IEEE

Published by the IEEE Computer Society
  Article Contents  
Download Citation
Download Content
PDFs Require Adobe Acrobat
Probability and Computing: Randomized Algorithms and Probabilistic Analysis, Michael Mitzenmacher and Eli Upfal. Randomization and probabilistic techniques play an important role in modern computer science, with applications ranging from combinatorial optimization and machine learning to communication networks and secure protocols.
Designed to accompany a one- or two-semester course for advanced undergraduates or beginning graduate students in computer science and applied mathematics, this textbook gives an introduction to the techniques and paradigms used in the development of probabilistic algorithms and analyses.
The book's first half covers core material, including random sampling, Markov's inequality, Chevyshev's inequality, Chernoff bounds, balls and bins models, and Markov chains. In the second half, the authors delve into more advanced topics such as continuous probability, applications of limited independence, Monte Carlo methods, martingales, and balanced allocations.

Cambridge University Press;; 0-521-83540-2; 368 pp.
The Nature of Mathematical Modeling, Neil Gershenfeld. This book explores mathematical modeling through the simple and efficient implementation of such models on computers. Its first section covers exact and approximate analytical techniques, the second examines numerical methods, the third focuses on model inference based on observations, and the fourth addresses the special role of time in modeling.
Each chapter presents a concise summary of the core results in an area, providing an orientation to what they can and cannot do, enough background to use them to solve typical problems, and pointers to access the literature for particular applications. Extensively worked problems complement the text.
Cambridge University Press;; 0-521-57095-6; 356 pp.
Performance Tuning for Linux Servers, Sandra K. Johnson, Gerrit Huizenga, and Badari Pulavarty, eds. An IBM team of experienced Linux performance specialists describes how to find bottlenecks, measure performance, and identify effective optimizations.
The book goes beyond kernel tuning to provide tips on maximizing the end-to-end performance of real-world applications and databases running on Linux. Throughout, the authors present realistic examples based on today's most popular enterprise Linux platforms. These examples will guide readers through installation and configuration of Linux for maximum performance from the outset; help them evaluate and select the right hardware architecture for their Linux environment; provide insight into Linux kernels 2.4 through 2.6 from the component, performance issues, and optimization opportunities perspectives; and master core Linux performance tuning principles and strategies.
IBM Press/Prentice Hall PTR;; 0-13-144753-X; 576 pp.
Web Standards Design Guide, Kevin Ruse. This text, which helps readers understand and implement the vast array of Web standards information, is really three books in one: a detailed overview of standardization; a reference book for the standards; and a step-by-step, how-to guide for creating standards-compliant Web sites.
The book details the essential Web standards, including XML, CSS, xForms, and xLink. It uses practical Web page examples throughout to help explain why readers should work within standards, which standards are relevant to specific projects, how to use them, and how to convert existing code.
Ultimately, the author strives to teach readers how to write standards-compliant code. Future insights into where the Web is headed accompany tips on how designers can prepare for this evolution. Using the skills and wisdom provided throughout the book can help readers prepare to create high-quality sites that meet today's standards. The book also includes a companion CD-ROM with tutorial files, Web software, and image files of all figures that appear in the text.
Charles River Media;; 1-58450-387-4; 364 pp.
Quantum Information Processing, 2nd ed., Thomas Beth and Gerd Leuchs, eds. This revised edition provides up-to-date insights into the current research of quantum superposition, entanglement, and the quantum measurement process—the key ingredients of quantum information processing.
In this volume, leading experts bring together the latest results in quantum information and address the subject area's relevant questions. The contributions have been carefully revised and enlarged with respect to recent developments in quantum computation, quantum transportation, and cryptography.
Wiley-Interscience;; 3-527-40541-0; 471 pp.