Beginning Algorithms A good understanding of algorithms, and the knowledge of when to apply them, is crucial to producing software that not only works correctly, but also performs efficiently. This i
Great management is difficult to see as it occurs. It's possible to see the results of great management, but it's not easy to see how managers achieve those results. Great management happens in one-o
We live in a world, according to N. Katherine Hayles, where new languages are constantly emerging, proliferating, and fading into obsolescence. These are languages of our own making: the programming l
This textbook provides a clear and concise introduction to computer architecture and implementation. Two important themes are interwoven throughout the book. The first is an overview of the major concepts and design philosophies of computer architecture and organization. The second is the early introduction and use of analytic modeling of computer performance. The author begins by describing the classic von Neumann architecture, and then presents in detail a number of performance models and evaluation techniques. He goes on to cover user instruction set design, including RISC architecture. A unique feature of the book is its memory-centric approach - memory systems are discussed before processor implementations. The author also deals with pipelined processors, input/output techniques, queuing modes, and extended instruction set architectures. Each topic is illustrated with reference to actual IBM and Intel architectures. The book contains many worked examples and over 130 homework exer
* The new edition of the classic bestseller that launched the data warehousing industry covers new approaches and technologies, many of which have been pioneered by Inmon himself * In addition t
Create queries that make forms and reports useful Develop forms to access the data you need and make reports that make sense! If you thought you had to use a spreadsheet program to produce repo
The Barnes & Noble ReviewThousands of security and law enforcement professionals desperately want to master digital forensics. Hands-on experience is crucial, but where can you get it? Here: Real
Computer vision is a rapidly growing field which aims to make computers 'see' as effectively as humans. In this book Dr Shapiro presents a new computer vision framework for interpreting time-varying imagery. This is an important task, since movement reveals valuable information about the environment. The fully-automated system operates on long, monocular image sequences containing multiple, independently-moving objects, and demonstrates the practical feasibility of recovering scene structure and motion in a bottom-up fashion. Real and synthetic examples are given throughout, with particular emphasis on image coding applications. Novel theory is derived in the context of the affine camera, a generalisation of the familiar scaled orthographic model. Analysis proceeds by tracking 'corner features' through successive frames and grouping the resulting trajectories into rigid objects using new clustering and outlier rejection techniques. The three-dimensional motion parameters are then compu
The Basics of Computer Arithmetic Made Enjoyable and Accessible-with a Special Program Included for Hands-on Learning"The combination of this book and its associated virtual computer is fantastic! Exp
AAA (Authentication, Authorization, Accounting) describes a framework for intelligently controlling access to network resources, enforcing policies, and providing the information necessary to bill for
Measuring Computer Performance sets out the fundamental techniques used in analyzing and understanding the performance of computer systems. Throughout the book, the emphasis is on practical methods of measurement, simulation, and analytical modeling. The author discusses performance metrics and provides detailed coverage of the strategies used in benchmark programmes. He gives intuitive explanations of the key statistical tools needed to interpret measured performance data. He also describes the general 'design of experiments' technique, and shows how the maximum amount of information can be obtained for the minimum effort. The book closes with a chapter on the technique of queueing analysis. Appendices listing common probability distributions and statistical tables are included, along with a glossary of important technical terms. This practically-oriented book will be of great interest to anyone who wants a detailed, yet intuitive, understanding of computer systems performance analysi
Become a cyber-hero - know the common wireless weaknesses "Reading a book like this one is a worthy endeavor toward becoming an experienced wireless security professional." --Devin Akin - CTO, Th
Selected by Choice magazine as an Outstanding Academic TitleDigital History: A Guide to Gathering, Preserving, and Presenting the Past on the Web provides for the first time a plainspoken and thoroug
This book is a complete account of the predicate transformation calculus semantics of sequential programs, including repetitions, recursive procedures, computational induction and unbounded nondeterminacy. Predicate transformation semantics are the best specification method for the development of correct and well-structured computer programs. The author develops this theory to a greater depth than has been achieved before, and describes it in a way that makes it readily compatible with programming rules for partial and total correctness of repetitions and recursive procedures, supplies new rules for proving incorrectness, and a stronger rule for proving that two programs satisfy the same specifications. Finally, the semantics are extended so that non-terminating programs can be specified as well. This will be essential reading for all computer scientists working in specification and verification of programs.
The author presents a theory of concurrent processes where three different semantic description methods that are usually studied in isolation are brought together. Petri nets describe processes as concurrent and interacting machines; algebraic process terms describe processes as abstract concurrent processes; and logical formulas specify the intended communication behaviour of processes. At the heart of this theory are two sets of transformation rules for the top-down design of concurrent processes. The first set can be used to transform stepwise logical formulas into process terms, whilst process terms can be transformed into Petri nets by the second set. These rules are based on novel techniques for the operational and denotational semantics of concurrent processes. Various results and relationships between nets, terms and formulas starting with formulas and illustrated by examples. The use of transformations is demonstrated in a series of case studies, and the author also identifies
The authors describe here a framework in which the type notation of functional languages is extended to include a notation for binding times (that is run-time and compile-time) that distinguishes between them. Consequently the ability to specify code and verify program correctness can be improved. Two developments are needed, the first of which introduces the binding time distinction into the lambda calculus, in a manner analogous with the introduction of types into the untyped lambda calculus. Methods are also presented for introducing combinators for run-time. The second concerns the interpretation of the resulting language, which is known as the mixed lambda-calculus and combinatory logic. The notion of 'parametrized semantics' is used to describe code generation and abstract interpretation. The code generation is for a simple abstract machine designed for the purpose; it is close to the categorical abstract machine. The abstract interpretation focuses on a strictness analysis that
The major reason for the lack of use of parallel computing is the mismatch between the complexity and variety of parallel hardware, and the software development tools to program it. The cost of developing software needs to be amortised over decades, but the platforms on which it executes change every few years, requiring complete rewrites. The evident cost-effectiveness of parallel computation has not been realized because of this mismatch. This book presents an integrated approach to parallel software development by addressing both software and performance issues together. It presents a methodology for software construction that produces architecture-independent and intellectually abstract software. The software can execute efficiently on a range of existing and potential hardware configurations. The approach is based on the construction of categorical data types, a generalization of abstract data types, and of objects. Categorical data types abstract both from the representation of a
Put your phone system on your computer network and see the savingsSee how to get started with VoIP, how it works, and why it saves you moneyVoIP is techspeak for "voice over Internet protocol," but it
Video Production Workshop is the first book written to be accessible and appealing to a younger, digitally savvy audience interested in learning the full range of skills involved in planning and exec