- Membership
- Publications
- Meetings
- Competitions
- Community
- Programs
- Students
- High School Teachers
- Faculty and Departments
- Underrepresented Groups
- MAA Awards
- MAA Grants

- News
- About MAA

Publisher:

Academic Press

Publication Date:

2005

Number of Pages:

456

Format:

Hardcover

Price:

104.00

ISBN:

978-0120885084

Category:

General

[Reviewed by , on ]

Sarah Boslaugh

07/2/2006

There are many introductory textbooks on probability and statistics, for two reasons: first, there is a huge market for such books because many university courses require students to take one or more semesters of statistics; second, it is difficult to present this material well. If it were easy to strike a balance between theoretical rigor and practical application, after all, the perfect text would already have been written.

Oliver Ibe's *Fundamentals of Applied Probability and Random Processes* is informed by his experiences teaching the introductory probability and statistics course to junior and senior engineering students at the University of Massachusetts-Lowell, where he is a professor in the Department of Electrical and Computer Engineering. This book presents a straightforward exposition of the basics of probability and statistics, starting with basic definitions of sample space and events, and proceeding through fairly advanced topics not always included in a one-semester statistics course. Each chapter is broken down into small subunits, making this a useful reference book as well as a textbook. The material is presented clearly, and solved problems are included in the text. The text layout is particularly good, with lots of white space and logical use of headers which make it easy to locate a particular topic within a chapter. There are exercises at the end of each chapter, but no solutions provided. No reference is made to a web page or any other types of supporting materials.

*Fundamentals of Applied Probability and Random Processes* could be used as a probability text in many contexts, including beginning statistics classes at the graduate level. Its usefulness is not be limited to engineering departments: the examples used as illustrations are drawn from many fields. It could also serve as a self-teaching text, although the fact that apparently no solutions are available for the end-of-chapter problems makes it less useful for that purpose. However, this text assumes readers are comfortable with mathematical notation and competent in at least freshman calculus: students without good mathematical preparation (I'm thinking of many graduate student in the social sciences, for instance) may find their eyes glazing over mid-way through the second chapter.

Sarah Boslaugh is a Senior Statistical Data Analyst in the Department of Pediatrics at the Washington University School of Medicine in St. Louis, MO. She wrote *An Intermediate Guide to SPSS Programming: Using Syntax for Data Management* with Sage Publications in 2005 and is currently writing *Secondary Data Sources for Public Health: A Practical Guide* for Cambridge University Press. She is also Editor-in-Chief of *The Encyclopedia of Epidemiology* which will be published by Sage in 2007.

Preface

Acknowledgments

Chapter 1 Basic Probability Concepts

1.1 Introduction

1.2 Sample Space and Events

1.3 Definitions of Probability

1.3.1 Axiomatic Definition

1.3.2 Relative-Frequency Definition

1.3.3 Classical Definition

1.4 Applications of Probability

1.4.1 Reliability Engineering

1.4.2 Quality Control

1.4.3 Channel Noise

1.4.4 System Simulation

1.5 Elementary Set Theory

1.5.1 Set Operations

1.5.2 Number of Subsets of a Set

1.5.3 Venn Diagram

1.5.4 Set Identities

1.5.5 Duality Principle

1.6 Properties of Probability

1.7 Conditional Probability

1.7.1 Total Probability and the Bayes’ Theorem

1.7.2 Tree Diagram

1.8 Independent Events

1.9 Combined Experiments

1.10 Basic Combinatorial Analysis

1.10.1 Permutations

1.10.2 Circular Arrangement

1.10.3 Applications of Permutations in Probability

1.10.4 Combinations

1.10.5 The Binomial Theorem

1.10.6 Stirling’s Formula

1.10.7 Applications of Combinations in Probability

1.11 Reliability Applications

1.12 Summary

1.13 Problems

1.14 References

Chapter 2 Random Variables

2.1 Introduction

2.2 Definition of a Random Variable

2.3 Events Defined by Random Variables

2.4 Distribution Functions

2.5 Discrete Random Variables

2.5.1 Obtaining the PMF from the CDF

2.6 Continuous Random Variables

2.7 Chapter Summary

2.8 Problems

Chapter 3 Moments of Random Variables

3.1 Introduction

3.2 Expectation

3.3 Expectation of Nonnegative Random Variables

3.4 Moments of Random Variables and the Variance

3.5 Conditional Expectations

3.6 The Chebyshev Inequality

3.7 The Markov Inequality

3.8 Chapter Summary

3.9 Problems

Chapter 4 Special Probability Distributions

4.1 Introduction

4.2 The Bernoulli Trial and Bernoulli Distribution

4.3 Binomial Distribution

4.4 Geometric Distribution

4.4.1 Modified Geometric Distribution

4.4.2 “Forgetfulness” Property of the Geometric Distribution

4.5 Pascal (or Negative Binomial) Distribution

4.6 Hypergeometric Distribution

4.7 Poisson Distribution

4.7.1 Poisson Approximation to the Binomial Distribution

4.8 Exponential Distribution

4.8.1 “Forgetfulness” Property of the Exponential Distribution

4.8.2 Relationship between the Exponential and Poisson Distributions

4.9 Erlang Distribution

4.10 Uniform Distribution

4.10.1 The Discrete Uniform Distribution

4.11 Normal Distribution

4.11.1 Normal Approximation to the Binomial Distribution

4.11.2 The Error Function

4.11.3 The Q-Function

4.12 The Hazard Function

4.13 Chapter Summary

4.14 Problems

Chapter 5 Multiple Random Variables

5.1 Introduction

5.2 Joint CDFs of Bivariate Random Variables

5.2.1 Properties of the Joint CDF

5.3 Discrete Random Variables

5.4 Continuous Random Variables

5.5 Determining Probabilities from a Joint CDF

5.6 Conditional Distributions

5.6.1 Conditional PMF for Discrete Random Variables

5.6.2 Conditional PDF for Continuous Random Variables

5.6.3 Conditional Means and Variances

5.6.4 Simple Rule for Independence

5.7 Covariance and Correlation Coefficient

5.8 Many Random Variables

5.9 Multinomial Distributions

5.10 Chapter Summary

5.11 Problems

Chapter 6 Functions of Random Variables

6.1 Introduction

6.2 Functions of One Random Variable

6.2.1 Linear Functions

6.2.2 Power Functions

6.3 Expectation of a Function of One Random Variable

6.3.1 Moments of a Linear Function

6.4 Sums of Independent Random Variables

6.4.1 Moments of the Sum of Random Variables

6.4.2 Sum of Discrete Random Variables

6.4.3 Sum of Independent Binomial Random Variables

6.4.4 Sum of Independent Poisson Random Variables

6.4.5 The Spare Parts Problem

6.5 Minimum of Two Independent Random Variables

6.6 Maximum of Two Independent Random Variables

6.7 Comparison of the Interconnection Models

6.8 Two Functions of Two Random Variables

6.8.1 Application of the Transformation Method

6.9 Laws of Large Numbers

6.10 The Central Limit Theorem

6.11 Order Statistics

6.12 Chapter Summary

6.13 Problems

Chapter 7 Transform Methods

7.1 Introduction

7.2 The Characteristic Function

7.2.1 Moment-Generating Property of the Characteristic Function

7.3 The s-Transform

7.3.1 Moment-Generating Property of the s-Transform

7.3.2 The s-Transforms of Some Well-Known PDFs

7.3.2.1 The s-Transform of the Exponential Distribution

7.3.2.2 The s-Transform of the Uniform Distribution

7.3.3 The s-Transform of the PDF of the Sum of Independent Random Variables

7.3.3.1 The s-Transform of the Erlang Distribution

7.4 The z-Transform

7.4.1 Moment-Generating Property of the z-Transform

7.4.2 The z-Transform of the Bernoulli Distribution

7.4.3 The z-Transform of the Binomial Distribution

7.4.4 The z-Transform of the Geometric Distribution

7.4.5 The z-Transform of the Poisson Distribution

7.4.6 The z-Transform of the PMF of Sum of Independent Random Variables

7.4.7 The z-Transform of the Pascal Distribution

7.5 Random Sum of Random Variables

7.6 Chapter Summary

7.7 Problems

Chapter 8 Introduction to Random Processes

8.1 Introduction

8.2 Classification of Random Processes

8.3 Characterizing a Random Process

8.3.1 Mean and Autocorrelation Function of a Random Process

8.3.2 The Autocovariance Function of a Random Process

8.4 Crosscorrelation and Crosscovariance Functions

8.4.1 Review of Some Trigonometric Identities

8.5 Stationary Random Processes

8.5.1 Strict-Sense Stationary Processes

8.5.2 Wide-Sense Stationary Processes

8.5.2.1 Properties of Autocorrelation Functions for WSS Processes

8.5.2.2 Autocorrelation Matrices for WSS Processes

8.5.2.3 Properties of Crosscorrelation Functions for WSS Processes

8.6 Ergodic Random Processes

8.7 Power Spectral Density

8.7.1 White Noise

8.8 Discrete-Time Random Processes

8.8.1 Mean, Autocorrelation Function and Autocovariance Function

8.8.2 Power Spectral Density

8.8.3 Sampling of Continuous-Time Processes

8.9 Chapter Summary

8.10 Problems

Chapter 9 Linear Systems with Random Inputs

9.1 Introduction

9.2 Overview of Linear Systems with Deterministic Inputs

9.2.Linear Systems with Continuous-Time Random Inputs

9.3 Linear Systems with Discrete-Time Random Inputs

9.4 Auto regressive Moving Average Process

9.4.1 Moving Average Process

9.4.2 Auto regressive Process

9.4.3 ARMA Process

9.5 Chapter Summary

9.6 Problems

Chapter 10 Some Models of Random Processes

10.1 Introduction

10.2 The Bernoulli Process

10.3 Random Walk

10.3.1 Gambler’s Ruin

10.4 The Gaussian Process

10.4.1 White Gaussian Noise Process

10.5 Poisson Process

10.5.1 Counting Processes

10.5.2 Independent Increment Processes

10.5.3 Stationary Increments

10.5.4 Definitions of a Poisson Process

10.5.5 Interarrival Times for the Poisson Process

10.5.6 Conditional and Joint PMFs for Poisson Processes

10.5.7 Compound Poisson Process

10.5.8 Combinations of Independent Poisson Processes

10.5.9 Competing Independent Poisson Processes

10.5.10 Subdivision of a Poisson Process and the Filtered Poisson Process 4

10.5.11 Random Incidence

10.5.12 Nonhomogeneous Poisson Process

10.6 Markov Processes

10.7 Discrete-time Markov Chains

10.7.1 State Transition Probability Matrix

10.7.2 The n-step State Transition Probability

10.7.3 State Transition Diagrams

10.7.4 Classification of States

10.7.5 Limiting-state Probabilities

10.7.6 Doubly Stochastic Matrix

10.8 Continuous-time Markov Chains

10.8.1 Birth and Death Processes

10.9 Gambler’s Ruin as a Markov Chain

10.10 Chapter Summary

10.11 Problems

Chapter 11 Introduction to Statistics

11.1 Introduction

11.2 Sampling Theory

11.2.1 The Sample Mean

11.2.2 The Sample Variance

11.2.3 Sampling Distributions

11.3 Estimation Theory

11.3.1 Point Estimate, Interval Estimate and Confidence Interval

11.3.2 Maximum Likelihood Estimation

11.3.3 Minimum Mean Squared Error Estimation

11.4 Hypothesis Testing

11.4.1 Hypothesis Test Procedure

11.4.2 Type I and Type II Errors

11.4.3 One-Tailed and Two-Tailed Tests

11.5 Curve Fitting and Linear Regression

11.6 Problems

Appendix 1: Table for the CDF of the Standard Normal Random Variable

- Log in to post comments