Probability and Random Processes for Electrical Engineering
Pearson (Verlag)
978-0-321-18963-9 (ISBN)
- Titel erscheint in neuer Auflage
- Artikel merken
This textbook offers an interesting, straightforward introduction to probability and random processes. While helping students to develop their problem-solving skills, the book enables them to understand how to make the transition from real problems to probability models for those problems. To keep students motivated, the author uses a number of practical applications from various areas of electrical and computer engineering that demonstrate the relevance of probability theory to engineering practice. Discrete-time random processes are used to bridge the transition between random variables and continuous-time random processes. Additional material has been added to the second edition to provide a more substantial introduction to random processes.
1. Probability Models in Electrical and Computer Engineering.
Mathematical models as tools in analysis and design. Deterministic models. Probability models.
Statistical regularity. Properties of relative frequency. The axiomatic approach to a theory of probability. Building a probability model.
A detailed example: a packet voice transmission system. Other examples.
Communication over unreliable channels. Processing of random signals. Resource sharing systems. Reliability of systems.
Overview of book. Summary. Problems.
2. Basic Concepts of Probability Theory.
Specifying random experiments.
The sample space. Events. Set operations.
The axioms of probability.
Discrete sample spaces. Continuous sample spaces.
Computing probabilities using counting methods.
Sampling with replacement and with ordering. Sampling without replacement and with ordering. Permutations of n distinct objects. Sampling without replacement and without ordering. Sampling with replacement and without ordering.
Conditional probability.
Bayes' Rule.
Independence of events. Sequential experiments.
Sequences of independent experiments. The binomial probability law. The multinomial probability law. The geometric probability law. Sequences of dependent experiments.
A computer method for synthesizing randomness: random number generators. Summary. Problems.
3. Random Variables.
The notion of a random variable. The cumulative distribution function.
The three types of random variables.
The probability density function.
Conditional cdf's and pdf's.
Some important random variables.
Discrete random variables. Continuous random variables.
Functions of a random variable. The expected value of random variables.
The expected value of X. The expected value of Y = g(X). Variance of X.
The Markov and Chebyshev inequalities. Testing the fit of a distribution to data. Transform methods.
The characteristic function. The probability generating function. The laplace transform of the pdf.
Basic reliability calculations.
The failure rate function. Reliability of systems.
Computer methods for generating random variables.
The transformation method. The rejection method. Generation of functions of a random variable. Generating mixtures of random variables.
Entropy.
The entropy of a random variable. Entropy as a measure of information. The method of a maximum entropy.
Summary. Problems.
4. Multiple Random Variables.
Vector random variables.
Events and probabilities. Independence.
Pairs of random variables.
Pairs of discrete random variables. The joint cdf of X and Y. The joint pdf of two jointly continuous random variables. Random variables that differ in type.
Independence of two random variables. Conditional probability and conditional expectation.
Conditional probability. Conditional expectation.
Multiple random variables.
Joint distributions. Independence.
Functions of several random variables.
One function of several random variables. Transformation of random vectors. pdf of linear transformations. pdf of general transformations.
Expected value of functions of random variables.
The correlation and covariance of two random variables. Joint characteristic function.
Jointly Gaussian random variables.
n jointly Gaussian random variables. Linear transformation of Gaussian random variables. Joint characteristic function of Gaussian random variables.
Mean square estimation.
Linear prediction.
Generating correlated vector random variables.
Generating vectors of random variables with specified covariances. Generating vectors of jointly Gaussian random variables.
Summary. Problems.
5. Sums of Random Variables and Long-Term Averages.
Sums of random variables.
Mean and variance of sums of random variables. pdf of sums of independent random variables. Sum of a random number of random variables.
The sample mean and the laws of large numbers. The central limit theorem.
Gaussian approximation for binomial probabilities. Proof of the central limit theorem.
Confidence intervals.
Case 1: Xj's Gaussian; unknown mean and known variance. Case 2: Xj's Gaussian; mean and variance unknown. Case 3: Xj's Non-Gaussian; mean and variance unknown.
Convergence of sequences of random variables. Long-term arrival rates and associated averages. Long-term time averages. A computer method for evaluating the distribution of a random variable using the discrete Fourier transform. Discrete random variables. Continuous random variables. Summary. Problems. Appendix: subroutine FFT(A,M,N).
6. Random Processes.
Definition of a random process. Specifying of a random process.
Joint distributions of time samples. The mean, autocorrelation, and autocovariance functions. Gaussian random processes. Multiple random processes.
Examples of discrete-time random processes.
iid random processes. Sum processes; the binomial counting and random walk processes.
Examples of continuous-time random processes.
Poisson process. Random telegraph signal and other processes derived from the Poisson Process. Wiener process and Brownian motion.
Stationary random processes.
Wide-sense stationary random processes. Wide-sense stationary Gaussian random processes. Cylostationary random processes.
Continuity, derivative, and integrals of random processes.
Mean square continuity. Mean square derivatives. Mean square integrals. Response of a linear system to random input.
Time averages of random processes and ergodic theorems. Fourier series and Karhunen-Loeve expansion.
Karhunen-Loeve expansion.
Summary. Problems.
7. Analysis and Processing of Random Signals.
Power spectral density.
Continuous-time random processes. Discrete-time random processes. Power spectral density as a time average.
Response of linear systems to random signals.
Continuous-time systems. Discrete-time systems.
Amplitude modulation by random signals. Optimum linear systems.
The orthogonality condition. Prediction. Estimation using the entire realization of the observed process. Estimation using causal filters.
The Kalman filter. Estimating the power spectral density.
Variance of periodogram estimate. Smoothing of periodogram estimate.
Summary. Problems.
8. Markov Chains.
Markov processes. Discrete-time Markov chains.
The n-step transition probabilities. The state probabilities. Steady state probabilities.
Continuous-time Markov chains.
State occupancy times. Transition rates and time-dependent state probabilities. Steady state probabilities and global balance equations.
Classes of states, recurrence properties, and limiting probabilities.
Classes of states. Recurrence properties. Limiting probabilities. Limiting probabilities for continuous-time Markov chains.
Time-reversed Markov chains.
Time-reversible Markov chains. Time-reversible continuous-time Markov chains.
Summary. Problems.
9. Introduction to Queueing Theory.
The elements of a queueing system. Little's formula. The M/M/I queue.
Distribution of number in the system. Delay distribution in M/M/I system and arriving customer's distribution. The M/M/I system with finite capacity.
Multi-server systems: M/M/c, M/M/c/c, and M/M/infinity.
Distribution of number in the M/M/c system. Waiting time distribution for M/M/c. The M/M/c/c queueing system. The M/M/infinity queueing system.
Finite-source queueing systems.
Arriving customer's distribution.
M/G/I queueing systems.
The residual service time. Mean delay in M/G/I systems. Mean delay in M/G/I systems with priority service discipline.
M/G/I analysis using embedded Markov chains.
The embedded Markov chains. The number of customers in an M/G/I system. Delay and waiting time distribution in an M/G/I system.
Burke's theorem: Departures from M/M/c systems Proof of Burke's theorem using time reversibility. Networks of queues: Jackson's theorem.
Open networks of queues. Proof of Jackson's theorem. Closed networks of queues. Mean value analysis. Proof of the arrival theorem.
Summary. Problems.
Appendix A. Mathematical Tables.
Appendix B. Tables of Fourier Transformation.
Appendix C. Computer Programs for Generating Random Variables.
Answers to Selected Problems.
Index.
Erscheint lt. Verlag | 9.6.2005 |
---|---|
Sprache | englisch |
Maße | 188 x 232 mm |
Gewicht | 900 g |
Themenwelt | Mathematik / Informatik ► Mathematik ► Angewandte Mathematik |
Mathematik / Informatik ► Mathematik ► Wahrscheinlichkeit / Kombinatorik | |
Technik ► Elektrotechnik / Energietechnik | |
ISBN-10 | 0-321-18963-9 / 0321189639 |
ISBN-13 | 978-0-321-18963-9 / 9780321189639 |
Zustand | Neuware |
Haben Sie eine Frage zum Produkt? |
aus dem Bereich