Zum Hauptinhalt springen
Nicht aus der Schweiz? Besuchen Sie lehmanns.de

Probability and Stochastic Processes (eBook)

(Autor)

eBook Download: EPUB
2014
John Wiley & Sons (Verlag)
978-1-118-59313-4 (ISBN)

Lese- und Medienproben

Probability and Stochastic Processes - Ionut Florescu
Systemvoraussetzungen
112,99 inkl. MwSt
(CHF 109,95)
Der eBook-Verkauf erfolgt durch die Lehmanns Media GmbH (Berlin) zum Preis in Euro inkl. MwSt.
  • Download sofort lieferbar
  • Zahlungsarten anzeigen

A comprehensive and accessible presentation of probability and stochastic processes with emphasis on key theoretical concepts and real-world applications

With a sophisticated approach, Probability and Stochastic Processes successfully balances theory and applications in a pedagogical and accessible format. The book's primary focus is on key theoretical notions in probability to provide a foundation for understanding concepts and examples related to stochastic processes.

Organized into two main sections, the book begins by developing probability theory with topical coverage on probability measure; random variables; integration theory; product spaces, conditional distribution, and conditional expectations; and limit theorems. The second part explores stochastic processes and related concepts including the Poisson process, renewal processes, Markov chains, semi-Markov processes, martingales, and Brownian motion. Featuring a logical combination of traditional and complex theories as well as practices, Probability and Stochastic Processes also includes:

  • Multiple examples from disciplines such as business, mathematical finance, and engineering
  • Chapter-by-chapter exercises and examples to allow readers to test their comprehension of the presented material
  • A rigorous treatment of all probability and stochastic processes concepts

An appropriate textbook for probability and stochastic processes courses at the upper-undergraduate and graduate level in mathematics, business, and electrical engineering, Probability and Stochastic Processes is also an ideal reference for researchers and practitioners in the fields of mathematics, engineering, and finance.

Ionut Florescu, PhD, is Research Associate Professor of Financial Engineering and Director of the Hanlon Financial Systems Lab at Stevens Institute of Technology. His areas of research interest include stochastic volatility, stochastic partial differential equations, Monte Carlo methods, and numerical methods for stochastic processes. He is also the coauthor of Handbook of Probability and coeditor of Handbook of Modeling High-Frequency Data in Finance, both published by Wiley.

Ionut Florescu, PhD, is Research Associate Professor of Financial Engineering and Director of the Hanlon Financial Systems Lab at Stevens Institute of Technology. His areas of research interest include stochastic volatility, stochastic partial differential equations, Monte Carlo methods, and numerical methods for stochastic processes. He is also the coauthor of Handbook of Probability and coeditor of Handbook of Modeling High-Frequency Data in Finance, both published by Wiley.

List of Figures xvii

List of Tables xxi

Preface i

Acknowledgments iii

Introduction 1

PART I PROBABILITY

1 Elements of Probability Measure 3

1.1 Probability Spaces 4

1.2 Conditional Probability 16

1.3 Independence 23

1.4 Monotone Convergence properties of probability 25

1.5 Lebesgue measure on the unit interval (0,1] 31

Problems 34

2 Random Variables 39

2.1 Discrete and Continuous Random Variables 42

2.2 Examples of commonly encountered Random Variables 46

2.3 Existence of random variables with prescribed distribution. Skorohod representation of a random variable 59

2.4 Independence 62

2.5 Functions of random variables. Calculating distributions 66

Problems 76

3 Applied chapter: Generating Random Variables 81

3.1 Generating one dimensional random variables by inverting the CDF 82

3.2 Generating one dimensional normal random variables 85

3.3 Generating random variables. Rejection sampling method 88

3.4 Generating random variables. Importance sampling 104

Problems 113

4 Integration Theory 117

4.1 Integral of measurable functions 118

4.2 Expectations 124

4.3 Moments of a random variable. Variance and the correlation coefficient. 137

4.4 Functions of random variables. The Transport Formula. 139

4.5 Applications. Exercises in probability reasoning. 142

4.6 A Basic Central Limit Theorem: The DeMoivre-Laplace Theorem: 144

Problems 146

5 Conditional Distribution and Conditional Expectation 149

5.1 Product Spaces 150

5.2 Conditional distribution and expectation. Calculation in simple cases 154

5.3 Conditional expectation. General definition 157

5.4 Random Vectors. Moments and distributions 160

Problems 169

6 Moment Generating Function. Characteristic Function. 173

6.1 Sums of Random Variables. Convolutions 173

6.2 Generating Functions and Applications 174

6.3 Moment generating function 180

6.4 Characteristic function 184

6.5 Inversion and Continuity Theorems 191

6.6 Stable Distributions. Lévy Distribution 196

Problems 200

7 Limit Theorems 205

7.1 Types of Convergence 205

7.2 Relationships between types of convergence 213

7.3 Continuous mapping theorem. Joint convergence. Slutsky's theorem 222

7.4 The two big limit theorem: LLN and CLT 224

7.5 Extensions of CLT 237

7.6 Exchanging the order of limits and expectations 243

Problems 244

8 Statistical Inference 251

8.1 The classical problems in statistics 251

8.2 Parameter Estimation problem 252

8.3 Maximum Likelihood Estimation Method 257

8.4 The Method of Moments 268

8.5 Testing, The likelihood ratio test 269

8.6 Confidence Sets 276

Problems 278

PART II STOCHASTIC PROCESSES

9 Introduction to Stochastic Processes 285

9.1 General characteristics of Stochastic processes 286

9.2 A Simple process? The Bernoulli process 293

Problems 296

10 The Poisson process 299

10.1 Definitions. 299

10.2 Interarrival and waiting time for a Poisson process 302

10.3 General Poisson Processes 309

10.4 Simulation techniques. Constructing Poisson Processes 315

Problems 318

11 Renewal Processes 323

11.1 Limit Theorems for the renewal process 326

11.2 Discrete Renewal Theory. 335

11.3 The Key Renewal Theorem 340

11.4 Applications of the Renewal Theorems 342

11.5 Special cases of renewal processes 344

11.6 The renewal Equation 350

11.7 Age dependent Branching processes 354

Problems 357

12 Markov Chains 361

12.1 Basic concepts for Markov Chains 361

12.2 Simple Random Walk on integers in d-dimensions 373

12.3 Limit Theorems 376

12.4 States in a MC. Stationary Distribution 377

12.5 Other issues: Graphs, first step analysis 384

12.6 A general treatment of the Markov Chains 385

Problems 395

13 Semi-Markov and Continuous time Markov Processes 401

13.1 Characterization Theorems for the general semi Markov process 403

13.2 Continuous time Markov Processes 407

13.3 The Kolmogorov Differential Equations 410

13.4 Calculating transition probabilities for a Markov process. General Approach 415

13.5 Limiting Probabilities for the Continuous time Markov Chain 416

13.6 Reversible Markov process 419

Problems 422

14 Martingales 427

14.1 Definition and examples 428

14.2 Martingales and Markov chains 430

14.3 Previsible process. The Martingale transform 432

14.4 Stopping time. Stopped process 434

14.5 Classical examples of Martingale reasoning 439

14.6 Convergence theorems. L1 convergence. Bounded martingales in L2 446

Problems 448

15 Brownian Motion 455

15.1 History 455

15.2 Definition 457

15.3 Properties of Brownian motion 461

15.4 Simulating Brownian motions 470

Problems 471

16 Stochastic Differential Equations 475

16.1 The construction of the stochastic integral 477

16.2 Properties of the stochastic integral 484

16.3 Ito lemma 485

16.4 Stochastic Differential equations. SDE's 489

16.5 Examples of SDE's 492

16.6 Linear systems of SDE's 503

16.7 A simple relationship between SDE's and PDE's 505

16.8 Monte Carlo Simulations of SDE's 507

Problems 512

A Appendix: Linear Algebra and solving difference equations and systems of differential equations 517

A.1 Solving difference equations with constant coefficients 518

A.2 Generalized matrix inverse and pseudodeterminant 519

A.3 Connection between systems of differential equations and matrices520

A.4 Linear Algebra results 523

A.5 Finding fundamental solution of the homogeneous system 526

A.6 The nonhomogeneous system 528

A.7 Solving systems when P is nonconstant 530

Index 533

Introduction


What is Probability? In essence:

Mathematical modeling of random events and phenomena. It is fundamentally different from modeling deterministic events and functions, which constitutes the traditional study of Mathematics.

However, the study of probability uses concepts and notions straight from Mathematics; in fact Measure Theory and Potential Theory are expressions of abstract mathematics generalizing the Theory of Probability.

Like so many other branches of mathematics, the development of probability theory has been stimulated by the variety of its applications. In turn, each advance in the theory has enlarged the scope of its influence. Mathematical statistics is one important branch of applied probability; other applications occur in such widely different fields as genetics, biology, psychology, economics, finance, engineering, mechanics, optics, thermodynamics, quantum mechanics, computer vision, geophysics,etc. In fact I compel the reader to find one area in today's science where no applications of probability theory can be found.

Early history


In the XVII-th century the first notions of Probability Theory appeared. More precisely, in 1654 Antoine Gombaud Chevalier de Méré, a French nobleman with an interest in gaming and gambling questions, was puzzled by an apparent contradiction concerning a popular dice game. The game consisted of throwing a pair of dice 24 times; the problem was to decide whether or not to bet even money on the occurrence of at least one ”double six” during the 24 throws. A seemingly well-established gambling rule led de Méré to believe that betting on a double six in 24 throws would be profitable (based on the payoff of the game). However, his own calculations based on many repetitions of the 24 throws indicated just the opposite. Using modern probability language de Méré was trying to establish if such an event has probability greater than 0.5 (we are looking at this question in example 1.7). Puzzled by this and other similar gambling problems he called on the famous mathematician BlaisePascal. This, in turn led to an exchange of letters between Pascal and another famous French mathematician Pierre de Fermat. This is the first known documentation of the fundamental principles of the theory of probability. Before this famous exchange of letters, a few other simple problems on games of chance had been solved in the XV-th and XVI-th centuries by Italian mathematicians; however, no general principles had been formulated before this famous correspondence.

In 1655 during his first visit to Paris, the Dutch scientist Christian Huygens learned of the work on probability carried out in this correspondence. On his return to Holland in 1657, Huygens wrote a small work De Ratiociniis in Ludo Aleae, the first printed work on the calculus of probabilities. It was a treatise on problems associated with gambling. Because of the inherent appeal of games of chance, probability theory soon became popular, and the subject developed rapidly during the XVIII-th century.

The XVIII-th century


The major contributors during this period were Jacob Bernoulli (1654–1705) and Abraham de Moivre (1667-1754). Jacob (Jacques) Bernoulli was a Swiss mathematician who was the first to use the term integral. He was the first mathematician in the Bernoulli family, a family of famous scientists of the XVIII-th century. Jacob Bernoulli's most original work was Ars Conjectandi published in Basel in 1713, eight years after his death. The work was incomplete at the time of his death but it still was a work of the greatest significance in the development of the Theory of Probability. De Moivre was a French mathematician who lived most of his life in England1. De Moivre pioneered the modern approach to the Theory of Probability, in his work The Doctrine of Chance: A Method of Calculating the Probabilities of Events in Play in the year 1718. A Latin version of the book had been presented to the Royal Society and published in the Philosophical Transactions in 1711. The definition of statistical independence appears in this book for the first time. The Doctrine of Chance appeared in new expanded editions in 1718, 1738 and 1756. The birthday problem (example 1.12) appeared in the 1738 edition, the gambler's ruin problem (example 1.11) in the 1756 edition. The 1756 edition of The Doctrine of Chance contained what is probably de Moivre's most significant contribution to probability, namely the approximation of the binomial distribution with the normal distribution in the case of a large number of trials - which is now known by most probability textbooks as “The First Central Limit Theorem” (we will discuss this theorem in Chapter 4). He understood the notion of standard deviation and is the first to write the normal integral (and the distribution density). In Miscellanea Analytica (1730) he derives Stirling's formula (wrongly attributed to Stirling) which he uses in his proof of the central limit theorem. In the second edition of the book in 1738 de Moivre gives credit to Stirling for an improvement to the formula. De Moivrewrote:

“I desisted in proceeding farther till my worthy and learned friend Mr James Stirling, who had applied after me to that inquiry, [discovered that ].”

De Moivre also investigated mortality statistics and the foundation of the theory of annuities. In 1724 he published one of the first statistical applications to finance Annuities on Lives, based on population data for the city of Breslau. In fact, in A History of the Mathematical Theory of Probability (London, 1865), Isaac Todhunter says that probability:

... owes more to [de Moivre] than any other mathematician, with the single exception of Laplace.

De Moivre died in poverty. He did not hold a university position despite his influential friends Leibnitz, Newton, and Halley, and his main income came from tutoring.

De Moivre, like Cardan (Girolamo Cardano), predicted the day of his own death. He discovered that he was sleeping 15 minutes longer each night and summing the arithmetic progression, calculated that he would die on the day when he slept for 24 hours. He was right!

The XIX-th century


This century saw the development and generalization of the early Probability Theory. Pierre-Simon de Laplace (1749–1827) published Théorie Analytique des Probabilités in 1812. This is the first fundamental book in probability ever published (the second being Kolmogorov's 1933 monograph). Before Laplace, probability theory was solely concerned with developing a mathematical analysis of games of chance. The first edition was dedicated to Napoleon-le-Grand, but the dedication was removed in later editions!2

The work consisted of two books and a second edition two years later saw an increase in the material by about 30 per cent. The work studies generating functions, Laplace's definition of probability, Bayes rule (so named by Poincaré many years later), the notion of mathematical expectation, probability approximations, a discussion of the method of least squares, Buffon's needle problem, andinverse Laplace transform. Later editions of the Théorie Analytique des Probabilités also contains supplements which consider applications of probability to determine errors in observations arising in astronomy, the other passion of Laplace.

On the morning of Monday 5 March 1827, Laplace died. Few events would cause the Academy to cancel a meeting but they did so on that day as a mark of respect for one of the greatest scientists of all time.

Century XX and modern times


Many scientists have contributed to the theory since Laplace's time; among the most important are Chebyshev, Markov, von Mises, and Kolmogorov.

One of the difficulties in developing a mathematical theory of probability has been to arrive at a definition of probability that is precise enough for use in mathematics, yet comprehensive enough to be applicable to a wide range of phenomena. The search for a widely acceptable definition took nearly three centuries and was marked by much controversy. The matter was finally resolved in the 20th century by treating probability theory on an axiomatic basis. In 1933, a monograph by the Russian giant mathematician Andrey Nikolaevich Kolmogorov (1903–1987) outlined an axiomatic approach that forms the basis for the modern theory. In 1925, the year he started his doctoral studies, Kolmogorov published his first paper with Khinchin on the probability theory. The paper contains, among other inequalities about partial series of random variables, the three series theorem which provides important tools for stochastic calculus. In 1929, when he finished his doctorate, he already had published 18 papers. Among them were versions of the strong law of large numbers and the law of iterated logarithm.

In 1933, two years after his appointment as a professor at Moscow University, Kolmogorov published Grundbegriffe der Wahrscheinlichkeitsrechnung his most fundamental book. In it he builds up probability theory in a rigorous way from fundamental axioms in a way comparable with Euclid's treatment of geometry. He gives a rigorous definition of the conditional expectation which later became fundamental for the definition of Brownian motion, stochastic integration, and Mathematics of Finance. (Kolmogorov's monograph is available in English translation as Foundations of Probability Theory,...

Erscheint lt. Verlag 4.12.2014
Sprache englisch
Themenwelt Mathematik / Informatik Mathematik Statistik
Mathematik / Informatik Mathematik Wahrscheinlichkeit / Kombinatorik
Technik
Schlagworte Angewandte Wahrscheinlichkeitsrechnung u. Statistik • applied probability • Applied Probability & Statistics • Betriebswirtschaft u. Operationsforschung • Business & Management • Integration Theory • limit theorems • <p>probability • Management Science/Operational Research • Markov chains</p> • Numerical Methods • Probability & Mathematical Statistics • probability measure • Statistics • Statistik • Stochastic Processes • Wahrscheinlichkeitsrechnung • Wahrscheinlichkeitsrechnung u. mathematische Statistik • Wirtschaft u. Management
ISBN-10 1-118-59313-8 / 1118593138
ISBN-13 978-1-118-59313-4 / 9781118593134
Informationen gemäß Produktsicherheitsverordnung (GPSR)
Haben Sie eine Frage zum Produkt?
EPUBEPUB (Adobe DRM)

Kopierschutz: Adobe-DRM
Adobe-DRM ist ein Kopierschutz, der das eBook vor Mißbrauch schützen soll. Dabei wird das eBook bereits beim Download auf Ihre persönliche Adobe-ID autorisiert. Lesen können Sie das eBook dann nur auf den Geräten, welche ebenfalls auf Ihre Adobe-ID registriert sind.
Details zum Adobe-DRM

Dateiformat: EPUB (Electronic Publication)
EPUB ist ein offener Standard für eBooks und eignet sich besonders zur Darstellung von Belle­tristik und Sach­büchern. Der Fließ­text wird dynamisch an die Display- und Schrift­größe ange­passt. Auch für mobile Lese­geräte ist EPUB daher gut geeignet.

Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen eine Adobe-ID und die Software Adobe Digital Editions (kostenlos). Von der Benutzung der OverDrive Media Console raten wir Ihnen ab. Erfahrungsgemäß treten hier gehäuft Probleme mit dem Adobe DRM auf.
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen eine Adobe-ID sowie eine kostenlose App.
Geräteliste und zusätzliche Hinweise

Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.

Mehr entdecken
aus dem Bereich