28
American Mathematical Society John B. Walsh Knowing the Odds An Introduction to Probability Graduate Studies in Mathematics Volume 139

Knowing the Odds - American Mathematical Society · Knowing the odds : an introduction to probability / John B. Walsh. p. cm.– (Graduate studies in mathematics ; v. 139) Includes

  • Upload
    others

  • View
    4

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Knowing the Odds - American Mathematical Society · Knowing the odds : an introduction to probability / John B. Walsh. p. cm.– (Graduate studies in mathematics ; v. 139) Includes

American Mathematical Society

John B. Walsh

Knowing the OddsAn Introduction to Probability

Graduate Studies in Mathematics

Volume 139

Page 2: Knowing the Odds - American Mathematical Society · Knowing the odds : an introduction to probability / John B. Walsh. p. cm.– (Graduate studies in mathematics ; v. 139) Includes

Knowing the OddsAn Introduction to Probability

Page 3: Knowing the Odds - American Mathematical Society · Knowing the odds : an introduction to probability / John B. Walsh. p. cm.– (Graduate studies in mathematics ; v. 139) Includes
Page 4: Knowing the Odds - American Mathematical Society · Knowing the odds : an introduction to probability / John B. Walsh. p. cm.– (Graduate studies in mathematics ; v. 139) Includes

Knowing the OddsAn Introduction to Probability

John B. Walsh

American Mathematical SocietyProvidence, Rhode Island

Graduate Studies in Mathematics

Volume 139

http://dx.doi.org/10.1090/gsm/139

Page 5: Knowing the Odds - American Mathematical Society · Knowing the odds : an introduction to probability / John B. Walsh. p. cm.– (Graduate studies in mathematics ; v. 139) Includes

EDITORIAL COMMITTEE

David Cox (Chair)Daniel S. FreedRafe Mazzeo

Gigliola Staffilani

2010 Mathematics Subject Classification. Primary 60-01.

For additional information and updates on this book, visitwww.ams.org/bookpages/gsm-139

Library of Congress Cataloging-in-Publication Data

Walsh, John B.Knowing the odds : an introduction to probability / John B. Walsh.

p. cm.– (Graduate studies in mathematics ; v. 139)Includes bibliographical references and index.ISBN 978–0-8218-8532-1 (alk. paper)1. Probabilities. I. title.

QA273.W24 2011519.2–dc23 2012013119

Copying and reprinting. Individual readers of this publication, and nonprofit librariesacting for them, are permitted to make fair use of the material, such as to copy a chapter for usein teaching or research. Permission is granted to quote brief passages from this publication inreviews, provided the customary acknowledgment of the source is given.

Republication, systematic copying, or multiple reproduction of any material in this publicationis permitted only under license from the American Mathematical Society. Requests for suchpermission should be addressed to the Acquisitions Department, American Mathematical Society,201 Charles Street, Providence, Rhode Island 02904-2294 USA. Requests can also be made bye-mail to [email protected].

c© 2012 by the American Mathematical Society. All rights reserved.The American Mathematical Society retains all rights

except those granted to the United States Government.Printed in the United States of America.

©∞ The paper used in this book is acid-free and falls within the guidelinesestablished to ensure permanence and durability.

Visit the AMS home page at http://www.ams.org/

10 9 8 7 6 5 4 3 2 1 17 16 15 14 13 12

Page 6: Knowing the Odds - American Mathematical Society · Knowing the odds : an introduction to probability / John B. Walsh. p. cm.– (Graduate studies in mathematics ; v. 139) Includes

To my wife, Joke

Page 7: Knowing the Odds - American Mathematical Society · Knowing the odds : an introduction to probability / John B. Walsh. p. cm.– (Graduate studies in mathematics ; v. 139) Includes
Page 8: Knowing the Odds - American Mathematical Society · Knowing the odds : an introduction to probability / John B. Walsh. p. cm.– (Graduate studies in mathematics ; v. 139) Includes

Contents

Preface xi

Introduction xiii

Chapter 1. Probability Spaces 1

§1.1. Sets and Sigma-Fields 1

§1.2. Elementary Properties of Probability Spaces 6

§1.3. The Intuition 8

§1.4. Conditional Probability 15

§1.5. Independence 18

§1.6. Counting: Permutations and Combinations 22

§1.7. The Gambler’s Ruin 30

Chapter 2. Random Variables 39

§2.1. Random Variables and Distributions 39

§2.2. Existence of Random Variables 45

§2.3. Independence of Random Variables 48

§2.4. Types of Distributions 49

§2.5. Expectations I: Discrete Random Variables 54

§2.6. Moments, Means and Variances 60

§2.7. Mean, Median, and Mode 63

§2.8. Special Discrete Distributions 65

Chapter 3. Expectations II: The General Case 75

§3.1. From Discrete to Continuous 75

vii

Page 9: Knowing the Odds - American Mathematical Society · Knowing the odds : an introduction to probability / John B. Walsh. p. cm.– (Graduate studies in mathematics ; v. 139) Includes

viii Contents

§3.2. The Expectation as an Integral 81

§3.3. Some Moment Inequalities 85

§3.4. Convex Functions and Jensen’s Inequality 86

§3.5. Special Continuous Distributions 89

§3.6. Joint Distributions and Joint Densities 96

§3.7. Conditional Distributions, Densities, and Expectations 103

Chapter 4. Convergence 117

§4.1. Convergence of Random Variables 117

§4.2. Convergence Theorems for Expectations 122

§4.3. Applications 127

Chapter 5. Laws of Large Numbers 133

§5.1. The Weak and Strong Laws 134

§5.2. Normal Numbers 137

§5.3. Sequences of Random Variables: Existence� 140

§5.4. Sigma Fields as Information 142

§5.5. Another Look at Independence 144

§5.6. Zero-one Laws 145

Chapter 6. Convergence in Distribution and the CLT 151

§6.1. Characteristic Functions 151

§6.2. Convergence in Distribution 162

§6.3. Levy’s Continuity Theorem 170

§6.4. The Central Limit Theorem 176

§6.5. Stable Laws� 182

Chapter 7. Markov Chains and Random Walks 191

§7.1. Stochastic Processes 191

§7.2. Markov Chains 192

§7.3. Classification of States 201

§7.4. Stopping Times 204

§7.5. The Strong Markov Property 208

§7.6. Recurrence and Transience 211

§7.7. Equilibrium and the Ergodic Theorem for Markov Chains 218

§7.8. Finite State Markov Chains 226

§7.9. Branching Processes 234

§7.10. The Poisson Process 242

Page 10: Knowing the Odds - American Mathematical Society · Knowing the odds : an introduction to probability / John B. Walsh. p. cm.– (Graduate studies in mathematics ; v. 139) Includes

Contents ix

§7.11. Birth and Death Processes� 250

Chapter 8. Conditional Expectations 265

§8.1. Conditional Expectations 265

§8.2. Elementary Properties 268

§8.3. Approximations and Projections 272

Chapter 9. Discrete-Parameter Martingales 275

§9.1. Martingales 275

§9.2. System Theorems 282

§9.3. Convergence 290

§9.4. Uniform Integrability 295

§9.5. Applications 304

§9.6. Financial Mathematics I: The Martingale Connection� 315

Chapter 10. Brownian Motion 335

§10.1. Standard Brownian Motion 336

§10.2. Stopping Times and the Strong Markov Property 344

§10.3. The Zero Set of Brownian Motion 348

§10.4. The Reflection Principle 351

§10.5. Recurrence and Hitting Properties 352

§10.6. Path Irregularity 354

§10.7. The Brownian Infinitesimal Generator� 359

§10.8. Related Processes 363

§10.9. Higher Dimensional Brownian Motion 368

§10.10. Financial Mathematics II: The Black-Scholes Model� 374

§10.11. Skorokhod Embedding� 377

§10.12. Levy’s Construction of Brownian Motion� 388

§10.13. The Ornstein-Uhlenbeck Process� 390

§10.14. White Noise and the Wiener Integral� 394

§10.15. Physical Brownian Motion� 404

§10.16. What Brownian Motion Really Does 410

Bibliography 413

Index 415

Page 11: Knowing the Odds - American Mathematical Society · Knowing the odds : an introduction to probability / John B. Walsh. p. cm.– (Graduate studies in mathematics ; v. 139) Includes
Page 12: Knowing the Odds - American Mathematical Society · Knowing the odds : an introduction to probability / John B. Walsh. p. cm.– (Graduate studies in mathematics ; v. 139) Includes

Preface

In the long-forgotten days of pre-history, people would color peach pits dif-ferently on the two sides, toss them in the air, and bet on the color thatcame up. We, with a more advanced technology, toss coins. We flip a coininto the air. There are only two possible outcomes, heads or tails, but untilthe coin falls, we have no way of knowing which. The result of the flip maydecide a bet, it may decide which football team kicks off, which tennis playerserves, who does the dishes, or it may decide a hero’s fate.

The coin flip may be the most basic of all random experiments. If thecoin is reasonably well-made, heads is as likely as tails to occur. But. . . whatdoes that mean?

Suppose we flip a coin, and call “Heads” or “Tails” while it is in theair. Coins are subject to the laws of physics. If we could measure the exactposition, velocity, and angular velocity of the coin as it left the hand—its initial conditions—we could use Newton’s laws to predict exactly how itwould land. Of course, that measurement is impractical, but not impossible.The point is that the result is actually determined as soon as the coin is inthe air and, in particular, it is already determined when we call it; the resultis (theoretically) known, but not to us. As far as we are concerned, it is justas unpredictable as it was before the flip. Let us look at the physics to seewhy.

The outcome is determined by the exact position, angular position, ve-locity, and angular velocity at the time of the flip. Physicists represent theseall together as a point in what they call phase space. We can picture it asfollows.

xi

Page 13: Knowing the Odds - American Mathematical Society · Knowing the odds : an introduction to probability / John B. Walsh. p. cm.– (Graduate studies in mathematics ; v. 139) Includes

xii Preface

H

T T

H

HH

TT

H

T

T

H H

H

H

T

T

H

T

Figure 1. Phase space

This represents the initial condition of the coin in phase space. Somepoints lead to heads, some to tails. But a small difference in initial conditionscompletely changes the result. The conditions leading to heads are a unionof very small regions, which are evenly mixed up with those leading to tails.

This means that no matter how we try to toss the coin, we cannot zeroin on a particular result—our toss will be smeared out, so to speak, over the“Heads” and “Tails” regions, and this will happen no matter how carefullywe toss it. This leads us to say things like: “Heads and tails are equallylikely,” or “Heads and tails each have probability one-half.”

Philosophers ask deep questions about the meaning of randomness andprobability. Is randomness something fundamental? Or is it just a measureof our ignorance? Gamblers just want to know the odds.

Mathematicians by and large prefer to duck the question. If pressed,they will admit that most probability deals with chaotic situations, like theflip of a coin, where the seeming randomness comes from our ignorance of thetrue situation. But they will then tell you that the really important thingabout randomness is that it can be measured—for probabilities measurelikelihood—and that we can construct a mathematical model which enablesus to compute all of the probabilities, and that, finally, this model is theproper subject of study.

So you see, mathematicians side with the gamblers: they just want toknow the odds.

From now on, probability is mathematics. We will be content just tonote that it works—which is why so few casino owners go broke—and wewill leave the deeper meanings of randomness to the philosophers.

Page 14: Knowing the Odds - American Mathematical Society · Knowing the odds : an introduction to probability / John B. Walsh. p. cm.– (Graduate studies in mathematics ; v. 139) Includes

Introduction

There is an order to chaos. Unpredictability is predictable. In fact, random-ness itself is so regular that we can assign a number to a random occurrencewhich tells us in a precise way how likely it is. The number is called itsprobability.

That is not to say that we can predict the result of a single toss of a faircoin. We cannot. But we can predict that between forty and sixty out of ahundred tosses will be heads. We might—rarely—be wrong about that, butonly once or twice in a hundred tries, and if we continue to toss: a thousandtimes, a million times, and so on, we can be sure that the proportion ofheads will approach 1/2.

So randomness has its own patterns. Our aim is to understand them.

Probability is a rather unusual part of mathematics. While its full birthas a mathematical subject can be traced to the correspondence betweenFermat and Pascal1 in the summer of 1654, the subject wasn’t put on arigorous footing until 1934, 270 years later, when A. N. Kolmogorov showedit was properly a part of measure theory2. But probability had been aroundfor several centuries before measure theory existed, and it is quite possible tostudy the subject without it. In fact, probability is taught at many different

1Pascal and Fermat were by no means the first to study probabiity, but their work on the“problem of points” was so much deeper than what had gone before that it is properly consideredthe true beginning of the subject. See Keith Devlin’s “The Unfinished Game” [13] for an account.

2See [22] for an English translation of Kolmogorov’s landmark paper. It showed that all ofprobability theory could be regarded as a part measure theory, giving a general existence theoremfor stochastic processes (not present, alas, in this book, but see [12] or [9]) and a rigorous definitionof conditional expectations (see Chapter 8), which had previously been confined to special cases.This was quite a change from the more intuitive approach, and it took some time to replace “couldbe taken” by “is.” That was completed by Doob, culminating in his seminal book StochasticProcesses [12].

xiii

Page 15: Knowing the Odds - American Mathematical Society · Knowing the odds : an introduction to probability / John B. Walsh. p. cm.– (Graduate studies in mathematics ; v. 139) Includes

xiv Introduction

levels, according to the mathematics the students know: in elementary andhigh school, first year college, third or fourth year college, as well as ingraduate school. Certain things are common to all of these courses, but thethe more mathematics the student knows, the deeper he or she can go. Thisparticular text is drawn from a two-semester course taught over the yearsat the University of British Columbia, mainly to fourth-year mathematicshonors students. It assumes the student is familiar with calculus and knowssome analysis, but not measure theory. Many of the students, but by nomeans all, take a concurrent course in Lebesgue measure. It is not necessary,but it adds depth, and gives the student some “Aha!” moments, such asthe sudden realization: “Aha! The expectation is nothing but a Lebesgueintegral3!”

We begin with the basic axioms of probability, and the all-importantideas of conditional probability and independence. Then we quickly developenough machinery to allow the students to solve some interesting problemsand to analyze card games and lotteries. Just to show how quickly one canget into non-trivial questions, we work out the problem of the gambler’sruin.

The systematic study of classical probability begins in Chapter Two. Itsaim is to prove two of the basic classical theorems of the subject: the lawof large numbers and the central limit theorem. Far from being recondite,these theorems are practically part of Western folklore. Who has not heardof the law of averages? That is another name for the law of large numbers.What student has not been subject to “grading on a curve”, a direct (andoften mistaken) application of the central limit theorem? It is surprising howmuch of the curriculum is determined by the modest aim of understandingthose two results: random variables, their expectations and variances, theirdistributions, the idea of independence, and the ideas of convergence areneeded merely to state the theorems. A number of inequalities, the theoryof convergence in distribution, and the machinery of characteristic functions,are necessary to prove them. This, along with enough examples to supplythe intuition necessary to understanding, determines the first six chapters.

The second part of the book introduces stochastic processes, and changesthe viewpoint. Stochastic processes evolve randomly in time. Instead oflimit theorems at infinity, the emphasis is on what the processes actuallydo; we look at their sample paths, study their dynamics, and see that manyinteresting things happen between zero and infinity. There is a large se-lection of stochastic processes to study, and too little time to study them.

3On the other hand, students who take probability before measure theory have their “Aha!”moment later, when they realize that the Lebesgue integral is nothing but an expectation.

Page 16: Knowing the Odds - American Mathematical Society · Knowing the odds : an introduction to probability / John B. Walsh. p. cm.– (Graduate studies in mathematics ; v. 139) Includes

Introduction xv

We want to introduce processes which are major building blocks of the the-ory, and we aim the course towards Brownian motion and some of its weirdand wonderful sample path properties. Once more, this determines muchof the curriculum. We introduce the Markov property and stopping timeswith a study of discrete-parameter Markov chains and random walks, in-cluding special cases such as branching processes. Poisson and birth anddeath processes introduce continuous parameter processes, which preparesfor Brownian motion and several related processes.

The one non-obvious choice is martingales. This deserves some expla-nation. The subject was once considered esoteric, but has since shown itselfto be so useful4 that it deserves inclusion early in the curriculum. There aretwo obstructions. The first is that its whole setting appears abstract, sinceit uses sigma-fields to describe information. Experience has shown that itis a mistake to try to work around this; it is better to spend the necessarytime to make the abstract concrete by showing how sigma-fields encode in-formation, and, hopefully, make them intuitive. The second obstruction isthe lack of a general existence theorem for conditional expectations: thatrequires mathematics the students will not have seen, so that the only casein which we can actually construct conditional expectations is for discretesigma-fields, where we can do it by hand. It would be a pity to restrictourselves to this case, so we do some unashamed bootstrapping. Once weshow that our hand-constructed version satisfies the defining properties ofthe general conditional expectation, we use only these properties to developthe theory. When we have proved the necessary martingale theorems, wecan construct the conditional expectation with respect to a general sigmafield as the limit of conditional expectations on discrete sigma fields. Thisgives us the desired existence theorem . . . and shows that what we did wasvalid for general sigma-fields all along. We make free use of martingales inthe sequel. In particular, we show how martingale theory connects with acertain part of mathematical finance, the option pricing, or Black-Scholestheory.

The final chapter on Brownian motion uses most of what we have learnedto date, and could pull everything together, both mathematically and artis-tically. It would have done so, had we been able to resist the temptationto spoil any possible finality by showing—or at least hinting at—some of

4The tipping point was when engineers started using martingales to solve applied problems,and, in so doing, beat the mathematicians to some very nice theorems. The coup de grace wasstruck by the surprising realization that the celebrated Black-Scholes theory of finance, used by allserious option-traders in financial markets was, deeply, martingale theory in disguise. See sections9.6 and 10.10

Page 17: Knowing the Odds - American Mathematical Society · Knowing the odds : an introduction to probability / John B. Walsh. p. cm.– (Graduate studies in mathematics ; v. 139) Includes

xvi Introduction

the large mathematical territory it opens up: white noise, stochastic inte-grals, diffusions, financial mathematics, and probabilistic potential theory,for example.

A last word. To teach a course with pleasure, one should learn at thesame time. Fair is fair: the students should not be the only learners. Thisis automatic the first time one teaches a course, less so the third or fourthtime. So we tried to include enough sidelights and interesting byways toallow the instructor some choice, a few topics which might be substituted ateach repetition. Most of these are starred: �. In fact, we indulged ourselvessomewhat, and included personal favorites that we seldom have time to coverin the course, such as the Wiener stochastic integral, the Langevin equation,and the physical model of Brownian motion.

Page 18: Knowing the Odds - American Mathematical Society · Knowing the odds : an introduction to probability / John B. Walsh. p. cm.– (Graduate studies in mathematics ; v. 139) Includes
Page 19: Knowing the Odds - American Mathematical Society · Knowing the odds : an introduction to probability / John B. Walsh. p. cm.– (Graduate studies in mathematics ; v. 139) Includes

Bibliography

[1] Bachelier, Louis, 1900. Theorie de la speculation, Ann. Sci. Ecole Norm. Sup. 17,21-86.

[2] Bachelier, Louis, 1964. Theory of speculation, The Random Character of Stock Mar-ket Prices (P. Cootner, ed.), MIT Press, pp. 17-78, Translated from French byA. James Boness.

[3] Baxter, M. and A. Rennie, Financial Calculus: An Introduction to Derivative Pricing,Cambridge University Press, Cambridge,1996.

[4] Billingsley, Patrick, 1999. Convergence of Probability Measures, 2nd ed., John Wiley& Sons, New York.

[5] Black, F. and M. Scholes, 1973. The pricing of options and corporate liabilities,Journal of Political Economy, 81, 637-654.

[6] Borel, E., 1909. “Les probabilites denombrables et leurs applications arithmetiques”,Rendiconti del Circolo Matematico di Palermo 27, 247-271.

[7] Bayer, D. and P. Diaconis, 1992. “Trailing the Dovetail Shuffle to its Lair”, Annalsof Applied Probability 2, 294.

[8] Chacon, R. and J. B. Walsh, 1976. One-dimensional potential embeddings, Seminairede Probabilites X de L’Univ. de Strasbourg, Lecture Notes in Math, vol. 511, 19-23.

[9] Chung, K. L., A Course in Probability Theory, Academic Press, 1974.

[10] Dalang, R. C., A. Morton, and W. Willinger, 1990. Equivalent martingale measuresand no-arbitrage in stochastic securities market models, Stochastics and StochasticsReports 29, 185–201.

[11] Delbaen, F. and W. Schachermayer, The Mathematics of Arbitrage, Springer, 2006.

[12] Doob, J. L., 1953. Stochastic Processes, John Wiley & Sons Inc., New York.

[13] Devlin, Keith, The Unfinished Game, Basic Books, 2008.

[14] Durrett, R., Probability: Theory and Examples, Second edition, Duxbury Press,Belmont, CA., 1996.

[15] Dudley, R. M., Real Analysis and Probability, Cambridge University Press, Cam-bridge, 1989.

413

Page 20: Knowing the Odds - American Mathematical Society · Knowing the odds : an introduction to probability / John B. Walsh. p. cm.– (Graduate studies in mathematics ; v. 139) Includes

414 Bibliography

[16] Feller, William, An Introduction to Probability Theory and its Applications, Vol. 1,Third edition, John Wiley and Sons, New York, 1968.

[17] Feller, William, An Introduction to Probability Theory and its Applications, Vol. 2,Third edition, John Wiley and Sons, New York, 1971.

[18] Keynes, John Maynard, A Treatise on Probability, Macmillan Co., London, 1921.

[19] Karlin, Samuel, A First Course in Stochastic Processes, Academic press, New York,1966.

[20] Khoshnevisan, Davar. Probability, American Mathematical Society, Graduate Studiesin Mathematics 80, Providence RI, 2007.

[21] Kolmogorov, A. N., Grundbegriffe der Wahrscheinlichkeitsrechnung, Springer, Berlin,1933.

[22] Kolmogorov, A. N., Foundations of Probability, Chelsea Publishing Company, NewYork, 1950. [Translation edited by Nathan Morrison].

[23] Laplace, P. S., Theorie Analytique des Probabilites, Vol. I and II, Paris, Courcier,1812.

[24] Lehmann, E. L., Testing Statistical Hypotheses, Wiley, New York, 1959.

[25] Levy, P., Calcul des Probabilites, Gauthier-Villars, Paris, 1925.

[26] Levy, P., Processus stochastiques et mouvement brownien, Gauthier-Villars, 1948.Reedition en 1992 par Jacques Gabay.

[27] Lindeberg, J. W., 1922. Eine neue Herleitung des Exponentialgesetzes in derWahrscheinlichkeitsrechnung, Math. Z. 15, 211-225.

[28] Markov, A. A., 1910. Recherches sur un cas remarquable d’epreuves dependantes,Acta Math. 33, 87-104.

[29] Mazliak, Laurant, 2009. How Paul Levy saw Jean Ville and Martingales, ElectronicJournal for History of Probability and Statistics, Vol. 5, no. 1.

[30] Nelson, Edward, Dynamical Theories of Brownian Motion, Princeton UniversityPress, Princeton, NJ. 1967.

[31] Royden, H. L., Real Analysis, 3rd edition, Macmillan, New York, 1988.

[32] Scheffe, H., The Analysis of Variance, Wiley, New York, 1959.

[33] Wiener, Norbert, Extrapolation, Interpolation, and Smoothing of Stationary TimeSeries, Wiley, New York, 1949.

[34] Williams, D., Probability with Martingales, Cambridge University Press, Cambridge,1991.

Page 21: Knowing the Odds - American Mathematical Society · Knowing the odds : an introduction to probability / John B. Walsh. p. cm.– (Graduate studies in mathematics ; v. 139) Includes

Index

L2, 394

σ-field, 1σ-field, xv, 1–3, 8, 43, 192, 204, 205,

207, 266

absorption, 228

accessible, 201, 213algebra, 1

aperiodic, 203, 218, 223, 226

arbitrage, 318–320, 327–330ascii, 23

Bachelier, Louis, 335

bad luck, 249

balayage, 379Bayes rule, 18

Bernstein, S., 136Bernstrein polynomials, 137

bi-stochastic, 228

binary tree, 330binomial coefficient, 25

binomial theorem, 66

Black-Scholes, 374, 377bond, 317–319

Borel field, 1

Borel measurable, 44Borel sets, 3, 207, 395

Borel, Emile, 138

Borel-Cantelli, 135, 356

bounded convergence theorem, 126branching process, 312

Brown, Robert, 335

Brownian

bridge, 365

Markov property, 340

self-similarity, 339

time-reversal, 339

transformations, 338

zero set, 348

Brownian density, 337

Brownian motion, xv, 346, 349, 353,354, 356, 374–377, 380, 381, 383,384, 388, 392–394, 396, 402, 404,405, 408, 409

absorbing, 363

construction, 388

in [0,1], 364

killed, 363

logarithmic, 374

nowhere-differentiability, 354

quadratic variation, 355

reflecting, 363

standard, 336

unbounded variation, 355

Buffon’s needle problem, 51

Cantelli strong law, 134

Cantor set, 349

capture-recapture, 57

card shuffling, 227

central limit theorem, 383

characteristic function, 337

combination, 22, 25

communicate, 201, 202, 215

conditional expectation

existence, 305

415

Page 22: Knowing the Odds - American Mathematical Society · Knowing the odds : an introduction to probability / John B. Walsh. p. cm.– (Graduate studies in mathematics ; v. 139) Includes

416 Index

continuityalmost sure, 342Brownian, 341

continuousversion, 343

continuous parameter, 191convergence

almost everywhere, 117in distribution, 118, 409

for processes, 408in probability, 117in the mean, 117pointwise, 117vague, 168weak, 168, 169, 191

convex function, 86convolution, 107correlation, 99, 100, 107, 112countable additivity, 4, 7, 395counting principle, 22covariance, 99, 100, 108, 112, 407

function, 391matrix, 108, 391

covariance function, 337, 366covariance matrix, 158craps, 39cylinder sets, 192

density, 50, 83bivariate normal, 112Cauchy, 92, 93conditional, 104–106joint, 97, 99, 105, 111marginal, 98, 99multivariate Gaussian, 110normal, 90transformation, 94, 101uniform, 89

derivative, 317, 322, 323, 327, 329, 330,332, 376, 377

Brownian motion, 354Radon-Nikodym, 376

difference equation, 32, 33discontinuity

jump, 41, 342, 343oscillatory, 342

discrete parameter, 191distribution, 43, 195, 200, 337, 391

absolutely continuous, 50, 97Bernoulli, 65, 70binomial, 65Cauchy, 92

conditional, 103continuous, 49discrete, 49exponential, 90finite dimensional, 192Gamma, 93Gaussian, 90

tails, 342geometric, 68hypergeometric, 69, 70, 72initial, 194, 195, 223, 226joint, 96negative binomial, 69normal, 90Poisson, 66, 67standard normal, 90stationary, 223, 225, 226, 228uniform, 51, 89, 140

distribution function, 40–42, 44, 45, 50conditional, 103, 104joint, 96–98marginal, 98

dollardiscounted, 321, 376today’s, 321

dominated convergence theorem, 125Donsker, M., 386Doob decomposition, 281, 301Doob, J. L., 37, 85, 277, 290, 292Dow-Jones, 192drunkard’s walk, 196dyadic approximation, 75, 346

Einstein, Albert, 335, 408embedded chain, 257embedding, 377, 382

random walk, 382equally likely, 10equation

backward differential, 255Chapman-Kolmogorov, 195, 254difference, 33forward differential, 256Kolmogorov, 254Langevin, 405, 406renewal, 219stochastic differential, 405stochastic integral, 405

equilibrium, 196, 198, 218, 223, 225equivalence class, 202, 213, 394equivalence relation, 201ergodic, 218

Page 23: Knowing the Odds - American Mathematical Society · Knowing the odds : an introduction to probability / John B. Walsh. p. cm.– (Graduate studies in mathematics ; v. 139) Includes

Index 417

events, 8exchangeability, 70exchangeable, 70, 71expectation, 54, 81

as integral, 81conditional, xv, 103, 105, 265, 266,

268, 270, 298, 305, 306, 375discrete, 55, 76general, 75linearity, 57matrix, 108non-existence, Cauchy, 93properties, 56vector, 108

explosion, 261extinction, 237–239

fair game, 275, 283false positives, 17Fatou’s Lemma, 124, 297, 302Feller, W., 410Fermat, P., xiiifield, 1, 2, 192, 396filtration, 277, 278, 291, 345

right continuous, 350finitely additive, 395, 397first exits, 353first hitting time, 206, 211flush, 25Fourier inversion, 152full house, 26function

Borel, 44, 49, 400characteristic, xiv, 79, 151–153, 155,

337Bernoulli, 155binomial, 155Cauchy, 156exponential, 156Gaussian, 156joint, 155joint Gaussian, 158Poisson, 156

compact support, 360concave, 86, 379, 381convex, 86, 278covariance, 337, 390, 391distribution, 40, 42, 152

normal, 376elementary, 398excessive, 314gamma, 93

generating, 79indicator, 43moment generating, 91, 151non-differentiable, 335probability generating, 235, 236, 238probability mass, 50

joint, 97Rademacher, 141right continuous, 343Schauder, 388simple, 399, 401subharmonic, 277superharmonic, 277tent, 379

future, 317

Galton, Francis, 234gambler’s ruin, 30, 32, 191, 196, 286,

293duration, 286

gambling systems, 210Gaussian

bivariate, 107bound, 342existence, 110joint, 109linear combinations, 109multivariate, 108, 109, 157multivariate density, 110

generating function, 79geometry, 112

hand, 25bridge, 25poker, 25

Hilbert space, 113, 401hitting time, 344, 350horse-racing, 328

incrementsindependent, 389

independence, xiv, 18, 113conditional, 199, 200, 347, 366events, 19of random variables, 48pairwise, 20

independent increments, 243, 337, 341inequality

Chebyshev, 85, 134, 135, 287Jensen, 88, 89, 271, 278Lyapounov, 89martingale maximal, 287

Page 24: Knowing the Odds - American Mathematical Society · Knowing the odds : an introduction to probability / John B. Walsh. p. cm.– (Graduate studies in mathematics ; v. 139) Includes

418 Index

martingale minimal, 288Schwarz, 85, 100upcrossing, 290, 291

infinitesimal generator, 255, 359Brownian, 360, 392, 408domain, 360Ornstein-Uhlenbeck, 392, 407Poisson, 256

inflation, 321inner product, 395, 401integrability

uniform, 295, 299, 301, 307integrable, 55, 56, 76integral

expectation, 81Ito, 398, 402Lebesgue, xiv, 82, 402Lebesgue-Stieltjes, 82Riemann, 82, 106stochastic, xvi, 398, 401, 402, 408Wiener, xvi, 398, 401, 402

integrationMonte Carlo, 141

integration by parts, 402invariance principle, 386irreducible, 202, 215, 218, 221, 223, 226,

228

Jacobian, 101, 102, 107, 110joint density, 97joint distribution

Brownian, 337joint distribution function, 98jump chain, 260jump time, 259jump times, 246

Khintchine, A., 356, 384Kolmogorov differential equations, 254Kolmogorov strong law, 307Kolmogorov, A. N., xiii

Levy, Paul, 152, 388Laplace transform, 84law of large numbers, 133, 161

Cantelli, 134, 136, 139Kolmogorov, 135, 383strong, 134weak, 134

law of the iterated logarithm, 356, 358,384

law of the unconscious statistician, 56,82

Lazzerini, M., 52Lebesgue measure, 350, 354, 396Lebesgue sets, 3Lebesgue, Henri, 398lemma

Kronecker, 135likelihood ratio, 310limit

Cesaro, 221linear function of Gaussian, 91

market, 317, 318, 322, 329, 374stock, 279

market price, 330, 332Markov chain, xv, 192–194, 196, 200,

207–209, 211, 218, 223, 225, 226,228

finite state, 226Markov chains, 359

periodic, 226Markov property, 346

Brownian, 340Poisson process, 245strong, 246, 251, 260, 344, 346,

349–351, 381, 385Markov time, 205Markov, A.A., 205martingale, xv, 211, 276–278, 281, 284,

285, 287, 294, 298, 304, 311, 313,315, 316, 328, 336, 351, 353, 362,375, 378

backward, 294, 295, 300, 308Brownian, 361

martingale measure, 328, 330, 332, 376matrix

covariance, 108, 110, 337, 391transition probability, 193, 194, 200,

228n-step, 195

maximal inequality, 351mean, 60, 61, 63, 64, 112

geometric, 69hypergeometric, 72Poisson, 67uniform, 89

measureL2-valued, 395change of, 374equivalent, 374martingale, 328, 330, 332

Page 25: Knowing the Odds - American Mathematical Society · Knowing the odds : an introduction to probability / John B. Walsh. p. cm.– (Graduate studies in mathematics ; v. 139) Includes

Index 419

probability, 4synthetic, 328

measure theory, xiiimedian, 63, 64

Bernoulli, 65exponential, 90geometric, 69uniform, 89

memoryless, 90method of images, 363mode, 63

Bernoulli, 65geometric, 69

moment, 61absolute, 61central, 61

moment generating function, 62Bernoulli, 65binomial, 66Gaussian, 91generates moments, 62geometric, 68Poisson, 66

moments, 85monotone class, 1, 2monotone class theorem, 4monotone convergence theorem, 123Monte Carlo method, 52mutation, 198

normalbivariate, 107multivariate, 108

normal numbers, 137, 138, 140nowhere-differentiability

of Brownian motion, 354null recurrent, 220, 223, 225, 226

odds, 315, 316option, 317, 330, 376, 377

European, 376, 377optional stopping, 282optional time, 205orthogonal, 100, 113

partition, 55Pascal’s triangle, 26, 27Pascal, Blaise, xiiipast before T, 207perfect set, 349period, 203, 215, 221periodic, 221

permutation, 22, 23Poisson process, 243

distribution, 245portfolio, 323

self-financing, 323, 329positive definite, 108, 109positive recurrent, 220, 223, 225, 226,

394potential, 378, 381, 382

of a measure, 378potential field, 378potential theory, xvi, 277, 378power series, 59principle

duck, 106probability

absorption, 228conditional, xiv, 15, 103, 106density, 50equivalent, 327measure, 4, 52space, 8, 11, 39synthetic, 315

probability generating function, 59probability integral transformation, 46probability space, 6process

birth and death, xv, 251, 360branching, 197, 234, 235, 312continuous parameter, 191diffusion, 336Gaussian, 336, 337, 390, 395Markov, 192, 336, 341, 363, 391Ornstein-Uhlenbeck, 390, 391, 394,

405, 407Poisson, xv, 243, 342stable, 336stationary, 390, 407stochastic, xiv, 191velocity, 405, 408Wiener, 335

projection, 272property

Markov, 192, 194, 200, 208, 229memoryless, 90strong Markov, 210, 214, 391

discrete case, 209Pyncheon, T., 67

quadratic form, 111quadratic variation, 355queuing, 197

Page 26: Knowing the Odds - American Mathematical Society · Knowing the odds : an introduction to probability / John B. Walsh. p. cm.– (Graduate studies in mathematics ; v. 139) Includes

420 Index

race track, 316radius of convergence, 59random variable, 39, 42, 75

Bernoulli, 140Cauchy, 160discrete, 54existence, 45extended real values, 81Gaussian, 91, 391independent, 78uniform, 136, 140

random walk, xv, 196, 203, 215, 286,377

embedded, 378null recurrent, 286reflecting, 196three dimensions, 216transient in three dimensions, 217two dimensions, 215

recurrence, 211, 215, 352recurrence class, 213, 226recurrent, 212, 213, 218, 393reflection principle, 351restarting, 194reversible, 391Riemann integral, 80roulette, 204, 282

sample path, 341irregularity, 354

sample space, 8sampling

with replacement, 24without replacement, 24

semimartingale, 277sequential analysis, 280, 310short-sell, 279, 318Skorokhod embedding, 377, 378, 380,

382–384, 386Skorokhod, A.V., 377Snell, Laurie, 290standard deviation, 60state

recurrent, 212transient, 212

state space, 193Stirling’s formula, 214–216stochastic differential equation, 405, 408stock, 317, 322, 331, 374stopping time, 204–208, 210, 211, 283,

284, 302, 344, 350strike price, 377

strong Markov property, 209, 391submartingale, 276, 277, 284, 287, 288,

293, 294, 298backward, 300

success runs, 199supermartingale, 276, 284supporting line, 88symmetric difference, 6synthetic probabilities, 315, 316

tail field, 347Taylor’s formula, 360, 392

remainder, 360tent function, 379

Schauder, 388theorem

bounded convergence, 126Cameron-Martin, 374–376central limit, 90, 135, 356, 383Dini’s, 356dominated convergence, 125, 127,

220, 296ergodic, 225Fourier inversion, 159Fubini, 106Glivenko-Cantelli, 136Helly-Bray, 169Levy inversion, 159martingale convergence, 292, 294monotone class, 4, 49monotone convergence, 123, 155, 353normal number, 138, 140Poisson limit, 68Radon-Nikodym, 268Riesz-Fischer, 394Skorokhod, 380system, 282, 301Weierstrass, 136

tight, 169tightness, 169tote board, 315, 328trading rules, 318transformation

scaling, 338transient, 211–213, 223, 225, 228transition probability, 193, 341, 359

n-step, 194Poisson process, 245stationary, 193

trinomial tree, 332

uniform integrability, 295–297

Page 27: Knowing the Odds - American Mathematical Society · Knowing the odds : an introduction to probability / John B. Walsh. p. cm.– (Graduate studies in mathematics ; v. 139) Includes

Index 421

upcrossing inequality, 291upcrossings, 291

vague convergence, 168de la Vallee Poussin, 351value, 322variance, 60, 407

geometric, 69hypergeometric, 72Poisson, 67

viscosity, 405volatility, 374, 377

Watson, Henry William, 234weak convergence, 168Weierstrass, 335white noise, 394, 395, 405, 408Wiener, Norbert, 335Wright, Sewall, 198

zero set, 348zero-one law, 307

Blumenthal, 347Borel, 347

Page 28: Knowing the Odds - American Mathematical Society · Knowing the odds : an introduction to probability / John B. Walsh. p. cm.– (Graduate studies in mathematics ; v. 139) Includes

GSM/139

For additional informationand updates on this book, visit

www.ams.org/bookpages/gsm-139

www.ams.orgAMS on the Webwww.ams.org

John Walsh, one of the great masters of the subject, has written a superb book on probability at exactly this level. It covers at a leisurely pace all the important topics that students need to know, and provides excellent examples. I regret his book was not available when I taught such a course myself, a few years ago.

—Ioannis Karatzas, Columbia University

In this wonderful book, John Walsh presents a panoramic view of Probability Theory, starting from basic facts on mean, median and mode, continuing with an excellent account of Markov chains and martingales, and culminating with Brownian motion. Throughout, the author’s personal style is apparent; he manages to combine rigor with an emphasis on the key ideas so the reader never loses sight of the forest by being surrounded by too many trees. As noted in the preface, “To teach a course with pleasure, one should learn at the same time.” Indeed, almost all instructors will learn something new from the book, (e.g. the potential-theoretic proof of Skorokhod embedding) and at the same time, it is attractive and approachable for students.

—Yuval Peres, Microsoft

With many examples in each section that enhance the presentation, this book is a welcome addition to the collection of books that serve the needs of advanced under-graduate as well as first year graduate students. The pace is leisurely which makes it more attractive as a text.

—Srinivasa Varadhan, Courant Institute, New York

This book covers in a leisurely manner all the standard material that one would want in a full year probability course with a slant towards applications in fi nancial analysis at the graduate or senior undergraduate honors level. It contains a fair amount of measure theory and real analysis built in but it introduces sigma-fi elds, measure theory, and expectation in an especially elementary and intuitive way. A large variety of examples and exercises in each chapter enrich the presentation in the text.

Phot

o co

urte

sy o

f The

Ha