Upload
habibah-ismail
View
7.137
Download
5
Embed Size (px)
P a g e | 1
Acknowledge
First of all, I would like to say Alhamdulillah, for giving me the strength and health to do this project work.
Not forgotten my parents for providing everything, such as money, to buy anything that are related to this project work and their advise, which is the most needed for this project. Internet, books, computers and all that. They also supported me and encouraged me to complete this task so that I will not procrastinate in doing it.
Then I would like to thank my teacher, Madam Zaiton for guiding me and my friends throughout this project. We had some difficulties in doing this task, but she taught us patiently until we knew what to do. She tried and tried to teach us until we understand what we supposed to do with the project work.
Last but not least, my friends who were doing this project with me and sharing our ideas. They were helpful that when we combined and discussed together, we had this task done.
P a g e | 2
Objective
The aims of carrying out this project work are:
to apply and adapt a variety of problem-solving strategies to solve problems;
to improve thinking skills;
to promote effective mathematical communication;
to develop mathematical knowledge through problem solvingIn a way that increases students’ interest and confidence;
to use the language of mathematics to express mathematical ideas precisely;
to provide learning environment that stimulates and enhances effective learning;
to develop positive attitude towards mathematics.
P a g e | 3
PART ONE
P a g e | 4
IntroductionProbability is a way of expressing knowledge or belief that an event will occur or has occurred. In mathematics the concept has been given an exact meaning in probability theory, that is used extensively in such areas of study as mathematics, statistics, finance, gambling, science, and philosophy to draw conclusions about the likelihood of potential events and the underlying mechanics of complex systems.
Probability has a dual aspect: on the one hand the probability or likelihood of hypotheses given the evidence for them and on the other hand the behavior of stochastic processes such as the throwing of dice or coins. The study of the former is historically older in, for example, the law of evidence, while the mathematical treatment of dice began with the work of Pascal and Fermat in the 1650s.
Probability is distinguished from statistics. While statistics deals with data and inferences from it, (stochastic) probability deals with the stochastic (random) processes which lie behind data or outcomes.
HISTORY - ORIGINS
Ancient and medieval law of evidence developed a grading of degrees of proof, probabilities, presumptions and half-proof to deal with the uncertainties of evidence in court. In Renaissance times, betting was discussed in terms of odds such as "ten to one" and maritime insurance premiums were estimated based on intuitive risks, but there was no theory on how to calculate such odds or premiums.
The mathematical methods of probability arose in the correspondence of Pierre de Fermat and Blasé Pascal (1654) on such questions as the fair division of the stake in an interrupted game of chance. Christian Huygens (1657) gave a comprehensive treatment of the subject.
IMPORTANCE OF PROBABILITY IN LIFE
I will assume that you are referring to probability theory. Statistics is based on an understanding of probability theory.
Many professions require basic understanding of statistics. So, in these cases, it is important.
Probability theory goes beyond mathematics. It involves logic and reasoning abilities. Marketing and politics have one thing in common, biased statistics. I believe since you are exposed to so many statistics, a basic understanding of this area allows more critical thinking. The book "How to lie with statistics" is a classic and still in print.
So, while many people would probably say that probability theory has little importance in their lives, perhaps in some cases if they knew more, it would have more importance.
P a g e | 5
History of Probability
Etymology
Probable and likely and their cognates in other modern languages derive from medieval learned Latin probabilis and verisimilis, deriving from Cicero and generally applied to an opinion to mean plausible or generally approved.
Origins
Ancient and medieval law of evidence developed a grading of degrees of proof, probabilities, presumptions and half-proof to deal with the uncertainties of evidence in court. In Renaissance times, betting was discussed in terms of odds such as "ten to one" and maritime insurance premiums were estimated based on intuitive risks, but there was no theory on how to calculate such odds or premiums.
The mathematical methods of probability arose in the correspondence of Pierre de Fermat and Blasé Pascal (1654) on such questions as the fair division of the stake in an interrupted game of chance. Christiaan Huygens (1657) gave a comprehensive treatment of the subject.
18th Century
Jacob Bernoulli's Ars Conjectandi (posthumous, 1713) and Abraham de Moivre's The Doctrine of Chances (1718) put probability on a sound mathematical footing, showing how to calculate a wide range of complex probabilities. Bernoulli proved a version of the fundamental law of large numbers, which states that in a large number of trials, the average of the outcomes is likely to be very close to the expected value - for example, in 1000 throws of a fair coin, it is likely that there are close to 500 heads (and the larger the number of throws, the closer to half-and-half the proportion is likely to be).
P a g e | 6
19th Century
The power of probabilistic methods in dealing with uncertainty was shown by Gauss's determination of the orbit of Ceres from a few observations. The theory of errors used the method of least squares to correct error-prone observations, especially in astronomy, based on the assumption of a normal distribution of errors to determine the most likely true value.
Towards the end of the nineteenth century, a major success of explanation in terms of probabilities was the Statistical mechanics of Ludwig Boltzmann and J. Willard Gibbs which explained properties of gases such as temperature in terms of the random motions of large numbers of particles.
The field of the history of probability itself was established by Isaac Todhunter's monumental History of the Mathematical Theory of Probability from the Time of Pascal to that of Lagrange (1865).
20th Century
Probability and statistics became closely connected through the work on hypothesis testing of R. A. Fisher and Jerzy Neyman, which is now widely applied in biological and psychological experiments and in clinical trials of drugs. A hypothesis, for example that a drug is usually effective, gives rise to a probability distribution that would be observed if the hypothesis is true. If observations approximately agree with the hypothesis, it is confirmed, if not, the hypothesis is rejected.
The theory of stochastic processes broadened into such areas as Markov processes and Brownian motion, the random movement of tiny particles suspended in a fluid. That provided a model for the study of random fluctuations in stock markets, leading to the use of sophisticated probability models in mathematical finance, including such successes as the widely-used Black-Scholes formula for the valuation of options.
The twentieth century also saw long-running disputes on the interpretations of probability. In the mid-century frequentism was dominant, holding that probability means long-run relative frequency in a large number of trials. At the end of the century there was some revival of the Bayesian view, according to which the fundamental notion of probability is how well a proposition is supported by the evidence for it.
The mathematical treatment of probabilities, especially when there are infinitely many possible outcomes, was facilitated by Kolmogorov's axioms (1931).
P a g e | 7
ApplicationsTwo major applications of probability theory in everyday life are in risk assessment and in trade on commodity markets. Governments typically apply probabilistic methods in environmental regulation where it is called "pathway analysis", often measuring well-being using methods that are stochastic in nature, and choosing projects to undertake based on statistical analyses of their probable effect on the population as a whole.
A good example is the effect of the perceived probability of any widespread Middle East conflict on oil prices - which have ripple effects in the economy as a whole. An assessment by a commodity trader that a war is more likely vs. less likely sends prices up or down, and signals other traders of that opinion. Accordingly, the probabilities are not assessed independently nor necessarily very rationally. The theory of behavioral finance emerged to describe the effect of such groupthink on pricing, on policy, and on peace and conflict.
It can reasonably be said that the discovery of rigorous methods to assess and combine probability assessments has had a profound effect on modern society. Accordingly, it may be of some importance to most citizens to understand how odds and probability assessments are made, and how they contribute to reputations and to decisions, especially in a democracy.
Another significant application of probability theory in everyday life is reliability. Many consumer products, such as automobiles and consumer electronics, utilize reliability theory in the design of the product in order to reduce the probability of failure. The probability of failure may be closely associated with the product's warranty.
P a g e | 8
Theoretical probabilityDefinition of Theoretical Probability
Probability is a likelihood that an event will happen.
We can find the theoretical probability of an event using the following ratio:
Let’s do a couple of examples.
Solved Examples on Theoretical Probability Example 1 If we toss a fair coin, what is the probability that a tail will show up?
Solution:
Tossing a tail is the favorable outcome here.
When you toss a coin there are only 2 possible outcomes: a Head or a Tail
So the options for tossing a tail are 1 out of 2.
We can also represent probability as a decimal or as a percent.
Example 2
A bag contains 20 marbles. 15 of them are red and 5 of them are blue in color. Find the probability of picking a red marble.
Let’s first answer a few questions here:
If I am going to randomly pick a marble from the bag then what results can I have:
I’ll either pick a red marble or a blue one.
My next question is what the chances of picking a red marble are:
P a g e | 9
There are 15 red marbles and just 5 blue marbles.
It’s obvious that we have three times as many red marbles as blue marbles.
So, the chance of picking a red marble is more than that of the blue one.
Therefore, the probability of picking a red marble is:
Example 3
Find the probability of getting a sum of 7 when you roll two dice.
Two dice are being rolled. The possible outcomes are as follows:
Let’s use the representation (a, b) for the outcomes where a = number on dice 1 and b = number on dice 2.
(1, 1), (1, 2), (1, 3), (1, 4), (1, 5), (1, 6),(2, 1), (2, 2), (2, 3), (2, 4), (2, 5), (2, 6),(3, 1), (3, 2), (3, 3), (3, 4), (3, 5), (3, 6),(4, 1), (4, 2), (4, 3), (4, 4), (4, 5), (4, 6),(5, 1), (5, 2), (5, 3), (5, 4), (5, 5), (5, 6),(6, 1), (6, 2), (6, 3), (6, 4), (6, 5), (6, 6)
There are 36 possible outcomes in all.
The question is when you roll two dice, what are the chances of getting a sum of 7?
From the list above identify the pairs with outcomes that add up to 7.
Let’s highlight them this way:
(1, 1), (1, 2), (1, 3), (1, 4), (1, 5), (1, 6),(2, 1), (2, 2), (2, 3), (2, 4), (2, 5), (2, 6),(3, 1), (3, 2), (3, 3), (3, 4), (3, 5), (3, 6),(4, 1), (4, 2), (4, 3), (4, 4), (4, 5), (4, 6),(5, 1), (5, 2), (5, 3), (5, 4), (5, 5), (5, 6),(6, 1), (6, 2), (6, 3), (6, 4), (6, 5), (6, 6)
Observe that the pairs along the main diagonal add up to 7. There are 6 such pairs.
So, the probability of getting a sum of 7 when we roll two dice is:
P a g e | 10
Empirical ProbabilityEmpirical probability, also known as relative frequency, or experimental probability, is the ratio of the number favorable outcomes to the total number of trials,[1][2] not in a sample space but in an actual sequence of experiments. In a more general sense, empirical probability estimates probabilities from experience and observation.[3] The phrase a posteriori probability has also been used as an alternative to empirical probability or relative frequency.[4] This unusual usage of the phrase is not directly related to Bayesian inference and not to be confused with its equally occasional use to refer to posterior probability, which is something else.
In statistical terms, the empirical probability is an estimate of a probability. If modeling using a binomial distribution is appropriate, it is the maximum likelihood estimate. It is the Bayesian estimate for the same case if certain assumptions are made for the prior distribution of the probability
An advantage of estimating probabilities using empirical probabilities is that this procedure is relatively free of assumptions. For example, consider estimating the probability among a population of men that they satisfy two conditions: (i) that they are over 6 feet in height; (ii) that they prefer strawberry jam to raspberry jam. A direct estimate could be found by counting the number of men who satisfy both conditions to give the empirical probability the combined condition. An alternative estimate could be found by multiplying the proportion of men who are over 6 feet in height with the proportion of men who prefer strawberry jam to raspberry jam, but this estimate relies on the assumption that the two conditions are statistically independent.
A disadvantage in using empirical probabilities arises in estimating probabilities which are either very close to zero, or very close to one. In these cases very large sample sizes would be needed in order to estimate such probabilities to a good standard of relative accuracy. Here statistical models can help, depending on the context, and in general one can hope that such models would provide improvements in accuracy compared to empirical probabilities, provided that the assumptions involved actually do hold. For example, consider estimating the probability that the lowest of the daily-maximum temperatures at a site in February in any one year is less zero degrees Celsius. A record of such temperatures in past years could be used to estimate this probability. A model-based alternative would be to select of family of probability distributions and fit it to the dataset contain past yearly values: the fitted distribution would provide an alternative estimate of the required probability. This alternative method can provide an estimate of the probability even if all values in the record are greater than zero.
P a g e | 11
Empirical and Theoretical ProbabilityEmpirical Probability of an event is an "estimate" that the event will happen based on how often the event occurs after collecting data or running an experiment (in a large number of trials). It is based specifically on direct observations or experiences.
Empirical Probability Formula
P (E) = probability that an event, E, will occur.Top = number of ways the specific event occurs.Bottom = number of ways the experiment could occur.
Example: A survey was conducted to determine students' favorite breeds of dogs. Each student chose only one breed.
Dog Collie Spaniel Lab BoxerPit-bull
Other
# 10 15 35 8 5 12
What is the probability that a student's favorite dog breed is Lab?Answer: 35 out of the 85 students chose Lab.
The probability is .
Theoretical Probability of an event is the number of ways that the event can occur, divided by the total number of outcomes. It is finding the probability of events that come from a sample space of known equally likely outcomes.
Theoretical Probability Formula
P (E) = probability that an event, E, will occur.n(E) = number of equally likely outcomes of E.n(S) = number of equally likely outcomes of sample space S.
Example 1: Find the probability of rolling a six on a fair die.
Answer: The sample space for rolling is die is 6 equally likely results: {1, 2, 3, 4, 5, 6}.
The probability of rolling a 6 is one out of 6 or
Example 2: Find the probability of tossing a fair die and getting an odd number.Answer:event E : tossing an odd numberoutcomes in E: {1, 3, 5}sample space S: {1, 2, 3, 4, 5, 6}
P a g e | 12
Comparing Empirical and Theoretical Probabilities
Karen and Jason roll two dice 50 times and record their results in the accompanying chart.1.) What is their empirical probability of rolling a 7?2.) What is the theoretical probability of rolling a 7?3.) How do the empirical and theoretical probabilities compare?
Sum of the rolls of two dice
3, 5, 5, 4, 6, 7, 7, 5, 9, 10, 12, 9, 6, 5, 7, 8, 7, 4, 11, 6,
8, 8, 10, 6, 7, 4, 4, 5, 7, 9, 9, 7, 8, 11, 6, 5, 4, 7, 7, 4,3, 6, 7, 7, 7, 8, 6, 7, 8, 9
Solution: 1.) Empirical probability (experimental probability or observed probability) is 13/50 = 26%.2.) Theoretical probability (based upon what is possible when working with two dice) = 6/36 = 1/6 = 16.7% (check out the table at the right of possible sums when rolling two dice). 3.) Karen and Jason rolled more 7's than would be expected theoretically.
P a g e | 13
PART TWO
P a g e | 14
a) My friends and I are playing Monopoly. At the beginner, each of us will toss a die once. The player who obtains the highest number will start the game. The possible outcomes are:
{1, 2, 3, 4, 5, 6}
b) Instead of one dice, two dice can also be tossed simultaneously by each player. The player will move the token according to the sum of all dots on both turned-up faces. For example, if the two dice are tossed simultaneously and "2" appears on one die and "3" appears on the other, the outcome of the toss is (2, 3). Hence, the player shall move the token 5 spaces. The possible outcomes when two dice are tossed simultaneously:
{(1, 1), (1, 2), (1, 3), (1, 4), (1, 5), (1, 6), (2, 1), (2, 2), (2, 3), (2, 4), (2, 5), (2, 6), (3, 1), (3, 2), (3, 3), (3, 4), (3, 5), (3, 6), (4, 1), (4, 2), (4, 3), (4, 4), (4, 5), (4, 6), (5, 1), (5, 2), (5, 3), (5, 4), (5, 5), (5, 6), (6, 1), (6, 2), (6, 3), (6, 4), (6, 5), (6, 6)}
P a g e | 15
PART THREE
P a g e | 16
Table 1 show the sum of all dots on both turned-up faces when two dice are tossed simultaneously.
Sum of dots on both turned-up faces (x)
Possible outcomes Probability, P(x)
2 (1,1) 136
3 (1,2),(2,1) 236
4 (1,3),(2,2),(3,1) 336
5 (1,4),(2,3),(3,2),(4,1) 436
6 (1,5),(2,4),(3,3),(4,2),(5,1) 536
7 (1,6),(2,5),(3,4),(4,3),(5,2),(6,1) 636
8 (2,6),(3,5),(4,4),(5,3),(6,2) 536
9 (3,6),(4,5),(6,3),(5,4) 436
10 (4,6),(5,5),(6,4) 336
11 (5,6),(6,5) 236
12 (6,6) 136
TABLE 1
A = {the two numbers are not the same}
P a g e | 17
= {(1,2),(2,1),(1,3),(3,1), (1,4),(2,3),(3,2),(4,1),(1,5),(2,4),(4,2),(5,1), (1,6),(2,5),(3,4),(4,3),(5,2),(6,1), (2,6),(3,5),(5,3),(6,2), (3,6),(4,5),(6,3),(5,4), (4,6),(6,4), (5,6),(6,5)}
P (A) =n( A)n (S)
=3036
B = {the product of the two numbers is greater than 36}
={∅ }
C = {Both numbers are prime or the difference between two numbers is odd)
={(2,2),(2,3),(2,5),(3,2),(3,3),(3,5),(5,2),(5,3),(5,5)}∪
{(1,2),(1,4),(1,6), (2,3),(2,5),(3,2),(3,4),(3,6),(4,1),(4,3),(4,5),
(5,2),(5,4),(5,6),(6,1),(6,3),(6,5)}
P(C) = 9
36+ 17
36
=2636
D = {The sum of the two numbers are even and both numbers are prime) = {(1,1),(1,3),(1,5),(2,2),(2,4),(2,6),(3,1),(3,3),(3,5),(4,2),(4,4),(4,6),(5,1),(5,3),
(5,5),(6,2),(6,4),(6,6)}∩{(2,3),(2,5),(3,2),(3,3),(3,5),(5,2),(5,3),(5,5)}
P (D) = 1836
×8
36
= 19
P a g e | 18
PART FOUR
P a g e | 19
a) An activity was conducted by tossing two dice simultaneously 50 times. The sums of all dots on both turned-up faces were observed. The frequency table is completed below.
Sum of the two numbers (x) Frequency (f)2 33 84 65 76 97 38 29 410 411 312 1
TABLE 2
- mean =∑ fx
N
=30150
=6.02
- Variance ¿∑ f x2
N−x2
=2185
50−6.022
=7.4596
- Standard deviation =√∑ f x2
N−x2
=√7.4596
=2.731226831
b) New mean value if the number of tosses is increased to 100 times =
P a g e | 20
c) The prediction in (b) is tested by continuing Activity 3(a) until the total number of tosses is 100 times. The value of (i) mean; (ii) variance; and (iii) standard deviationof the new data is estimated.
x f fx fx2
2 6 12 243 9 27 814 11 44 1765 12 60 3006 13 78 4687 10 70 4908 7 56 4489 12 108 97210 7 70 70011 6 66 72612 7 84 1008
∑ f =¿100 ∑ fx=675 ∑ f x2=5393
P a g e | 21
- mean =∑ fx
N
=675100 = 6.75
- Variance= ∑ f x2
N−x2
=5363100
−6.752
=8.3675
- Standard deviation =√∑ f x2
N−x2
=2.892663133
The prediction is proven.
P a g e | 22
PART FIVE
P a g e | 23
When two dice are tossed simultaneously, the actual mean and variance of the sum of all dots on the turned-up faces can be determined by using the formulae below:
a) Based on Table 1, the actual mean, the variance and the standard deviation of the sum of all dots on the turned-up faces are determined by using the formulae given.
x x2 P(x) x P(x) x2 P(x)2 4 1
361
1819
3 9 236
16
12
4 16 336
13
43
5 25 436
59
259
6 36 536
56
5
7 49 636
76
496
8 64 536
109
1603
9 81 436
1 9
10 100 336
56
253
11 121 236
1118
12118
12 144 136
13
4
Mean=7
Variance=1787
18−72
=50.27777778
P a g e | 24
Standard deviation=√50.277777778 =7.090682462
b) Table below shows the comparison of mean, variance and standard deviation of part 4 and part 5.
PART 4 PART 5n=50 n=100
Mean 6.02 6.75 7.00Variance 7.4596 8.3675 50.27777778Standard Deviation 2.731226831 2.892663133 7.090682462
We can see that, the mean, variance and standard deviation that we obtained through experiment in part 4 are different but close to the theoretical value in part 5.
For mean, when the number of trial increased from n=50 to n=100, its value get closer (from 6.02 to 6.75) to the theoretical value. This is in accordance to the Law of Large Number. We will discuss Law of Large Number in next section.
Nevertheless, the empirical variance and empirical standard deviation that we obtained i part 4 get further from the theoretical value in part 5. This violates the Law of Large Number. This is probably due to
a. The sample (n=100) is not large enough to see the change of value of mean, variance and standard deviation.
b. Law of Large Number is not an absolute law. Violation of this law is still possible though the probability is relative low.
In conclusion, the empirical mean, variance and standard deviation can be different from the theoretical value. When the number of trial (number of sample) getting bigger, the empirical value should get closer to the theoretical value. However, violation of this rule is still possible, especially when the number of trial (or sample) is not large enough.
c) The range of the mean
6 ≤ mean≤ 7
P a g e | 25
Conjecture: As the number of toss, n, increases, the mean will get closer to 7. 7 is the theoretical mean.
Image below support this conjecture where we can see that, after 500 toss, the theoretical mean become very close to the theoretical mean, which is 3.5. (Take note that this is experiment of tossing 1 die, but not 2 dice as what we do in our experiment)
P a g e | 26
FURTHER EXPLORATION
P a g e | 27
Law of Large NumbersIn probability theory, the law of large numbers (LLN) is a theorem that describes the result of performing the same experiment a large number of times. According to the law, the average of the results obtained from a large number of trials should be close to the expected value, and will tend to become closer as more trials are performed.
For example, a single roll of a six-sided die produces one of the numbers 1, 2, 3, 4, 5, 6, each with equal probability. Therefore, the expected value of a single die roll is
1+2+3+4+5+66
=3.5
According to the law of large numbers, if a large number of dice are rolled, the average of their values (sometimes called the sample mean) is likely to be close to 3.5, with the accuracy increasing as more dice are rolled.
Similarly, when a fair coin is flipped once, the expected value of the number of heads is equal to one half. Therefore, according to the law of large numbers, the proportion of heads in a large number of coin flips should be roughly one half. In particular, the proportion of heads after n flips will almost surely converge to one half as n approaches infinity.
Though the proportion of heads (and tails) approaches half, almost surely the absolute (nominal) difference in the number of heads and tails will become large as the number of flips becomes large. That is, the probability that the absolute difference is a small number approaches zero as number of flips becomes large. Also, almost surely the ratio of the absolute difference to number of flips will approach zero. Intuitively, expected absolute difference grows, but at a slower rate than the number of flips, as the number of flips grows.
The LLN is important because it "guarantees" stable long-term results for random events. For example, while a casino may lose money in a single spin of the roulette wheel, its earnings will tend towards a predictable percentage over a large number of spins. Any winning streak by a player will eventually be overcome by the parameters of the game. It is important to remember that the LLN only applies (as the name indicates) when a large number of observations are considered. There is no principle that a small number of observations will converge to the expected value or that a streak of one value will immediately be "balanced" by the others. See the Gambler's fallacy.
P a g e | 28
REFLECTION
P a g e | 29
TEAM WORK IS IMPORTANT BE HELPFUL
ALWYS READY TO LEARN NEW THINGS BE A HARDWORKING STUDENT
BE PATIENT ALWAYS CONFIDENT
P a g e | 30
CONCLUSION
P a g e | 31
As a conclusion, now I know;
The history of probability from 18th century to 20th century,How to apply theory of probability in daily life and its importance,Two categories of probability: empirical probability and theoretical probability and their differences,How to conduct a dice-tossing activity to find its probability,How to calculate the probability,How to calculate mean, variance and standard deviation using formula:
Mean, x ¿∑ fx
N, or x=∑ x P(x)
Variance, σ 2=∑ f x2
N−x2, or σ 2=∑ x2 P(x )−¿ x2¿
Standard deviation, σ=√∑ f x2
N−x, or σ=√∑ x2 P(x )−¿ x2¿
About Law of Large Number (LLN) and its relation in this project work,Moral values obtained from this project work.