From cfcd3eefcbc547a8b69d9f5830b7a27c0bdb60ce Mon Sep 17 00:00:00 2001 From: Julian T Date: Wed, 17 Feb 2021 12:35:27 +0100 Subject: Changes to prob --- sem6/prob/m2/opgaver.md | 20 +++++++ sem6/prob/m3/noter.md | 141 ++++++++++++++++++++++++++++++++++++++++++++++++ 2 files changed, 161 insertions(+) create mode 100644 sem6/prob/m3/noter.md diff --git a/sem6/prob/m2/opgaver.md b/sem6/prob/m2/opgaver.md index a5d0cdb..0ce9c77 100644 --- a/sem6/prob/m2/opgaver.md +++ b/sem6/prob/m2/opgaver.md @@ -50,4 +50,24 @@ $$ ## Opgave 3 +Først skal man finde $\lambda$. + +$$ + \int_{0}^{\infty} \lambda e^{- \frac x {100}} \mathrm{dx} = 1 \\ + \left[ - \lambda 100 \cdot e^{- \frac x {100}}\right]_{0}^{\infty} = 1 \\ + \lambda \cdot 100 = 1 \\ + \lambda = \frac 1 {100} +$$ + +Nu kan man sætte 50 til 150 ind. + +$$ + P(50 < x \leq 150) = \int_{50}^{150} f(x) \mathrm{dx} = - e^{- \frac {150} {100}} + e^{ - \frac {50} {100}} = 0.3834 +$$ + +Derefter kan vi tage fra 0 til 100. + +$$ + P(x < 100) = \int_{0}^{100} f(x) \mathrm{dx} = - e^{- \frac {100} {100}} = - \frac 1 e +$$ diff --git a/sem6/prob/m3/noter.md b/sem6/prob/m3/noter.md new file mode 100644 index 0000000..23990ef --- /dev/null +++ b/sem6/prob/m3/noter.md @@ -0,0 +1,141 @@ +# Notes for third probability lecture + +*Moment generation function* is a third way to describe the behavior of a random variable. + +Can also find *Key performance indicators*. +Which will be explained in this document. + +## Expectation + +Is just the mean value. + +$$ + E[X] = \sum_{i} x_i P(X = x_i) = \sum_{i} x_i p(x_i) +$$ + +In the continues way integral is used instead. + +$$ + E[X] = \int_{\infty}^{\infty} x f(x) \mathrm{dx} +$$ + +Can also calculate expectation distribution function: + +$$ + E[X] = \sum_{k=0}^{\infty} P(X > k) \\ + E[X] = \int_{0}^{\infty} (1 - F(x)) \mathrm{dx} +$$ + +Tatianas recommendation is to calculate expectation from the PDF. + + + + +### LOTUS + + +### Some properties + +$$ +E[a X + b] = a E[X] + b +$$ + +The mean of a constant is the constant itself: + +$$ +E[b] = b +$$ + + +### Multiple variables + +If $Z = g(X,Y)$ one can find the expectation with: + +$$ + E[Z] = \sum_{i} \sum_{j} g(x_i, y_j) \cdot p(x_i, y_j) +$$ + +If discrete just use integrals instead. + +$$ + E[Z] = \int_{-\infty}^{\infty} \int_{-\infty}^{\infty} g(x, y) \cdot f(x, y) \mathrm{dxdy} +$$ + +The following rule can be used: + +$$ +E[X+Y] = E[X] + E[Y] +$$ + +If $X$ and $Y$ are **independent** the following is true: + +$$ + E[g_1(X) \cdot g_2(Y)] = E[g_1(X)] \cdot E[g_2(Y)] \\ + E[X \cdot Y] = E[X] \cdot E[Y] +$$ + +## Variance + +Describes the mean of the distance between outcomes and the overall mean. +Good way to describe the spread of the random variable. + +$$ +Var(X) = E[(X - E[X])^2] \\ +Var(X) = E[X^2] - E[X]^2 +$$ + +If there is no power of two, it will be mean minus mean, which wont work. + +One can define the *standard deviation* to bring back the unit from squared. + +$$ + Std(X) = \sqrt{ (Var(X)) } +$$ + +A rule for variance: + +$$ +Var(a X + b) = a^2 Var(X) +$$ + +The variance of a constant is therefore $0$. + +### Summing + +$$ +Var(X+Y) = Var(X) + Var(Y) + 2 Cov(X, Y) +$$ + +If X and Y are independent the Cov part disappears. + +## Covariance + +$$ + Cov(X,Y) = E[(X - E[X]) \cdot (Y - E[Y])] \\ + Cov(X,Y) = E[XY] - E[X] \cdot E[Y] +$$ + +Shows whether two variables vary together, can be both positive and negative. +If it is possible $X$ and $Y$ are varying from the average together. + +Some rules below: + +$$ + Cov(X, X) = Var(X) \\ + Cov(a X, Y) = a Cov(X, Y) \\ + Cov(X + Y, Z) = Voc(X, Z) + Cov(Y, Z) +$$ + +If X and Y are independent, then covariance is zero (X and Y are *uncorrelated*). +X and Y can be uncorrelated and not be independent. + +## Correlation coefficient + +It is hard to compare covariance as the value is dependent on the size of X and Y values. +We can therefore take the Correlation coefficient instead. + +$$ + Corr(X,Y) = \frac {Cov(X,Y)} {\sqrt{Var(X)} \cdot \sqrt{Var(Y)}} +$$ + + vim: spell spelllang=da,en -- cgit v1.2.3