aboutsummaryrefslogtreecommitdiff
path: root/sem6/prob/m3/noter.md
diff options
context:
space:
mode:
Diffstat (limited to 'sem6/prob/m3/noter.md')
-rw-r--r--sem6/prob/m3/noter.md40
1 files changed, 20 insertions, 20 deletions
diff --git a/sem6/prob/m3/noter.md b/sem6/prob/m3/noter.md
index 23990ef..c784a6a 100644
--- a/sem6/prob/m3/noter.md
+++ b/sem6/prob/m3/noter.md
@@ -19,7 +19,7 @@ $$
E[X] = \int_{\infty}^{\infty} x f(x) \mathrm{dx}
$$
-Can also calculate expectation distribution function:
+Can also calculate expectation distribution function, however this can only be used if all values are non-negative:
$$
E[X] = \sum_{k=0}^{\infty} P(X > k) \\
@@ -55,7 +55,7 @@ $$
E[Z] = \sum_{i} \sum_{j} g(x_i, y_j) \cdot p(x_i, y_j)
$$
-If discrete just use integrals instead.
+If continues just use integrals instead.
$$
E[Z] = \int_{-\infty}^{\infty} \int_{-\infty}^{\infty} g(x, y) \cdot f(x, y) \mathrm{dxdy}
@@ -69,27 +69,27 @@ $$
If $X$ and $Y$ are **independent** the following is true:
-$$
- E[g_1(X) \cdot g_2(Y)] = E[g_1(X)] \cdot E[g_2(Y)] \\
- E[X \cdot Y] = E[X] \cdot E[Y]
-$$
+\begin{align*}
+ E[g_1(X) \cdot g_2(Y)] &= E[g_1(X)] \cdot E[g_2(Y)] \\
+ E[X \cdot Y] &= E[X] \cdot E[Y]
+\end{align*}
## Variance
Describes the mean of the distance between outcomes and the overall mean.
Good way to describe the spread of the random variable.
-$$
-Var(X) = E[(X - E[X])^2] \\
-Var(X) = E[X^2] - E[X]^2
-$$
+\begin{align*}
+ Var(X) &= E[(X - E[X])^2] \\
+ Var(X) &= E[X^2] - E[X]^2
+\end{align*}
If there is no power of two, it will be mean minus mean, which wont work.
One can define the *standard deviation* to bring back the unit from squared.
$$
- Std(X) = \sqrt{ (Var(X)) }
+ Std(X) = \sqrt{ Var(X) }
$$
A rule for variance:
@@ -110,21 +110,21 @@ If X and Y are independent the Cov part disappears.
## Covariance
-$$
- Cov(X,Y) = E[(X - E[X]) \cdot (Y - E[Y])] \\
- Cov(X,Y) = E[XY] - E[X] \cdot E[Y]
-$$
+\begin{align*}
+ Cov(X,Y) &= E[(X - E[X]) \cdot (Y - E[Y])] \\
+ Cov(X,Y) &= E[XY] - E[X] \cdot E[Y]
+\end{align*}
Shows whether two variables vary together, can be both positive and negative.
If it is possible $X$ and $Y$ are varying from the average together.
Some rules below:
-$$
- Cov(X, X) = Var(X) \\
- Cov(a X, Y) = a Cov(X, Y) \\
- Cov(X + Y, Z) = Voc(X, Z) + Cov(Y, Z)
-$$
+\begin{align*}
+ Cov(X, X) &= Var(X) \\
+ Cov(a X, Y) &= a Cov(X, Y) \\
+ Cov(X + Y, Z) &= Cov(X, Z) + Cov(Y, Z)
+\end{align*}
If X and Y are independent, then covariance is zero (X and Y are *uncorrelated*).
X and Y can be uncorrelated and not be independent.