Expectation

edit

Expectation according to joint distribution equals single distribution Expectation

edit

 

 

Hence:

 

Also:

 

linearity

edit

 

 

 


hence:

 

 

hence:

 

Variance & Standard deviation

edit

definitions

edit

 

 

The meaning of standard deviation

edit

One way to look at standard deviation is as an approximation of the "expected drift" from the expectation. The "expected drift" could be defined as:

 

This value is probably not easy to manipulate.

Suppose that X can have only the two values   and   and that  . Then:

 

and

 

and

 

 ,   and   doesn't change by adding a constant so any random variable   that all its drifts are of the same absolute value   has  .

Whenever the drift values are not the same,   averages with bigger weights to bigger values while   keep fair plane average. *todo*: show why

Example:Suppose you are performing the following experiment: you flip a coin, if it's head you go 5 meters to the left, if it is tail, you go 5 meters to the right. The variance in this case is 25 and the standard deviation is 5. The expected drift is also 5 (all the drift values are equal). More on that example, see here.

Alternative definition of variance

edit

 

 

hence:

 

variance (and sd) doesn't change by adding a constant

edit

 

variance of multiplication

edit

 

hence:

 

SD of multiplication

edit

 

hence:

 

Variance of sum of random variables

edit

 

 
 
 

hence:

 

When   and   are independent,   and hence:

  Independent  

When   and   are i.i.d (identically independent distributed) then:

  i.i.d  

Or more generally:

  i.i.d  

hence:

  i.i.d  


Note the difference from summing the variable with itself (identically distributed but not independent):

     

and

     

more on the last result

edit

We've showed that:

  i.i.d  

Why is this important?

  is a measure for expected drift. The last result shows that the expected drift goes as square root (less than linear) with successive experiments... this means that the mean drift tends to zero:

 

Recall the example of the random walk +-5. Now, suppose You repeat the process   times. What is the expected drift?

The standard deviation, which can be considered as a measure to that drift is:  

The mean drift is:  

For example, for 10000 iterations, the mean drift is:   meter. Instead of 5 meter in each step it is 5 centimeter. The total drift is only 500 instead of 50,000.

  • todo:*...example of random walk +-5 gnuplot picture. the relation to the law of big numbers... the fact that frequent ration converges is an assumption in probability theory or a result?..

misc

edit

  is constant.

hence:

  is constant.

Covariance

edit

Alternative definition

edit

 

hence:

 

A special case is a covariance of two of the same random variable:  

Covariance of independent variables

edit

Assume that   and   are independent:

   

And hence:

  independent  

The contrary is not true, however. For example, if X is a constant random variable then

 

But of course, X and X are very much dependent.

Wiener processes

edit

(also known as "Brownian motion")

Let Z be a stochastic process with the following properties: 1. The change   in a small period of time   is

 

where:

 

Summary

edit

Expectation

edit
  •  
  •  
  •  

Variance and standard deviation

edit
  •  
  •  
  •  
  •  
  •   Independent  
  •   i.i.d  
  •   i.i.d  

Covariance

edit
  •  


Misc

edit
int main() {
  cout << "hello lord\n";
}

Determinant is the area of the Parallelogram

edit

Let   and   be two vectors in  . We will show that the determinant   is equal to the area of the Parallelogram.

short way

edit
 

let   be a vector orthogonal to   and of norm equal to 1:

 

(a word about left/right systems? why we didn't choose   ?)

Let   be the area of the Parallelogram:

 

longer way

edit
 

Let   be the vector that resembles the height of the Parallelogram:

 

Let   be the area of the Parallelogram:

 


 

 

   

 

 

(-1)*(-1) =?= 1

edit

 

 

 

לכן:

 

צעד ראשון: 0 כפול כל מספר הוא 0

צעד שני: 0 סכום של הופכיים חיבוריים

צעד שלישי: דיסטריביוטיביות

צעד רביעי: 1 נטרלי כפלי

הסקה: הופכי חיבורי ל - -1-