Examples of using Random variables in English and their translations into Thai
{-}
-
Colloquial
-
Ecclesiastic
-
Ecclesiastic
-
Computer
They are independent random variables.
Like this with the random variables and that it's a little bit confusing.
Let's say we have two random variables.
So one instantiation of the random variables, you have-- you sample once from the universe, and you get X=1 and Y=3.
This is equal to the expected value of the random variables, X and Y, X times Y.
So you can perform this experiment a bunch of times, but this tells you the frequency, the frequency of that random variables.
You can have discrete random variables, or continuous.
What I want to do in this video is introduce you to the idea of the covariance between 2 random variables.
So what can the covariance of 2 random variables be approximated by?
What I want to do in this video is build up some tools in our tool kit for dealing with sums and differences of random variables.
So let's say that we have two random variables, x and y, and they are completely independent.
Correlation(Correlatio the Latin means"relationship, the relationship")- a definite statistical relationship between two or more random variables.
But what do we have left? We have the covariance of these 2 random variables X and Y, equal to the expected value of.
In the role of measuring the correlation of random variables performs correlation ratio and the coefficient of correlation.
And I'm doing that because we just talked about random variables and all of that.
If we're taking essentially the difference of two random variables, the variance is going to be the sum of those two random variables.
So we just showed you is that the variance of the difference of two independent random variables is equal to the sum of the variances.
Now what I need to show you is that the variance of negative y, of the negative of that random variables are going to be the same thing as the variance of y.
So another way of thinking about the slope of our regression line, it can be literally viewed as the covariance of our 2 random variables over the variance of X. You can kind of view it as the independent random variable. .
And the random variable is just that function mapping.
So the expected value of our random variable is equal to the sum.
It's the expected value of random variable minus expected value of X.
Let's define a random variable, X, like we always do.
So I have random variable x.
Random variable y.
And I'm going to define my random variable.
This is for our random variable, x.
So that's our random variable.
That's a random variable.
It isn't true for any random variable, X.