Examples of using Squared distance in English and their translations into Thai
{-}
-
Colloquial
-
Ecclesiastic
-
Ecclesiastic
-
Computer
Now, we have the total squared distance.
And if I want the squared distance, I square it-- that's how we calculate variance.
You take the average of all of these squared distances.
It's figures out the squared distance of this y value from the mean.
And let me just review what those squared distances are.
The variance is the average squared distance from the mean, the standard deviation is the square root of that.
And then take the average of all of these squared distances.
But you see here, the average squared distance from the mean in that first data set is 0.25.
What we saw is that there is a line that we can find that minimizes the squared distance.
Where the fit minimizes the squared distance to each of the points.
Minus our mean, which is p plus the probability that we get a 1, which is just p-- this is the squared distance.
So this is the sum of the squared distances, right?
If you divide this by n, you're going to get what we typically associate as the variance of y, which is kind of the average squared distance.
And here what's the average squared distance from the mean?
The total squared distance between each of the points or their kind of spread, their variation, is not explain by the variation in x.
Remember, that is the weighted sum of the squared distances from the mean.
If you think about the squared distance from some central tendency, and the best central measure we can have of y is the arithmetic mean.
At least, when you measure the error by the squared distances from the points.
I'm going to find essentially the mean of these squared distances, so I have five squared distances right over here, so let me divide by five so what will I get when I make this calculation right over here.
But what I want to do is find a line that minimizes the squared distances to these different points.
Well the squared distance from 0 to our mean-- let me write it over here-- it's going to be 0, that's the value we're taking on-- let me do that in blue since I already wrote the 0-- 0 minus our mean-- let me do this in a new color-- minus our mean.
Sample variance is going to be equal to the sum of my squared distances to the mean divided by my samples minus 1.
We got to a formula for the slope and y-intercept of the best fitting regression line when you measure the error by the squared distance to that line.
And then, the next thing we want to do is the squared distance. so this is equal to the squared distance of our y value from the y's mean.
So if you have gotten this far, you have been waiting for several videos to get to the optimal line that minimizes the squared distance to all of those points.
And we have shown ourselves that the slope of this line-- the one that best minimizes the squared distance to each of those points-- is going to be the mean of the xy's minus the mean of x times the mean of y.
So if this squared distance on average is some variance, and this one is completely independent, it's squared distance on average is some distance, then the variance of their sum is actually going to be the sum of their variances.
If we talk about the variance of the random variable x, that is it the same thing as the expected value of the squared distances between our random variable x and its mean.
So the average squared distance, or the mean squared distance from our population mean is equal to 20, you might say"wait these aren't 20 away," remember it's squared distance away from my population mean so I squared each of these things I like to because I made it positive, and we will see later it has other nice properties about it, now the last thing is- how can we represent this mathematically?
So the variance-- let me write it over here, let me pick a new color-- the variance is just-- you could view it as the probability weighted sum of the squared distances from the mean, or the expected value of the squared distances from the mean.
