Examples of using Squared error in English and their translations into Bulgarian
{-}
-
Colloquial
-
Official
-
Medicine
-
Ecclesiastic
-
Ecclesiastic
-
Computer
We have the squared error of the line.
Sometimes, it's called the squared error.
This is the squared error of the line axis.
You put it into this expression for the squared error of the line.
The total squared error with the line is 2.73.
And then, you could imagine the vertical axis to be the squared error.
So this is the squared error of the line.
And so you can even visualize it the same way we visualized the squared error from the line.
If the squared error of the line is really small what does that mean?
Let me caught it the squared error with the line.
If the squared error of the line is small, it tells us that the line is a good fit.
So what is the total squared error with the line?
So are squared error versus our line,our total squared error, we just computed to be 2.74.
Let me call this the squared error from the average.
So our squared error to the line from the sum of the squared error to the line from the n points is going to be equal to-- this term right here is n times the mean of the y squared values.
Maybe I will call this the squared error from the mean of y.
And so if the squared error of the line is large, this whole thing's going to be close to 1.
Our goal is to simplify this expression for the squared error between those n points.
So let me define the squared error against this line as being equal to the sum of these squared errors. .
And over the next few videos, is I want to find the m andb that minimizes the squared error of this line right here.
And the partial derivative of our squared error with respect to b is going to be equal to 0.
Netflix could just have a computer compare the predicted ratings with the held-out ratings using a prespecified metric(the particular metric they used was the square root of the mean squared error).
So once again,this is just the squared error of that line with n points.
And we're taking the sum of the squared error between each of those n points and our actual line, y equals mx plus b.
The total variation in y, which is the squared error from the mean of the y's.
In the last video, we showed that the squared error between some line, y equals mx plus b and each of these n data points is this expression right over here.
And so what we calculated next was the total error, the squared error, from the means of our y values.
So if you wanted the total error, if you want the total squared error-- this is actually how we started off this whole discussion-- the total squared error between the points and the line, you literally just take the y value each point.
So at that point,the partial derivative of our squared error with respect to m is going to be equal to 0.
And what we want to do is minimize this squared error from each of these points to the line.