Proof (part 1) minimizing squared error to regression line Khan Academy

the least squares regression line minimizes the sum of the This is a topic that many people are looking for. star-trek-voyager.net is a channel providing useful information about learning, life, digital marketing and online courses …. it will help you have an overview and solid multi-faceted knowledge . Today, star-trek-voyager.net would like to introduce to you Proof (part 1) minimizing squared error to regression line Khan Academy. Following along are instructions in the video below:

“The last video. We showed nthat. The squared error between some line. Y.

Equals. Mx. Mx. Plus.

Nb and each of these n data. Points. Is this expression nright over in this video. I m really just ngoing to algebraically manipulate this expression.

So nthat. It s ready for the calculus stage. So we can actually optimize. We ncan actually find the m and b values.

That minimize this nvalue right over here. So this is just going to be a nton of algebraic manipulation. But i ll try to color code nit well so we don t get lost in the math. So let me just rewrite this nexpression over here.

So this whole video is just ngoing to be rewriting this over and over again. Just simplifying. It a nbit with algebra. So this first term right over nhere y1 minus.

Mx1 plus b. Squared. This is all going nto be the squared error of the line. So this first term over here ni ll keep it in blue is going to be if we just expand.

It y1 nsquared minus 2 times y1 times mx1 plus b. Plus mx1 nplus b. Squared all i did is i just squared nthis binomial right here you can imagine if this was a nminus b. It would be a squared minus 2ab plus b.

Squared. That s all i did. Now. I ll just have to do that nfor each of the terms and each term is only different by nthe x.

And the y coordinates right over here. And i ll go down. So. That we ncan kind of combine like terms so this term over here.

Nsquared is going to be y2 squared minus 2. Times ny2 times..


Mx2 plus b. Plus. Mx2 plus b. Squared same exact thing up here.

Except now. It was with x2 and ny2 as opposed to x1 and y1 and then we re just going to nkeep doing that n times. We re going to do it for the nthird x3 y3 keep going keep going all the way until we get the nthis nth term over here and this nth term over here when nwe square. It is going to be yn squared minus.

2yn ntimes mxn plus b. Plus. Mxn plus b. Squared.

Now the next thing. I want to ndo is actually expand these out a little bit more so let s actually scroll down so this whole expression. I m njust going to rewrite. It is the same thing as and remember nthis is just the squared error of the line.

So let me rewrite this ntop line over here this top line over here. Nis y1. Squared and then i m going to ndistribute this 2y1. So this is going to be nminus 2y1mx1.

That s just that times. That minus. 2y1b and then plus and now. Let s nexpand mx1 plus b.

Squared. So that s going to be m. Squared. Nx1.

Squared. Plus. 2. Times.

Mx1 times. B. Nplus. B.

Squared. All. I did. If was a plus b.

Nsquared. This is a squared plus..


2ab plus b. Squared and we re going to do that for neach of these terms or for each of these colors. I nguess you could say so now let s move to nthe second term. It s going to be nthe same thing.

But instead of y1 s and nx1 s. It s going to be y2 s and x2 s so it is y2 squared minus. N2y2mx2 minus. 2y2b plus m.

Squared. X2 squared plus 2. Times nmx2b plus b. Squared.

And we re going to keep ndoing this all the way to get the nth term. I guess color. We should say so this is going to be yn nsquared minus 2ynmxn and you don t even nhave to think you just have to kind of nsubstitute. These with n s now we could actually nlook at this.

But it s going to be the nexact same. Thing. Minus. 2ynb plus m.

Squared. Nxn. Squared. Plus.

2mxnb plus. B. Squared. So once.

Again. This is just. The nsquared error of that line with n points. Between those n points.

And the nline y equals. Mx plus b. So. Let s see if we can simplify nthis somehow and to do that what i m going to ndo is i m going to kind of try to add up a bunch nof these terms here so if i were to add up all of nthese terms.

Right here. If i were to add up this ncolumn right over there. What do i get it s going to be y1 squared. Plus.

Ny2 squared. All the way to all the way to yn squared..


That s those terms nright over there so i m going to have that and then have this common n2m amongst. All of these terms over here. So let me write that down so. Then you have this 2m nhere 2m.

Here 2m here let me put parentheses. Naround here. So you have these terms nall. Added up then you have minus 2m times all nof these terms actually let me color code it so you nsee what we re doing i want to be very careful nwith this math.

So nothing seems too confusing. Although this is really just nalgebraic manipulation. If i had all of these up i get ny1 squared plus y2 squared all the way to yn squared. I ll put some parentheses.

Naround that and then to that we have this ncommon term we have this minus 2m minus 2m minus 2m and so we can distribute nthose out and so i should actually nwrite it like this so we have a minus 2m. Once we ndistribute it out up here. We re just going to be nleft with a y1x1 or maybe. I can call nit.

An x1y1 that s that over there with nthe 2m factored out let me do that in nanother color. I want to make this neasy to read plus x2y2 plus xnyn well we re going to keep adding nup. We re going to do this n times. All the way to plus xnyn this last term over here nynxn same thing.

So that s the sum. So this stuff over here. The sum nof all of this stuff right over. Here is the same thing as nthis term right over.

Here. And then we have to sum nthis right over here. And you see again we can factor nout here. A minus 2b out of all of these terms.

So we nhave minus 2b times. Y1. Plus. Y2 plus.

All the way to to yn. So this business. These terms right over here nwhen you add them up give you these terms or this term nright over there and let s just keep going and in the next video. We re nprobably going to run out of time in this one.

I ll simplify nthis. More and clean up the algebra. A good bit so then the next term what nis this going to be same drill. We can factor out nan m.

Squared. So we have m squared times ntimes x1 squared plus..


X2 squared actually i want to ncolor code. Them. I forgot to color code these over here plus all the way nto xn squared let me color code. These this was a yn squared and this over here nwas a y2 squared.

So this is exactly this so in this last step. We just ndid this thing over here is this thing right over here and of course. We nhave to add it so i ll put a plus out front. We re almost done with this nstage of the simplification.

So over here. We have a common n2mb. So let s put a plus 2mb times. Once again x1 plus.

X2 nplus. All the way to xn. So this term right over. Here.

Nthis is the exact same thing as this term over here. And then finally we have a b nsquared in each of these. And how many of these b. Nsquared.

Do we have well we have n of these nlines right this is the first line second nline then bunch bunch bunch. All the way to the nth line. So we have b squared added nto itself n times. So this right over here is njust b.

Squared n. Times. So we ll just write that as nplus n. Times b.

Squared. Let me remind ourselves. What nthis is all about this is all just algebraic nmanipulation of the squared error between those n points. Nand.

The line y equals. Mx plus b. It doesn t look like i ve nsimplified. It much and i m going to stop in nthe video right now in the next video.

We re just ngoing to take off right here and try to simplify nthis thing. ” ..

Thank you for watching all the articles on the topic Proof (part 1) minimizing squared error to regression line Khan Academy. All shares of star-trek-voyager.net are very good. We hope you are satisfied with the article. For any questions, please leave a comment below. Hopefully you guys support our website even more.
description:

tags:

Leave a Comment