Minimizing the sum of squares of the distances

1,919

Let's start by doing it for two points. We know the point that minimizes the sum of the squared distances is the bisector of the segment between them, but let's pretend we don't know that and want to invoke the machinery of calculus. Let the two points be $(a_1,b_1)$ and $(a_2,b_2)$ We want to minimize $$A=(a_1-x)^2+(b_1-y)^2+(a_2-x)^2+(b_2-y)^2$$ with respect to $x,y$. We can't set $A$ to zero as it is the sum of squares, so will be positive unless our three points are coincident. What we can do is that the partial derivatives of $A$ with respect to $x,y$ and set them to zero. We find $$\frac {\partial A}{\partial x}=-2a_1+2x-2a_2+2x$$ If we set that to zero we find $$x=\frac 12(a_1+a_2)$$ and similarly in $y$ so the point that minimizes $A$ is the midpoint of the segment between the two points.

You need to make the same argument for an arbitrary number of points. It works the same but you need a bunch of summation signs in yours.

Share:
1,919

Author by

O. Bates

Updated on April 22, 2020

I'm given $n$ fixed points $(a_1,b_1)...(a_n,b_n)$ and told to show that the sum of the squares of the distances from a point $P(x,y)$ to those fixed points is minimized when $x$ is the average of all the $a$ coordinates and $y$ is the average of all the $b$ coordinates.
I initially went about it by setting the sum of the squares of the distances equal to zero, so $0=(a_1-x)^2+(b_1-y)^2+(a_2-x)^2+(b_2-y)^2+...+(a_n-x)^2+(b_n-y)^2$ and then said that this would only work if $a_1+a_2+...+a_n=nx$ and, when simplified, $x$ would equal the average of all the $a$ coordinates. I'm not entirely sure I did this right, though, because I believe I'm expected to use gradients - I'm stuck on where I would include them. Could someone verify what I did makes sense or get me on the right track?