Expected value of the distance square
As was mentioned in the comments, the upper bound is easily found by the points along the straight line between X and Y (no matter where they lie on the border). The lower bound is more difficult and to work out all the details would take quite some time, but I'll give the needed ideas and some literature. First I would like to assume, that the number of points is not fixed $n$ but poisson distributed with parameter $n$. That is almost the same as you will have with high probability say between $n/2$ and $2n$ points then, which is enough for the purpose. Now you cover your square with smaller squares of sidelength $\sqrt{1/n}$ and calculate the probability that one of these squares contains no point. The number of points in a small square is poisson distributed with parameter $n \cdot (\sqrt{1/n})^2 = 1$. Hence you have for each of your small squares independently a probability of $e^{1}$ that there is no point in it. Now there are 2 possible arguments: We say a small square is black if it contains no points, otherwise we call it white. If there is no white path (i.e. connected set of small white squares) from $X$ to $Y$ then at least one step is larger than than the sidelength of a small square $\sqrt{1/n}$. Hence you obtain a lower bound of $1/n$. This argument may be refined, as you will have to take about $\sqrt{1/n}$ of these larger jumps. The general topic you need here is "percolation" and there is a wonderful book about it called "Percolation" from Geoffrey Grimmett. To refine the argument, you will find the tools you need in a paper of Martin called "linear growth for greedy lattice animals". As each path from $X$ to why is a greedy lattice animal of size at least $\sqrt{n}$ and if you chose your sidelength as $\sqrt{1/cn}$ you will for large enough $c$ obtain the result, that any path of lenght $\sqrt{n}$ has to contain at least $c_2 \sqrt{n}$ black cells. Hence your bound will become of order $\sqrt{n}/n$, which is what you wanted.
Related videos on Youtube
Golbez
Updated on August 01, 2022Comments

Golbez 10 months
Given two points $X,Y$ on two sides of square $[0,1]\times [0,1]$ ($X:(0,1/2),Y:(1,1/2)$ (PS: My original question is $X,Y$ on opposite of a square, but I think that's not the real case) )and $n$ points distributed uniformly(i.i.d) in the square (where $n$ is large, and $A$ denotes the set of $n$ points), can I caluculate the asymptotic behavior of the value $M(n)$, where $M$ is defined as $$M(n)=E\left[\min_{B\subset A} \sum_{k=1}^{B+1} d(B_k,B_{k1})^2\right]$$ where $B_k$ is the $k$th element of $B$,$B_0=X,B_{m+1}=Y$(We let $m=B$), and the expected value is taken over all the possible $A$ . That is to say, I would like to compute the expected value of the minimal weight defined as sum of the square of distance.
I know that when $n\to\infty,M(n)\to 0$. And in the $1$dim case, this is easy, since it is only a Poisson process, and the distance between two consecutive points are surely exponential distribution.(Calculation suggests it's about $(n+3)/((n+1)(n+2))$,where $n$ is number of points added) But in the two dimensional case, I got stuck and don't know how to tackle it. This is a problem arouse from the calculation of the cost of a network. Any hint or reference are welcomed, Thanks!
(Some computer experiment suggests that the weight is about $\approx 1.1/\sqrt{n}=O(1/\sqrt{n})$. I also wonder if there are some similar results?)

you can call me Al over 9 yearsHere is an attempt at an upper bound. Consider only the points in the $\epsilon$neighborhood of the diagonal $XY$. This neighborhood is approximately a rectangle $\sqrt{2}\times 2\epsilon$, and contains approximately $2\sqrt{2}\epsilon n$ points. The orthogonal projections of points within this rectangle onto the diagonal are uniformly distributed (neglecting what happens near endpoints). Apply the 1dimensional asymptotic to them. The deviations from the diagonal contribute at most $4m\epsilon^2$ to the sum of squares (by the Pythagorean theorem).

Golbez over 9 years@youcancallmeAl Thank you! This gives me a new view point!

Golbez over 9 years@youcancallmeAl Thanks! Edited.

Did over 9 yearsNotation fixes: $m=n$? $E(n)=M(n)$?

Golbez over 9 years@Did $m\not = n$, and $m=B$,where $B$ is a subset of $A$. Thanks for your correction, edited.

Golbez over 9 yearsThe minimum is taken over all ordered subsets of $A$( named $B$), and $m$ is the cardinality of the subset. Sorry for the ambiguity.

Did over 9 yearsThen $M(n)\geqslant1/m$ for every $n$ hence $M(n)$ does not go to zero when $n\to\infty$. Please explain.

Golbez over 9 years@Did Well, I mean that we first generate $A$ where $n$ points are distributed randomly. Then we select a "path" with smallest weight from $X$ to $Y$, all the points in the path are selected from $A$, and $m$ is not fixed( $m$ is the number of point in the path of smallest weight). It is a number related to $A$( we can call it $m(A)$, but I don't care what it is, when two paths have the same weight, $m$ can be selected as either of the points in the two paths). Then we take the expectation when $A$ is randomly generated.

Golbez over 9 years$m$ is not a number I decided earlier, maybe I should not write $m$ in the minimum symbol, sorry for that.

Did over 9 yearsClearer now. Then @youcancallmeAl's reasoning with $\epsilon=1/\sqrt{n}$ indeed yields $\Theta(1/\sqrt{n})$ asymptotics (optimum on $\epsilon$ of $(n\epsilon)^{1}+(\epsilon n)\epsilon^2$).

Golbez over 9 years@Did Ah yes, you are right, this gives an upper bound. And can you suggest some probable way to get a lower bound?
