# Why median?

Stick Lengths So I solved this problem recently and the solution is that the target length of the sticks would be the arrayOfLengths[len/2], aka median. I just don’t get why median. Could anyone explain? Or is there a mathematical proof somewhere? Thank you!

1 Like

I guess you could rephrase the problem as given an increasing sequence (x_i)_{i=1}^{n}, find

\min_{k\in\mathbb{R}}\sum_{i=1}^{n}|k-x_i|.

Claim 1: We require k\in[x_1,x_n]

Proof: Assume we found some suitable k_1<x_1. Then k=x_1+(x_1-k_1)=2x_1-k_1 is always going to be a better “k” than k_1 ever could be, where k_1 is such that k is between x_1,x_n. Imagine a number line with x_1, k_1 and 2x_1-k_1, as follows

k_1 ----------------x_1------------------(2x_1-k_1)---------------------------------------------------------x_n.

Clearly every x_j between x_1 and 2x_1-k_1 will be closer to 2x_1-k_1 and the same can be said for x_j between 2x_1-k_1 and x_n. A very similar argument with some k_2>x_n, compared with k=x_n-(k_2-x_n)=2x_n-k_2 proves the claim fully.

Therefore the required expression is just

\min_{x_1 \leqslant k \leqslant x_n}\sum_{i=1}^{n}|k-x_i|.

Claim 2: Consider some k=m such that x_j<m<x_{j+1}. Then this m will lead to a lower cost than k=x_j iff j < n/2. Below is a visual representation

x_1-------------…---------------x_j-----------------------------m--------------------x_{j+1}------…------x_n

Proof: Total cost C_{x_j} with k=x_j is

\sum_{i=1}^n |x_j-x_i|.

Total cost C_m with k=m is

j(m-x_j)-(n-j)(m-x_j)+\sum_{i=1}^n |x_j-x_i| \\= (2j-n)(m-x_j)+C_{x_j}.

The first 2 terms in the first line above come from considering that every x_i such that 1 \leqslant i \leqslant j will need to “travel” to x_j and then k-x_j further to get to m, and every x_i such that 1 \leqslant i \leqslant j will need to “travel” to x_j but then move back by k-x_j to get back to m. There are j and n-j such i respectively.

When the difference between cost with m and with x_j is negative, this means that m gives a lower cost. Noting m-x_j>0 by definition of m,

C_m-C_{x_j} = (2j-n)(m-x_j) <0\iff j< n/2.

In other words, if x_j is an element of the sequence before the median element, then any point m as defined above will lower the cost. This proves the claim.

Very similarly you can prove that such an m will be worse than the corresponding x_j iff j > n/2. Therefore if j=n/2 so that k=x_{n/2} i.e. you make all elements of the sequence become the median, you will achieve the desired sum.

This is because, visually, we have

x_1------…-------x_{j_1}—P-----x_{j_1+1}----…-----x_{n/2}—…-----x_{j_2}----------x_{j_2+1}--------…----x_n,

and imagining trying to move the P to the optimal point, you could always move it towards x_{j_1+1}, and then towards x_{j_1+2} and so on and eventually the median and lower cost, and then once you move past the median you could then always move it towards the median to lower cost, and continue in an oscillatory fashion until you’re infinitesimally close to the median. Also note that |(2j-n)(m-x_j)| increases with m for some fixed j, which alternatively shows this.

1 Like