A array is to be filled with integers between 1 and 1000 inclusive, with each integer appearing 1000 times. What is the maximum value of such that there necessarily exists a row or column with at least distinct integers?
This section requires Javascript.
You are seeing this because something didn't load right. We suggest you, (a) try
refreshing the page, (b) enabling javascript if it is disabled on your browser and,
finally, (c)
loading the
non-javascript version of this page
. We're sorry about the hassle.
To minimize the number of different integers in the same row or column, we need to cluster each integer as much as possible, trying to put all 1's inside a square, and same for the one thousand 2's and so on. For example, we can put the 1's into a 3 1 × 3 2 almost-square with 8 1's remaining, and place these on the upper-left corner of the 1000 by 1000 grid. In the end, the 1000 by 1000 grid will be partitioned into 1000 smaller near-square sections, and each row or column will have at most 32 different numbers.