This algorithm has been in my mind for a long time, but I cannot find it described anywhere. It\'s so simple though that I can\'t be the only one who has thought of it. Here\'s
Contrary to what I read in other answers, this algorithm is actually quite popular for supersampling, at least in the image processing community.
It is implemented in Intel's Performance Primitives library under the name Super Sampling; the (rather uninformative) name is a way to state that there is no alternative algorithm for supersampling in the library. In OpenCV, it goes under the name INTER_AREA; it is listed amongst other interpolation types, which could suggest they are interchangeable, but mention that "it may be a preferred method for image decimation" — a rather conservative statement to my taste.
When you supersample an image by an integer factor, say by a factor of two, taking the mean of the underlying pixels for the resulting image (as done for example by scikit-image
's downscale_local_mean) is indeed optimal in a specific sense.
Suppose your image is obtained by some quantitative measurement of a signal by a grid of receptors. For example, photographs or X-rays, counting the number of photons. Pixel values are proportional to the amount of signal received by a given receptor.
If you assume that your camera is perfect — no spread of the signal, 100% coverage of the receiving area — then mean supersampling is optimal because it gives the exact image that would be received by a perfect camera with half the resolution.
Area averaging is a straightforward generalization of this optimal mean supersampling to non-integer ratios, which explains its popularity, although it cannot boast the same property for any supersampling ratio other than integers.