Consider a black and white image like this

What I am trying to do is to find the region where
You could create a sliding window (e.g. 10x10 pixels size) which iterates over the image, and for each position you count the number of white pixels in this 10x10 field, and store the positions with the highest counts.
This whole process is O(n*m) where n is the number of pixels of the image, and m the size of the sliding window.
In other words, you convolve the image with a mean filter (here the box filter), and then use the extrema.
At first, calculate a summed area table, which can be done very efficiently in a single pass:
sat with the same size as the original image img.Iterate over each index, and calculate for each index x and y
sat[x, y] = img[x, y] + sat[x-1, y] + sat[x, y-1] - sat[x-1, y-1]
For example, given an image where 0 is dark and 1 is white, this is the result:
img sat
0 0 0 1 0 0 0 0 0 1 1 1
0 0 0 1 0 0 0 0 0 2 2 2
0 1 1 1 0 0 0 1 2 5 5 5
0 1 0 0 0 0 0 2 3 6 6 6
0 0 0 0 0 0 0 2 3 6 6 6
Now iterate over the summed area table's indices with a sliding window, and calculate the number of white pixels in it by using the corners A, B, C, D of the sliding window:
img sat window
0 0 0 1 0 0 0 0 0 1 1 1 0 A-----B 1
0 0 0 1 0 0 0 0 0 2 2 2 0 | 0 2 | 2
0 1 1 1 0 0 0 1 2 5 5 5 0 | 2 5 | 5
0 1 0 0 0 0 0 2 3 6 6 6 0 | 3 6 | 6
0 0 0 0 0 0 0 2 3 6 6 6 0 D-----C 6
Calculate
density(x', y') = sat(A) + sat(C) - sat(B) - sat(D)
Which in the above example is
density(1, 0) = 0 + 6 - 1 - 2 = 3
This process requires a temporary image, but it is just O(n), so speed is independent of the sliding window's size.