UPDATE:
Images who are projected on the MKMapView using a MKOverlayView use the Mercator projection, while the image that I use as input data uses
My guess is that Google Maps is stretching the image non-linearly to compensate for the map projection and that your code/Apple's code isn't.
One possible solution would be to subdivide the overlay image into smaller rectangles and call MKMapPointForCoordinate() separately for each rectangle. Then the data will be much closer to being correct.