I was wondering if there is a way to determine if an image is blurry or not by analyzing the image data.
One way which I'm currently using measures the spread of edges in the image. Look for this paper:
@ARTICLE{Marziliano04perceptualblur,
author = {Pina Marziliano and Frederic Dufaux and Stefan Winkler and Touradj Ebrahimi},
title = {Perceptual blur and ringing metrics: Application to JPEG2000,” Signal Process},
journal = {Image Commun},
year = {2004},
pages = {163--172} }
It's usually behind a paywall but I've seen some free copies around. Basically, they locate vertical edges in an image, and then measure how wide those edges are. Averaging the width gives the final blur estimation result for the image. Wider edges correspond to blurry images, and vice versa.
This problem belongs to the field of no-reference image quality estimation. If you look it up on Google Scholar, you'll get plenty of useful references.
EDIT
Here's a plot of the blur estimates obtained for the 5 images in nikie's post. Higher values correspond to greater blur. I used a fixed-size 11x11 Gaussian filter and varied the standard deviation (using imagemagick's convert
command to obtain the blurred images).
If you compare images of different sizes, don't forget to normalize by the image width, since larger images will have wider edges.
Finally, a significant problem is distinguishing between artistic blur and undesired blur (caused by focus miss, compression, relative motion of the subject to the camera), but that is beyond simple approaches like this one. For an example of artistic blur, have a look at the Lenna image: Lenna's reflection in the mirror is blurry, but her face is perfectly in focus. This contributes to a higher blur estimate for the Lenna image.