I have an application that I am developing for the iPhone. What it does is, it captures the video from the camera and stores the video file onto the File System.
I
A better solution actually is to use the AVFoundation framework to do this. It bypasses the need to construct an MPMoviePlayerController which causes the problem that the Iris of the camera remains closed if used in conjuction with the UIImagePickerController (at least that's what I experienced).
The code I use:
+ (UIImage *)thumbnailFromVideoAtURL:(NSURL *)contentURL {
UIImage *theImage = nil;
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:contentURL options:nil];
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
generator.appliesPreferredTrackTransform = YES;
NSError *err = NULL;
CMTime time = CMTimeMake(1, 60);
CGImageRef imgRef = [generator copyCGImageAtTime:time actualTime:NULL error:&err];
theImage = [[[UIImage alloc] initWithCGImage:imgRef] autorelease];
CGImageRelease(imgRef);
[asset release];
[generator release];
return theImage;
}