I have a UIImagePickerController being called with sourceType camera and 80% of the time I get a black preview. If I wait, let\'s say around 30 seconds, I get a good preview
In my case I had to move some methods to the main thread.
My app creates a new Image with a new context, I can create the new context in a different thread and use the CGContext functions (like CGContextScaleCTM or CGContextTranslateCTM or CGContextConcatCTM) and [uiimage drawInRect:Mybounds];.
Basically I had to move to the main thread the renderInContext method when I drew layer in the context:
CGSize sizeView = viewPrintBase.frame.size;
UIGraphicsBeginImageContext(sizeView);
currentContext = UIGraphicsGetCurrentContext();
dispatch_sync(dispatch_get_main_queue(), ^{
[viewPrintBase.layer renderInContext:currentContext];
[imageViewPhotoUser.layer renderInContext:currentContext];
if (imageViewMask) {
[imageViewMask.layer renderInContext:currentContext];
}
});
[imageSnapShotDrawinfView drawInRect:viewPrintBase.bounds];
UIImage *finalImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
CGImageRef imageRef = CGImageCreateWithImageInRect([finalImage CGImage], viewPrintBase.bounds);
// or use the UIImage wherever you like
imageSnapShot = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
AlsoI had to move to the main thread all UI objects creation, like:
__block UIView *viewPrintBase;
dispatch_sync(dispatch_get_main_queue(), ^{
viewPrintBase = [[UIView alloc]initWithFrame:CGRectMake(0, 0, WIDTH_FINAL_IMAGE, HEIGHT_FINAL_IMAGE)];
viewPrintBase.backgroundColor = [UIColor whiteColor];
[viewPrintBase setClipsToBounds:YES];
//We get the photo user
});
I hope this helps :)