问题
I'm trying to blend a background with a foreground image, where the foreground image is a transparent image with lines on it.
I am trying to do it this way.
UIGraphicsBeginImageContext(CGSizeMake(320, 480));
CGContextRef context = UIGraphicsGetCurrentContext();
// create rect that fills screen
CGRect bounds = CGRectMake( 0,0, 320, 480);
// This is my bkgnd image
CGContextDrawImage(context, bounds, [UIImage imageNamed:@"bkgnd.jpg"].CGImage);
CGContextSetBlendMode(context, kCGBlendModeSourceIn);
// This is my image to blend in
CGContextDrawImage(context, bounds, [UIImage imageNamed:@"over.png"].CGImage);
UIImage *outputImage = UIGraphicsGetImageFromCurrentImageContext();
UIImageWriteToSavedPhotosAlbum(outputImage, self, nil, nil);
// clean up drawing environment
//
UIGraphicsEndImageContext();
but does not seem to work.
Any suggestions will be appreciated.
回答1:
This is what I've done in my app, similar to Tyler's - but without the UIImageView
:
UIImage *bottomImage = [UIImage imageNamed:@"bottom.png"];
UIImage *image = [UIImage imageNamed:@"top.png"];
CGSize newSize = CGSizeMake(width, height);
UIGraphicsBeginImageContext( newSize );
// Use existing opacity as is
[bottomImage drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
// Apply supplied opacity
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height) blendMode:kCGBlendModeNormal alpha:0.8];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
If the image already has opacity, you do not need to set it (as in bottomImage
) otherwise you can set it (as with image
).
回答2:
UIImage* bottomImage = [UIImage imageNamed:@"bottom.png"];
UIImage* topImage = [UIImage imageNamed:@"top.png"];
UIImageView* imageView = [[UIImageView alloc] initWithImage:bottomImage];
UIImageView* subView = [[UIImageView alloc] initWithImage:topImage];
subView.alpha = 0.5; // Customize the opacity of the top image.
[imageView addSubview:subView];
UIGraphicsBeginImageContext(imageView.frame.size);
[imageView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage* blendedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
[subView release];
[imageView release];
[self doWhateverIWantWith: blendedImage];
回答3:
my answer is based on Eric's answer, but allows for @2x images to retain their resolution after the image merge. Please note the URL REF in my comments, as I am acknowledging the sources that contributed assistance to my development of this function that I used in my iOS apps.
- (UIImage*) mergeTwoImages : (UIImage*) topImage : (UIImage*) bottomImage
{
// URL REF: http://iphoneincubator.com/blog/windows-views/image-processing-tricks
// URL REF: https://stackoverflow.com/questions/1309757/blend-two-uiimages?answertab=active#tab-top
// URL REF: http://www.waterworld.com.hk/en/blog/uigraphicsbeginimagecontext-and-retina-display
int width = bottomImage.size.width;
int height = bottomImage.size.height;
CGSize newSize = CGSizeMake(width, height);
static CGFloat scale = -1.0;
if (scale<0.0)
{
UIScreen *screen = [UIScreen mainScreen];
if ([[[UIDevice currentDevice] systemVersion] floatValue] >= 4.0)
{
scale = [screen scale];
}
else
{
scale = 0.0; // Use the standard API
}
}
if (scale>0.0)
{
UIGraphicsBeginImageContextWithOptions(newSize, NO, scale);
}
else
{
UIGraphicsBeginImageContext(newSize);
}
[bottomImage drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
[topImage drawInRect:CGRectMake(0,0,newSize.width,newSize.height) blendMode:kCGBlendModeNormal alpha:1.0];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
回答4:
Blending with alpha
UIGraphicsBeginImageContext(area.size);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextRetain(context);
// mirroring context
CGContextTranslateCTM(context, 0.0, area.size.height);
CGContextScaleCTM(context, 1.0, -1.0);
for (...) {
CGContextBeginTransparencyLayer(context, nil);
CGContextSetAlpha( context, alpha );
CGContextDrawImage(context, area, tempimg.CGImage);
CGContextEndTransparencyLayer(context);
}
// get created image
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
CGContextRelease(context);
UIGraphicsEndImageContext();
回答5:
Swift 3
This function will take two images, and a CGSize
and return an optional UIImage
. It works best when both images are the same size. If your top image has alpha it will show the bottom image through it.
// composit two images
func compositeTwoImages(top: UIImage, bottom: UIImage, newSize: CGSize) -> UIImage? {
// begin context with new size
UIGraphicsBeginImageContextWithOptions(newSize, false, 0.0)
// draw images to context
bottom.draw(in: CGRect(origin: CGPoint.zero, size: newSize))
top.draw(in: CGRect(origin: CGPoint.zero, size: newSize))
// return the new image
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
// returns an optional
return newImage
}
Usage
let outputSize = CGSize(width: 100, height: 100)
if let topImage = UIImage(named: "myTopImage") {
if let bottomImage = UIImage(named: "myBottomImage") {
// composite both images
if let finalImage = compositeTwoImages(top: topImage, bottom: bottomImage, newSize: outputSize) {
// do something with finalImage
}
}
}
回答6:
Can you provide detail in what you mean by "it does not seem to work?" Does it draw only one image or the other image? Draw black? Noise? Crash? Why have you chosen kCGBlendModeSourceIn
; what effect are you trying to achieve (there are dozens of ways to blend images)? Do either of your images have alpha already?
I assume what you're trying to do is mix two images such that each has 50% opacity? Use CGContextSetAlpha()
for that rather than CGContextSetBlendMode()
.
回答7:
You can use UIImage
's drawInRect:
or drawAtPoint:
instead of CGContextDrawImage (they draw to the current context). Does using them give you any difference in output?
It may also be helpful to make sure the UIImage*
values you are getting back from imageNamed:
are valid.
来源:https://stackoverflow.com/questions/1309757/blend-two-uiimages-based-on-alpha-transparency-of-top-image