core-graphics

Any idea why this image masking code does not work?

时光怂恿深爱的人放手 提交于 2019-12-18 02:46:15
问题 I have this code to mask an image. Basically, I only work with PNG images. So I have a 300x400 PNG image with 24bits of color (PNG-24). I am not sure if it also has an alpha channel. But there's no transparency in it. Then, there is the image mask which is PNG-8bit without alpha channel. It is just black, grayscale and white. I create both images as UIImage. Both display correctly when putting them into an UIImageView. Then I create an UIImage out of them which contains the results of the

CG Gradient runs on simulator, but not on iPhone

安稳与你 提交于 2019-12-18 01:13:36
问题 I have a code that compile without problems. It runs well on the iPhone simulator, but on my device, I get an EXC_BAD_ACCESS. This happens in a helper function to draw gradient. I followed this tutorial to do it. The code I have is as follows: - (void) drawRect:(CGRect)rect { CGContextRef context = UIGraphicsGetCurrentContext(); CGColorRef whiteColor = [UIColor whiteColor].CGColor; CGColorRef lightGrayColor = [UIColor colorWithRed:230.0/255.0 green:230.0/255.0 blue:230.0/255.0 alpha:1.0]

How to reproduce this Xcode blue drag line

有些话、适合烂在心里 提交于 2019-12-17 23:46:09
问题 I'd like to reproduce the Xcode blue drag line in my app. Do you know a way to code this ? I know how to draw a line using Core Graphics ... But this line has to be over the top of all other items (on the screen). 回答1: I'm posting this after you've posted your own answer, so this is probably a huge waste of time. But your answer only covers drawing a really bare-bones line on the screen and doesn't cover a bunch of other interesting stuff that you need to take care of to really replicate

Turning an NSImage* into a CGImageRef?

我们两清 提交于 2019-12-17 22:25:45
问题 Is there an easy way to do this that works in 10.5? In 10.6 I can use nsImage CGImageForProposedRect: NULL context: NULL hints: NULL If I'm not using 1b black and white images (Like Group 4 TIFF), I can use bitmaps, but cgbitmaps seem to not like that setup... Is there a general way of doing this? I need to do this because I have an IKImageView that seems to only want to add CGImages, but all I've got are NSImages. Currently, I'm using a private setImage:(NSImage*) method that I'd REALLY

How to Convert UIImage to CIImage and vice versa

╄→гoц情女王★ 提交于 2019-12-17 21:49:02
问题 I'm trying to get CIImage from ImageView that display on the screen. UIImage *image = myImageView.image; convert image UIImage to CIImage . CIImage *cImage = [....]; How to do this? 回答1: CIImage *ciImage = [UIImage imageNamed:@"test.png"].CIImage; UIImage *uiImage = [[UIImage alloc] initWithCIImage:ciImage]; To fix the case where myUIImage.CIImage returns nil like [UIImageView image] , you can instead do [CIImage imageWithCGImage:myUIImage.CGImage] – Dylan Hand Swift version: let ciImage =

Adding a circle mask layer on an UIImageView

早过忘川 提交于 2019-12-17 21:26:21
问题 I'm building a Photo filter app (like Instagram, Camera+ and many more..), may main screen is a UIImageView that presenting the image to the user, and a bottom bar with some filters and other options. One of the option is blur , where the user can use his fingers to pinch or move a circle that represent the non-blur part (radius and position) - all the pixels outside of this circle will be blurred. When the user touch the screen I want to add a semi transparent layer above my image that

Trouble using callbacks with CGPattern in Swift3

﹥>﹥吖頭↗ 提交于 2019-12-17 20:37:37
问题 I'm attempting to create a colored pattern using CGPattern in Swift. Apple provides a nice Objective-C example in the Quartz 2D Programming Guide in their section on Painting Colored Patterns. But getting all of that syntax converted from Objective-C is not straight forward. Plus I'd like to make use of the info parameter in the drawing callback and there is no example of doing that. Here's my first attempt: class SomeShape { func createPattern() -> CGPattern? { let bounds = CGRect(x: 0, y: 0

Drawing rounded rect in core graphics

♀尐吖头ヾ 提交于 2019-12-17 20:00:51
问题 I want to replicate the event markers of the default iPad calendar, which look like this: I'm trying to use coregraphics for this, painting a path of a rounded rect. This is the result I could come up with: As you can see, iPad's version looks much smoother on the rounded corners. I tried to use a bigger line width, which looks like this: My code looks like this (got it from this website): UIColor* fillColor= [self.color colorByMultiplyingByRed:1 green:1 blue:1 alpha:0.2];

Which CGImageAlphaInfo should we use?

…衆ロ難τιáo~ 提交于 2019-12-17 18:43:28
问题 The Quartz 2D programming guide defines the availability of the various alpha storage modes: Which ones should we use for RGB contexts, and why ? For non-opaque contexts, kCGImageAlphaPremultipliedFirst or kCGImageAlphaPremultipliedLast ? For opaque contexts, kCGImageAlphaNoneSkipFirst or kCGImageAlphaNoneSkipLast ? Does the choice of value affect performance? Typically, I see kCGImageAlphaPremultipliedFirst for non-opaque and kCGImageAlphaNoneSkipFirst for opaque contexts. Some state that

What's the difference and compatibility of CGLayer and CALayer?

放肆的年华 提交于 2019-12-17 17:26:00
问题 I'm confusing CGLayer and CALayer . They look similar, so why are there separate implementations? What's the difference between, and compatibility of, CGLayer and CALayer ? 回答1: They are completely different, and not compatible. To be absolutely clear: it is strictly a coincidence that the word "layer" is used in both names ; they are completely unrelated . CGLayers are a "special" "high performance" thingy. You could consider them "like bitmaps, but better." Apple sat down and said "We're