core-graphics

How to remove the transparent area of an UIImageView after masking?

那年仲夏 提交于 2019-12-05 05:37:47
问题 In one of my iOS applications, I am trying to cut a portion of an image using CGImageMask . I have succeeded in masking the image with the following code: - (UIImage *)maskImage:(UIImage *)referenceImage withMask:(UIImage *)maskImage { CGImageRef maskRef = maskImage.CGImage; CGImageRef mask = CGImageMaskCreate(CGImageGetWidth(maskRef), CGImageGetHeight(maskRef), CGImageGetBitsPerComponent(maskRef), CGImageGetBitsPerPixel(maskRef), CGImageGetBytesPerRow(maskRef), CGImageGetDataProvider(maskRef

Obtaining modifier key pressed in CGEvent tap

筅森魡賤 提交于 2019-12-05 05:36:44
问题 Having setup an event tap, I'm not able to identify what modifier key was pressed given a CGEvent. CGEventFlags flagsP; flagsP=CGEventGetFlags(event); NSLog(@"flags: 0x%llX",flagsP); NSLog(@"stored: 0x%llX",kCGEventFlagMaskCommand); if (flagsP==kCGEventFlagMaskCommand) { NSLog(@"command pressed"); } Given the above snippet, the first NSLog returns a different value from the second NSLog. No surprise that the conditional is never triggered when the command modifier key is pressed. I need to

How to obtain a CGImageRef from the content of an UIView?

大城市里の小女人 提交于 2019-12-05 05:23:48
I have an UIView where I was drawing some stuff inside -drawRect:. Now I need a CGImageRef from this graphics context or bitmap of the UIView. Is there an easy way to get that? Like this (typed from memory, so it might not be 100% correct): // Get a UIImage from the view's contents UIGraphicsBeginImageContextWithOptions(view.bounds.size, view.opaque, view.contentScaleFactor); CGContextRef context = UIGraphicsGetCurrentContext(); [view.layer renderInContext:context]; UIImage *image = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext(); // Convert UIImage to CGImage

Writing with digital pen on ipad/iphone by keeping hand on screen

会有一股神秘感。 提交于 2019-12-05 03:50:18
问题 I am working on a drawing app, with pen touch, on iphone and ipad. Till now , I have implemented the basic drawing and tested with digital pen and it work fine, but I have one issue, while drawing on iphone/ipad, if my fingers touch the screen before I draw with pen , the pen wont work, So what I want to achieve is that, I want to able to write with pen, even if my fingers are touching the ipad screen. Regards Ranjit 回答1: There is no way to differentiate between a finger and a stylus, as a

Smoothing a hand-drawn free shape

眉间皱痕 提交于 2019-12-05 03:44:20
问题 I am creating an app that allows the user to do some hand-drawing. The problem is I draw lines between the points where the user moved his finger so the resulting shape is somewhat jagged. My question is how can I smooth the drawing? What is the best algorithm for dealing with this kind of situation? 回答1: You could use some kind of curve fitting (maybe Bezier curve) to do it for you. There is also this very nice example of how it could work. I could not find source code for it but i think

CoreGraphics slower on iPhone4 than on 3G/3GS

时光怂恿深爱的人放手 提交于 2019-12-05 02:52:59
问题 I have a chart drawn with CoreGraphics. This chart can be scrolled horizontally and it's drawn as we scroll it. The problem is that on 3G/3GS the speed and performance of the scroll is good but on iPhone 4 is slower than expected. I suppose this is a issue related to the higher resolution of the iPhone 4. Is that correct? How can I increase the performance on iPhone 4? Does the framework automatic conversion to draw on iPhone 4 resolution or is it my work? Thank you very much. 回答1: The iPhone

Create the indented look found in UINavigationBarButton - programmatically

拈花ヽ惹草 提交于 2019-12-05 02:51:24
问题 I'm trying to programmatically recreate the indented button look that can be seen on a UINavigationBarButton. Not the shiny two tone look or the gradient, just the perimeter shading: It looks like an internal dark shadowing around the entire view perimeter, slightly darker at the top? And then an external highlighting shadow around the lower view perimeter. I've played a bit with Core Graphics, and experimented with QuartzCore and shadowing with view.layer.shadowRadius and .shadowOffset, but

Flattening a CGPath

可紊 提交于 2019-12-05 02:50:37
问题 Simply put, I'm looking for an equivalent to NSBezierPath's -bezierPathByFlatteningPath that can be used on iOS. It doesn't matter to me whether this is a function dealing directly with a CGPath or a method on UIBezierPath, because the two can easily be converted back and forth. Neither the CGPath Reference nor the UIBezierPath Class Reference indicate the presence of any such function or method. Also: I'm aware of CGPath's CGPathApply function, and I lack both the time and the skill-set to

Get screen resolution programmatically in OS X

眉间皱痕 提交于 2019-12-05 02:23:21
I'd like to launch a fullscreen 3D C++ application in native resolution on mac. How can I retrieve the native screen resolution ? If you don't wish to use Objective C, get the display ID that you wish to display on (using e.g. CGMainDisplayID ), then use CGDisplayPixelsWide and CGDisplayPixelsHigh to get the screen width and height, in pixels. See " Getting Information About Displays " for how to get other display information. If you're willing to use a bit of Objective-C, simply use [[NSScreen mainScreen] frame] . Note that there are other concerns with full screen display, namely ensuring

Optimize Core Graphics animated drawing (iPhone)

浪子不回头ぞ 提交于 2019-12-05 02:18:04
问题 I have a loop that fires a function 30 times per second. The function changes the position of a couple of points that I use to animate. I draw lines through all the points, meaning that the lines will change 30 times per second. I draw these lines to a CGLayer, which then is drawn to a UIView in the drawRect: method. I do this because I understand that performance is improved when drawing offscreen. However, it seems that the CGLayer saves all actual lines instead of drawn pixels, since even