quartz-graphics

In Objective-C (OS X), is the “global display” coordinate space used by Quartz Display Services the same as Cocoa's “screen” coordinate space?

走远了吗. 提交于 2019-12-03 17:07:12
问题 I'm attempting to create an image "loupe" for my application that can be used to inspect images at different magnification levels, and I have run into a bit of a road bump. I'm using Quartz to create a CGImageRef snapshot of a selected portion of the display my app's window is on. The problem is, the nomenclature used by all of the different OS X technologies has me really confused. The function I'm using is CGDisplayCreateImageForRect(CGDirectDisplayID display, CGRect rect) . The

iPhone Performance Differences in Quartz Drawing vs. Pre-Baked Images (which I guess simplifies to Quartz vs. Quartz)

允我心安 提交于 2019-12-03 16:54:02
New to Quartz and I am curious on the drawing speeds of simple shapes, gradients, and shadows; specifically comparing Quartz drawing functions to Quartz image drawing on the iPhone. Say that I need to draw a filled, stroked, and shadowed rectangle. I'm assuming that importing a pre-baked rect as a PNG and drawing it using drawInRect: or drawAtPoint: is faster than using Quartz's drawing functions to draw the same thing, since the latter requires explicit calculations. On the other hand, drawing an image I assume increases memory use and application size since I have to import the image and

CATransform3D rotate causes half of image to disappear

拥有回忆 提交于 2019-12-03 15:04:12
问题 I'm using the following code to rotate an image, but half the image (down the y-axis) that has been rotated "out of" the page, disappears. How to fix? heading is in radians. CALayer *layer = myUIImageView.layer; CATransform3D rotationAndPerspectiveTransform = CATransform3DIdentity; rotationAndPerspectiveTransform.m34 = 1.0 / 500; rotationAndPerspectiveTransform = CATransform3DRotate(rotationAndPerspectiveTransform, heading, 0.0f, 1.0f, 0.0f); layer.transform = rotationAndPerspectiveTransform;

Can CARemoteLayerServer and CARemoteLayerClient be used between processes?

为君一笑 提交于 2019-12-03 14:24:01
In Mac OS X Lion CARemoteLayerServer and CARemoteLayerClient were added to QuartzCore. I've been trying to investigate if they'd be suitable for splitting a graphical application between multiple processes, but without success. I can use them successfully within a single process, with some code along the lines of this: - (void)buildLayerSingleProcess { CARemoteLayerServer *server = [CARemoteLayerServer sharedServer]; self.client = [[CARemoteLayerClient alloc] initWithServerPort: server.serverPort]; uint32_t clientID = self.client.clientId; CALayer *layer1 = [CALayer layer]; layer1.bounds =

Combining UIView animation blocks and OpenGL ES rendering

孤街醉人 提交于 2019-12-03 13:07:38
I am developing an iP* game and I make use of both UIKit and OpenGL ES 2.0. The UIKit elements are rendered over the OpenGL view and occupy a significant (arbitrary) amount of screen space. I must admit that Apple has done an excellent work and the frame rate of the game is always 60 FPS. In order to come to this conclusion I made many tests regarding the performance: I added lots of static (not moving) UIViews over the OpenGL view -- OK! I animated using custom code the same UIViews (modified the center property in the drawFrame method of my game) -- OK!. I added many OpenGL ES elements under

Drawing a path with CAKeyFrameAnimation on iPhone

守給你的承諾、 提交于 2019-12-03 13:07:28
I want to draw a path gradually, i.e. I want the path to appear as if it is drawn by hand. I have managed to create the path I need. I have also managed to create a CAKeyFrameAnimation that uses this path. But so far I can only move an object along this path. I would like to both move an object (say a pencil) along the path, and have the path appear as if it is drawn. Any pointers? Create a CAShapeLayer with your path and animate the layer's strokeEnd from 0.0 to 1.0 . (This is new in iOS SDK 4.2, won't work with previous versions.) 来源: https://stackoverflow.com/questions/4390911/drawing-a

CGContextClipToMask returning blank image

て烟熏妆下的殇ゞ 提交于 2019-12-03 11:57:28
I'm new to Quartz. I have 2 images, a background, and a mask with cutout shape that I want to lay over the background in order to cut out a section. The resulting image should be the shape of the cutout. This is my mask (the shape in the middle is 0 alpha): And this is my code: UIView *canvas = [[[sender superview] subviews] objectAtIndex:0]; UIGraphicsBeginImageContext(canvas.bounds.size); CGColorSpaceRef colourSpace = CGColorSpaceCreateDeviceRGB(); CGContextRef cgContext = CGBitmapContextCreate(NULL, canvas.bounds.size.width, canvas.bounds.size.height, 8, 0, colourSpace,

Quartz 2D/OpenGLES geometric distortions on images (preferrably using CGImage)

你说的曾经没有我的故事 提交于 2019-12-03 10:16:39
问题 What is the preferred method for implementing such geometric distortions as pinch/fisheye/etc. using the iPhone SDK? I know that the Core Image library for OSX has all these types of filters built in, but not for the iPhone SDK. I can create a displacement map at a specific location and radius given the original source bitmap data, but I'm not sure how to apply this bitmap data as a transformation on my CGImage. This isn't an affine transformation since lines are no longer parallel around the

iOS: Improving speed of image drawing

China☆狼群 提交于 2019-12-03 10:14:20
问题 I have a sequence of images that I want to animate ( UIImageView supports some basic animation but it's not sufficient for my needs). My first approach was to use UIImageView and set the image property when the image. This was too slow. The reason for the poor speed was due to the drawing of the images (which surprised me; I assumed the bottle neck would be loading the image). My second approach was to use a generic UIView and set view.layer.contents = image.CGImage . This gave no noticeable

Optimizing a drawing (with finger touches) application for iPhone SDK

匆匆过客 提交于 2019-12-03 10:03:34
问题 I'm writing an application that uses your finger to draw simple diagrams. I have it working for the most part but now I'm trying to optimize its performance. When the user swipes their finger fast, I can't capture enough touch events to draw a smooth path. Here's my current approach: 1) I subclassed a UIView and added a poroperty to a CGLayer (gets created lazily and is the same size as my UIView). 2) My UIView subclass responds to touch events by storing the current and last touch points in