So of course the first thing I\'m trying on my new iPad (4g) is the old GLGravity example. The only modification I\'ve made is to set the target device to \"iPad\".
Howe
I think I kind of get this.
XIB files measure in pts, not pixels. A pt is a unit of distance, like a cm, or a light-year. A pixel is not a unit of distance, it stands for "picture element". Different devices can have different pixel densities.
Your XIB file tells the size of your display area in pts. iPhone apps default to 320x480 pts, iPad 768x1024 pts. You can check this out in your xib file.
Now, self.contentScaleFactor
that Brad refers to is the answer. What this value does is: "convert from the default logical coordinate space (768x1024 pts) into the device coordinate space of this screen (1536x2048 px)."
In other words, self.contentScaleFactor
is a measure of pixels/pt used by the device. This value varies naturally from device to device. XCode has you specify UI elements in pts (unit of real world distance), so things don't appear screwed up on different resolutions.
You can even set self.contentScaleFactor=4
, which actually looks kinda cool (it supersamples) (but can run slow).
Another gotcha here is touch events will always be in pts, never in pixels. That is, retina devices still only resolve touch events on a 768x1024 scale.
So, if your view supports touch picking, you need to multiply each incoming touch event by self.contentScaleFactor
in order for the incoming location in pts to map to pixels in your framebuffer.