For anyone working on a project with Core Animation layer-backed views, it\'s unfortunately obvious that subpixel antialiasing (text smoothing) is disabled for text not rend
If you are simply trying to get sharp looking text to display on an opaque background and hitting your head against the wall with CATextLayer - give up and use NSTextField with the inset disabled and linefragmentpadding set to 0. Sample code is swift but you should be able to translate easily...
var testV = NSTextView()
testV.backgroundColor = NSColor(calibratedRed: 0.73, green: 0.84, blue: 0.89, alpha: 1)
testV.frame = CGRectMake(120.0, 100.0, 200.0, 30.0)
testV.string = "Hello World!"
testV.textContainerInset = NSZeroSize
testV.textContainer!.lineFragmentPadding = 0
self.addSubview(testV)
for me displays text the equivalent to:
var testL = CATextLayer()
testL.backgroundColor = NSColor(calibratedRed: 0.73, green: 0.84, blue: 0.89, alpha: 1).CGColor
testL.bounds = CGRectMake(0.0, 0.0, 200.0, 30.0)
testL.position = CGPointMake(100.0, 100.0)
testL.string = "Hello World!"
testL.fontSize = 14
testL.foregroundColor = NSColor.blackColor().CGColor
self.layer!.addSublayer(testL)
This class obviously comes with more overhead and possibly unnecessary things like text selecting/copying etc, but hey the text displays well.
As a simple hack, if text anti-aliasing is working (but not subpixel), you can fake it by rendering to a view that is 3x as wide, then scaling down. This is nonportable as I know of no way to query the element order on your display, but it should work.
E.g.,
RGB RGB RGB -> RGB
| | | |||
| | +----++^
| +---------+^
+--------------^
I'm not really sure I have grasped the question, so this is a real punt.
But I noticed LaC's answer depends on the pixels being written on top of each other in the normal way. In the manner you would in Photoshop call the "Normal Blend Mode".
You can merge two layers in lots of other ways including "Multiply":

This individually multiplies the R, G, and B of each pixel. So if you had the text on a white background, you could get a further-back layer to show through by setting the top layer to "Multiply" which causes both layers to burn into each other like so.
This should work for your use case too:

Both text layers in this shot have an opaque white background, but the one on the right has the blend mode set to "Multiply".
With "Multiply":
In other words, the same result that you would have got just laying the text directly onto the background.
I haven't subpixel antialiased the text in this screenshot but it would work equally well with differing R, G, and B values.
I'm not remotely familiar with Mac's API, but according to this link, Core Animation does have this capability, which you get by writing:
myLayer.compositingFilter = [CIFilter filterWithName:@"CIMultiplyBlendMode"];
or
myLayer.compositingFilter = [CIFilter filterWithName:@"CIMultiplyCompositing"];
(I have no idea what "Blend Mode" vs "Compositing" relates to in Mac parlance so you'll have to try both!)
EDIT: I'm not sure you ever specified your text was Black. If it's White, you can use a white-on-black layer set to a Blend Mode of "Screen".
If you're using a transparent sheet, you don't know in advance what the pixels below it will be. They may change. Remember that you have a single alpha channel for all three colors: if you make it transparent, you won't see any subpixel effect, but if you make it opaque, all three subelements are going to get composited with the background. If you give an edge the right color for compositing over a white background, it won't look right if the background changes to some other color.
For example, let's say you're drawing black text on a white background, and the subelement order is RGB. A right edge may be a faint blue: high B value (full brightness on the side away from the glyph), slightly lower but still high R and G values (lower brightness on the side closer to the glyph). If you now composite that pixel over a dark gray background, you're going to make it lighter than it would have been if you had rendered black text on dark gray background directly.
Basically, you are not facing an arbitrary limitation of CoreAnimation: it simply makes no sense to use subpixel rendering on a transparent layer that might be composited over an arbitrary background. You'd need a separate alpha per color channel, but since the pixel format of your buffer is RGBA (or ARGB or whatever it is), you can't have it.
But this leads us to the solution. If you know that the background will remain the same (eg, the sheet displays over a window whose contents you control), then you can simply make your layer opaque, fill it with a copy of the covered region of the background window, and render subpixel-antialiased text on it. Basically, you'd be precompositing your layer with the background. If the background stays the same, this will look identical to what normal alpha compositing would do, except that you can now do subpixel text rendering; if the background changes, then you'd have to give up on doing subpixel text rendering anyway (although I guess you could keep copying the background and redrawing your opaque overlay whenever it changes).
I don't know if this solution will work for you but can you draw the text into an image with transparent background using UIGraphicsBeginImageContext(), I am pretty sure you can set up aliasing for this, if not you draw your text at twice or 4 times the size and then scale down when drawing the image in you view.