I am using AVFoundation in swift for take pictures but I can't convert any func lines of code from objective c to Swift. My func code is:
- (void) capImage { //method to capture image from AVCaptureSession video feed
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
videoConnection = connection;
break;
}
}
if (videoConnection) {
break;
}
}
NSLog(@"about to request a capture from: %@", stillImageOutput);
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
if (imageSampleBuffer != NULL) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
[self processImage:[UIImage imageWithData:imageData]];
}
}];
}
This line send me error AnyObject[]does not conform to protocol sequencfe..:
for (AVCaptureInputPort *port in [connection inputPorts]) {
In swift:
for port:AnyObject in connection.inputPorts {
And I don't know how convert this line:
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
Can u help me to convert to swift? Thanks!!
for (AVCaptureInputPort *port in [connection inputPorts]) { )
Arrays of AnyObject should be cast to arrays of your actual type before interating, like this:
for (port in connection.inputPorts as AVCaptureInputPort[]) { }
In terms of blocks to closures, you just have to get the syntax correct.
stillImageOutput.captureStillImageAsynchronouslyFromConnection(videoConnection) {
(imageSampleBuffer, error) in // This line defines names the inputs
//...
}
Note that this also uses Trailing Closure Syntax. Do read up on the docs more!
EDIT: In terms of initializers, they now look like this:
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageSampleBuffer)
self.processImage(UIImage(data:imageData))
Try this
var videoConnection :AVCaptureConnection?
if let videoConnection = self.stillImageOutput.connectionWithMediaType(AVMediaTypeVideo){
self.stillImageOutput.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: { (buffer:CMSampleBuffer!, error: NSError!) -> Void in
if let exifAttachments = CMGetAttachment(buffer, kCGImagePropertyExifDictionary, nil) {
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(buffer)
self.previewImage.image = UIImage(data: imageData)
UIImageWriteToSavedPhotosAlbum(self.previewImage.image, nil, nil, nil)
}
})
}
This should answer the problem with the ports:
if let videoConnection = stillImageOuput.connectionWithMediaType(AVMediaTypeVideo){//take a photo here}
来源:https://stackoverflow.com/questions/24288048/how-to-convert-code-avfoundation-objective-c-to-swift