avfoundation

AVPlayerViewController doesn't play local videos

家住魔仙堡 提交于 2019-12-31 05:18:47
问题 I have added a DemoVideo.mp4 in my project and added it to copy resource file. But when i run the app it doesn't play my video. Here is my method. private func setUpAndPlayVideo() { guard let videoPath = Bundle.main.path(forResource: "DemoVideo.mp4", ofType: nil) else { return } let videoURL = NSURL(string: videoPath) let player = AVPlayer(url: videoURL! as URL) playerViewController = AVPlayerViewController() playerViewController.player = player playerViewController.view.frame = self

How do I record a maker note or user comment in EXIF?

可紊 提交于 2019-12-31 04:43:09
问题 I'm building a camera app, and want to record in the EXIF the settings that the user selected. Most settings have standard fields, but one doesn't, so I thought I'll record it in the maker notes field. I tried let attachments = NSMutableDictionary(dictionary: CMCopyDictionaryOfAttachments(nil, buffer, kCMAttachmentMode_ShouldPropagate) as! NSDictionary) let exif = NSMutableDictionary(dictionary: attachments[kCGImagePropertyExifDictionary] as! NSDictionary) exif[kCGImagePropertyExifMakerNote]

MLKit Text detection on iOS working for photos taken from Assets.xcassets, but not the same photo taken on camera/uploaded from camera roll

£可爱£侵袭症+ 提交于 2019-12-31 03:33:14
问题 I'm using Google's Text detection API from MLKit to detect text from images. It seems to work perfectly on screenshots but when I try to use it on images taken in the app (using AVFoundation) or on photos uploaded from camera roll it spits out a small number of seemingly random characters. This is my code for running the actual text detection: func runTextRecognition(with image: UIImage) { let visionImage = VisionImage(image: image) textRecognizer.process(visionImage) { features, error in

Why is my sound making my game lag in Swift Spritekit?

回眸只為那壹抹淺笑 提交于 2019-12-31 02:26:06
问题 I have this sound effect when my hero node collects a coin and there is this small hiccup in my game. Its not smooth like in other games when there is sound involved when collecting a coin. What am I doing wrong? Heres my code for the sound: class GameScene: SKScene, SKPhysicsContactDelegate { var coinSound = NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource("coin", ofType: "wav")!) var coinAudioPlayer = AVAudioPlayer() override func didMoveToView(view: SKView) { coinAudioPlayer =

Possible for AVAssetWriter to write files with transparency?

前提是你 提交于 2019-12-30 11:23:09
问题 Every file I write with AVAssetWriter has a black background, if the images I include do not fill the entire render area. Is there any way to write with transparency? Here's the method I use to get the pixel buffer: - (CVPixelBufferRef)pixelBufferFromCGImage:(CGImageRef)image { CGSize size = self.renderSize; NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey, [NSNumber numberWithBool:YES],

Swift Error : Use of module 'CMSampleBuffer' as a type

半城伤御伤魂 提交于 2019-12-30 11:12:14
问题 I saw below question to use 'captureStillImageAsynchronouslyFromConnection' function in swift: How to convert code AVFoundation objective c to Swift? When I try to use a AVFoundation function as following: var stillImageOutput: AVCaptureStillImageOutput! //...Initialize stillImageOutput stillImageOutput.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: {(imageSampleBuffer, error) in if imageSampleBuffer { var imageData = AVCaptureStillImageOutput

Adjust AVPlayer Frame Rate During Playback

青春壹個敷衍的年華 提交于 2019-12-30 07:20:10
问题 Is there a way to change the rate at which frames are rendered during playback? I have a couple short 5 second videos that I would like to play at 15, 30, or 60 FPS. This would be an option to the user. I did find a frameDuration property in the AVVideoComposition class. But all this did was adjust how many frames rendered per second. 回答1: Similar question here There is a rate property in the AVPlayer. It worked very well from me. Tho the rate values ranges from 0 to 2, 1 being played at

Converting .m4a file to .aiff using AudioConverter Swift

半城伤御伤魂 提交于 2019-12-30 03:35:06
问题 I'm trying to convert a given audio file in .m4a format to .aiff format, using the answer from this post. I've converted the code to Swift 3.0. func convertAudio(_ url: URL, outputURL: URL) { var error : OSStatus = noErr var destinationFile : ExtAudioFileRef? = nil var sourceFile : ExtAudioFileRef? = nil var srcFormat : AudioStreamBasicDescription = AudioStreamBasicDescription() var dstFormat : AudioStreamBasicDescription = AudioStreamBasicDescription() var audioConverter : AudioConverterRef?

Unable to trim a video using AVAssetExportSession

烈酒焚心 提交于 2019-12-30 01:29:26
问题 I want to trim a video: -(void)trimVideo:(NSURL*)outputURL { //[[NSFileManager defaultManager] removeItemAtURL:outputURL error:nil]; AVURLAsset *asset = [AVURLAsset URLAssetWithURL:outputURL options:nil]; AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetLowQuality]; NSString * outputFilePath = NSHomeDirectory(); outputFilePath = [outputFilePath stringByAppendingPathComponent:@"Library"]; outputFilePath = [outputFilePath

Get Camera Preview to AVCaptureVideoPreviewLayer

我怕爱的太早我们不能终老 提交于 2019-12-29 04:45:07
问题 I was trying to get the camera input to show on a preview layer view. self.cameraPreviewView is tied to a UIView in IB Here is my current code that I put together from the AV Foundation Programming Guide. But the preview never shows AVCaptureSession *session = [[AVCaptureSession alloc] init]; session.sessionPreset = AVCaptureSessionPresetHigh; AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; NSError *error = nil; AVCaptureDeviceInput *input =