avfoundation

How to convert base64 into NSDATA in swift

陌路散爱 提交于 2019-12-02 04:42:20
I am working on an iOS project. It stores audio on web server in the form of base64 string. When I request server to get base64 Strings for all audios and tried convert it in NSData I am getting nil . do { var audioData: NSData! = NSData(base64EncodedString: audioBase64String, options: NSDataBase64DecodingOptions(rawValue:0)) if audioData != nil { let sound = try AVAudioPlayer(data: audioData) sound.play() } else { print("Data Not Exist") } } catch { } On Android same base64 string is converted into byte array and is playing, but in iOS audioBase64String return nil for NSData . This works:

How to color manage AVAssetWriter output

China☆狼群 提交于 2019-12-02 04:10:09
I'm having trouble getting a rendered video's colors to match the source content's colors. I'm rendering images into a CGContext, converting the backing data into a CVPixelBuffer and appending that as a frame to an AVAssetWriterInputPixelBufferAdaptor. This causes slight color differences between the images that I'm drawing into the CGContext and the resulting video file. It seems like there are 3 things that need to be addressed: tell AVFoundation what colorspace the video is in. make the AVAssetWriterInputPixelBufferAdaptor and the CVPixelBuffers I append to it match that color space. use

Deep Copy of CMImageBuffer or CVImageBuffer

僤鯓⒐⒋嵵緔 提交于 2019-12-02 03:59:44
Hi I am currently working on an app which needs to capture a Video and at the same time should be able to take frames to blend them. The problem I am having is that my frames coming from: func captureOutput( captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection! ) will drop after blending about 10-12 frames. I tried blending every 10th frame but it will still drop after 10-12 blended frames. I know that I should copy the CVImageBuffer to release the imageBuffer which I got using the following: let imageBuffer =

AVFoundation to reproduce a video loop

扶醉桌前 提交于 2019-12-02 03:31:28
I need to reproduce a video indefinitely (restarting the video when it ends) in my OpenGL application. To do so I'm trying to utilize AV foundation. I created an AVAssetReader and an AVAssetReaderTrackOutput and I utilize the copyNextSampleBuffer method to get CMSampleBufferRef and create an OpenGL texture for each frame. NSString *path = [[NSBundle mainBundle] pathForResource:videoFileName ofType:type]; _url = [NSURL fileURLWithPath:path]; //Create the AVAsset _asset = [AVURLAsset assetWithURL:_url]; //Get the asset AVAssetTrack NSArray *arrayAssetTrack = [_asset tracksWithMediaType

Using AVMutableComposition iPhone

北慕城南 提交于 2019-12-02 02:44:16
I am using the below code, for streaming the two videos sequentially. But it is not showing any video in the simulator, its totally blank. Also how can I seek through these two videos. Like, if one video is of 2 minutes and the second is 3 minutes. Now I need to get the total time of these videos and seek through them. When I slide the slider bar to 4 minutes so the 2nd video should be played from minute 2 to onward. Is it possible? - (void)viewDidLoad { [super viewDidLoad]; // Do any additional setup after loading the view, typically from a nib. NSURL *url1 = [NSURL URLWithString:@"http://www

AVFoundation to reproduce a video loop

本秂侑毒 提交于 2019-12-02 02:06:02
问题 I need to reproduce a video indefinitely (restarting the video when it ends) in my OpenGL application. To do so I'm trying to utilize AV foundation. I created an AVAssetReader and an AVAssetReaderTrackOutput and I utilize the copyNextSampleBuffer method to get CMSampleBufferRef and create an OpenGL texture for each frame. NSString *path = [[NSBundle mainBundle] pathForResource:videoFileName ofType:type]; _url = [NSURL fileURLWithPath:path]; //Create the AVAsset _asset = [AVURLAsset

MLKit Text detection on iOS working for photos taken from Assets.xcassets, but not the same photo taken on camera/uploaded from camera roll

强颜欢笑 提交于 2019-12-02 01:11:35
I'm using Google's Text detection API from MLKit to detect text from images. It seems to work perfectly on screenshots but when I try to use it on images taken in the app (using AVFoundation) or on photos uploaded from camera roll it spits out a small number of seemingly random characters. This is my code for running the actual text detection: func runTextRecognition(with image: UIImage) { let visionImage = VisionImage(image: image) textRecognizer.process(visionImage) { features, error in self.processResult(from: features, error: error) } } func processResult(from text: VisionText?, error:

Can't use AVCaptureDevice with a flash

早过忘川 提交于 2019-12-01 22:53:12
I am having difficult times, for something which I think ought to be simple. I just want to light the flash when taking a picture in my iOS app. And all I tried failed or works only 20 percent. Here is the code fired to light the flash up: // Here we have: captureDevice.hasFlash && captureDevice.isFlashModeSupported(.On) do {try captureDevice.lockForConfiguration() captureDevice.flashMode = .On captureDevice.unlockForConfiguration() } catch let error as NSError { print("captureDevice.lockForConfiguration FAILED") print(error.code) } I have tried several flavors of the code, by moving the 2

Reverse an audio file Swift/Objective-C

≯℡__Kan透↙ 提交于 2019-12-01 19:57:19
Is there a way that I could reverse and export .m4a audio file? I found a solution to reverse an audio track here , but it only seems to be working on .caf file formats. If the only way is to use a .caf, is there a way to convert the .m4a file to .caf first? Update: In another post I found out that AVAssetReader can be used to read audio samples from an audio file, but I have no idea how to write the samples back in the reverse order. The below code snippet is an answer directly from the post. Any help would be appreciated. Thanks + (void) reverseAudioTrack: (AVAsset *)audioAsset outputURL:

Reverse an audio file Swift/Objective-C

夙愿已清 提交于 2019-12-01 19:41:22
问题 Is there a way that I could reverse and export .m4a audio file? I found a solution to reverse an audio track here, but it only seems to be working on .caf file formats. If the only way is to use a .caf, is there a way to convert the .m4a file to .caf first? Update: In another post I found out that AVAssetReader can be used to read audio samples from an audio file, but I have no idea how to write the samples back in the reverse order. The below code snippet is an answer directly from the post.