avfoundation

AVFoundation -Videos merge but only last video plays

匆匆过客 提交于 2020-01-03 17:27:33
问题 I have an array of [AVAsset]() . Whenever I record different videos at different durations the below code merges all the durations into 1 video but it will only play the last video in a loop. For eg. video1 is 1 minute and shows a dog walking, video2 is 1 minute and shows a bird flying, video3 is 1 minute and shows a horse running. The video will merge and play for 3 minutes but it will only show the horse running for 1 minute each three consecutive times. Where am I going wrong at? var

How to use custom video resolution when use AVFoundation and AVCaptureVideoDataOutput on mac

元气小坏坏 提交于 2020-01-03 10:43:50
问题 I need to process each frame of captured video frame, although AVCaptureDevice.formats provided so many different dimension of frame sizes, it seems AVCaptureSession only support those frame sizes defined in presets. I've also tried to set AVCaptureDevice.activeFormat before AVCaptureInputDevice or after, no matter what setting I set, if I set AVCaptureSessionPresetHigh in AVCaptureSession , it always give me a frame of 1280x720. Similar , If i set AVCaptureSessionPreset 640x480, then I can

CATextLayer doesn't appear in an AVMutableComposition when running from a unit test

微笑、不失礼 提交于 2020-01-03 07:14:09
问题 EDIT: The strangest thing: it seems that when running this code from a full app everything works, but I was always running the creation of the movie from my unit tests, and only there it didn't work. Trying to figure out why is that... I'm trying to combine video + audio + text using AVMutableComposition and export it to a new video. My code is based on the AVEditDemo from WWDC '10 I added a purple background to the CATextLayer so I can know for a fact it is exported to the movie, but no text

Where to place code for audio playback in a SwiftUI app [closed]

﹥>﹥吖頭↗ 提交于 2020-01-03 06:40:16
问题 Closed . This question is opinion-based. It is not currently accepting answers. Want to improve this question? Update the question so it can be answered with facts and citations by editing this post. Closed 5 months ago . Where is the best place to put code for audio playback in a SwiftUI based app, i.e. not having UIViewController classes? The sound I want to play is initiated by a view, so I'm thinking of putting it into the corresponding view model class. But as a model class is about data

IOS Receiving video from Network

半腔热情 提交于 2020-01-03 05:50:36
问题 UPDATE - I have fixed some mistakes in the code below and the images are displayed on the other device, but I have another problem. While video capture is open, the "master" device sends data continuously, sometimes this capture appears on "slave" device and in a very short time, the image "blinks" to blank and repeat this all time for a short period. Any idea about this? I'm working on a app that's need to send live camera capture and live microphone capture to another device in network. I

iOS 9 : AVFoundation Export Session is missing audio

人盡茶涼 提交于 2020-01-03 03:48:05
问题 I'm using the below code snipped while merging videos with original audio. It has been working until I upgraded to iOS9. Anyone faced the same issue and any help to resolve would be greatly appreciated. I couldn't find anything after researching whole day. AVAssetTrack *videoTrack = nil; AVAssetTrack *audioTrack = nil; CMTime insertionPoint = kCMTimeZero; if([[url tracksWithMediaType:AVMediaTypeVideo] count] != 0) { videoTrack = [url tracksWithMediaType:AVMediaTypeVideo][0]; } if([[url

AVFoundation play consecutive video fragments

你说的曾经没有我的故事 提交于 2020-01-03 03:12:40
问题 I am working on an iOS app that involves fetching video fragments that are part of a stream from a web server and playing them consecutively inside the app. After some research, I decided to use an AVQueuePlayer. Every time I fetch an MP4 file from the server and store it in an NSData object, I create an AVPlayerItem and append it to the queue. Also, I listen to the AVPlayerItemDidPlayToEndTimeNotification notification where I advance to next item. The issue I am facing is an annoying small

iOS rotate video AVAsset avfoundation

谁都会走 提交于 2020-01-03 02:10:06
问题 Example Hi, Struggling to rotate this video to show in the proper orientation and fill the entire screen. I cannot get the avasset with videocompisition but cannot get it to work correctly. let videoAsset: AVAsset = AVAsset(URL: outputFileURL) as AVAsset let clipVideoTrack = videoAsset.tracksWithMediaType(AVMediaTypeVideo).first! as AVAssetTrack let newHeight = CGFloat(clipVideoTrack.naturalSize.height/3*4) let composition = AVMutableComposition() composition.addMutableTrackWithMediaType

How to show side by side two previews of camera in iOS objective-C for vr app?

半腔热情 提交于 2020-01-03 02:03:05
问题 I am trying to implement side by side two camera views for vr application. I found some useful info on at this site: How to show 2 camera preview side by side?[For cardboard apps] but I want how to create it with iOS? I am trying to create same behaviour with AVFoundation. This is my current code CameraViewController #import <AVFoundation/AVFoundation.h> #import "CameraViewController.h" @interface CameraViewController () @property (strong, nonatomic) AVCaptureSession *captureSession;

audioPlayerDidFinishPlaying function in SWIFT

你。 提交于 2020-01-03 01:22:08
问题 I am writing a simple player in swift using the AVFoundation framework. Everything seems to be working except my player keeps playing the same song over and over again. I only have one song in my play list so this makes sense. what I am trying to do is check the audioPlayerDidFinishPlaying flag to make sure it is done playing and then I will make it stop. I am not sure how to implement the call to get the flag here is my code. mp3Player?.play() **if (mp3Player?.audioPlayerDidFinishPlaying