avfoundation

Getting Current Time of a Video (Player - Swift)

送分小仙女□ 提交于 2019-12-05 20:57:01
I set up a video player using Player (using AVFoundation). I am trying to access and manipulate currentTime of the video however, the library is only providing duration of the video but not currentTime. I am tweaking the demo project of Player . Please check 'Updated' section below instead of here: A friend told me I can achieve using that approach, but I couldn't really get how to adapt this approach in this case. [NSTimer scheduledTimerWithInterval:1.0 target:self selector:@selector(refreshCurrentTimeTextField) userInfo:nil repeats:YES] -(void)refreshCurrentTimeTextField { NSTimeInterval

iOS Detect system volume level change. private API or not? AVSystemController_SystemVolumeDidChangeNotification

不想你离开。 提交于 2019-12-05 20:51:40
问题 Can listening to AVSystemController_SystemVolumeDidChangeNotification NSNotification be considered (during the App Store review process) as using private API? In my application I need to display and update the current volume level. Volume indicator should be updated after pressing hardware volume buttons and after volume change in MPVolumeView slider. I've searched solution how to get notification or event of hardware volume button press and had found a number of solutions. For example,

Processing all frames in an AVAsset

我怕爱的太早我们不能终老 提交于 2019-12-05 19:10:54
I am trying to go through each frame in an AVAsset and process each frame as if it were an image. I have not been able to find anything from my searches. The task I am trying to accomplish would look like this in pseudo-code for each frame in asset take the frame as an image and convert to a cvMat Process and store data of center points Store center points in array The only part in that pseudo-code I do not know how to write is the going though each frame and capturing it in an image. Can anyone help? One answer is to use AVAssetImageGenerator . 1) Load the movie file into an AVAsset object. 2

Xcode 8 Swift 3 Pitch-altering sounds

▼魔方 西西 提交于 2019-12-05 19:03:14
So, I asked this question before on the Apple Developer Forums but never got a proper answer, so I thought I'd ask it here: I'm trying to make a simple game with a hit sound that has a different pitch whenever you hit something. I thought it'd be simple, but it ended up with a whole lot of stuff (most of which I completely copied from someone else): func hitSound(value: Float) { let audioPlayerNode = AVAudioPlayerNode() audioPlayerNode.stop() engine.stop() // This is an AVAudioEngine defined previously engine.reset() engine.attach(audioPlayerNode) let changeAudioUnitTime = AVAudioUnitTimePitch

How to Animate Images while converting to Video

╄→尐↘猪︶ㄣ 提交于 2019-12-05 18:58:22
I want to animate images smoothly while converting them to video. Dispite of searching SO, I am unable to understand how to achive it. I tried changing the Rotation angle(CGAffineTransformRotation), Translations and Scaling but didn't found a way to for the smooth animations. Heres how I am converting array of photos to video : - (void)createVideoWithArrayImages:(NSMutableArray*)images size:(CGSize)size time:(float)time output:(NSURL*)output { //getting a random path NSError *error; AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:output fileType:AVFileTypeMPEG4 error: &error];

iOS AVFoundation Video Capture Orientation Options

半城伤御伤魂 提交于 2019-12-05 18:51:37
I have an app that I would like to have video capture for the front-facing camera only. That's no problem. But I would like the video capture to always be in landscape, even when the phone is being held in portrait. I have a working implementation based on the AVCamDemo code that Apple published. And borrowing from the information in this tech note , I am able to specify the orientation. There's just one trick: while the video frame is oriented correctly, the contents still appear as though shot in portrait: I'm wondering if I'm just getting boned by the physical constraints of the hardware:

record video with AVFoundation with custom dimensions? [duplicate]

橙三吉。 提交于 2019-12-05 18:47:23
This question already has answers here : How do I use AVFoundation to crop a video (2 answers) Closed 6 years ago . I am recording an MOV file using AVFoundation but im having trouble finding out how to change the dimensions of the video. I have videoGravity property of captureVideoPreviewLayer set to AVLayerVideoGravityResizeAspectFill and the UIView showing the preview layer has custom dimensions (not the same aspect ratio of the screen). Recording works fine but the dimensions of the recorded video are the same as the aspect ratio of the screen. How can i record with the aspect ratio of the

how to add video on another video as overlay

安稳与你 提交于 2019-12-05 18:27:48
I am stuck on overlay video into another video, as I successfully add video on another video but not able to make scale same for both videos. taking reference from this . note: I have to add overlay video with transparency so below video visible. 来源: https://stackoverflow.com/questions/39825323/how-to-add-video-on-another-video-as-overlay

How to add animation on image change while create video using images

本秂侑毒 提交于 2019-12-05 17:43:02
问题 I have an array of images from which I want to create a video by playing these images one after the other in a sequence.I want to add a different types of animation when image get change. Suggest me some method or any solution to achieve this functionality in Objective-C with the Cocoa framework. Here is working code for making video of images, But please suggest that how we animate images while making video: -(void)createVideoFromImages:(NSString *) path withSize:(CGSize) size { NSError

How to convert Data of Int16 audio samples to array of float audio samples

心不动则不痛 提交于 2019-12-05 16:17:43
I'm currently working with audio samples. I get them from AVAssetReader and have a CMSampleBuffer with something like this: guard let sampleBuffer = readerOutput.copyNextSampleBuffer() else { guard reader.status == .completed else { return nil } // Completed // samples is an array of Int16 let samples = sampleData.withUnsafeBytes { Array(UnsafeBufferPointer<Int16>( start: $0, count: sampleData.count / MemoryLayout<Int16>.size)) } // The only way I found to convert [Int16] -> [Float]... return samples.map { Float($0) / Float(Int16.max)} } guard let blockBuffer = CMSampleBufferGetDataBuffer