avfoundation

How to create video from its frames iPhone

ε祈祈猫儿з 提交于 2019-11-28 17:56:33
问题 I had done R&D and got success in how to get frames in terms of images from video file played in MPMoviePlayerController . Got all frames from this code, and save all images in one Array. for(int i= 1; i <= moviePlayerController.duration; i++) { UIImage *img = [moviePlayerController thumbnailImageAtTime:i timeOption:MPMovieTimeOptionNearestKeyFrame]; [arrImages addObject:img]; } Now the question is that, After change some image file, like adding emotions to the images and also adding filters,

Using existing system sounds in iOS App [swift|

狂风中的少年 提交于 2019-11-28 17:52:07
Is it possible to use the existing Apple system sounds in my own app? I would like to write a sample app in Swift that does the following steps: Read/Get a list of all available systemSounds on the device (I think they are located in /System/Library/Audio/UISounds/ ) show the list on the screen if I touch a list item, play these sound So its basically the same, like when you choose a new ringtone on your iPhone. I think some apps are using this sounds, or have they copied/bought it? Thanks and regards Jens You can use this Swift 4 code to play system sounds: // import this import AVFoundation

Changing the volume without a volume slider on an iphone

℡╲_俬逩灬. 提交于 2019-11-28 17:39:50
I need your help. How should I proceed to change the sound volume in my app. I don't want to use a volume slider. Instead I have an UIImageView which is a volume knob, in which I rotate clockwise to increase, and anti clockwise to decrease the sound volume. The rotation is just an animation and I've already done that part. I need your help and advice on how to increase/decrease the volume. Thanks I would be careful calling setValue on an MPVolumeView since it probably won't do anything other than update the appearance of the slider, but not the actual device volume level. You would instead

AVAudioSession setCategory Swift 4.2 iOS 12 - Play Sound on Silent

旧街凉风 提交于 2019-11-28 17:22:46
问题 To play sound even on Silent mode I use to use below method. But how it's not working. // Works on Swift 3 do { try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback) } catch { print(error) } How to get it work in 4.2 / iOS 12? In newer version we need to set mode and options. try AVAudioSession.sharedInstance().setCategory( <#T##category:AVAudioSession.Category##AVAudioSession.Category#>, mode: <#T##AVAudioSession.Mode#>, options: <#T##AVAudioSession.CategoryOptions#

How to play a music file using URL link in iPhone?

ⅰ亾dé卋堺 提交于 2019-11-28 17:20:33
I want to play a music file using URL Link in the iPhone. But when I use the below code I am getting error I am not getting where I am going wrong. Can anyone Correct me? -(void)viewDidLoad { [super viewDidLoad]; NSString* resourcePath = @"http://192.167.1.104:8888/SHREYA.mp3"; //your url NSData *_objectData = [NSData dataWithContentsOfURL:[NSURL URLWithString:resourcePath]]; NSError *error; audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:_objectData error:&error]; audioPlayer.numberOfLoops = -1; if (audioPlayer == nil) NSLog([error description]); else [audioPlayer play]; } iPhone

Subtitles for AVPlayer/MPMoviePlayerController

生来就可爱ヽ(ⅴ<●) 提交于 2019-11-28 17:20:16
I am using m3u8 video format for streaming the video and now I need to display subtitles for the same. I searched in Apple Documentation and found that I can achieve this by using the closedCaptionDisplayEnabled property of AVPlayer . I am interested to know what should be the format of subtitles? Will the .srt format do? Also can I achieve the same using MPMoviePlayerController ? Any help is appreciated. Update 10/30/2018: It's worth checking this answer by an Apple engineer (Thanks to @allenlini for pointing it out ). He suggests a solution involving AVAssetResourceLoaderDelegate . I haven't

UIImage created from CMSampleBufferRef not displayed in UIImageView?

穿精又带淫゛_ 提交于 2019-11-28 17:16:50
I'm trying to display a UIImage in real-time coming from the camera, and it seems that my UIImageView is not displaying the image properly. This is the method which a AVCaptureVideoDataOutputSampleBufferDelegate has to implement - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { // Create a UIImage from the sample buffer data UIImage *theImage = [self imageFromSampleBuffer:sampleBuffer]; // NSLog(@"Got an image! %f %f", theImage.size.width, theImage.size.height); // NSLog(@"The image

How to get front camera, back camera and audio with AVCaptureDeviceDiscoverySession

纵然是瞬间 提交于 2019-11-28 16:50:34
Before iOS 10 came out I was using the following code to get the video and audio capture for my video recorder: for device in AVCaptureDevice.devices() { if (device as AnyObject).hasMediaType( AVMediaTypeAudio ) { self.audioCapture = device as? AVCaptureDevice } else if (device as AnyObject).hasMediaType( AVMediaTypeVideo ) { if (device as AnyObject).position == AVCaptureDevicePosition.back { self.backCameraVideoCapture = device as? AVCaptureDevice } else { self.frontCameraVideoCapture = device as? AVCaptureDevice } } } When iOS 10 finally came out, I received the following warning when I was

AVPlayer HLS live stream level meter (Display FFT Data)

时光怂恿深爱的人放手 提交于 2019-11-28 16:48:05
I'm using AVPlayer for a radio app using HTTP live streaming. Now I want to implement a level meter for that audio stream. The very best would a level meter showing the different frequencies, but a simple left / right solution would be a great starting point. I found several examples using AVAudioPlayer . But I cannot find a solution for getting the required informations off AVPlayer . Can someone think of a solution for my problem? EDIT I want to create something like this (but nicer) EDIT II One suggestion was to use MTAudioProcessingTap to get the raw audio data. The examples I could find

iOS Video Editing - Is it possible to merge (side by side not one after other) two video files into one using iOS 4 AVFoundation classes?

给你一囗甜甜゛ 提交于 2019-11-28 16:45:39
I know you could merge multiple clips and create a single video by appending one after other using AVFoundation classes- AVURLAsset, AVMutableComposition, AVMutableCompositionTrack etc. There are apps like 'Video-Joiner' that do that. What I want to do is to juxtaposition 2 videos. My app idea - SelfInterviewer please don't steal :) First I record video 1 using front facing camera standing left to the frame. Then video 2 standing to the right. In video 1 ask a question and in video 2 I answer. When I merge, it should appear like I am being interviewed by myself. I am almost sure its not