avfoundation

AVAudioPlayer.play() does not play sound

≡放荡痞女 提交于 2019-11-29 16:33:23
Why doesn't the following code play a sound? It returns "true" for play(), but I cannot hear anything. let path = "/Users/account/Music/sound.mp3"; let fileURL = NSURL(fileURLWithPath: path) var Player = AVAudioPlayer(contentsOfURL:fileURL, error:nil); Player.delegate = self; Player.prepareToPlay(); Player.volume = 1.0; var res = Player.play(); println(res); If I use the following code instead, I can hear the sound. var inFileURL:CFURL = fileURL!; var mySound = UnsafeMutablePointer<SystemSoundID>.alloc(sizeof(SystemSoundID)); AudioServicesCreateSystemSoundID(inFileURL, mySound);

Setting multiple Volumes to each Video tracks using AudioMixInputParameters AVFoundation is not working in Swift iOS

核能气质少年 提交于 2019-11-29 16:21:58
I am working on Video based Application in Swift . As per the requirement I have to select multiple Videos from Device Gallery , setting up different different CIFilter effects and Volume for each Video Asset and then merge all the Videos and have to Save the Final Video . As an output, when I will play the Final Video then Video sound volume should change accordingly. I have already merged all the selected Video Assets into one with different different CIFilter effects but my problem is when I am trying to set Volume for each Video Clips then it's not working. I am getting the default Volume

How to overlay one video on another in iOS?

你。 提交于 2019-11-29 15:46:19
问题 I am trying to crop an already taken video into a circle in iOS. How might I go about doing this. I know how I would do it with AVCaptureSession but I don't know to pass in an already taken video as an AVCaptureDevice? Is there a way to crop a video into a circle. I want to overlay it on top of another video so it has to have a transparent background as well. Thanks. 回答1: I guess you want to produce something like this: You don't want an AVCaptureSession , because you're not capturing video.

Setting AVMutableComposition's frameDuration

泄露秘密 提交于 2019-11-29 15:01:55
问题 I'm playing with the AVEditDemo project, from Apple's WWDC 2010 sample pack, and I'm trying to change the frame rate of the exported video. The video is exported using an AVMutableComposition on which the frameDuration is set like that: videoComposition.frameDuration = CMTimeMake(1, 30); // 30 fps For some reason, changing the 30 to 25 does not change the framerate of the video exported with the AVAssetExportSession . Does anyone have an idea why? 回答1: It seems that the AVAssetExportSession

iPhone: AVCaptureSession capture output crashing (AVCaptureVideoDataOutput)

左心房为你撑大大i 提交于 2019-11-29 14:43:40
问题 I'm capturing video and converting it to a CGImage to do processing on it. It will work for a ~10 seconds, get memory warning and then crash (usually it says data formatters were temporarily unavailable). Can someone help me solve the problem? - (void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { // CONVERT CMSAMPLEBUFFER INTO A CGIMAGE CVImageBufferRef imageBuffer =

How to get file size and current file size from NSURL for AVPlayer iOS4.0

我的未来我决定 提交于 2019-11-29 12:59:26
self.player = [[AVPlayer playerWithURL:[NSURL URLWithString:@"http://myurl.com/track.mp3"]] retain]; I am trying make a UIProgressView for the above track. How do I obtain the file size and current file size from that URL? Please help, thanks! You need to start observing the loadedTimeRanges property of the current item, like this: AVPlayerItem* playerItem = self.player.currentItem; [playerItem addObserver:self forKeyPath:kLoadedTimeRanges options:NSKeyValueObservingOptionNew context:playerItemTimeRangesObservationContext]; Then, in the observation callback, you make sense of the data you're

AVAudioSession manipulate sound output

独自空忆成欢 提交于 2019-11-29 12:30:30
I'm using AVSoundSession to configure sound, and AVAudioPlayer to play different sounds. I searched a lot and couldn't find anything. How can I manipulate output sources? I need a method in my SoundManager where I could switch output between phone speaker and loudspeaker. success = [session overrideOutputAudioPort:AVAudioSessionPortOverrideSpeaker error:&error]; Using this I can route sound to loudspeaker, but there is no method to move it to phone speaker. Can anybody help me with it? Yevgen THD So, I found solution for manipulating with sound output. You could initialize sound settings with

Swift 2.0: Type of Expression is ambiguous without more context?

送分小仙女□ 提交于 2019-11-29 11:07:40
问题 The following used to work in Swift 1.2: var recordSettings = [ AVFormatIDKey: kAudioFormatMPEG4AAC, AVEncoderAudioQualityKey : AVAudioQuality.Max.rawValue, AVEncoderBitRateKey : 320000, AVNumberOfChannelsKey: 2, AVSampleRateKey : 44100.0] Now, it gives the error: "Type expression is ambiguous without more context". 回答1: To comply to the required [String : AnyObject] format required by recordSettings parameter; In addition to @Unheilig's answer, you'll need to convert your ints and floats to

iPhone song lyrics access

我怕爱的太早我们不能终老 提交于 2019-11-29 10:24:55
问题 I'm trying to get the lyrics for a song on an iOS device and the examples I've found on the web and stackoverflow show getting the song's MPMediaItem (i.e. using a [MPMediaQuery songsQuery] with MPMediaItemPropertyPersistentID as a predicate) and then retrieving the lyrics using: [mediaItem valueForProperty:MPMediaItemPropertyLyrics] The problem is that this only works if you first open the song in the iPod music app and view the lyric there. Even if you do that the next time you sync it may

iOS: error in __connection_block_invoke_2: Connection interrupted [duplicate]

谁说我不能喝 提交于 2019-11-29 10:05:27
This question already has an answer here: What is “error in __connection_block_invoke_2: Connection interrupted” in iOS? 1 answer Xcode/iOS 8/AVFoundation related error in console: error in __connection_block_invoke_2: Connection interrupted I am just adding AVCaptureVideoDataOutput to Apple's sample app 'AVCamManualUsingtheManualCaptureAPI' What I added was: // CoreImage wants BGRA pixel format NSDictionary *outputSettings = @{ (id)kCVPixelBufferPixelFormatTypeKey : [NSNumber numberWithInteger:kCVPixelFormatType_32BGRA]}; // create and configure video data output AVCaptureVideoDataOutput