avfoundation

iOS swift convert mp3 to aac

淺唱寂寞╮ 提交于 2019-12-06 03:45:44
问题 I'm converting an mp3 to m4a in Swift with code based on this. It works when I generate a PCM file. When I change the export format to m4a it generates a file but it won't play. Why is it corrupt? Here is the code so far: import AVFoundation import UIKit class ViewController: UIViewController { var rwAudioSerializationQueue:dispatch_queue_t! var asset:AVAsset! var assetReader:AVAssetReader! var assetReaderAudioOutput:AVAssetReaderTrackOutput! var assetWriter:AVAssetWriter! var

Reloading keys in AVAsset when status is AVKeyValueStatusFailed

旧街凉风 提交于 2019-12-06 03:40:25
问题 I'm doing the following: Create a new AVAsset with a given URL. That URL points to a video on a remote web server. Attempt to load the tracks property by calling loadValuesAsynchronouslyForKeys:completionHandler: The initial request fails, because no internet connection exists I notice that the request failed by calling statusOfValueForKey:error: I then wait for the connection to re-appear (using some reachability code). As soon as it does, I call loadValuesAsynchronouslyForKeys

I cannot get a precise CMTime for Generating Still Image from 1.8 second Video

若如初见. 提交于 2019-12-06 03:21:59
问题 Every time I try to generate a still frame from my video asset, it is generated at a time of 0.000.. seconds. I can see this from my log message. The good thing is that I can get the image at time 0.000.. to show up in a UIImageView called "myImageView." I thought the problem was that AVURLAssetPreferPreciseDurationAndTimingKey was not set, but even after I figured out how to do that, it still does not function.. Here is what I have.. time, actualTime, and generate are declared in the Header

How to crop a video in iOS

旧巷老猫 提交于 2019-12-06 03:21:55
I was having a look at the RosyWriter Sample Code provided by Apple as a starting point and I'd like to find a way how to crop a video. So i have the full resolution video from the iPhones Camera, but I just want to use a cropped part of it (and also rotate this subpart). I figured that in captureOutput:didOutputSampleBuffer: fromConnection: i can modify each frame by modifying the CMSampleBufferRef that i get passed in. So my questions now are: Is this the right place to crop my video? Where do I specify that the final video (that get's saved to disc) has a smaller resolution than the full

iPhone App - Show AVFoundation video on landscape mode

若如初见. 提交于 2019-12-06 03:05:21
I am using the AVCam example App from Apple. This example uses AVFoundation in order to show video on a view. I am trying to make from the AVCam a landscape App with no luck. When screen orientation changes the video is shown rotated on the view. Is there a way of handling this problem? Samssonart When you create your preview layer: captureVideoPreviewLayer.orientation = UIInterfaceOrientationLandscapeLeft; And the methods to manage rotations: -(void)willAnimateRotationToInterfaceOrientation: (UIInterfaceOrientation)toInterfaceOrientation duration:(NSTimeInterval)duration { [CATransaction

How to get lyrics for Now Playing song in iOS10 (Swift 3)

泄露秘密 提交于 2019-12-06 02:58:41
问题 I want to display lyrics from song that is currently playing by iOS system player. Here is my custom player: import UIKit import MediaPlayer import AVFoundation class NowPlayingController: NSObject { var musicPlayer: MPMusicPlayerController { if musicPlayer_Lazy == nil { musicPlayer_Lazy = MPMusicPlayerController.systemMusicPlayer() let center = NotificationCenter.default center.addObserver(self, selector: #selector(self.playingItemDidChange), name: NSNotification.Name

GPUImageMovieWriter and avfiletypempeg4 filetype

江枫思渺然 提交于 2019-12-06 02:55:11
First of all I would like congratulate Brad for the amazing work on GPUImage. I'm trying to apply a rotation to a given video file and obtain an mpeg4 (AVFileTypeMPEG4) file as output. When doing this I obtain the following message : * -[AVAssetWriterInput appendSampleBuffer:] Input buffer must be in an uncompressed format when outputSettings is not nil This problem occurs when using the following init method of GPUImageMovieWriter with filetype set to AVFileTypeMPEG4 : - (id)initWithMovieURL:(NSURL *)newMovieURL size:(CGSize)newSize fileType:(NSString *)newFileType outputSettings:

Slow presentViewController performance

孤者浪人 提交于 2019-12-06 02:26:35
问题 I am using UIViewControllerTransitioningDelegate to build custom transitions between two view controllers (from a MKMapView ) to a custom Camera built on ( AVFoundation ). Everything goes well until I call the presentViewController and the phone seems to hang for about 1 second (when I log everything out). This even seems to happen when I am transitioning to a much simpler view (I have a view controller that only displays a UITextview and even with that there appears to be about a .4 - .5

CVPixelBuffer to CIImage always returning nil

那年仲夏 提交于 2019-12-06 02:17:29
问题 I am trying to convert a pixelBuffer extracted from AVPlayerItemVideoOutput to CIImage but always getting nil. The Code if([videoOutput_ hasNewPixelBufferForItemTime:player_.internalPlayer.currentItem.currentTime]) { CVPixelBufferRef pixelBuffer = [videoOutput_ copyPixelBufferForItemTime:player_.internalPlayer.currentItem.currentTime itemTimeForDisplay:nil]; CIImage *image = [CIImage imageWithCVPixelBuffer:pixelBuffer]; // Always image === nil CIFilter *filter = [FilterCollection

AVPlayer doesn't play more than one time

安稳与你 提交于 2019-12-06 02:02:04
I have some strange problem with playing audio through AVPlayerItem & AVPlayer. I have recorder and iPod item picker, stream from both of them are going through code like this: AVAudioSession *audioSession = [AVAudioSession sharedInstance]; [audioSession setCategory:AVAudioSessionCategoryPlayback error:nil]; [audioSession setActive:YES error:nil]; _playerItem = [[AVPlayerItem alloc] initWithURL:_soundFileURL]; _player = [[AVPlayer alloc] initWithPlayerItem:_playerItem]; [_player play]; Everything is actually ok, except one thing, I can't play this stream second or more time. It's just stopped,