avfoundation

How to get another app's currently playing audio [duplicate]

对着背影说爱祢 提交于 2019-11-29 21:05:37
问题 This question is an exact duplicate of: Audio Information of Current Track iOS Swift 1 answer How can I access another app's currently playing audio - the actual audio item but metadata is welcome too. I can see that this question has been asked a lot but with few solutions being offered over the years. I understand apple's philosophy for probably not wanting an app to be able to do this. I also understand that such a request is probably outside of the iOS API. With that being said, I would

How to save photos taken using AVFoundation to Photo Album?

对着背影说爱祢 提交于 2019-11-29 20:42:16
AVFoundation is just great for those who want to get hands dirty, but there still a lot of basic stuff not easy to figure like how to save the pictures taken from device to Photo Album Any ideas? Here is a step by step tutorial on how to capture an image using AVFoundation and save it to photo album. Add a UIView object to the NIB (or as a subview), and create a @property in your controller: @property(nonatomic, retain) IBOutlet UIView *vImagePreview; Connect the UIView to the outlet above in IB, or assign it directly if you’re using code instead of a NIB. Then edit your UIViewController , and

AVAudioRecorder Won't Record On Device

此生再无相见时 提交于 2019-11-29 20:40:28
问题 This is my method: -(void) playOrRecord:(UIButton *)sender { if (playBool == YES) { NSError *error = nil; NSString *filePath = [[NSBundle mainBundle] pathForResource:[NSString stringWithFormat: @"%d", [sender tag]] ofType:@"caf"]; NSURL *fileUrl = [NSURL fileURLWithPath:filePath]; AVAudioPlayer *player = [[AVAudioPlayer alloc] initWithContentsOfURL:fileUrl error:&error]; [player setNumberOfLoops:0]; [player play]; } else if (playBool == NO) { if ([recorder isRecording]) { [recorder stop];

CGBitmapContextCreateImage error

左心房为你撑大大i 提交于 2019-11-29 20:34:48
问题 I am getting error like this In my console: : CGBitmapContextCreate: invalid data bytes/row: should be at least 1920 for 8 integer bits/component, 3 components, kCGImageAlphaPremultipliedFirst. : CGBitmapContextCreateImage: invalid context 0x0 I use below code: - (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer { // Get a CMSampleBuffer's Core Video image buffer for the media data CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); // Lock the base

Timeline Progress bar for AVPlayer

生来就可爱ヽ(ⅴ<●) 提交于 2019-11-29 19:59:24
AVPlayer is fully customizable, unfortunately there are convenient methods in AVPlayer for showing the time line progress bar. AVPlayer *player = [AVPlayer playerWithURL:URL]; AVPlayerLayer *playerLayer = [[AVPlayerLayer playerLayerWithPlayer:avPlayer] retain];[self.view.layer addSubLayer:playerLayer]; I have an progress bar that indicates the how video has been played, and how much remained just as like MPMoviePlayer . So how to get the timeline of video from AVPlayer and how to update the progress bar Suggest me. iOSPawan Please use the below code which is from apple example code

AVCaptureVideoPreviewLayer doesn't fill up whole iPhone 4S Screen

*爱你&永不变心* 提交于 2019-11-29 19:48:28
AVCaptureVideoPreviewLayer *avLayer = [AVCaptureVideoPreviewLayer layerWithSession:session]; avLayer.frame = self.view.frame; [self.view.layer addSublayer:avLayer]; I use AVCaptureVideoPreviewLayer to display video on the view. But the video's view didn't fill up the full iPhone4's screen.(two grey bar at the right&left side) I want the video fills up full screen. How can I deal with it? Thank you very much! Maybe this solves it? CGRect bounds=view.layer.bounds; avLayer.videoGravity = AVLayerVideoGravityResizeAspectFill; avLayer.bounds=bounds; avLayer.position=CGPointMake(CGRectGetMidX(bounds)

AVFoundation Metadata Object Types

痴心易碎 提交于 2019-11-29 19:41:19
问题 I'm trying to use the AVFoundation to read barcodes with the below code, but I keep getting the error below. Help as to why would be much appreciated. Thanks in advance! //Create camera view session = AVCaptureSession() var layer = self.cameraView.layer vidLayer = AVCaptureVideoPreviewLayer.layerWithSession(session) as AVCaptureVideoPreviewLayer vidLayer.frame = self.cameraView.bounds vidLayer.videoGravity = AVLayerVideoGravityResizeAspectFill var device = AVCaptureDevice

Face Detection with Camera

社会主义新天地 提交于 2019-11-29 19:26:40
How can I do face detection in realtime just as "Camera" does? I noticed that AVCaptureStillImageOutput is deprecated after 10.0, so I use AVCapturePhotoOutput instead. However, I found that the image I saved for facial detection is not so satisfied? Any ideas? UPDATE After giving a try of @Shravya Boggarapu mentioned. Currently, I use AVCaptureMetadataOutput to detect the face without CIFaceDetector . It works as expected. However, when I'm trying to draw bounds of the face, it seems mislocated. Any idea? let metaDataOutput = AVCaptureMetadataOutput() captureSession.sessionPreset =

How can I specify the format of AVAudioEngine Mic-Input?

旧街凉风 提交于 2019-11-29 17:44:06
问题 I'd like to record the some audio using AVAudioEngine and the users Microphone. I already have a working sample, but just can't figure out how to specify the format of the output that I want... My requirement would be that I need the AVAudioPCMBuffer as I speak which it currently does... Would I need to add a seperate node that does some transcoding? I can't find much documentation/samples on that problem... And I am also a noob when it comes to Audio-Stuff. I know that I want NSData

Dot Product and Luminance/ Findmyicone

帅比萌擦擦* 提交于 2019-11-29 16:38:45
All, I have a basic question that I am struggling with here. When you look at the findmyicone sample code from WWDC 2010, you will see this: static const uint8_t orangeColor[] = {255, 127, 0}; uint8_t referenceColor[3]; // Remove luminance static inline void normalize( const uint8_t colorIn[], uint8_t colorOut[] ) { // Dot product int sum = 0; for (int i = 0; i < 3; i++) sum += colorIn[i] / 3; for (int j = 0; j < 3; j++) colorOut[j] = (float) ((colorIn[j] / (float) sum) * 255); } And then it is called: normalize(orangeColor, referenceColor); Running the debugger, it is converting BGRA: (Red