gpuimage

GPUImage animated gaussian blur filter

被刻印的时光 ゝ 提交于 2019-12-06 11:06:09
问题 I was using the GPUImage gaussian blur filter to blur a still image. I want to tie the blur size to an UI element and as the element is changed by the user I blur the picture. The way I am doing it right now is to change the blurSize when there has been a significant (> 0.25) change, reapply the filter and animate the new image into the imageView. Is there a more efficient way for me to be doing this? On the iPhone 5, while performance is not laggy, it is not super smooth either (but perhaps

How to maintain the aspect ratio when using GPUImage?

空扰寡人 提交于 2019-12-06 10:15:26
问题 I found the Instagram has a camera window like 300*300? its a square, then Im trying to use the GPUImage to make the same camera size.so I wrote like this: primaryView = [GPUImageView alloc] initWithFrame: CGRectMake(0,0,300,300)];//define a square view //define a still camera stillCamera = [[GPUImageStillCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionFront]; //make it portrait stillCamera.outputImageOrientation =

How to reduce the memory consumption in GPUImageGaussianSelectiveBlurFilter effect?

删除回忆录丶 提交于 2019-12-06 07:21:34
I'm using GPUImage framework implementation GPUImageGaussianSelectiveBlurFilter effect. Now GPUImageGaussianSelectiveBlurFilter effect has been achieved,but there is too much memory consumption.I know through the forceProcessingAtSize method can reduce some memory consumption, but reduce memory consumption too little. If the processImage method is not called, the memory consumption will reduce a lot, so how to use another method to replace it? How to reduce memory consumption? #import "ViewController.h" #import "GPUImage.h" @interface ViewController () { UIImageView *captureImageView;

Switching between filter chains using GPUImage Framework

跟風遠走 提交于 2019-12-06 07:11:29
I would like to switch between two filter chains as shown in case 1 and case 2 with the code below. When I initially select either cases, the output appears correct. However, when I switch to another the filter chain, the output flickers between current and prior filter chain. What is the recommended way to switch filter chains? -(void) updateFilter:(NSInteger) style { switch (style) { case 1: [kuwahara setRadius:5]; [videoCamera addTarget:kuwahara]; [kuwahara addTarget:grayscale]; [grayscale addTarget:filteredVideoView]; break; case 2: [videoCamera addTarget:grayscale]; [blur setBlurSize:3];

Color Specific Hue/Saturation from Photoshop to iOS

血红的双手。 提交于 2019-12-06 05:15:00
I'm trying to use GPUImage and CIFilter to map this filter. Please note, I need help mapping the color (Reds) specific (note: NOT Master, just Reds) photoshop element to iOS. Does anyone know how to manipulate a CIFilter or GPUImage class to get the photoshop effect below in iOS? You could use GPUImage with the lookup filter: GPUImageLookupFilter: Uses an RGB color lookup image to remap the colors in an image. First, use your favourite photo editing application to apply a filter to lookup.png from GPUImage/framework/Resources. For this to work properly each pixel color must not depend on other

GPUImageMovieWriter and avfiletypempeg4 filetype

江枫思渺然 提交于 2019-12-06 02:55:11
First of all I would like congratulate Brad for the amazing work on GPUImage. I'm trying to apply a rotation to a given video file and obtain an mpeg4 (AVFileTypeMPEG4) file as output. When doing this I obtain the following message : * -[AVAssetWriterInput appendSampleBuffer:] Input buffer must be in an uncompressed format when outputSettings is not nil This problem occurs when using the following init method of GPUImageMovieWriter with filetype set to AVFileTypeMPEG4 : - (id)initWithMovieURL:(NSURL *)newMovieURL size:(CGSize)newSize fileType:(NSString *)newFileType outputSettings:

GPUImage: Darker iOS 7 Blur Effect

邮差的信 提交于 2019-12-05 18:45:53
I need a reliable, efficient method to create iOS 7 blur effect. I've implemented Apple's applyBlurWithRadius from WWDC code (UIImage+ImageEffects) . It is pretty flexible actually, it allows to change tintColor also which provides to create a darker blur effect like this: But it relies on the Core Graphics and it is decreasing the scrolling performance in a table view. Then I've seen BradLarson's GPUImage library and it's GPUImageiOSBlurFilter method which replicates the iOS 7 effect and it works much faster than the UIImage+ImageEffects , so it seems more usable in my case. But the problem

Ios rotate, filter video stream in ios

若如初见. 提交于 2019-12-05 10:53:10
Hello There I am rotating and applying image filters by GPUImage on vide live stream The task is consuming more time than expected resulting over-heating of iPhone Can anybody help me out in optimising my code Following is my used code: - (void)willOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer{ //return if invalid sample buffer if (!CMSampleBufferIsValid(sampleBuffer)) { return; } //Get CG Image from sample buffer CGImageRef cgImageFromBuffer = [self cgImageFromSampleBuffer:sampleBuffer]; if(!cgImageFromBuffer || (cgImageFromBuffer == NULL)){ return; } //We need rotation to perform

How to get corners using GPUImageHarrisCornerDetectionFilter

北战南征 提交于 2019-12-05 09:54:09
问题 I am trying to get the corner points from a still image using GPUImageHarrisCornerDetectionFilter . I have looked at the example code from the project, I have looked at the documentation, and I have looked at this post that is about the same thing: GPUImage Harris Corner Detection on an existing UIImage gives a black screen output But I can't make it work - and I have a hard time understanding how this is supposed to work with still images. What I have at this point is this: func

ios save GPUImage video

强颜欢笑 提交于 2019-12-05 07:47:15
问题 I'm add overlay over my video using GPUImage. On preview it all looks great but how can I write my video to file? I use GPUMovieWriter but I cant find this file and I dont even know if it works. Here is my code: -(void)setUpCameraWithPosition:(bool)switchToFrontCamera { if(videoCamera != nil) { [videoCamera stopCameraCapture]; } if(switchToFrontCamera) { videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionFront];