gpuimage

How do I modify a GPUImageGaussianSelectiveBlurFilter to operate over a rectangle instead of a circle?

Deadly 提交于 2019-11-29 12:29:18
I have used the GPUImage framework for a blur effect similar to that of the Instagram application, where I have made a view for getting a picture from the photo library and then I put an effect on it. One of the effects is a selective blur effect in which only a small part of the image is clear the rest is blurred. The GPUImageGaussianSelectiveBlurFilter chooses the circular part of the image to not be blurred. How can I alter this to make the sharp region be rectangular in shape instead? Because Gill's answer isn't exactly correct, and since this seems to be getting asked over and over, I'll

What's the Swift equivalent of declaring `typedef SomeClass<SomeProtocol> MyType`?

a 夏天 提交于 2019-11-29 07:35:14
I’m currently writing some Swift code in a project that is predominately Objective-C. In our ObjC code, we have a header that declares typedef GPUImageOutput<GPUImageInput> MyFilter; . We can then declare e.g. a @property that can only be a GPUImageOutput subclass that implements GPUImageInput . (NOTE: GPUImageOutput and GPUImageInput are not defined by me; they are part of the GPUImage library ) Our Swift code doesn't seem to recognize this, even though the header is #imported in our Bridging Header. I’ve tried to replicate the declaration in Swift, but neither of these are proper syntax:

GPUImageMovie playback controls

北战南征 提交于 2019-11-29 05:15:00
I'm trying to control the playback of a GPUImageMovie. What I would like to archive is jump from a frame to another (seekToTime back and forward, swipe controlled) rather then play the video, but I don't understand if the component it's designed for this use. GPUImageMovie -> filters -> GPUImageView I've tried to use an AVPlayer on playerItem, but apparetly it's null. ghkaren You can use AVPlayer for control video: create GPUImageMovie by using AVPlayerItem , so you can control your video by using AVPlayer methods. example: NSURL *mediaURL = ... AVPlayer *mainPlayer = [[AVPlayer alloc] init];

GPUImageMovie pause while applying filter

旧街凉风 提交于 2019-11-29 05:06:08
I am using Brad Larson's great library GPUImage for my application. Currently I am stuck with an issue. My application captures 10 second videos and after that allows filter applying. While applying filters in GPUImageMovie , I am not able to pause the play and apply new filter so that video will play continuously without starting from the beginning. I saw an open github issue here . If anyone faced similar issue and found a solution, please post your answers. Thanks in advance. QUserS And finally I fixed this after too many searches and try. We need to initiate GPUImageMovie with AVPlayerItem

GPUImage: blending two images

﹥>﹥吖頭↗ 提交于 2019-11-29 03:33:07
问题 I was using GPUImage framework (some old version) to blend two images (adding border overlay to a certain image). After I have updated to latest framework version, after applying such a blend, I get an empty black image. I'm using next method: - (void)addBorder { if (currentBorder != kBorderInitialValue) { GPUImageAlphaBlendFilter *blendFilter = [[GPUImageAlphaBlendFilter alloc] init]; GPUImagePicture *imageToProcess = [[GPUImagePicture alloc] initWithImage:self.imageToWorkWithView.image];

How do I animate in/out a gaussian blur effect in iOS?

跟風遠走 提交于 2019-11-28 17:32:33
For the whole iOS 7 feel, I want to apply a blur effect to a specific portion of the screen to obfuscate it, but I don't want to just throw the blur on instantly, I want to animate it in and animate it out so the user almost sees the blur effect being applied. Almost as if in Photoshop you changed the gaussian blur value bit by bit from 0 to 10, instead of 0 to 10 in one go. I've tried a few solutions to this, the most popular suggestion being to simply put the blurred view on top of a non-blurred view, and then lower the alpha value of the blurred view. This works okay , but not very eye

Using GPUImage to Recreate iOS 7 Glass Effect

你离开我真会死。 提交于 2019-11-28 17:06:45
I am trying to use the iOS 7 style glass effect in my glass by applying image effects to a screenshot of a MKMapView . This UIImage category , provided by Apple, is what I am using as a baseline. This method desaturates the source image, applies a tint color, and blurs heavily using the input vals: [image applyBlurWithRadius:10.0 tintColor:[UIColor colorWithRed:229/255.0f green:246/255.0f blue:255/255.0f alpha:0.33] saturationDeltaFactor:0.66 maskImage:nil]; This produces the effect I am looking for, but takes way too long — between .3 and .5 seconds to render on an iPhone 4. I would like to

Video with GPUImageChromaKeyFilter has tint when played in transparent GPUImageView?

时间秒杀一切 提交于 2019-11-28 11:47:52
I have a video with a solid green background, that I am trying to make transparent with GPUImageChromaKeyFilter. When I have clear colour for the player view, the result is that the video is not really transparent, but tints the background: When I change the background to white, the green background is gone, but obviously the view is not transparent: What am I doing wrong? My code is: let movie = GPUImageMovie(URL: url) let filter = GPUImageChromaKeyFilter() filter.setColorToReplaceRed(0, green: 1, blue: 0) movie.addTarget(filter) let playerView = GPUImageView() playerView.backgroundColor =

GPUImage video filter set brightness not working

爷,独闯天下 提交于 2019-11-28 06:39:22
问题 //The whole code is looking fine, but movieWriter - setCompletionBlock lines are not getting compiled. Doesn't know what is the problem in it. I tried to solve out since last 3 days but am not getting it successfully working. -(IBAction)setBrightness:(id)sender { sleep(1); NSURL *sampleURL = [NSURL URLWithString:_videoURLPath]; movieFile = [[GPUImageMovie alloc] initWithURL:sampleURL]; movieFile.runBenchmark = YES; movieFile.playAtActualSpeed = NO; filterView = (GPUImageView *)gpuView; filter

GPUImageMovie Not Support Alpha Channel?

纵饮孤独 提交于 2019-11-28 06:12:43
问题 I create Video Effect by GPUImage like this self.overlayerView = [[GPUImageView alloc] init]; self.overlayerView.frame = self.view.frame; dispatch_queue_t queue = dispatch_queue_create("queue", NULL); dispatch_async(queue, ^{ NSURL *sourceURL = [[NSBundle mainBundle] URLForResource:@"212121" withExtension:@"mp4"]; GPUImageMovie *sourceMovie = [[GPUImageMovie alloc] initWithURL:sourceURL]; sourceMovie.playAtActualSpeed = YES; sourceMovie.shouldRepeat = YES; sourceMovie