I am trying to achieve this effect . you can check this effect in musical.ly app (ripple effect)
https://drive.google.com/open?id=1uXExnmWQ7OfSGLFXdH7-5imay8tW87vO
Here is my approach.
I will render alpha and scaled image to pixel buffer I am showing all sample buffer using AVSampleBufferDisplayLayer . I want to show this animation for 3 sec - 5sec . Once user is done Then I will convert it to mp4 using AVAssetWriter .
I am unable to add alpha image to cvpixelbuffer
If there’s a better approach to make this animation . Please guide .
I get all the sample buffer using AVAssetReaderTrackOutput.
NSDictionary *readerOutputSettings = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey, nil];
AVAssetReaderTrackOutput* readerOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:videoTrack
outputSettings:readerOutputSettings];
[reader addOutput:readerOutput];
[reader startReading];
while ((sample = [readerOutput copyNextSampleBuffer]))
{
[samples addObject:(__bridge id)sample];
CFRelease(sample);
}
CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer((__bridge CMSampleBufferRef)[samples lastObject]);
CIImage *filteredImage = [CIImage imageWithCVPixelBuffer:pixelBuffer options:nil];
CIFilter* theFilter = [CIFilter filterWithName:@"CIColorMatrix"];
[theFilter setDefaults];
[theFilter setValue: filteredImage forKey: @"inputImage"];
CIVector *theRVector = [CIVector vectorWithX:1 Y:0 Z:0 W:0];
[theFilter setValue: theRVector forKey:@"inputRVector"];
CIVector *theGVector = [CIVector vectorWithX:0 Y:1 Z:0 W:0];
[theFilter setValue: theGVector forKey:@"inputGVector"];
CIVector *theBVector = [CIVector vectorWithX:0 Y:0 Z:1 W:0];
[theFilter setValue: theBVector forKey:@"inputBVector"];
CIVector *theAVector = [CIVector vectorWithX:0 Y:0 Z:0 W:0.5];
[theFilter setValue: theAVector forKey:@"inputAVector"];
CIVector *theBiasVector = [CIVector vectorWithX:0 Y:0 Z:0 W:0];
[theFilter setValue: theBiasVector forKey:@"inputBiasVector"];
CIImage* result = [theFilter valueForKey: @"outputImage"];
Then I have decrease the alpha of ciimage and render into first pixel buffer
EAGLContext *eaglContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES3];
CIContext *cicontext = [CIContext contextWithEAGLContext:eaglContext options:@{kCIContextWorkingColorSpace : [NSNull null]}];
[cicontext render:result toCVPixelBuffer:CMSampleBufferGetImageBuffer((__bridge CMSampleBufferRef)[samplebuffers firstObject]) bounds:CGRectMake(0, 0, [result extent].size.width,[result extent].size.height) colorSpace:CGColorSpaceCreateDeviceRGB()];
I got this result .
Expected result
Thanks in advance