0

I want to create square video using AVAssetWriter(GPUImageMovieWriter) in IOS SDK.If i resize the video output settings as 640*640 and i got result is 640*640 size video with resolution is 480*640.how to change preset for square video output.Please Help Me.

Ramkumar Paulraj
  • 1,841
  • 2
  • 20
  • 40

2 Answers2

2

There is probably a better way to do this, but the following code should work. I'm assuming that you are using the GPUImage framework by Brad Larson. Note that this code will save a 480x480 square video.

//Square cropping
UIImage *testImage = [filter imageFromCurrentlyProcessedOutput];
double camWidth = testImage.size.width;
double camHeight = testImage.size.height;
double offset;
double normalizedOffset;
CGRect normalizedCropFrame;

if(camWidth > camHeight) {
    offset = (camWidth - camHeight) / 2.0;
    normalizedOffset = offset / camWidth;
    normalizedCropFrame = CGRectMake(normalizedOffset, 0.0, camHeight/camWidth, 1.0);
}
else {
    offset = (camHeight - camWidth) / 2.0;
    normalizedOffset = offset / camHeight;
    normalizedCropFrame = CGRectMake(0.0, normalizedOffset, 1.0, camWidth/camHeight);
}
squareCrop = [[GPUImageCropFilter alloc] initWithCropRegion:normalizedCropFrame];

//pause camera while setting up movie writer
[videoCamera pauseCameraCapture];

//set up movie writer
pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents/Movie.m4v"];
unlink([pathToMovie UTF8String]);
NSURL *movieURL = [[NSURL alloc] initFileURLWithPath:pathToMovie];
movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(480.0, 480.0)];

//start recording
[filter addTarget:squareCrop];
[squareCrop addTarget:movieWriter];
videoCamera.audioEncodingTarget = movieWriter;
[movieWriter startRecording];
[videoCamera resumeCameraCapture];
David Schwartz
  • 507
  • 3
  • 14
  • is 2.0 supposed to be the device scale? – Ryan Romanchuk Oct 06 '14 at 15:58
  • 1
    the offset calculation centers the crop region in the frame of the image. you could calculate the offset differently to crop a square from the top or the bottom of a tall rectangular image, for example. – David Schwartz Oct 08 '14 at 03:44
  • ahhh, right. It's a bummer there isn't an easier way to get resolution of any capture preset at run time. – Ryan Romanchuk Oct 08 '14 at 08:05
  • I wish there was a better way to do this. I really want a robust way to setup a crop region for any session preset, and for any target sized frame without having capture a live image to determine the resolution. – Ryan Romanchuk Oct 14 '14 at 05:30
2

There is a Easiest way to do this without using GPUImageCropFilter

gpuimagevideocamera.m

 -(void)processVideoSampleBuffer:(CMSampleBufferRef)sampleBuffer{
if (capturePaused)return;
CFAbsoluteTime startTime = CFAbsoluteTimeGetCurrent();
CVImageBufferRef cameraFrame = CMSampleBufferGetImageBuffer(sampleBuffer);
int bufferWidth ; 
int bufferHeight; 
bufferHeight=480;
bufferWidth=480;
CMTime currentTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);----etc}

your .m file

 GPUImageVideoCamera *videoCamera = [[GPUImageVideoCamera alloc]
               initWithSessionPreset:AVCaptureSessionPreset640x480
               cameraPosition:AVCaptureDevicePositionBack];

GpuimageMovieWriter.m

outputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                  AVVideoCodecH264, AVVideoCodecKey,
                  [NSNumber numberWithInt:640.0], AVVideoWidthKey,
                  [NSNumber numberWithInt:640.0], AVVideoHeightKey,
                  AVVideoScalingModeResizeAspectFill,AVVideoScalingModeKey,nil];



assetWriterVideoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:outputSettings];
assetWriterVideoInput.expectsMediaDataInRealTime = _encodingLiveVideo;

// You need to use BGRA for the video in order to get realtime encoding. I use a color-swizzling shader to line up glReadPixels' normal RGBA output with the movie input's BGRA.
NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey,
                                                       [NSNumber numberWithInt:640], kCVPixelBufferWidthKey,
                                                       [NSNumber numberWithInt:640], kCVPixelBufferHeightKey,
                                                       nil];
//    NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictiona ry dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey,
//                                                           nil];

assetWriterPixelBufferInput = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:assetWriterVideoInput sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];

[assetWriter addInput:assetWriterVideoInput];
Ramkumar Paulraj
  • 1,841
  • 2
  • 20
  • 40