0

I have read and gone over: UIImage from CALayer - iPhone SDK

But when I try to use UIGraphicsGetImageFromCurrentImageContext() I only seem to get my sublayer in the returned UIImage that I added to my UIView. I would like the returned UIImage to have both the camera feed in my UIView as well as the CALayer sublayer I added to the UIView.

I set up the camera feed in my UIView, drawing my sublayer, adding my sublayer, then setting up my Context with:

UIGraphicsBeginImageContextWithOptions(myView.layer.frame.size,myView.layer.isOpaque,UIScreen.man.scale)

all in viewDidLoad().

Then in a later button action I try:

let img = UIGraphicsGetImageFromCurrentImageContext()
UIImageWriteToSavePhotosAlbum(img!,nil,nil,nil)

But the saved image only has my sublayer with the drawing I made and not the camera feed. Any ideas?

rmaddy
  • 314,917
  • 42
  • 532
  • 579
Paul K.
  • 95
  • 1
  • 12
  • Instead of describing what you do, show actual relevant code. – rmaddy Jan 21 '18 at 18:01
  • I am pretty sure the answer to this question lies in using AVCaptureVideoDataOutput() but all of the example code blocks I see create functions that take a CMSampleBuffer to create a UIImage but I don't know where to get this buffer from? – Paul K. Jan 22 '18 at 18:23
  • After some more digging I found that if your class extends AVCaptureVideoDataOutputSampleBufferDelegate and you create a func captureOutput(_ output:AVCaptureOutput, didOutput sampleBuffer:CMSampleBuffer, from connection: AVCaptureConnection) then the machine will call this function for you and place the data in the sampleBuffer variable. Hope this helps others. Now to figure out how to lay my sublayers I drew on top of this image..... – Paul K. Jan 22 '18 at 19:05

0 Answers0