0

I'm programming a cocoa app that can edit the streaming videos from different QTDeviceInputs. At this moment I can easily display 2 QtDeviceInputs in 2 differents QTCaptureView, but I wonder if it's possible to overlap (maybe with transparency) the streams from external cams in a single view. What kind of View I have to use to overlap the video streams? How I can overlap these 2 video streams?

I think I have to use a OpenGlView, but I never used it and I wonder if there is an easier way to do it.


I tried to create 2 QTCaptureLayer as follow:

layer1=[[QTCaptureLayer alloc]initWithSession:session1];
layer2=[[QTCaptureLayer alloc]initWithSession:session2];

where session1 and session2 are 2 QTCaptureSession that I'm using to display 2 QTDeviceInput. Than I added the layers in a NSView:

 [[myView layer] addSublayer:layer1];

but nothing changed. I created the NSView "myView" in interface builder and I linked it to the file's owner. I also tried to draw a simple CALayer and add it to myView:

CALayer *layer = [CALayer layer];

layer.backgroundColor = CGColorCreateGenericRGB(0,0,0,1.0f);
layer.borderColor=CGColorCreateGenericRGB(100,100,100,1.0f);
layer.borderWidth=4.0;

NSRect rect = NSMakeRect(0, 0, 1000, 1000);

layer.frame = NSRectToCGRect(rect);
layer.cornerRadius = rect.size.height/5;

// Insert the layer into the root layer
[[myView layer] addSublayer:layer];

but nothing happened. The NSView is unchanged! What I'm doing wrong?


I tried to create 2 QTCaptureLayer as follow:

layer1=[[QTCaptureLayer alloc]initWithSession:session1];
layer2=[[QTCaptureLayer alloc]initWithSession:session2];

where session1 and session2 are 2 QTCaptureSession that I'm using to display 2 QTDeviceInput. Than I added the layers in a NSView:

 [[myView layer] addSublayer:layer1];

but nothing changed. I created the NSView "myView" in interface builder and I linked it to the file's owner. I also tried to draw a simple CALayer and add it to myView:

CALayer *layer = [CALayer layer];

layer.backgroundColor = CGColorCreateGenericRGB(0,0,0,1.0f);
layer.borderColor=CGColorCreateGenericRGB(100,100,100,1.0f);
layer.borderWidth=4.0;

NSRect rect = NSMakeRect(0, 0, 1000, 1000);

layer.frame = NSRectToCGRect(rect);
layer.cornerRadius = rect.size.height/5;

// Insert the layer into the root layer
[[myView layer] addSublayer:layer];

but nothing happened. The NSView is unchanged! What I'm doing wrong?

Tim Post
  • 33,371
  • 15
  • 110
  • 174
Jacopo Berta
  • 265
  • 2
  • 14

1 Answers1

0

If I understand your problem correctly, there is no need to resort to an OpenGLView:

There is a class called QTCaptureLayer which should enable you to achieve what you want. All you need then, is an NSView backed by a layer to which you can add your capture-layers as sublayers.

For blending and similar purposes you can then use the capture-layers' opacity, mask or compositingFilter properties — depending on the effect you want to achieve.

danyowdee
  • 4,658
  • 2
  • 20
  • 35