I'm using AVFoundation to grab live camera feed, process the pixels into an alpha mask image, then trying to use that image to create collision boundaries in my SKView. Balls will bounce off of the boundaries in this node. Kind of an augmented reality thing.
All of the AVFoundation stuff works fine. It's the collisions that I'm having the issue with. Since setting up an SKTexture is expensive, I've opted for SKMutableTexture for updating the physics body. I can't seem to get it to work (no collisions occur), however using SKTexture with a static image works great. For example, I have a tree node (an irregular shape to bounce balls off of) elsewhere in the code which I wrote as a development step. The balls bounce off of it just fine, but they don't bounce off of my live video node. The SKTexture/SKMutableTexture and SKPhysicsBody object are different between the two nodes and are created differently.
The tree example simple and creates it's SKPhysicsBody with an image:
SKTexture *objTexture = [SKTexture textureWithImageNamed:objImageName];
self.obj.physicsBody = [SKPhysicsBody bodyWithTexture:objTexture size:self.obj.size];
The live video node (self.env) is setup like this in a SKScene:
-(id)initWithSize:(CGSize)size {
if (self = [super initWithSize:size]) {
self.dots = [@[]mutableCopy];
self.physicsWorld.contactDelegate = self;
self.physicsWorld.gravity = CGVectorMake(0.0f, 0.0f);
SKPhysicsBody* borderBody = [SKPhysicsBody bodyWithEdgeLoopFromRect:self.frame];
self.physicsBody = borderBody;
self.physicsBody.friction = 0.0f;
self.env = [[SKSpriteNode alloc]init];
self.env.name = @"env";
self.env.position = CGPointMake(self.frame.size.width/2, self.frame.size.height/2);
self.env.size = self.size;
[self addChild:self.env];
}
}
This is called by AVFoundationDelegate when a new video frame has been processed.
-(void)updateTextureWithPixels:(uint8_t*)pixel length:(size_t)size{
if(self.envTexture == nil){
self.envTexture = [SKMutableTexture mutableTextureWithSize:self.env.size];
self.env.physicsBody = [SKPhysicsBody bodyWithTexture:self.envTexture alphaThreshold:0.5 size:self.env.size];
self.env.physicsBody.friction = 0.0f;
self.env.physicsBody.restitution = 1.0f;
self.env.physicsBody.linearDamping = 0.0f;
self.env.physicsBody.allowsRotation = NO;
self.env.physicsBody.dynamic = NO;
}
[self.envTexture modifyPixelDataWithBlock:^(void *pixelData, size_t lengthInBytes) {
pixelData = pixel;
NSLog(@"updated texture");
}];
}
Adding a ball
-(void)touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event {
UITouch* touch = [touches anyObject];
CGPoint touchLocation = [touch locationInNode:self];
self.startPoint = touchLocation;
self.ball = [SKSpriteNode spriteNodeWithImageNamed: @"ball.png"];
self.ball.name = ballCategoryName;
self.ball.position = self.startPoint;
[self.dots addObject:self.ball];
[self addChild:self.ball];
self.ball.physicsBody = [SKPhysicsBody bodyWithCircleOfRadius:self.ball.frame.size.width/2];
self.ball.physicsBody.friction = 0.0f;
self.ball.physicsBody.restitution = 1.0f;
self.ball.physicsBody.linearDamping = 0.0f;
self.ball.physicsBody.allowsRotation = NO;
}
I've verified that the pixels being passed into updateTexture are what is required. They are BGRA format. A set of these BGRA will either be all 0x00 or all 0xFF, as processed by my AVFoundation classs. I just can't get this mutable texture to do anything. I'm really hoping that I'm just overlooking a small setting.
Please let me know if you'd like to see more code.