4

My goal is to stream images captured by AVCpatureInput from one iOS device to another via bonjour.

Here is my current method:

1) Capture frame from video input

- (void)captureOutput:(AVCaptureOutput *)captureOutput 
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
   fromConnection:(AVCaptureConnection *)connection 
{ 
    /*code to convert sampleBuffer into UIImage */
    NSData * imageData = UIImageJPEGRepresentation(image,1.0);
    [connection sendImage:image];
}

2) Send over TCP connection (from http://mobileorchard.com/tutorial-networking-and-bonjour-on-iphone/)

// Send raw image over network
- (void)sendRawImagePacket:(UIImage *)image {
// Encode packet
NSData * imageData = UIImageJPEGRepresentation(image, 1.0);

NSData * rawPacket = [NSKeyedArchiver archivedDataWithRootObject:imageData];

// Write header: length of raw packet
int packetLength = [rawPacket length];

[outgoingDataBuffer appendBytes:&packetLength length:sizeof(int)];

[outgoingDataBuffer appendData:rawPacket];

// Try to write to stream
[self writeOutgoingBufferToStream];
}

3) Read the data and convert it back into an image to be displayed on the receiving device's UIImageView

// Read as many bytes from the stream as possible and try to extract meaningful packets
- (void)readFromStreamIntoIncomingBuffer {
// Temporary buffer to read data into
UInt8 buf[1024];

// Try reading while there is data
while( CFReadStreamHasBytesAvailable(readStream) ) {  
   CFIndex len = CFReadStreamRead(readStream, buf, sizeof(buf));
if ( len <= 0 ) {
  // Either stream was closed or error occurred. Close everything up and treat this as "connection terminated"
  [self close];
  [delegate connectionTerminated:self];
  return;
 }

  [incomingDataBuffer appendBytes:buf length:len];
}

// Try to extract packets from the buffer.
//
// Protocol: header + body
//  header: an integer that indicates length of the body
//  body: bytes that represent encoded NSDictionary

// We might have more than one message in the buffer - that's why we'll be reading it inside the while loop
 while( YES ) {
// Did we read the header yet?
if ( packetBodySize == -1 ) {
  // Do we have enough bytes in the buffer to read the header?
  if ( [incomingDataBuffer length] >= sizeof(int) ) {
    // extract length
    memcpy(&packetBodySize, [incomingDataBuffer bytes], sizeof(int));

    // remove that chunk from buffer
    NSRange rangeToDelete = {0, sizeof(int)};
    [incomingDataBuffer replaceBytesInRange:rangeToDelete withBytes:NULL length:0];
  }
  else {
    // We don't have enough yet. Will wait for more data.

      break;

  }
}

// We should now have the header. Time to extract the body.
if ( [incomingDataBuffer length] >= packetBodySize ) {
  // We now have enough data to extract a meaningful packet.
  NSData* raw = [NSData dataWithBytes:[incomingDataBuffer bytes] length:packetBodySize];

  // Tell our delegate about it

            NSData * imageData = [NSKeyedUnarchiver unarchiveObjectWithData:raw];

            UIImage * image = [UIImage imageWithData:imageData];
            [delegate receivedNetworkRawImage:image viaConnection:self];


  // Remove that chunk from buffer
  NSRange rangeToDelete = {0, packetBodySize};
  [incomingDataBuffer replaceBytesInRange:rangeToDelete withBytes:NULL length:0];

  // We have processed the packet. Resetting the state.
  packetBodySize = -1;
}
else {
  // Not enough data yet. Will wait.
  break;
 }
}
}

However, when the connection gets choppy, UIImage throws an error that it cannot render the JPEG.

How should I pass images through wifi?

It's okay if some frames skip, I need a way to tell UIImage to skip that "batch" of bad data.

Thanks!

phmagic
  • 691
  • 8
  • 10
  • Hello, out of curiosity, did you manage to do a good-enough live video stream from one device to another? – Arthur Nov 14 '14 at 11:32

1 Answers1

2

UIImage does not conform to NSCoding --> NSKeyedArchiver fails.

You'll have to use UIImagePNGRepresentation() to get the data of the image. Or use UIImageJPEGRepresentation() for compressed data.

Fabian Kreiser
  • 8,307
  • 1
  • 34
  • 60
  • Yes, I have this when sending the data: `NSData * imageData = UIImageJPEGRepresentation(image, 1.0)'` The problem for me is receiving the data. Am I missing something? – phmagic Sep 02 '11 at 04:34
  • 1
    Oh I see now. I would remove that NSKeyedArchiver thing completely. You already have NSData. – Fabian Kreiser Sep 02 '11 at 04:46
  • You should verify your network code with simple NSString objects first. Does that work? – Fabian Kreiser Sep 02 '11 at 04:54
  • Yes, I have removed NSKeyedArchiver. So the missing bits problem is getting passed down into rendering the UIImage when it gets to the other side. Network Code works for strings, arrays, images, and dictionaries. However, streaming is an issue because sometimes there are missed frames. – phmagic Sep 02 '11 at 05:13
  • I have sent images over the network without problems, I have no clue where the error lies. You could test if it works with Base64 encoded images. – Fabian Kreiser Sep 02 '11 at 05:19
  • I used Gamekit, but Gamekit uses Bonjour under the hood. – Fabian Kreiser Sep 02 '11 at 06:37
  • Seems like sending images are fine. However, sending about 10 images per second every time the video frame refresh, seems to overload the buffer. Still no solution to this. – phmagic Sep 02 '11 at 09:38
  • @phmagic were you able to work around this somehow? Have you been able to stream ~10 images per second to another iPhone? – SirRupertIII Oct 09 '13 at 19:35