1

Right now I'm investigating possibility to implement video streaming through MultipeerConnectivity framework. For that purpose I'm using NSInputStream and NSOutputStream.

The problem is: I can't receive any picture so far. Right now I'm trying to pass simple picture and show it on the receiver. Here's a little snippet of my code:

Sending picture via NSOutputStream:

- (void)sendMessageToStream
{
    NSData *imgData = UIImagePNGRepresentation(_testImage);

    int img_length = (int)[imgData length];
    NSMutableData *msgData = [[NSMutableData alloc] initWithBytes:&img_length length:sizeof(img_length)];
    [msgData appendData:imgData];
    int msg_length = (int)[msgData length];

    uint8_t *readBytes = (uint8_t *)[msgData bytes];

    uint8_t buf[msg_length];
    (void)memcpy(buf, readBytes, msg_length);

    int stream_len = [_stream writeData:(uint8_t*)buf maxLength:msg_length];

    //int stream_len = [_stream writeData:(uint8_t *)buf maxLength:data_length];
    //NSLog(@"stream_len = %d", stream_len);

    _tmpCounter++;
    dispatch_async(dispatch_get_main_queue(), ^{

        _lblOperationsCounter.text = [NSString stringWithFormat:@"Sent: %ld", (long)_tmpCounter];
    });
}

The code above works totally fine. stream_len parameter after writing equals to 29627 bytes which is expected value, because image's size is around 25-26 kb.

Receiving picture via NSinputStream:

- (void)readDataFromStream
{
    UInt32 length;
    if (_currentFrameSize == 0) {

        uint8_t frameSize[4];
        length = [_stream readData:frameSize maxLength:sizeof(int)];
        unsigned int b = frameSize[3];
        b <<= 8;
        b |= frameSize[2];
        b <<= 8;
        b |= frameSize[1];
        b <<= 8;
        b |= frameSize[0];

        _currentFrameSize = b;
    }
    uint8_t bytes[1024];

    length = [_stream readData:bytes maxLength:1024];

    [_frameData appendBytes:bytes length:length];
    if ([_frameData length] >= _currentFrameSize) {
        UIImage *img = [UIImage imageWithData:_frameData];

        NSLog(@"SETUP IMAGE!");
        _imgView.image = img;

        _currentFrameSize = 0;
        [_frameData setLength:0];
    }

    _tmpCounter++;
    dispatch_async(dispatch_get_main_queue(), ^{
        _lblOperationsCounter.text = [NSString stringWithFormat:@"Received: %ld", (long)_tmpCounter];
    });
}

As you can see I'm trying to receive picture in several steps, and here's why. When I'm trying to read data from stream, it's always reading maximum 1095 bytes no matter what number I put in maxLength: parameter. But when I send the picture in the first snippet of code, it's sending absolutely ok (29627 bytes . Btw, image's size is around 29 kb.

That's the place where my question come up - why is that? Why is sending 29 kb via NSOutputStream works totally fine when receiving is causing problems? And is there a solid way to make video streaming work through NSInputStream and NSOutputStream? I just didn't find much information about this technology, all I found were some simple things which I knew already.

Eugene Alexeev
  • 1,152
  • 12
  • 32
  • Btw, streams are working with each other. I forgot to mention that. So the problem with bad connections can be excluded. I've tested streams with simple strings, it worked absolutelly fine – Eugene Alexeev Feb 21 '17 at 14:28

2 Answers2

1

Here's an app I wrote that shows you how:

https://app.box.com/s/94dcm9qjk8giuar08305qspdbe0pc784

Build the project with Xcode 9 and run the app on two iOS 11 devices.

To stream live video, touch the Camera icon on one of two devices.

If you don't have two devices, you can run one app in the Simulator; however, you can only use the camera on the real device (the Simulator will display the video broadcasted).

Just so you know: this is not the ideal way to stream real-time video between devices (it should probably be your last choice). Data packets (versus streaming) are way more efficient and faster.

Regardless, I'm really confused by your NSInputStream-related code. Here's something that makes a little more sense, I think:

case NSStreamEventHasBytesAvailable: {
    // len is a global variable set to a non-zero value;
    // mdata is a NSMutableData object that is reset when a new input 
    // stream is created.
    // displayImage is a block that accepts the image data and a reference
    // to the layer on which the image will be rendered
    uint8_t * buf[len];
    len = [aStream read:(uint8_t *)buf maxLength:len]; 
    if (len > 0) {
        [mdata appendBytes:(const void *)buf length:len];
    } else {
        displayImage(mdata, wLayer);
    }
    break;
}

The output stream code should look something like this:

// data is an NSData object that contains the image data from the video 
// camera;
// len is a global variable set to a non-zero value
// byteIndex is a global variable set to zero each time a new output
// stream is created

if (data.length > 0 && len >= 0 && (byteIndex <= data.length)) {
            len = (data.length - byteIndex) < DATA_LENGTH ? (data.length - byteIndex) : DATA_LENGTH;
            uint8_t * bytes[len];
            [data getBytes:&bytes range:NSMakeRange(byteIndex, len)];
            byteIndex += [oStream write:(const uint8_t *)bytes maxLength:len];
         }

There's a lot more to streaming video than setting up the NSStream classes correctly—a lot more. You'll notice in my app, I created a cache for the input and output streams. This solved a myriad of issues that you would likely encounter if you don't do the same.

I have never seen anyone successfully use NSStreams for video streaming...ever. It's highly complex, for one reason.

There are many different (and better) ways to stream video; I wouldn't go this route. I just took it on because no one else has been able to do it successfully.

James Bush
  • 1,485
  • 14
  • 19
0

I think that the problem is in your assumption that all data will be available in NSInputStream all the time while you are reading it. NSInputStream made from NSURL object has an asynchronous nature and it should be accessed accordingly using NSStreamDelegate. You can look at example in the README of POSInputStreamLibrary.

Pavel Osipov
  • 2,067
  • 1
  • 19
  • 27
  • Thank you for you reply! I'm working with my NSInputStream strictly through NSStreamDelegate methods, I don't do any workaround here at all. So when I read NSInputStream, it's 100% has some data to give. And the main question is about the length of passing data. Why I can pass to NSOutputStream picture of 29kb in one call, but I can't do same with NSInputStream while I read it? – Eugene Alexeev Feb 22 '17 at 09:15
  • Because you don't need to. //// len is a global variable set to a non-zero value; //// mdata is a NSMutableData object that is reset when a new input //// stream is created. //// displayImage is a block that accepts the image data and a reference //// to the layer on which the image will be rendered uint8_t * buf[len]; len = [aStream read:(uint8_t *)buf maxLength:len]; if (len > 0) { [mdata appendBytes:(const void *)buf length:len]; } else { displayImage(mdata, wLayer); } break; – James Bush Sep 14 '17 at 01:58