0

I am making an application that senses iBeacons. When you get within immediate range of an iBeacon the application sends the major and minor numbers of the beacon to a server and the server sends back an image that is stored in a MySQL database, different images are sent back based on the major and minor numbers.

The application sends the major and minor number to a Python (Twisted sockets) script via an NSStream, the script uses these numbers to get an image from the database and send it back to the application.

This setup work great when I use it to get simple text messages back from the database but I am running into problems when trying to recieve and display images inside the application.

first I will post the code of the stream:handleEvent that recieves the data from the input stream.

The code is only a slight modification of this tutorial http://www.raywenderlich.com/3932/networking-tutorial-for-ios-how-to-create-a-socket-based-iphone-app-and-server

// input stream event that recieves the data from the server
//
- (void)stream:(NSStream *)aStream handleEvent:(NSStreamEvent)eventCode
{

switch (eventCode)
{
    case NSStreamEventOpenCompleted:
        NSLog(@"stream opened");
        break;


    case NSStreamEventHasBytesAvailable: // event for recieving data

        NSLog(@"Recieved Data");

        if (aStream == _inputStream)
        {
            uint8_t buffer[500000];
            int len;

            // loop gets bytes from input stream
            //
            while ([_inputStream hasBytesAvailable])
            {
                len = [_inputStream read:buffer maxLength:sizeof(buffer)];

                if (len > 0)
                {

                    NSString *str = @"data:image/jpg;base64,";
                    NSString *img = [[NSString alloc] initWithBytes:buffer length:len encoding:NSASCIIStringEncoding];
                    str = [str stringByAppendingString:img];
                    NSData *ImgOut = [NSData dataWithContentsOfURL:[NSURL URLWithString:str]];


                    if (nil != ImgOut)
                    {

                        self.ImageView.image = [UIImage imageWithData:ImgOut];
                        NSLog(@"show image");

                    }
                }
            }

        }

        break;



    case NSStreamEventErrorOccurred:
        NSLog(@"can not connect to host");
        [self initNetworkComms];

        break;

    case NSStreamEventEndEncountered:
        NSLog(@"Connection Lost");

        [_outputStream close];
        [_inputStream close];
        [self initNetworkComms];

        break;

    default:
        NSLog(@"unkown event");

        break;
}
}

just for good measure I will post the code of the Python script

from twisted.internet.protocol import Protocol, Factory
from twisted.internet import reactor

import mysql.connector

db = mysql.connector.connect(user='NotImportant', password='WouldntYouLikeToKnow', host='localhost', database='retailbeacons')
cursor = db.cursor()

class MessageServer(Protocol):
    def connectionMade(self):

        self.factory.clients.append(self)
        print "clients are ", self.factory.clients



    def connectionLost(self, reason):

        self.factory.clients.remove(self)
        print "client has disconnected"



    def dataReceived(self, data):

        a = data.split(':')

        if len(a) > 1:
            Major = a[0]
            Minor = a[1]

            msg = ""

            print "Received query " + Major + ":" + Minor

            sql = "SELECT Picture FROM beaconinfo WHERE major=" + Major + " AND minor=" + Minor + ";"

            cursor.execute(sql)

            for row in cursor.fetchall():

                mess = row[0]
                msg = mess.encode('utf=8')


            self.message(msg)

    def message(self, message):
        self.transport.write(message + '\n')


factory = Factory()
factory.protocol = MessageServer
factory.clients = []

reactor.listenTCP(8080, factory)
print "Python message test server started"
reactor.run()

what happens with this code is that when the app queries the server, the server sends back the image data (in base64 format), the application recieves this data and the EventHasBytesAvailable case of the switch statement is triggered. But only a small portion of the image is displayed and I get an error log saying:

<Error>: ImageIO: JPEG Corrupt JPEG data: premature end of data segment

This led me to believe that not all the data came across the stream. you'll see in the code that I have an NSLog say 'Recieved Data' everytime the EventHasBytesAvailable case is called and 'show image' when the UIImageView is set with the image data.

The thing I find odd, and what I feel is the source of this problem is the fact that when the EventHasBytesAvailable is called the 'Recieved Data' message is logged, then the 'show image' message is logged, then once again the 'Recieved Data' message is logged and the Error listed above is then logged.

So it looks like a small portion of the data comes in through the stream, the loop gathers up those bytes and sticks them in the UIImageView, then more bytes come in through the stream and an attempt to put them into the UIImageView is made but the 'premature end of data segment' error occurs.

I am very confused as to why this is happening. Shouldn't the whole data of the image be sent through the stream with one calling of the EventHasBytesAvailable case? Possibly I have over looked the buffer in my code? Can my buffer take an image of 60kb? That is the only thing I can think of that might be wring with the application code, then all i can think of is maybe the Python script is sending the data in two chunks instead of one.

Thank you for your time. I am an intern that has hit a bit of a wall with this one! any help will be greatly appreciated!

ThriceGood
  • 1,633
  • 3
  • 25
  • 43
  • No, you cannot assume the stream is complete until the end of stream is received (in the case of closing the stream on the server when finished) or some other marker is received if you are keeping the stream open. – Paulw11 Aug 05 '14 at 20:11

1 Answers1

0

fixed this problem. so the stream was sending the data over in more than one call of the 'HasBytes' case. so i created a string that gets appended with each chunk of the data when 'HasBytes' gets called. i also used a different method for converting the image data string to an NSData object.

NSString *ImgStr = [[NSString alloc] initWithBytes:buffer length:len encoding:NSASCIIStringEncoding];

// string property for appending
//            
_str = [_str stringByAppendingString:ImgStr];

NSData *ImgData = [[NSData alloc] initWithBase64EncodedString:_str options:1];


if (nil != ImgData)
{

     self.ImageView.image = [UIImage imageWithData:ImgData];

}

Thanks very much!

ThriceGood
  • 1,633
  • 3
  • 25
  • 43