My Question:
How to properly convert and send images from ios swift client app to a java server?( without using much of external sdks)
Which type of sockets to use in swift(I am new to swift and can't really find ANY suitable socket) ?
Please give me a example code as I am not at all well versed with swift syntax and libraries.
Expected Result from my program - The ios swift app should efficiently connect to my java server and send images of video frames live to it. The images should then be converted to bufferedImage on the Java server machine and played as video!
Regarding previously asked questions - I found only one similar question but the answer was not very informative.
Details
So, I have written a Java server program on my mac and I want to add a feature in which the user should be able to send live video feed from his iPhone(ios device) to my Java server program.
The ios app is written in swift on xcode.
In order to do that I capture CGImage from video frames in the swift program and convert it into UIImage ; then I convert this UIImage to byte[] data as follows:-
let cgImage:CGImage = context.createCGImage(cameraImage, from: cameraImage.extent)! //cameraImage is grabbed from video frame image = UIImage.init(cgImage: cgImage) let data = UIImageJPEGRepresentation(image, 1.0)
This byte[] data is then sent to the IP address and port from where my Java Server is running using SwiftSocket/TCPClient (https://github.com/swiftsocket/SwiftSocket)
client?.send(data: data!)
Here client is an object of type TCPClient which was declared in swift xcode like this:(https://github.com/swiftsocket/SwiftSocket/blob/master/Sources/TCPClient.swift)
client = TCPClient(address: host, port: Int32(port)) client?.connect(timeout: 10)
Connection is successful and the Java Server program spawns a MobileServer Thread to handle this client. DataInput and OutputStreams are opened with the ServerSocket. This is the run() method os the MobileServer Thread spawned by the Java server(where "in" is a DataInputStream derived from the ServerSocket)
public void run() { try{ while(!stop) { int count=-1; count = in.available(); if(count>0) System.out.println("LENGTH="+count); byte[] arr=new byte[count]; System.out.println("byte="+arr); in.read(arr); BufferedImage image=null; try{ InputStream inn = new ByteArrayInputStream(arr); image = ImageIO.read(inn); inn.close(); }catch(Exception f){ f.printStackTrace();} System.out.println("IMaGE="+image); if(image!=null) appendToFile(image); } }catch(Exception l){ l.printStackTrace(); } }
The problem is my Java server is receiving some strange byte sequences which are probably not properly convertable to BufferedImage and thus on viewing the "Video" stored in file I can only see a thin strip of "image" while the iPhone is capturing fine.(Basically the image is not properly transferred from the ios app to my server!)
Entire Swift Program's viewController.swift for video capture is derived from this github project ( https://github.com/FlexMonkey/LiveCameraFiltering)
Edit - I have figured out the problem and posted it as an answer but this is still just a workaround because the server video feed still hangs a lot and I had to reduce the quality of the image byte data being sent by the swift client. There definitely is a better way to do things and I request people to share their knowledge.