I am trying to convert the image: ImageView sentObject
into byte Array. I found the way to convert it in [ 1 ] so I wrote the code as below.
public void sendImage(ImageView sentObject)
{
PixelReader pixelReader=sentObject.getImage().getPixelReader();
int width =(int) sentObject.getImage().getWidth();
int height =(int) sentObject.getImage().getHeight();
byte[] buffer=new byte[width*height*4];
System.out.println("Byte1: "+buffer);
try
{
pixelReader.getPixels(0,0,width,height,PixelFormat.getByteBgraInstance(),buffer,0,width*4);
System.out.println("Byte2: "+buffer);
//pixel[i][j]=sentObject.getImage().getPixelReader().getColor(i,j);
DataOutputStream out = new DataOutputStream(client.getOutputStream());
out.write(buffer);
out.flush();
}
catch(IOException IO)
{
IO.printStackTrace();
}
}
The main point I want you to look is getPixels
method. This method will convert an image into bytes and store those bytes in byte array (in my case, it is byte[] buffer
) However When I compiled and ran, it didn't seem to store the bytes for me. I put the debug statement: System.out.println("Byte1: "+buffer);
and System.out.println("Byte2: "+buffer);
Then the output is as below:
The buffer
before and after activates getPixels
are the same which means there are no pixels stored in the buffer
. What should I do to store the pixels by getPixels
method. And do I misunderstand anything?
Reference
[ 1 ] Pure JavaFX: convert Image to bytes Array (+ opposit operation). What's wrong?