0

I'm using a python program to acquire an image from a scientific camera. This part is okay, I can get the 16 bits image in an array. The problem comes when I want to display the image in the qt window (I'm using a QGraphicsWindow), the way the image is displayed is very strange. To display the image, I convert the 2d array to a pixmap which is then dislpayed. I tried different things but the best results are obtained for the following codes :

def array2Pixmap(arr):
arr_uint8 = arr.view(dtype=numpy.uint8)
im8 = Image.fromarray(arr_uint8)
imQt = QtGui.QImage(ImageQt.ImageQt(im8))
pix = QtGui.QPixmap.fromImage(imQt)
return pix

which gives the following result :enter image description here

and this one:

def array2Pixmap(arr):
arr_uint8 = arr.astype(numpy.uint8)
im8 = Image.fromarray(arr_uint8)
imQt = QtGui.QImage(ImageQt.ImageQt(im8))
pix = QtGui.QPixmap.fromImage(imQt)
return pix

which gives this for the exact same capture conditions (camera exposure time, light intensity, etc ...): enter image description here

So now I'm looking for a way to display the image the correct way. Do you have any idea of what I'm doing wrong ?

Thanks

EDIT

Here is an exemple of what is the arr. The command print(arr) returns

[[100  94  94 ...  97  98  98]
[ 97 100  98 ...  98 101  99]
[100  95  98 ... 104  98 102]
...
[ 98  98  98 ...  96  98 100]
[ 94 100 102 ...  92  98 104]
[ 97  90  96 ...  96  97 100]]

and a print(type(arr)) returns

<class 'numpy.ndarray'>

EDIT

Ok I have some news. I changed my code so that now the conversion to 8 bits array id done like this :

arr = numpy.around(arr*(2^8-1)/(2^16-1))
arr_uint8 = arr.astype(numpy.uint8)

If I display the image using matplotlib.pyplot.imshow(arr, cmap='gray') it works and the image is diplayed like this in the editor :

enter image description here

but when I convert it into a QPixmap, the result is the same as before.

What is strange is that when I use arr_uint8 = arr.view(dtype=numpy.uint8) to convert to 8-bits, the result is an array of 2048*4096 instead of 2048*2048. I don't understand why...

  • What is the image format? In other words how do the values in the array correspond to color? Are they greyscale? RGB? You say 16 bits in the question but then convert to an 8 bit array. – user545424 Apr 03 '19 at 20:30
  • The image is returned as a numpy 2d array with values comprised between 0 and 65535. The conversion from 8 to 16 bits is done because -to my knowledge- Pixmap cannot handle 16 bits images – William Magrini Apr 03 '19 at 20:50
  • Pixmap can handle various image encodings, but you first need to figure out what those values mean. Perhaps you can look it up in the documentation for whatever software is creating the numpy array. – user545424 Apr 03 '19 at 21:00
  • Could you add the raw `arr` or byte data for a single frame to the question? I have an application that takes raw bytes from a scientific camera (hyperspectral imaging) and have it displayed in a PyQt window using QPixmap. I think I can help you out but I need a example raw image of the frame you're trying to display. – nathancy Apr 04 '19 at 01:15
  • @user545424 To capture the data, I use Micro-Manager and the documentation says that _Images returned as numpy array by calls to an instance of the pythonized Micro-Manager CMMCore class. The array dtype depends on property named PixelType (see below)._ and the _PixelType_ property of my camera is 16bits – William Magrini Apr 04 '19 at 07:26
  • @nathancy I edited my post with relevant informations – William Magrini Apr 04 '19 at 07:27
  • @WilliamMagrini that still doesn't tell you what the encoding is. I've added an answer where I assume it's a 16 bit greyscale image based on what you posted. – user545424 Apr 04 '19 at 18:19

2 Answers2

0

So, although you don't say it in the question, I am going to assume that the format of your image is 16 bit greyscale.

Looking at the format types here: https://doc.qt.io/Qt-5/qimage.html#Format-enum that isn't a supported format so you'll have to change it to something that can be displayed.

The RGB64 format allows for 16 bits per color which is enough resolution for the values you have:

from PySide import QtGui, QPixmap

def array_to_pixmap(arr):
    """Returns a QPixmap from a 16 bit greyscale image `arr`."""

    # create a local variable arr which is 64 bit so we can left shift it
    # without overflowing the 16 bit original array
    arr = arr.astype(np.int64)

    # pack the 16 bit values of arr into the red, green, and blue channels
    rgb = arr << 48 | arr << 32 | arr << 16 | 0xffff
    im = QtGui.QImage(rgb, rgb.shape[0], rgb.shape[1], QtGui.QImage.Format_RGBA64)
    return QtGui.QPixmap.fromImage(im)

I haven't tested this, but it should give you enough info to go on.

user545424
  • 15,713
  • 11
  • 56
  • 70
  • Ok I think I just don't understand what you mean by image format, I'm sorry... But your solution should work because I effectively have a 16 bit image array at the input of my function. I'll try that today and let you know. Thanks – William Magrini Apr 05 '19 at 07:18
0

I found the solution. In fact, the solution of @user545424 didn't work because I am using PyQt5 and the image format Format_RGBA64 is not supported. I tried to install PySide2 but it didn't work so after some research, I found this post : Convert 16-bit grayscale to QImage The solution proposed in the answer works perfectly. Here is the code that I use to display my 16 bits image :

from PyQt5 import QtGui
import numpy as np

def array2Pixmap(img):
    img8 = (img/256.0).astype(np.uint8) 
    img8 = ((img8 - img8.min()) / (img8.ptp() / 255.0)).astype(np.uint8)
    img = QtGui.QImage(img8.repeat(4), 2048, 2048, QtGui.QImage.Format_RGB32)

    pix = QtGui.QPixmap(img.scaledToWidth(img.width()*2))
    return pix

This code works and I have a nice image but now I have to handle with 32 bits images 2048*2048 pixels so the execution is getting slow after some time. I will try to find out why.