1

I am downloading Sentinel-2 images with the following code

import boto3

s3 = boto3.resource('s3', region_name='us-east-2')
bucket = s3.Bucket('sentinel-s2-l1c')
path = 'tiles/36/R/UU/2017/5/14/0/'

object = bucket.Object(path + 'B02.jp2')
object.download_file('B02.jp2')
object = bucket.Object(path + 'B03.jp2')
object.download_file('B03.jp2')
object = bucket.Object(path + 'B04.jp2')
object.download_file('B04.jp2')

and I get 3 grayscale JP2 images on disk.

Then I am trying to mix color layers with the following code

import matplotlib.image as mpimg
import numpy as np
from PIL import Image

Image.MAX_IMAGE_PIXELS = 1000000000

print('Reading B04.jp2...')
img_red = mpimg.imread('B04.jp2')

print('Reading B03.jp2...')
img_green = mpimg.imread('B03.jp2')

print('Reading B02.jp2...')
img_blue = mpimg.imread('B02.jp2')

img = np.dstack((img_red, img_green, img_blue))

img = np.divide(img, 256)
img = img.astype(np.uint8)

mpimg.imsave('MIX.jpeg', img, format='jpg')

Result looks very poor, very dimmed and nearly black and white.

I would like something like this:

enter image description here

or like preview

enter image description here

while my version is

enter image description here

(sorry)

UDPATE

I found that images probably 12-bit. When I tried 12, I saw overexposure. So experimentally I found the best quelity is for 14 bitness.

UDPATE 2

Although even with 14 bits I have small areas of overexposure. Here are Bahamas:

enter image description here

Dims
  • 47,675
  • 117
  • 331
  • 600
  • Just a hunch: skip one or both of the `img = np.divide(img, 256)` and `img = img.astype(np.uint8)` lines. – MB-F May 24 '17 at 11:36
  • Normal JPEG can't handle HDR. `imsave` can't handle JPEG2000. – Dims May 24 '17 at 11:47
  • 1
    Fair enough. Are you certain the input images' pixel values are in range 0 - 65535? Otherwise try to normalize so the output makes use of the full 8 bit range. – MB-F May 24 '17 at 11:56
  • @kazemakase you are right, I found that images are probably 14-bit. – Dims May 24 '17 at 12:08
  • You could normalize the images so the output has the highest possible contrast you can get out of 8 bits. Instead of dividing by 256, try this: `img = img * 255.0 / img.max()`. This will scale the image so that the most intense pixel has a value of 255 - using the full range while avoiding over-exposure. – MB-F May 24 '17 at 12:25

1 Answers1

0

I looked into this problem today and I think you problem is the division. Like kazemakase said it is wise to multiply by 255 and divice by the max value but the matrix is very big (as in too big to handle). I used this:

max_pixel_value = rgb_image.max()
rgb_image = np.multiply(rgb_image, 255.0)
rgb_image = np.divide(rgb_image, max_pixel_value)
rgb_image = rgb_image.astype(np.uint8)

Also you used the wrong bands (I think, but I am not sure!). Red should be band 5 and blue 1 or 2. See https://en.wikipedia.org/wiki/Sentinel-2#Instruments for more details.

Unfortunately the data of band 1 and 5 is much smaller then of band 3. Therefore I resized it with cv2 (PIL had trouble with the size and all). So this made quite the few calculation errors and the new picture does not look much better.