0

I have 2 different sources of satellite images of the same area and hence they have different pixel value distributions. How can I process or transform them to similar distribution? (so that they are visually similar)

For instance: (please ignore the difference in image sizes)

enter image description here

enter image description here

NKR
  • 17
  • 4
  • What are the sizes of these images? That is an important detail – Abhi25t Dec 05 '21 at 08:08
  • Both the images are of 1024x1024. The snapshots added here are after cropping small area out of image. – NKR Dec 05 '21 at 08:34
  • What does "visually similar" mean? Imho they look quite similar, but color saturation and scene geometry is a bit different. – Micka Dec 05 '21 at 08:59
  • 3
    your goal is a look-up table (LUT) that is 3D (R,G,B -> R,G,B). it'll have to be sparse, i.e. say just 16 values per dimension, and you'll need to interpolate when looking up values in there. -- to calculate this LUT, you'll first have to align the images (feature matching + findHomography or lesser transformation), and then calculate a huge histogram from all pairs of pixels (R,G,B -> *set of* RGB values), and then compress that down again by picking the median or mode for each bin. – Christoph Rackwitz Dec 05 '21 at 11:00
  • @Micka by visually similar I meant 'similar color' or color saturation. The scene geometry is not of concern. – NKR Dec 05 '21 at 17:41
  • @Navjot when geometry changes, how do you want to make sure that same elements are normalized in color? Do you have some kind of mask for pixels that should or shouldnt match? Modern approach (and still research topic) for switching the distribution would be to train a GAN for style transfer, if you need complex changes like changing the season of an image. Simple approach could be white balancing / contrast stretching or histogram equalization in BOTH images. – Micka Dec 05 '21 at 19:05

0 Answers0