I am trying to implement my own image stitcher for a more robust result. This it what I got so far.
The result of the panorama sticher that OpenCV provides is as follows.
Apart from the obvious blending issues, I am wondering how they distribute the warping to both images. It seems to me they project the images on some cylinder before the actual stitching. Is this part of the calculated homography or do they warp the images before the feature matching? I had a look a the highlevel of the stitching pipeline, the actual code as well as the landmark paper for the pipeline, but I couldn't figure out where exactly this warping happens and what kind of warping it is.