I am trying to find the real world distance from camera to ORB feature points using Monocular ORB SLAM2.
I calculated the Euclidean distance between world coordinates of each ORB feature point and the world coordinate of the current key frame's camera location. This process is repeated for all the frames. So a distance is obtained for each ORB point in the current frame.
In Viewer.cc
std::vector<MapPoint*> MP = mpTracker->mCurrentFrame.mvpMapPoints;
cv::Mat CC = mpTracker->mCurrentFrame.GetCameraCenter();
mpFrameDrawer->DrawFrame(MP,CC);
In FrameDrawer.cc
cv::Mat FrameDrawer::DrawFrame(std::vector<MapPoint*> DistMapPt, cv::Mat CameraPt)
{
.
.
.
else if(state==Tracking::OK) //TRACKING
{
mnTracked=0;
mnTrackedVO=0;
const float r = 5;
for(int i=0;i<n;i++)
{
if(vbVO[i] || vbMap[i])
{
cv::Point2f pt1,pt2;
pt1.x=vCurrentKeys[i].pt.x-r;
pt1.y=vCurrentKeys[i].pt.y-r;
pt2.x=vCurrentKeys[i].pt.x+r;
pt2.y=vCurrentKeys[i].pt.y+r;
float MapPointDist;
MapPointDist = sqrt(pow((DistMapPt[i]->GetWorldPos().at<float>(0)-CameraPt.at<float>(0)),2)+pow((DistMapPt[i]->GetWorldPos().at<float>(1)-CameraPt.at<float>(1)),2)+pow((DistMapPt[i]->GetWorldPos().at<float>(2)-CameraPt.at<float>(2)),2));
}
.
.
.
}
}
}
How ever this calculated distance is neither equal to nor can be scaled to the real distance. The same method gives relatively accurate distances in RGBD ORB SLAM2.
Is there any method to scale distances in Monocular ORB SLAM?