0

Being a novice I need an advice how to solve the following problem. Say, with photogrammetry I have obtained a point cloud of the part of my room. Then I upload this point cloud to an android phone and I want it to track its camera pose relatively to this point cloud in real time.

  1. As far as I know there can be problems with different cameras' (simple camera or another phone camera VS my phone camera) intrinsics that can affect the presision of localisation, right?

  2. Actually, it's supposed to be an AR-app, so I've tried existing SDKs - vuforia, wikitude, placenote (haven't tried arcore yet cause my device highly likely won't support it). The problem is they all use their own clouds for their services and I don't want to depend on them. Ideally, it's my own PC where I perform 3d reconstruction and from where my phone downloads a point cloud.

  3. Do I need a SLAM (with IMU fusion) or VIO on my phone, don't I? Are there any ready-to-go implementations within libs like ARtoolKit or, maybe, PCL? Will any existing SLAM catch up a map, reconstructed with other algorithms or should I use one and only SLAM for both mapping and localization?

So, the main question is how to do everything arcore and vuforia does without using third party servers. (I suspect the answer is to device the same underlay which vuforia and other SDKs use to employ all available hardware..)

Sophour
  • 106
  • 9
  • Hi! as far as I can tell, these aren't signal processing questions, but questions about how to use specific libraries (none of which I've heard of). So, this reads like a programming question more than a signal processing question; in that case, your best bet would be to ask on StackOverflow instead of here. – Marcus Müller Aug 09 '18 at 09:39
  • Well, thank you, I'll try my luck there! – Sophour Aug 09 '18 at 12:58
  • https://fantasmo.io/home You can take a look at this – Ali Kanat Oct 11 '18 at 13:19

0 Answers0