1

Can you use a USB camera for the Spectator View rig and overwrite one of the scripts OpenCV uses to get the camera feed?

I think this is the first StackOverflow question where it talks about Spectator View the Microsoft HoloLens supports, because I checked once to see if any other questions talk about it here, and it doesn't look like it.

Anyway, according to the documentation here, to enable Spectator View on a Unity-based UWP app that is deployed to more than one Microsoft HoloLens, I need to choose one out of four different ways to capture live video feed from a camera:

  • OpenCV 3.2
  • DeckLink Capture Card
  • Elgato Capture Card
  • Canon SDK

In this Spectator View setup I have for a project that's under a non-disclosure agreement, I am using OpenCV 3.2. What I'm using is a Lenovo ThinkPad laptop as the hub for Spectator View.

In detail, it runs the Unity Editor that holds the Spectator View Manager component I need to see in the Inspector in order to build, install, and launch the app that two HoloLens headsets I have will use to see a shared, anchored hologram that is spatially placed. The editor also has the Compositor interface I need to overlap what the camera sees with what a virtual camera this Unity scene has to create a video feed that goes out to a projector or TV set. Lastly, I have an executable from Microsoft's Mixed Reality Toolkit called Sharing Service where it runs basically a server program to exchange the transform of holograms on the fly, as if those are put in place in the real environment.

Now, the Lenovo ThinkPad cannot take in any capture cards, because there are no internal expansion ports. The laptop does not have an HDMI input port; only output. As such, when I start running the app on the Unity Editor, I do get video input and Unity view input in the Compositor interface, but the video feed is coming from the built-in camera the Lenovo ThinkPad provides. What I want to do is use a different camera instead, preferably a DSLR camera that can connect to my laptop using USB.

By using OpenCV 3.2 as the main dependency in the libraries I need, can I modify one of the scripts where it accepts video stream from a USB camera?

user9236834
  • 153
  • 10
  • 1
    You will probably have better luck avoiding the Sharing Service in your case. I would set it up using UNET instead. That way you don't need an external server and you can just use your two HoloLens as the host/server and client and just run your feed to a TV, computer, etc. – Dtb49 Apr 17 '18 at 22:04
  • This is a valid comment. But I need to stick with what I got right now, so I'll have to leave this for the team later on. The end goal here is to simply deploy Spectator View in the project and worry about potential upgrades and further setup later. This is a rough start for us. – user9236834 Apr 18 '18 at 13:51
  • Actually, @Dtb49, I don't think that's the way on how Spectator View works. Spectator View involves a computer as the server because only the Unity editor can collect the video feed and the holograms from the Spectator View rig through the Compositor interface. I need *that* interface in order to *project* it into the big screen for an audience to see. – user9236834 Apr 18 '18 at 13:59
  • The very first line of the README says: "These instructions are for the UNET implementation of spectator view For the HoloToolkit sharing service implementation, see the Legacy Documentation." – Dtb49 Apr 18 '18 at 14:13
  • 1
    Yes, you have to run the compositor through the editor and then to your projector or whatever else, I thought that part was understood. But with UNET you don't need an external server in addition to that because it is all done through the HoloLens instead. – Dtb49 Apr 18 '18 at 14:25
  • The documentation is quite confusing. See, except for the calibration data, which I'll be getting later on once the Spectator View rig is put together and raised up, I do have the prefabs, as well as the Compositor interface in place. What I'm not getting is the session where a client HoloLens can join, but I think this is out of scope for this question. I would have to post another one. The point of this question is to see if, in the Spectator View rig, I can use a camera that's *different* than what the documentation says. The camera is connected to the laptop by USB. – user9236834 Apr 18 '18 at 14:32
  • Let us [continue this discussion in chat](https://chat.stackoverflow.com/rooms/169253/discussion-between-user9236834-and-dtb49). – user9236834 Apr 18 '18 at 14:34

1 Answers1

1

@Dtb49 says in the StackOverflow chat above,

"I don't think you are limited to those four choices I think those are just the ones that they tested with. I do remember something about the USB port needed to be a 3.0 for it to work properly. I do remember coming across that problem when I was initially setting it up."

I don't know right now whether I need to change a script or not to have the Compositor interface take camera input from the external camera that's connected by USB, or just disable the webcam on my laptop temporarily where something in the OpenCV assembly or motherboard determines which camera to load for the interface. But it looks like using a DSLR camera connected by USB for the Microsoft HoloLens Spectator View rig is possible.

As a university intern, I can say that the documentation for Spectator View in its current state is quite confusing, as I am not familiar with UNET and some other Microsoft technologies.

user9236834
  • 153
  • 10