I'm trying to get started with the remote camera API, and have hit a wall.
It's not clear to me what the initial setup for the camera (A6000 in my case) should be. The docs seem to imply that I configure "Remote Control" (which I've done) and that as a result, the camera becomes an "access point". Now, to my understanding, an access point is like a WiFi router, and I expect to see a new SSID advertized when I browse local networks from my computer. I would also expect to need credentials (a password) to complete the connection. However, when I set remote control on in the menu, I do not see any new SSID advertized, and I see nothing that would tell me what SSID I might expect, nor what password I should use if I did see such a network.
So, I wondered if my understanding of "access point" was wrong. I then connected my camera to my regular home Wi-Fi (it did this successfully, and I can see the IP address handed to it via my DHCP server). Having done this, I can still set remote control on in the menus. So, with that configuration, I attempted to create a crude connection. I took the core of the code from the example (I don't work with Android, though I'm totally happy with Java) and hacked together something that I hoped would send the initial multicast UDP packet, and look for the UDP response. Well, "nothing happens" -- it sends (ten times actually) while concurrently waiting for a response. However, it never recognizes any response, and times out. I'm not attempting to parse the response, just get a packet.
Of course, I don't know if there's a bug in my code (though it's very simple, and largely stolen from the example) because I don't know if this is even how it's supposed to work.
I am interested in any pointers, particularly regarding the initial camera configuration.
I should note that I have updated the firmware on my A6000 to version 3.20, which appears to be the latest. It worked, I know because a) the version is reported as 3.20, and also the new movie file container format (XAVCS) is offered.