I have an idea for an app that I'd like to develop, but before I invest a lot of time learning objective C and the iOS APIs, I'd like to make sure that what I want to do is feasible.
The app I want to make is a purely auditory (sound-only) version of Google Glass. I'm visually impaired, so spending a lot of money on something visual, even though it can read content to you, would not be worth it. But if I could use an iPhone to give many of the same options as Google Glass, that would be great.
Many times, I've wanted a piece of information while walking down the street, where I couldn't easily get to my iPhone, because I have my cane in one hand, and something else in the other. In such cases, it'd be awesome if I could say a command, and get a voice response.
I'd use the microphone built into the Apple earphones for audio input, but I'm not sure if it's possible to listen for audio input while the screen is locked. I'm certain it's not possible with a non-jailbroken iPhone.
Can anyone can tell me if this is possible?