3

During a search you can pass some data to search manager by using the APP_DATA bundle. This mechanism works great for normal search, but how can you do the same for a voice search and get back some context information when he voice search returns ?

Snicolas
  • 37,840
  • 15
  • 114
  • 173

1 Answers1

2

From what I understand, it goes through the same mechanism. Simply create your normal onSearchRequested override in the backend, then annotate your dialog or widget with voice search functionality as described here.

Using their example, something like this should go into your frontend:

<?xml version="1.0" encoding="utf-8"?>
<searchable xmlns:android="http://schemas.android.com/apk/res/android"
    android:label="@string/search_label"
    android:hint="@string/search_hint"
    android:voiceSearchMode="showVoiceSearchButton|launchRecognizer" >
</searchable>

When voice search is requested, its data will be passed through the search mechanism and on to your onSearchRequested callback, allowing you to manipulate the data as needed.


Edit: The actual problem being addressed here was to differentiate when voice search is used in a search widget versus when the standard text input has been invoked.

Unfortunately, it appears Google does not provide for these facilities unless you roll your own Recognizer or attempt to retrieve properties from the search bundle that are shaped like voice data. The latter case is undocumented and, at least apparently, is unsupported as well.

MrGomez
  • 23,788
  • 45
  • 72
  • Sorry but you explain how to implement voice search. Actually what I want is to pass some data to the search and get it back. Imagine I got 2 activities using search (sharing the same search widget), how can I pass some context data to voice search and get it back two distinguish the 2 callers activities ? – Snicolas Mar 27 '12 at 08:03
  • @Snicolas Ah! So, if I understand correctly, you are looking for data to differentiate _between_ when someone uses voice search versus direct input in the same widget. It appears that Android goes out of its way to provide no context here, since you're basically getting a UI shim. So, if nothing usable exists in your `Bundle`, your only recourse is to create a [`RecognizerIntent`](http://developer.android.com/reference/android/speech/RecognizerIntent.html) directly. I can find nothing else pertinent in the docs. – MrGomez Mar 27 '12 at 08:47
  • Not exactly. Although documentation says almost nothing, it is possible to distinguish voice search and normal search ( looking at some properties in the query intent's bundle like language,etc.). I am not looking for that but for a way 2 differenciate two distinct use of the voice search in the same app. I want, when the search return to be able to say if it has been called from here or from there. – Snicolas Mar 27 '12 at 08:49
  • @Snicolas Assuming that's true, would it not be possible to define these properties in your [Searchable configuration](http://developer.android.com/guide/topics/search/searchable-config.html), then simply detect for them in your callback? It's taking the garden path to the correct solution, but then, so is creating the recognizer directly as at the top of [this question](http://stackoverflow.com/questions/4718643/recognizerintent-not-working-missing-extra-calling-package). – MrGomez Mar 27 '12 at 08:53
  • Using the recognizer intent could maybe help but there is no real reason for extras passed to that intent to be included when the activity receives the voice search intent result. I don't think so but I may be wrong, but I explained this to highlight that I ma interested by the return of the voice search and its intent is distinct from the intent used to launch voice search. And about the config file, I couldn't find any place to include any context information there and even then how to distinguish those 2 contexts/activities from a single config files ? – Snicolas Mar 27 '12 at 09:47
  • 1
    @Snicolas I understand. The difficulty is voice search appears to be a shim into the text input properties of a search widget. Google does not seem to have provided for the ability to detect the shim versus when standard text input has been used, except by way of the garden path (by detecting for the properties of a voice search) or rolling your own Recognizer. The next question is why this was their apparent design decision. As for the config files, I can't find a way to provide a split context, either. – MrGomez Mar 27 '12 at 18:03