3

I am using OkHttp v3.6.0 on Android for communicating with AVS v20160207. I am successful in communicating with AVS on the Events channel for both sending SpeechRecognizer Event and receiving the matching SpeechSynthesizer directive.

When establishing a connection to downchannel, I receive a successful response of HTTP 200 Success and then block on the stream to receive inbound data. When I ask Alexa to set a "timer for 5 seconds", I receive her prompt saying she will start the timer but I never receive any directives on the downchannel to tell me to set the timer up.

What's also interesting, as noted above, I receive a HTTP 200 success from the downchannel and then can block on the response.body().source(). exhausted(). But after 10 minutes of being blocked and not receiving anything, the stream is CLOSED and I receive the following exception:

Response with Error okhttp3.internal.http2.StreamResetException: stream was reset: CANCEL at okhttp3.internal.http2.Http2Stream$FramingSource.checkNotClosed(Http2Stream.java:436) at okhttp3.internal.http2.Http2Stream$FramingSource.read(Http2Stream.java:338) at okio.ForwardingSource.read(ForwardingSource.java:35) at okio.RealBufferedSource$1.read(RealBufferedSource.java:409) at java.io.InputStream.read(InputStream.java:101) at com.example.demo.alexaassistant.AlexaVoiceServices.interfaces.DownChannelRunnable.run(DownChannelRunnable.java:192) at java.lang.Thread.run(Thread.java:761)

Note that I have tried all of the suggestions found in this thread: Establishing a downchannel with Okhttp?

private static final long CONNECTION_POOL_TIMEOUT_MILLISECONDS = 60 * 60 * 1000;

        ConnectionPool connectionPool = new ConnectionPool(5,
            CONNECTION_POOL_TIMEOUT_MILLISECONDS, TimeUnit.MILLISECONDS);

        /**
         * Create a customized HTTP/2 interface.
         *
         * For the AVS's downchannel, we need to
         * disable the timeout on the read.
         */
        OkHttpClient downChannelClient = httpClient.newBuilder()
            .connectTimeout(0, TimeUnit.MILLISECONDS)  // 0 => no timeout.
            .readTimeout(0, TimeUnit.MILLISECONDS)
            .connectionPool(connectionPool)
            .build();

        final Request request = new Request.Builder()
            .url(url)
            .get()
            .addHeader("Authorization", "Bearer " + this.accessToken)
            .build();
        Log.d(TAG, "downchannel URL ==> " + request.url().toString());
        Log.d(TAG, "downchannel headers ==> " + request.headers().toString());

        Response response = null;
        try
        {
            currentCall = downChannelClient.newCall(request);
    response = currentCall.execute();
    BufferedSource bufferedSource = response.body().source();

    Log.i(TAG, "Downchannel ==> HTTP response code: " + response.code());

    Buffer buffer = new Buffer();

    while (!bufferedSource.exhausted())
    {
        Log.w(TAG, "downchannel received data!!!");
        bufferedSource.read(buffer, 8192);
        Log.d(TAG, "Size of data read: " + buffer.size());
    }

    Log.d(TAG, "Response: " + buffer.toString());
}
catch (IOException e)
{
    Log.d(TAG, "Response with Error", e);
}
finally
{
    if (response != null)
    {
        response.close();
    }
}

EDIT:

Amazon's documentation says that the client needs to have ONE connection to the server so POSTs and GETs streams are made to that one connection as well as the one downchannel in a half-closed stream state. Does OkHttp2 support this?

Paul
  • 217
  • 3
  • 12
  • Hi @Paul I have the same issue, do you find any solution? – leobelizquierdo Feb 22 '17 at 18:39
  • @leobelizquierdo, I'm still searching for one. I've tried implementing each of the suggestions I could find but nothing has worked. – Paul Feb 22 '17 at 19:40
  • could you tell me which event should I expect to send something through the downchannel? SynchronizeState is one of this? should the context I pass to the server matter in the response whether is on the same stream or in the downchannel? I'm getting an 204 for SynchronizeState event but I'm getting nothing on the downchannel, any idea? – leobelizquierdo Feb 22 '17 at 20:00
  • Sure, the 204 from SynchronizeState seems to be correct, as that's what I'm getting. If you say something like "set a timer for 3 seconds", that should send stuff down through the downchannel connection. I have yet received any data through this connection. Also, setting the SpeechRecognizer.Recognize profile parameter to "NEAR_FIELD" or "FAR_FIELD" while streaming audio the cloud should send a StopCapture Directive when it detects end-of-speech. – Paul Feb 22 '17 at 20:16
  • The result of SpeechRecognizer.Recognize is a SpeechSynthesizer.Speak Event. The format can be found here: [link]https://developer.amazon.com/public/solutions/alexa/alexa-voice-service/reference/speechsynthesizer – Paul Feb 22 '17 at 21:33

1 Answers1

1

Finally I'm now receiving data from the server in the downchannel. What I do was SynchronizeState after I get the 200 response when create the downchannel. The doc say:

  1. To establish a downchannel stream your client must make a GET request to /{{API version}}/directives within 10 seconds of opening the connection with AVS. The request should look like this:

  2. After establishing the downchannel stream, your client must synchronize it’s components’ states with AVS. This requires making a POST request to /{{API version}}/events on a new event stream on the existing connection (Note: Do not open a new connection). This event stream should be closed when your client receives a response (directive). The following is an example SynchronizeState event:

Hope this help. I'm now struggling trying to parse the result in the downchannel. Your code should by:

Log.i(TAG, "Downchannel ==> HTTP response code: " + response.code());

...

synchronizeState();

....

Buffer buffer = new Buffer();

while (!bufferedSource.exhausted())
{
    Log.w(TAG, "downchannel received data!!!");
    bufferedSource.read(buffer, 8192);
    Log.d(TAG, "Size of data read: " + buffer.size());
}
Community
  • 1
  • 1
leobelizquierdo
  • 1,648
  • 12
  • 20
  • thanks so much. My problem turned out to be not having a singleton OkHttpClient. Your suggestion pointed the way! – Paul Feb 22 '17 at 22:10
  • I was using a singleton OkHttpClient but after `SynchronizeState` not receive any data. Can you suggest any code for parsing the server response using OkHttpClient on Android? Please accept my answer if it help you, thanks. – leobelizquierdo Feb 22 '17 at 23:07
  • To parse the information sent back from AVS, I'm using Apache FileUpload [link](http://commons.apache.org/proper/commons-fileupload/) and IOUtils. – Paul Feb 23 '17 at 14:30
  • @Paul, I'm now parsing the response sent by AVS in the downchannel but I have a problem. It seems the response isn't entire read it. This code `bufferedSource.read(buffer, 8192);` don't should read 8192 bytes at most? The `while` by other hand is intended for keeping the stream half-closed right? I'm getting a StopCapture but the parser throw an exception because it hasn't an closed boundary. Do you have any idea of how read the entire response from the server? – leobelizquierdo Feb 24 '17 at 21:59
  • this is what I get from the server: `\r\n--------abcde123 Content-Type: application/json {"directive":{"header":{"namespace":"SpeechRecognizer","name":"StopCapture","messageId":"4e984ae1-d465-465a-baac-ee5040f11d7a"},"payload":{}}} --------abcde123\r\n` – leobelizquierdo Feb 24 '17 at 22:02