So I'm currently working on a google assistant project, with dialogflow, firebase and google storage, and thus far I have a conversational agent that is working, but after searching the whole day a way to play .mp3 files stored in my google storage bucket, I'm still helpless.
Here's what the intent is supposed to do :
conv.ask(
`<speak>
<audio src="https://storage.cloud.google.com/path_to_my_bucket/mp3_file_name">
Couldn't read the mp3 file !
</audio>
</speak>`);
Unfortunately, the sound is not played, and I got the 'Couldn't read the mp3 file !' message instead. The mp3 file is conform to the requirement in the DialogFlow documentation
Here is the response :
{
"payload": {
"google": {
"expectUserResponse": true,
"richResponse": {
"items": [
{
"simpleResponse": {
"textToSpeech": "<speak><audio src=\"https://storage.cloud.google.com/path_to_my_bucket/mp3_file_name\">Couldn't read the mp3 file !</audio></speak>"
}
}
]
}
}
},
...
I tried with the https://console.actions.google.com/ test platform, on all the devices available.
This is not an authorization problem : i set all my files as public in my google storage bucket, (and that's why I obviously didn't type the real audio file link...)