0

I have a basic model set up but it appears that the Alexa beta builder expects the design of the model to be up front with explicit lists of possible answers to requests. However, the Jeopardy game on that platform appears to allow new answers from day to day and I am supposing they don't need to update their model each day. Can someone shed light on this process?

I would like to be able to:

  • Launch intent to start conversation
  • Handle back and forth within web service response and request
  • Handle basic stop/start from Alexa

How does the middle portion (response/request) get modeled up front?

Matt Ray
  • 1,274
  • 1
  • 12
  • 26

1 Answers1

2

Currently, you need to provide a list of expected values for each of your slot types (a slot type may be question_response in this case).

Not sure how Jeopardy works but I have a couple ideas:

  • They provide huge lists of possible answers (and they update them from time to time). I bet there is no one writing new questions every day, they have a pool of questions and they can anticipate the answers to those.
  • They have access to an API that is not open to the general public (as do people who are participating in the social bot challenge) or are allowed to use the deprecated LITERAL slot type.
Josep Valls
  • 5,483
  • 2
  • 33
  • 67