1

I currently have follow-up prompts on my QnA pairs but those prompts does not show-up when running the bot locally or using the webchat. Is there a way to get this enabled using the Virtual Assistant template?

The follow-up prompts works when using a QnA bot but not on Virtual Assistant

  • Something is missing from this issue - can you please fix this? – DFBerry Aug 30 '19 at 16:22
  • @DFBerry can you be more explicit about what is missing for the new SO user? – tdurnford Aug 30 '19 at 17:05
  • @samueljohnpaul The QnA code from any of the samples should be able to be extracted and used in a different bot. Can you show your code that isn't working? – DFBerry Aug 31 '19 at 21:54
  • You can reference this C# .NET Core [sample](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/qnamaker-prompting/csharp_dotnetcore) that demonstrates ingesting QnA Maker follow-up (a.k.a. multi-turn) prompts. This may help in determining if something in your VA code is incorrect. – Steven Kanberg Sep 03 '19 at 21:41

1 Answers1

2

The C# .NET Core sample posted by Steven Kanberg is a great resource. If you like tutorial style guides, this one may help : https://www.joji.me/en-us/blog/implement-follow-up-prompt-for-qna-bot/.

The steps outlined there are :

  1. Edit your Bots\YourBotName.cs, add following namespaces we will use to support follow-up prompts:
using Newtonsoft.Json;
using System.Collections.Generic;
using System.Text;
  1. Add following classes to match the JSON response shape:
class FollowUpCheckResult
 {
     [JsonProperty("answers")]
     public FollowUpCheckQnAAnswer[] Answers
     {
         get;
         set;
     }
 }

 class FollowUpCheckQnAAnswer
 {
     [JsonProperty("context")]
     public FollowUpCheckContext Context
     {
         get;
         set;
     }
 }

 class FollowUpCheckContext
 {
     [JsonProperty("prompts")]
     public FollowUpCheckPrompt[] Prompts
     {
         get;
         set;
     }
 }

 class FollowUpCheckPrompt
 {
     [JsonProperty("displayText")]
     public string DisplayText
     {
         get;
         set;
     }
 }
  1. After the qnaMaker.GetAnswersAsync succeeds and there is valid answer, perform an additional HTTP query to check the follow-up prompts:
// The actual call to the QnA Maker service.
 var response = await qnaMaker.GetAnswersAsync(turnContext);
 if (response != null && response.Length > 0)
 {
     // create http client to perform qna query
     var followUpCheckHttpClient = new HttpClient();

     // add QnAAuthKey to Authorization header
     followUpCheckHttpClient.DefaultRequestHeaders.Add("Authorization", _configuration["QnAAuthKey"]);

     // construct the qna query url
     var url = $"{GetHostname()}/knowledgebases/{_configuration["QnAKnowledgebaseId"]}/generateAnswer"; 

     // post query
     var checkFollowUpJsonResponse = await followUpCheckHttpClient.PostAsync(url, new StringContent("{\"question\":\"" + turnContext.Activity.Text + "\"}", Encoding.UTF8, "application/json")).Result.Content.ReadAsStringAsync();

     // parse result
     var followUpCheckResult = JsonConvert.DeserializeObject<FollowUpCheckResult>(checkFollowUpJsonResponse);

     // initialize reply message containing the default answer
     var reply = MessageFactory.Text(response[0].Answer);

     if (followUpCheckResult.Answers.Length > 0 && followUpCheckResult.Answers[0].Context.Prompts.Length > 0)
     {
         // if follow-up check contains valid answer and at least one prompt, add prompt text to SuggestedActions using CardAction one by one
         reply.SuggestedActions = new SuggestedActions();
         reply.SuggestedActions.Actions = new List<CardAction>();
         for (int i = 0; i < followUpCheckResult.Answers[0].Context.Prompts.Length; i++)
         {
             var promptText = followUpCheckResult.Answers[0].Context.Prompts[i].DisplayText;
             reply.SuggestedActions.Actions.Add(new CardAction() { Title = promptText, Type = ActionTypes.ImBack, Value = promptText });
         }
     }
     await turnContext.SendActivityAsync(reply, cancellationToken);
 }
 else
 {
     await turnContext.SendActivityAsync(MessageFactory.Text("No QnA Maker answers were found."), cancellationToken);
 }

  1. Test it in Bot Framework Emulator, and it should now display the follow-up prompts as expected.

Notes:

Be sure to create IConfiguration _configuration property, pass IConfiguration configuration into your constructor, and update your appsettings.json with the appropriate QnAKnowledgebaseId and QnAAuthKey.

If you used one of the Bot Samples as a starting point, note that QnAAuthKey in appsettings.json will probably be named QnAEndpointKey instead.

You will also need a GetHostName() function or just replace that with the url for your bot's qna hostname.

Mbuotidem Isaac
  • 699
  • 9
  • 10
  • Thank you so much! Worked perfectly! – samueljohnpaul Sep 05 '19 at 17:28
  • This is a great solution however if using MS Teams as your channel the prompts won't work as Teams doesn't support SuggestedActions, [this post](https://stackoverflow.com/questions/58178054/prompts-not-showing-in-in-microsoft-teams-integrated-bot) says to use cards instead for the prompts. – J_L Mar 11 '20 at 23:47