2

Is there a way to display the dialog inputs(human inputs) and outputs(robot answers) on the tablet of Pepper? I have seen an example of it on https://softbankroboticstraining.github.io/pepper-chatbot-api/#pepper-chat, but it doesn't work directly in QiChat syntax.

I have also seen some examples in ALTabletService documentation for images but not for interactive dialog. The motivation behind it is to have a multi-modal interaction instead of just audio based. Note: Python implementations would be preferable than with Choreographe.

grizzthedj
  • 7,131
  • 16
  • 42
  • 62
rajput
  • 166
  • 7

2 Answers2

2

Do the following to get the human input and robot answer :-

  1. Subscribe to event Dialog/CurrentString - this will give you the currently processed human input.
  2. Subscribe to event Dialog/Answered - Raised each time the robot answers. Contains the last answer.

Display these events output on the tablet.

Zeeshan
  • 277
  • 2
  • 6
  • 16
1

It is possible and recommended yes!

You will need to create a webpage along with your application. This webpage should be called index.html and be located in a "html" directory in your project. It will automatically be hosted on the robot and made accessible for the tablet to display when you deploy your application onto the robot.

In Javascript in the webpage code, you can subscribe to event (see http://doc.aldebaran.com/2-5/dev/js/index.html) to display what the robot understands and push what the robot says as well.

In Python, you just need to call ALTabletService.loadApplication then ALTabletService.showWebview to display that webpage on the tablet.

JLS
  • 968
  • 5
  • 12