1

I've recently started building a Javascript program for Pepper. My goal is to make pepper listen to what people say and either say Hello or make Pepper do an animation based on keyword 'Hello/Animation' in WordRecognized event in Javascript.

As of now,I'm able to show two buttons on the tablet using JavaScript and make Pepper say Hello on one button press and perform animations on another button press. Clicking on the buttons work but I'm not able to make it working for the WordRecognized events by using the Qi Javascript SDK (http://doc.aldebaran.com/2-4/dev/js/index.html ). I went through the link mentioned here and came up with the below code snippet that makes Pepper say Word Detected on hearing the recognized word. Just wondering what else am I missing in the code to make Pepper listen to the words and perform action accordingly?

    //Start the Speech Recognition
    var asr = session.service('ALSpeechRecognition');

    //Define the Vocabulary
    vocabulary = ["hello", "dance"];

    //Set The Language To English and set the Vocabulary
    asr = asr.then( function(asr) { return asr.setLanguage('English') }).then( function(asr){ return asr.setVocabulary(vocabulary, false); } );
    console.log("Set the Language to English!");

    //Register the Callback function  for the Speech REcognition

    asr.unsubscribe(); //De-Register if Existing from Before
    asr.subscribe();

    session.service("ALMemory").then(function (ALMemory) {
    ALMemory.subscriber("wordRecognized").then(function (subscriber) {
    // subscriber.signal is a signal associated to "wordRecognized"
    subscriber.signal.connect(function (state) {
    word = state.getData("wordRecognized")[1];
    word.then( function() { session.service('ALTextToSpeech').say("A Keyword is Detected!") });
    asr.unsubscribe();
  }); //subscriber
   }); //connect
    }); //ALMemory

  });
srinu634
  • 43
  • 9

1 Answers1

1

Your code snippet as given won't work, because this:

var asr = session.service('ALSpeechRecognition');

means the asr variable is a future, so you can't call asr.unsubscribe() on it.

You'd have to wrap everything in session.service(...).then(function(asr) { ...} for it to work right, like you do with ALMemory.

The syntax can be a bit awkward, I usually use a small helper library, robotutils.qim.js, that makes the code a bit more readable, and that has a helper for subscribing to ALMemory.

Emile
  • 2,946
  • 2
  • 19
  • 22
  • Thanks for the information. It helps a lot. I have a question. I'm successful in setting up a callback function for FrontTactileTouched. The Robot says 'Someone touched my head' whenever I touch its head. However, I'm not able to make the Robot detect a face/respond to speech events (Making the Robot say I saw a face on FaceDetected for example) although I subscribed to the callbacks in the same way as FrontTactileTouched ( https://github.com/srinu634/Pepper-Javascript-Sample/blob/master/script.js ). Shouldn't the behavior be the same for all the event callbacks? – srinu634 Jun 11 '18 at 21:44
  • no; FrontTactileTouched is always "active" but other events like face and word detection need to be activated because they require special processing - see http://doc.aldebaran.com/2-5/naoqi/peopleperception/alfacedetection-tuto.html#alfacedetection-tuto for example, especially the `aceProxy.subscribe("Test_Face", period, 0.0 )` part. – Emile Jun 13 '18 at 07:29