-3

Can chatbots like [Rasa] learn from the trusted user - new additional employees, product ids, product categories or properties - or unlearn when these entities are no longer current ? Or do I have to go through formal data collection, training sessions, testing (confidence rates > given ratio), before the new version be made operational.

peter.cyc
  • 1,763
  • 1
  • 12
  • 19
  • My question is not code related but more on the capabilities of chatbot technologies currently available. – peter.cyc Aug 02 '20 at 08:48

2 Answers2

1

If you have entity values that are being checked against a shifting list of valid values, it's more scalable to check those values against a database that is always up to date (e.g. your backend systems probably have a queryable list of current employees). Then if a user provides a value that used to be valid and now isn't, it will act the same as if a user provided an invalid value in the first place.

This way, the entity extraction can stay the same regardless of if some training examples go out of relevance -- though of course it's always good to try to keep your data up to date!

  • Do you know if Rasa or other chatbot have a function to "check values against a DB" like you suggest ? AFAIK Machine Learning algorithms store their "knowledge" in obscure coefficients of the underlying network w/o (human) logical counterparts. I think chatbot knowledge is neither scalable nor extensible (like extra facts mentioned in the question), at least as of today. Maybe in 5 years' time? – peter.cyc Jul 24 '20 at 19:01
  • 1
    That's something you would write a function for. So for example if you have an action called `check_employee_value` that takes in the entity and checks it against the database, you can return the `"employee": "value"` slot if it is valid, and `"employee": None` if invalid. This is a way to change the conversation based on whether or not a value is valid, taking into account a changing list of valid values, without the ML algorithm needing to embed the information of which values are valid. – Ella Rohm-Ensing Jul 27 '20 at 10:56
0

Many Chatbots do not have such a function. Except avanced ones like Alexa, with the keyword "Remember" available 2017 +/-. The user wants Alexa to commit to memory certain facts.

IMHO such a feature is a mark of "intelligence". It is not trivial to implement in ML systems where coefficients in their neural network models are updated by back-propagation after passing learning examples. Rule-based systems (such as CHAT80 a QA system on geography) store their knowledge in relations that can be updated more transparently.

peter.cyc
  • 1,763
  • 1
  • 12
  • 19