Visual conversation flow is a first thing to create, when you want to build chatbot. Such flow will help to define proper set of intents along with dialog path. Otherwise it is very easy to get lost in conversation transitions and this will lead to chatbot implementation failure. Our chatbot for medical system doesn't make any decisions, instead it helps user to work with enterprise system. It gets user input and during conversation leads to certain API call - which at the end triggers enterprise system to execute one or another action. If user is looking for patient blood pressure results, chatbot will open blood pressure module with patient ID. If user wants to edit or review blood pressure results in general, chatbot will load blood pressure results module without parameters. This kind of chatbot is very helpful in large and complex enterprise systems, this helps to onboard new users much quicker without extra training for system usage. Example of visual conversation flow for chatbot:
Conversation intents can be logged in JSON file. Where you should list conversation patterns mapped with tags, responses and contextual information. Chatbot is not only about machine learning and user input processing, very important is to handle conversation contextual flow and usually this is done outside of machine learning area in another module. We will look into it later. Machine learning with neural network is responsible to allow chatbot to calculate tag probability based on user input. In other words - machine learning helps to bring the best matching tag for current sentence, based on predefined intents patterns. As long as we get probability for the intent tag - we know what user wants, we can set conversation context and in the next user request - react based on current context:
TensorFlow runs neural network, which trains on supplied list of intents. Each training run may produce different learning results, you should check total loss value - lower value, better learning result. Probably you will run training multiple times to get optimal learning model:
TensorFlow can save learned model to be reusable by classification API. REST interface which calls classification API is developed as separate TensorFlow module. REST is handled by Flask library installed into TensorFlow runtime:
Classification function gets user input from REST call and runs it through TensorFlow model. Results with higher probability than defined by threshold are collected into ordered array and returned back. We have classification function without REST annotation for local tests within TensorFlow runtime:
Let's see how classification works, result of classification will drive next action for the chatbot. Each classification request returns matched tag and probability. User input is not identical to the patterns defined in intents, thats why matching probability may differ - this is core part of machine learning. Neural network constructed with TensorFlow, based on learned model, assumes the best tag for current user input.
User input "Checking blood pressure results for patient". This input can be related to both tags blood_pressure_search and blood_pressure, but classification decides higher probability for the first option, and this is correct. Similar for user input "Any recommendations for adverse drugs?":
Through REST endpoint we can call classification function outside of TensorFlow environment. This will allows us to maintain conversation context outside TensorFlow:
Useful resources:
- TensorFlow notebooks and intents JSON are available on GitHub repository.
- Excellent article about Contextual Chatbots with TensorFlow
- My previous post about Red Samurai chatbot
Conversation intents can be logged in JSON file. Where you should list conversation patterns mapped with tags, responses and contextual information. Chatbot is not only about machine learning and user input processing, very important is to handle conversation contextual flow and usually this is done outside of machine learning area in another module. We will look into it later. Machine learning with neural network is responsible to allow chatbot to calculate tag probability based on user input. In other words - machine learning helps to bring the best matching tag for current sentence, based on predefined intents patterns. As long as we get probability for the intent tag - we know what user wants, we can set conversation context and in the next user request - react based on current context:
TensorFlow runs neural network, which trains on supplied list of intents. Each training run may produce different learning results, you should check total loss value - lower value, better learning result. Probably you will run training multiple times to get optimal learning model:
TensorFlow can save learned model to be reusable by classification API. REST interface which calls classification API is developed as separate TensorFlow module. REST is handled by Flask library installed into TensorFlow runtime:
Classification function gets user input from REST call and runs it through TensorFlow model. Results with higher probability than defined by threshold are collected into ordered array and returned back. We have classification function without REST annotation for local tests within TensorFlow runtime:
Let's see how classification works, result of classification will drive next action for the chatbot. Each classification request returns matched tag and probability. User input is not identical to the patterns defined in intents, thats why matching probability may differ - this is core part of machine learning. Neural network constructed with TensorFlow, based on learned model, assumes the best tag for current user input.
User input "Checking blood pressure results for patient". This input can be related to both tags blood_pressure_search and blood_pressure, but classification decides higher probability for the first option, and this is correct. Similar for user input "Any recommendations for adverse drugs?":
Through REST endpoint we can call classification function outside of TensorFlow environment. This will allows us to maintain conversation context outside TensorFlow:
Useful resources:
- TensorFlow notebooks and intents JSON are available on GitHub repository.
- Excellent article about Contextual Chatbots with TensorFlow
- My previous post about Red Samurai chatbot
No comments:
Post a Comment