Universal Sentence Embeddings (USE) promise to address all of the issues of word embeddings by capturing the semantic meaning of a sentence and identifying order. They’re considered “universal” because they promise to encode general properties of sentences due to being trained on very large datasets. In theory, they can then be applied to any downstream NLP task without requiring domain specific knowledge. (View Highlight)
Queries such as How can I change my password, how can I setup an account, or Where do I setup my email may represent common issues customers encounter and enquire about. There is no need for someone to come up with a new response here, there is probably documentation and a clear series of steps to take to resolve these issues. Instead of your customer support agent spending time looking for a previous similar query, or creating a new response, you would like them to be able to quickly retrieve the responses that were already provided. (View Highlight)
Sentence embeddings are interesting since they offer a number of ways in which you could potential use them in your business. They are versatile in how they can be deployed. For example, you could build a neural network to identify question and answer pairing. This is similar to the Amazon example we mentioned earlier where questions and answers are encoded and the network learns, via a supervised approach, to match correct pairings. It can then automatically identify an answer for a new incoming query. (View Highlight)
In this way, you use the embeddings to identify the best match between the new query and your saved queries. The saved queries are linked to a known answer. Your recommender would be a question-to-question recommender as opposed to a question-to-answer system. (View Highlight)