AI Translation for Omnichannel

How to start off this post? I’ve been trying to work out how exactly I can express my excitement around this new feature for Omnichannel. Included in the Wave 2 2020 release, it’s just AMAZING. That, however, doesn’t give it true justice. So let’s see how I can describe it properly to give it due respect.

Previously I’ve mentioned the ability to use skills within Omnichannel (see https://thecrm.ninja/omnichannel-for-dynamics-365-queues-users-skills/). This can be used to indicate, for example, agents who can communicate in a certain language. That’s useful of course, but what happens when you don’t have anyone who can speak the language that the customer wants to use? It’s a problem, and one that’s really not easily solved. At least, not until now.

So, what exactly does this new translation feature do? Simple – it translates from one language to another. OK, it’s actually a little more awesome than just that. Having delved into it quite a bit over the last week or so, there are (in my view) three main benefits (with a bonus one as well!):

  1. It translates incoming text from the customer (through chat) from the language that they’re using to the language that the agent is using
  2. It translates outgoing text from the agent (through chat) from the language that the agent is using to the language that the customer is using
  3. It translates text between agents from one language to the other & vice versa (eg on an internal consult)

Now for the bonus. It doesn’t just translate text from one language to another. It follows the languages being used! So if the customer switches in mid-conversation to a different language, the system picks it up. Not only is the new incoming language translated into the agents language, but the replies from the agent are shown in the (new) language being used by the customer. It’ll automatically show text in the ‘last used’ language, which is really quite incredible (at least in my opinion).

There’s no fiddling around of needing agents to select the language that they need, or anything else. It’s a simple click to turn it on, and then another click to turn it off. I’m going to go through the setup of it below, as there are a few fiddly bits that did confuse me for a bit.

It’s also possible to use different translation tools. At the time of writing this post, it’s possible to use Bing, Google or Azure translation models. I’m sure that there will be other options available in the future as well to use, which really opens up possibilities for clients with differing digital estates.

Translation happens in real time, so there’s no waiting around for it to actually get on with it. It’s displayed immediately on the screen for the agent to see.

Setup for translation

I found the general guides to be alright, but weren’t too clear on a few items. I’m therefore sharing below how I went about it, in order to get things working properly. Please be aware that this isn’t in the order specified in the documentation, but in retrospect means less switching between screens:

  1. Ensure that you have the latest updates to your Omnichannel environment (this is always a good idea, regardless of anything else!)
  1. Go to https://github.com/microsoft/Dynamics365-Apps-Samples/tree/master/customer-service/omnichannel/real-time-translation & download the ‘webResourceV2.js’ file there (if you’re unfamiliar with how to do this, click to open the file, click the ‘Raw’ button, and then save the page (ensure it’s got the ‘.js’ extension when you save it!).
  1. Ensure you have an API key to enter into the web resource file! This is what tripped me up at first. You can use any text editor (I use Notepad++) to open it up. How you get the API key will depend on the provider. For example, to set up a free account in Azure, take a look at https://docs.microsoft.com/en-us/azure/cognitive-services/translator/translator-how-to-signup. There are also some additional things that you can configure in the web resource file, but I’m not going to go into that here
  1. Go to your solutions (this can either be through the Classic interface, or through http://make.powerapps.com). You can either create a new solution to hold the web resource file, or alternatively if you have existing solutions that you’d deploy, you can add the web resource file to that. Either:
    1. In the classic interface, navigate to Web Resources, click to create a new web resource, and upload the file (ensure you select the type to be ‘Script (JScript)’, or
    2. In the modern interface, click the ‘New’ button, select ‘Web Resource’ from the ‘Other’ section, and then follow the steps above

Once it’s saved, it’ll give you a URL. Copy that, and publish the solution.

  1. Go to the Omnichannel Administration Hub, find ‘Real Time Translation’ under Settings, and set this to Yes. You can also select a default input language from the selection. Also enter the URL that you copied above. Save it
  1. You’re all done!

Agent Experience

Depending on how you’ve configured your web resource, auto translation will either by on by default, or be off. If it’s not on by default, the agent can simply click within their chat window to select it to be active:

Once active, it’ll then start to translate everything, in both directions. Below are side by side screens of the customer & agent experiences. You’ll note that the customer is seeing the initial agent response in English, as the agent was the first in the conversation

From the agent side of things, both the original language, as well as the translated language, are shown. The customer is only shown the language that they’re actually using

If the agent isn’t sure what language the customer is using (as it’s being auto-translated for them), they can hover over the text, and it’ll show the details for it:

If the agent will consult, or transfer the session to another agent, the second agent will see the conversation in the language that they are themselves using (with the original text as well). This allows for the possibility to pass a customer to a specialist to assist them, even if they don’t speak the same language! It’s really cool to see this in action.

Even more wonderfully, this is even stored down to the transcript level:

This is really opening up major new concepts that Omnichannel can be used for, which will be supported entirely by this feature. As I said at the beginning of this post, I’m absolutely excited for it, and we’re already envisioning how this will be able to empower our clients even more.

Do you have any questions around this? Can you think of any scenarios that this could solve for you? Drop a comment below – I’d love to hear!

Omnichannel & Sentiment Analysis (II)

I’ve previously touched upon sentiment analysis within Omnichannel in several articles (https://thecrm.ninja/omnichannel-sentiment-analysis/ and https://thecrm.ninja/omnichannel-supervisor-tools/). It’s really a great feature that allows agents to quickly & easily see how the customer is interacting. It also allows for supervisors to see at a glance how interactions are going overall.

With all of that, I thought it would be helpful to take a further look into how sentiment analysis actually works, so that we can understand it a little better.

Now, the actual nuts & bolts for sentiment analysis are provided by Azure Cognitive Services. There are a wide range of tools available through this, but we have no need to go into Azure to configure this. It’s a simple setting within Omnichannel to get it working, rather than needing to fiddle around with many different things:

However, what’s actually going on during a conversation, and how is the sentiment analysis worked out/calculated? We see the pretty little face icons (with the different colours), but how are these actually being set?

Well, there are two ways in which algorithms are used to calculate the sentiment that’s shown:

  • Natural language processing (NLP)
  • Machine learning (ML) algorithms

With these two ways methods, it’s possible to not only see what the current interactions are showing, but also to enhance the model to understand sentiment better.

Note: In a session that I presented recently, one of the attendees asked if it’s possible to train the model, to result in a custom algorithm. Unfortunately this isn’t possible to do – the machine learning that takes place is the general Azure one, rather than one for a single company or customer

The following diagram shows the sentiments that are used. They’re nicely colour-coded, for ease of reference as well:

When a customer interacts through Omnichannel, the sentiment shown is based on the last 6 messages received from the customer. As a result, the sentiment shown can very well fluctuate & change during the conversation, based on how it’s going.

The Sweetest Languages in the World - | Beyond Exclamation

Obviously, customers aren’t just going to use English to communicate. Companies are based around the world, and will use their native/local language when providing support. Omnichannel allows for this without an issue, utilising the Azure Text Translator API behind the scenes to provide this. If you’re interested to see which languages are supported for this, head to https://docs.microsoft.com/en-us/azure/cognitive-services/translator/language-support which is the latest source of information for this.

There are some interesting things to know around how this actually works:

  • When a language other than English is used, the Text Translator API translates the text to English, and then it’s analysed/scored for sentiment
  • If a language isn’t supported by the Text Translator API, it won’t be scored
  • If profanity (eg a swearword) is detected, the sentiment will automatically be shown as Negative or Very Negative, regardless of the rest of the last 6 lines of conversation

Some people have expressed their concern to me around how accurate the Azure translation actually is, but to date I haven’t seen any major concerns resulting out from it. As with the other Azure services, Microsoft is continually refining & improving it. That being said, there are several languages with very nuanced terms. I’d like to think that these would be supported without issues.

There is, however, somewhat of an interesting behaviour when starting off the analysis at the beginning of the conversation:

  • If the initial language is detected as English, it’s assumed that all of the subsequent conversation will be in English. As a result, if the customer switches away from English, the system won’t recognise this, and a Neutral sentiment score will be shown
  • If the initial conversation is not in English, then the system will check every conversation line & re-detect the language as necessary.

This seems somewhat strange to me, as I’d have thought that the system would automatically check the language for each conversation line. I can think of plenty of scenarios where different languages are used in a single conversation, even if it does start with English being used. I’d like to think that this will be updated at some point, to make the experience better.