MB-280: Microsoft Dynamics 365 Customer Experience Analyst

It’s been a while since taking a Microsoft certification exam, but with the new MB-280 exam being launched in the last few days, I’ve obviously needed to take a look at it! It felt a little strange, as I’m now used to the certification renewal process (which is why I haven’t taken any exams in a while), but thankfully things went alright with the overall exam.

For those who haven’t been following the news, Microsoft made an announcement a few months back that some exams would be retiring, and the new MB-280 exam would be the replacement for this. In short, this is supposed to replace the MB-210 (Sales), MB-220 (Customer Insights – Journeys) & MB-260 (Customer Insights – Data). Malin Martnes wrote a good blog post in June – I’d suggest to take a look at it at for more general information around it.

Now I’m all up for new certifications being created & made available. However, and I know this could be considered controversial, I have ABSOLUTELY NO IDEA as to why this exam was created in THIS specific way. If an exam had been created, for example, to bring together the two sides of Customer Insights (ie to cover both Data & Journeys in a single exam), I think that would have been quite good.

But with having taken this, my thoughts (& feedback to Microsoft directly) is that they should un-deprecate (if that’s a word/phrase?) the MB-210 exam, and continue it forward. There’s no reason that I can see having Marketing & Sales together in a single exam – it feels like two (or technically 3?) lego bricks lumped together without any rhyme or reason.

The learning path for the exam was also launched in the last few days, and can be found at Study guide for Exam MB-280: Microsoft Dynamics 365 Customer Experience Analyst | Microsoft Learn

The official description of the exam is:

As a candidate for this exam, you’re a Microsoft Dynamics 365 customer experience analyst who has:

  • Participated in or plans to participate in Dynamics 365 Sales implementations.
  • An understanding of an organization’s sales process.
  • An understanding of the seller’s perspective (user experience).
  • The ability to demonstrate Dynamics 365 Customer Insights – Data and Customer Insights – Journeys capabilities.

You’re responsible for configuring, customizing, and expanding the functionality of Dynamics 365 Sales to create business solutions that support, automate, and accelerate the company’s sales process. You use your knowledge of customer experience capabilities in Dynamics 365 Sales and Microsoft Power Platform to inform the following design and implementation tasks:

  • Configure Dynamics 365 Sales standard and premium features.
  • Implement collaboration features.
  • Configure the security model.
  • Perform Dynamics 365 Sales customizations.
  • Extend Dynamics 365 Sales with Microsoft Power Platform.
  • Deploy the Dynamics 365 App for Outlook.

As a candidate, you need:

  • An understanding of the Dataverse security model and features, including business units, security roles, and row ownership and sharing.
  • Experience configuring model-driven apps in Microsoft Power Apps.
  • An understanding of accounts, contacts, and activities.
  • An understanding of leads and opportunities.
  • An understanding of the components of model-driven apps, including forms, views, charts, and dashboards.
  • An understanding of model-driven app personal settings.
  • Experience working with Dataverse solutions.
  • An understanding of Dataverse, including tables, columns, and relationships.
  • Familiarity with Power Automate cloud flow concepts, such as connectors, triggers, and actions.

More can be found at the exam page itself, which is located at Exam MB-280: Microsoft Dynamics 365 Customer Experience Analyst (beta) – Certifications | Microsoft Learn

Now during my exam, I was looking forward to seeing the ‘new’ capability around being able to use Microsoft Learn during the exam (new to me – as I haven’t taken any other exams in the last year or so since it was announced!). However there didn’t seem to be any capability to launch Microsoft Learn – I’m not sure why it wasn’t available, as this isn’t a Fundamental level exam

Questions also used the older terms of references rather than the newer/accepted terms – ie using ‘field’ instead of ‘column’, and ‘entity’ instead of ‘table’. Again, I have no idea why this is – all other exams (including the renewals for them) are using these properly (in my summary below I have ensured I use the correct terms).

So, as I’ve posted before around my exam experiences, it’s not permitted to share any of the exam questions. This is in the rules/acceptance for taking the exam. I’ve therefore put an overview of the sorts of questions that came up during my exam. (Note: exams are composed from question banks, so there could be many things that weren’t included in my exam, but could be included for someone else!). It’s also in beta at the moment, which means that things can obviously change.

I’ve tried to group things as best together as I feel (in my recollection), to make it easier to revise.

  • Sales Apps
    • Configuring forms, columns & tables
    • Configuring security roles & access to records
    • Configuring relationships between records (including deletion properties)
    • Sales Mobile App – security & deployment
    • Forecasting – setting up & configuring
    • Configuring Goals
    • Configuring Opportunities
    • Handling currencies
  • Copilot for Sales
    • Setting up & deploying to users
    • Configuring access
  • Outlook App
    • Deploying & setting up
    • Configuring forms & information
  • Exchange
    • Connecting to mailboxes
    • Configuring folder permissions
    • Configuring multiple domains
  • Product Families & Catalogue
    • Creating & setting up
    • Configuring options
    • Adding items to be used
  • Price Lists
    • Creating & setting up
    • Configuring options, including discounts
    • Using time-restricted price lists
    • Handling currencies
  • Document Management
    • Different document management capabilities
    • Usage of SharePoint in different ways
  • Data Import
    • Usage of Power Query
    • Data manipulation
    • Handling duplicate records
  • SMS
    • Setting up & configuring SMS provider
  • Journeys
    • Different triggers to use based on scenarios & requirements
    • How to trigger journeys
    • How to set up emails to be used within a journey
  • Segments
    • Different types of segments
    • Creating & modifying segments
  • Searching/Filtering
    • Using Advanced Find
    • Setting up/modifying queries to include/exclude records based on conditions
  • Business Process Flows
    • Modifying business process flows
    • Handling conditions within business process flows

As a Sales exam, it seemed alright. But as mentioned above, the Customer Insights questions just seemed strange to me – I’d expect a consultant to be very technically skilled in Customer Insights, but not in Sales (& vice versa), so I’m not understanding bringing these two sides together.

I’m going to be quite interested in seeing how the exam is actually launched (as it’s currently in Beta of course). Having chatted with a few others who have taken the exam (whilst obviously respecting the NDA!), they also can’t really understand the landscape. Personally, I think that if it continues like this, Microsoft is going to hear quite a few complaints around it.

I hope that this is helpful for anyone who’s thinking of taking it – good luck, and please do drop a comment below to let me know how you found it! I’d also be interested in your thoughts/opinions around the direction that Microsoft has taken for this!

Calculated columns not working with data migration

Interesting title, isn’t it? I thought to do something that might grab peoples attention, and this was the best that I could come up with! So, let’s get into the scenario, the issue experienced, and how we managed to resolve it.

The scenario on this project was as follows. We’ve been implementing a customer service solution for a sales company, that manufacture multiple products, under multiple brands. Currently there are multiple systems used for order entries, which at some point will be moved to a single system.

However for the moment, they’re wanting to be able to carry out holistic customer service across all brands, to be able to enable all customer service agents to have access to the same data, customers able to be serviced in the same way, regardless of brand, etc.

rectangular brown wooden table

As a result, Dynamics 365 Customer Service was the ticket, and has many standard capabilities that addresses the need of the customer.

Now, whilst sales (aka orders) will not be handled within Dynamics 365 itself, we didn’t want the customer service agents to have to look up order information in the ordering systems. Instead, we wanted to be able to bring the sales/order information into Dynamics 365 for reference (at some point it’s likely that the customer will actually use Dynamics 365 capabilities for sales as well).

In order to do this, we’ve had some amazing data architects bringing the data together into Azure Data Factory (ADF)) from the multiple order systems, and then pushing the data into Dynamics 365 (users have read-only view of it).

With bringing in the data, we were looking to capitalise on the native functionality of Dynamics 365, namely the ability for columns to be automatically calculated. An example of this would be bringing in the order line amount, the tax amount, and then having the total order line amount automatically calculated. This is standard system functionality, and when working in Dynamics 365, has many different uses across the system.

Now, it’s important to note here that as we’re not actually handling orders within Dynamics 365, we’re also not holding a ‘proper’ product list within Dynamics 365 itself. However, orders need to show product information on them (bit useless otherwise!), so we’re using the capability of ‘write-in products’.

Note: If you haven’t come across write-in products before, it’s actually a really great item. Essentially, it allows products to be entered for opportunities, quotes, orders etc (wherever products are used), but for when the product/s aren’t in the system product catalogue. Write-in products allow you to simply type the name of a product or service, & then type in the price. This is very useful if, for instance, a product isn’t yet available in the product catalogue, but you still want to be able to quote it. In our scenario, we’re using write-in products to avoid the need to manage the product catalogue itself. It’s also helpful for when you don’t want to use price lists, as all products need to be associated to a price list.

So we start off the data migration, and it’s looking good. No issues being reported by the integration…

But, then users go in to the UAT system to check through things, and find that when looking at orders, the totals aren’t being calculated:

Order line not calculating
Order not calculating either!

Hmm. That’s strange. So we started to look at what could have caused this problem…

  • Is the environment in ‘admin mode’? If an environment is in admin mode, then auto-calculations won’t work at all. Well, the environment wasn’t in admin mode, so it wasn’t that
  • Is there a plugin not firing correctly? Well, this is native Microsoft standard functionality within the platform, so unlikely, but we double-checked to make sure. No, there wasn’t anything causing issues in that dimension
  • Does it work for users, when it’s created manually within the system? Yes, it DOES work when users enter an order/order line with a product. Hmm…this was getting VERY confusing

For clarification, we didn’t want to auto-calculate the information within ADF, and then push it into the relevant Dynamics 365 columns. We wanted to be able to rely on the system working in the way that it should!

Finally, we found out why the calculated columns weren’t working. There’s actually a system setting that governs how this works:

With this set, the auto calculations are now working in the system:

So, thankfully we managed to get this working, and everything went smoothly from that point.

Have you ever been caught out by something similar? I’d love to hear – please drop a comment below!

Working with Opportunity Close table

I’ve recently had the experience of working with the Opportunity Close functionality within Dynamics 365, and given what occurred, thought it would be useful to document this so that others are able to see this as well. There are many scenarios in which we’d use this, and being able to give a comprehensive solution to clients does make all of the difference!

There are three areas that I’d like to cover:

  • Working with Opportunity Close table
  • Challenges with data
  • Power Automate to the rescue!
  • Caveats

So let’s get started then!

Thanks to various members of the community such as Matt Collins-Jones, Andrew Bibby & others, who helped me along the way

Working with Opportunity Close

The Opportunity Close functionality within Dynamics 365 (& yes, I’m going to refer to it as this, rather than Power Platform) is used to provide information around why an opportunity is being closed. This is regardless of whether the opportunity has been won, or it’s been lost. It’s still quite important to track the information around it, so that companies can understand better how the market views the products it offers, how it stacks up against others, etc.

The default path in the system is to create a lead, and then qualify it. Qualifying a lead then automatically creates an opportunity record, which further information (quotes, etc) can be entered against. An account record (if company information is specified) is also created:

Updated Solution Release: Lead Qualification Version 2.0.0 for Microsoft Dynamics  365

On the opportunity record, users are able to show if it’s been won or lost by clicking an appropriate button on the toolbar:

Doing this brings up the Opportunity Close pane on the right hand side of the screen:

Now it’s possible to customise this screen. In fact, the screenshot above shows 3 custom columns that have been added to it already in the system I was in.

To do this, we go to customise the solution (in the Maker Experience), and add the column/s that we’re wanting to:

Next, we need to remember to add it to the form! Otherwise it’s not going to show up. If we’re wanting it to appear on the side bar, then it’s important to customise the ‘Quick Create’ form version, to make our customisations show up.

Note: We’re able to put conditional visibility of the column/s if we want to, based on whether the opportunity is won or lost, using Business Rules. I haven’t done so in this scenario, but you’re obviously able to do so if you want to

Remember to save & publish the form, and then it’ll display within the system for users. Brilliant!

Challenges with data

So we’ve gone ahead & created the custom columns, and users are actually using them to record data. Wonderful – that’s exactly what we’ve been wanting to achieve.

OK – let’s now review the data so that we can see overall what’s happened with our opportunities. Of course we’re wanting to do this simply & easily, so we’ll open an Advanced Find window, go to the Opportunity Close table, add columns from the associated Opportunity, and….hold on. Opportunity Close ISN’T displaying in the Advanced Find????

It’s just NOT there. In case you’re wondering if you saved/published things correctly, or forgot some system setting, stop worrying. It’s not you – it’s the system.

See, Opportunity Close, though a table in its own right, is a SPECIAL sort of table. It doesn’t show up, and can’t be directly queried. I know – frustrating. I felt exactly the same way.

On digging deeper into things, I found out that there’s actually an activity record saved. It’s possible to query against this:

However, and this is the BIG catch, it’s NOT possible to return custom columns when carrying out this query. The search will ONLY return the (system) columns that are present for activities. So this leaves us with a problem.

Essentially, though we can set up custom columns to track the data that we’re needing to, it’s not possible (through the front end) to query it. This sort of negates what we’re trying to achieve here overall, and is a pain.

So what’s the way round it? Well, it’s actually going to be Power Automate!

Power Automate to the rescue

In order to handle our issue, what we need to do is the following:

  • Add custom columns to the Opportunity table (these should mimic the custom columns that we’ve added to the Opportunity Close table)
  • Use Power Automate for automation purposes!

The first step is easy. We need to go & create custom columns on the Opportunity table. These WILL show up in the Advanced Find search. They obviously need to be the same as the custom columns on the Opportunity Close table. If we’ve used Choice or Choices there, point the Opportunity column to the same source (it’s a good argument for using Global, rather than Local, choice/s).

We then can go and create a Power Automate. This should trigger when an Opportunity Close record is created.

Note: For this, I’ve made it so that it runs under the user triggering the action, rather than a system account. This is to keep in line with licensing limits etc

You’ll then need to add a ‘Get Dataverse row’ step, and get the Opportunity Close record that has just been created. This is annoying, but for some strange reason the trigger doesn’t present the custom columns/values in the JSON that it returns. Hopefully Microsoft fixes this at some point, but for the moment, we need to work around it.

The last step is to add a ‘Update Dataverse row’. This should point to the Opportunity table, & we can simply map the values across (from the SECOND step, NOT the first one – VERY IMPORTANT).

Once this is all done, save & test it, and you should see it working. I generally don’t add the Opportunity custom columns to the form, but rather leave them for querying against.

Caveats

It’s important to keep in mind that when an opportunity is marked as either won or lost, it’s then closed, and changed to a read-only state. That’s how the system is designed to be, and makes sense.

However it’s ALSO possible to re-activate a closed opportunity, and then close it again. Ie a single Opportunity record could have multiple Opportunity Close records against it. This solution won’t handle this (it would need to be built out further – the Opportunity record itself will only show the values from the latest Opportunity Close action, so please do keep this in mind!

Have you ever come up against something like this? How have you handled it? I’d love to hear – please drop a comment!

Data Export Service Connection Issues

This is a slightly different post from the usually stuff that I talk about. It’s much more ‘techy/developer’ focused, but I thought it would be quite useful still for people to keep in mind.

The background to this comes from a project that I’ve been working on with some colleagues. Part of the project involves setting up an Azure SQL database, and replicating CDS data to it. Why, I hear you ask? Well, there are some downstream systems that may be heavy users of the data, and as we well know, CDS isn’t specifically build to handle a large number of queries against it. In fact, if you start hammering the CDS layer, Microsoft is likely to reach out to ask what exactly you’re trying to do!

Therefore (as most people would do), we’re putting in database layer/s within Azure to handle the volume of data requests that we’re expecting to occur.

Azure SQL Database | Microsoft Azure

So with setting up things like databases, we need to create the name for them, along with access credentials. All regular ‘run of the mill’ stuff – no surprises there. In order for adequate security, we usually use one of a handful of password generators that we keep to hand. These have many advantages to them, such as ensuring that it’s not something we (as humans) are dreaming up, that might be easier to be guessed at. I’ve used password generators over the years for many different professional & personal projects, and they really are quite good overall.

Sordum Random Password Generator Creates Random Passwords with Ease -  MajorGeeks
Example of a password generation tool

Once we had the credentials & everything set up, we then logged in (using SQL Server Management Studio), and all was good. Everything that we needed was in place, and it was looking superb (from the front end, at least).

OK – on to getting the data actually loaded in. To do this, we’re using the Data Export Service (see https://docs.microsoft.com/en-us/power-platform/admin/replicate-data-microsoft-azure-sql-database for further information around this). The reason for using this is that the Data Export Service intelligently synchronises the entire database initially, and thereafter synchronises on a continuous basis as changes occur (delta changes) in the system. This is really good, and means we don’t need to build anything custom to handle it. Wonderful!

Setting up the Data Export Service takes a little bit of time. I’m not going to go into the details of how to set it up – instead there’s a wonderful walkthrough by the AMAZING Scott Durow at http://develop1.net/public/post/2016/12/09/Dynamic365-Data-Export-Service. Go take a look at it if you’re needing to find out how to do it.

So we were going through the process. Part of this is needing to copy the Azure connection string into into a script that you run. When you do this, you need to re-insert the password (as Azure doesn’t include it in the string). For our purposes (as we had generated this), we copied/pasted the password, and ran things.

However all we were getting was a red star, and the error message ‘Unable to validate profile’.

As you’d expect, this was HIGHLY frustrating. We started to dig down to see what actual error log/s were available (with hopefully more information on them), but didn’t make much progress there. We logged in through the front end again – yes, no problems there, all was working fine. Back to the Export Service & scripts, but again the error. As you can imagine, we weren’t very positive about this, and were really trying to find out what could possibly be causing this. Was it a system error? Was there something that we had forgotten to do, somewhere, during the initial setup process?

It’s at these sorts of times that self-doubt can start to creep in. Did we miss something small & minor, but that was actually really important? We went over the deployment steps again & again. Each time, we couldn’t find anything that we had missed out. It was getting absolutely exasperating!

Finally, after much trial & error, we narrowed the issue down to one source. It’s something we hadn’t really expected, but had indeed caused all of this to happen!

What happened was that the password that we had auto-generated had a semi-colon (‘;’) in it. In & of itself, that’s not an issue (usually). As we had seen, we were able to log into SSMS (the ‘front-end’) successfully, with no issues at all.

However when put into code, Azure treats the semi-colon as a special character (a command separator). It was therefore not recognising the entire password, which was causing the entire thing to fail! To resolve this was simple – we regenerated the password to ensure that it didn’t include a semi-colon character within it!

Now, this is indeed something that’s quite simple, and should be at the core of programming knowledge. Most password generators will have an option to avoid this happening, but not all password generators have this. Unfortunately we had fallen subject to this, but thankfully all was resolved in the end.

The setup then carried on successfully, and we were able (after all of the effort above) to achieve what we had set out to do initially.

Have you ever had a similar issue? Either with passwords, or where something worked through a front-end system, but not in code? Drop a comment below – I’d love to hear!

Omnichannel – Data Masking Rules

There are various scenarios in which companies would be quite keen to utilise data masking rules. Examples of these include:

  • Masking credit card details, so that firstly the company isn’t required to comply with credit card handling information requirements, and secondly (and potentially more importantly!) there’s no risk of an agent copying down a credit card number, and using it fraudulently
  • Masking personal information – if a customer mistakenly types in their Social Security Number, UK Tax Reference, etc, then the company is likely to want to avoid having their support agents seeing this sensitive information
  • Socially offensive language – companies will be extremely keen to avoid having their staff exposed to offensive language and behaviour

There are obviously other examples of these as well – you can feel free to let your mind run free as to the possibilities.

Omnichannel for Dynamics includes what are referred to as ‘Data Masking Rules’. Using these, you can create rule/s that will then be used to identify the undesired words (or other types of data) within a conversation, and these will then be automatically masked with the asterisk (*) character. Data masking works for chat and async channels.

Now, a few things to keep in mind. Data masking is done through the use of regular expressions, also know as ‘regex’ (when I heard this, I needed to go and look up what a ‘regular expression’ is, as I had no idea! If you have no idea as well, take a look at https://en.wikipedia.org/wiki/Regular_expression , which has a good summary of things).

Currently, there is a HARD LIMIT of 10 data masking rules. Yes, you read that correctly. You are ONLY able to have 10 rules saved. However if you play cool, there’s a way around it, thanks to the way that regex expressions can work.

So, let’s look at how to set up and configure these data masking rules. As with practically everything else that’s set up in Omnichannel, you’ll need to be in the Omnichannel Administration Hub. Scroll down in the left hand menu, and you’ll find it under the Settings section:

Click on it to open, and you’ll presented with a slightly different looking screen than you’re used to. It’s not a general view/list of records – it’s a static screen, with a grid of rules on the right:

The two settings on the left hand of the screen are really quite important. They are:

  • ‘Mask private agent data from the customer’. What this does is mask data that the agent is sending, for both the agent and the customer (ie the agent won’t be able to see the data either, even though they’re typing it). This applies for both live chat and async channel messages.
  • ‘Mask private customer data from the agent’. This will mask data that the customer is sending to the agent, for both the customer and the agent. This is only masked for live chats for both; when using async channels, it’s only masked for the agent interface (ie the customer will still be able to see the data)

Note: If only the first option is enabled, then if the customer types in data matching the masking rule, it will not be hidden, and vice-versa for the second option. It’s therefore vitally important to consider all of the relevant scenarios for the company, and apply these settings appropriately.

Note: Data masking isn’t just through the chat interfaces – it’s also how the data is stored. So if you open up a transcript, or get to check the data within the database, it’s also masked there, which means that it’ll allow you to be fully compliant with everything! A side effect of this is that the sentiment analysis

There are 3 rules provided for you out of the box (in an inactive state). They are:

  • Credit Card: Masks the credit card number, if provided in a message.
  • Email: Masks the email address, if provided in a message.
  • SSN: Masks the SSN, if provided in a message.

Opening up one of these shows us how Microsoft is implementing these:

There’s the name (which you can put whatever you want to), and a description (which is always helpful and useful!). Then there’s the regular expression (I’m not going to go into details as to how to actually put these together – there are plenty of resources out there. Take a look at https://docs.microsoft.com/en-us/dotnet/standard/base-types/regular-expression-language-quick-reference as a starting point). The character used for masking can’t be changed (at this point in time – perhaps in the future they’ll allow this to be changed).

The really great thing from my perspective is the testing area on the right hand side of the form. Here you can input your text, and see if it actually matches the conditions for the regex (or not, as the case may be). This will allow tweaking of the regex etc:

Just please be aware that you need to click or tab out of the ‘enter test data’ field in order for the ‘masked test data’ value to update

Once you have a rule in place, save it, and click ‘Activate’! Otherwise the rule will be saved, but won’t actually work!

Now, remember that I mentioned above about the limit of only TEN data masking rules? Well, here’s a great little tip as to how to work around this, as there are many legitimate examples of needing many more than ten!

So how do you go about doing this? Well, it goes something like this:

  1. The ‘Regular Expression’ field is actually of type ‘Text’ (with it being ‘Single line of text’). This means that it can actually hold up to 4000 characters in it (according to https://docs.microsoft.com/en-us/dynamics365/customerengagement/on-premises/customize/types-of-fields )
  2. There’s a cute little character that exists – this is the ‘|’ character (also known as the pipe character)
  3. Using the pipe character, you can separate regex expressions, which will be evaluated separately
  4. Therefore you can ACTUALLY have multiple regex expressions in a single data masking rule!

Let’s see this is set up!

And now to see it through the actual interface:

So with this, though it might take a bit of time and testing (and double-checking, to make sure it’s working absolutely correctly, of course), it’s possible to have quite a lot of regex expressions for you to use!