Changes in the FTRSA Program

Firstly for those who are not aware, the acronym ‘FTRSA’ stands for ‘Fast Track Recognised Solution Architect’. This is an award that Microsoft bestows on people working for Microsoft Partners who have demonstrated clear technical expertise & understanding of the Microsoft Business Applications Platform at (enterprise) scale.

To quote from the Microsoft documentation for the program:

The FTRSA designation is awarded by Microsoft’s Business Industry & Copilot (BIC) engineering team to enterprise solution architects who exhibit outstanding expertise in architecture and deliver high-quality solutions. Recipients are typically nominated based on their exceptional skills, extensive experience with Microsoft products, relevant certifications, and leadership in projects.

The award covers two main areas – Power Platform & Dynamics 365, with different capabilities under each area.

The program has been around for 6 years now (since 2019), with people needing to submit for annual (re)award & recognition. On average, approx. 120 people are recognised with this award globally. It is definitely something that Microsoft Partners can place a large emphasis on if they have people with this!

Generally over the last few years, the categories for being awarded have included:

  • Power Apps
  • Power Automate
  • Power BI
  • Dynamics 365 (CE)
  • Dynamics 365 (ERP)

Changes over the last few years have included the Power BI category being retired. This is to be expected, I guess, given that Microsoft programs tend to flex/pivot over time.

The process for application is simple. By this, I mean that nominees need to fill in a form (located at https://aka.ms/FTRSANomination). In this form, they then need to provide various pieces of information, such as their personal information, the partner that they work for (including the Microsoft Partner ID), as well as submitting proofs to show that they currently fulfil the necessary requirements for the program. These requirements can vary based on the technology, and over the last few years I’ve seen a few different versions (based on the year).

The form is usually open for around 3 months or so, opening at some point in October, and closing at some point in January.

Once submitted, the information is then sent to the relevant Microsoft team who oversee & run the program for review. There are several stages to the review that is carried out:

  1. The team carry out an initial review of the information provided, ensuring that it meets the program requirements. Applicants who have not provided the information to meet the program requirements/criteria, or who do not pass the initial review threshold as evaluated by the team (this is why applicants are recommended to ensure that they’re focusing on quality of information being submitted), are not progressed and are notified.
  2. Applicants who pass the first stage are then invited to an interview. This is carried out with one of the wider team members, based on region & availability. The interview usually lasts around one hour, and is an evaluation of the technical skills & expertise of the applicant. During this interview, candidates are required to present on a project that they have implemented, and to demonstrate their in-depth knowledge & role that they played on the project.
  3. Finally, the team reviews the interviews, and decides as to which applicants have successfully shown their skills & expertise. Applications who have not met the level required are notified, along with feedback and areas that they could look to work on for a future nomination.
  4. Successful applicants are notified as well directly, though the news is not publicised until May or so, when the public announcement takes place with the relevant FTRSA websites being updated with their information.

Business Contributions

Having taken a look at the nomination form for this year, there are some new changes coming in that will be quite important (in my opinion) to pay attention to. These are being referred to as ‘Business Contributions’. Specifically, applicants will not only need to demonstrate technical/project expertise, but will also need to demonstrate one or more business contributions.

Depending on the technical area being selected for the application (Power Apps or Dynamics 365), these are the areas that contributions can be submitted for:

Power Apps

  • Published Microsoft Customer Stories or Microsoft Partner Stories, or evidence of nomination to be published
  • Contribution of product feedback to engineering teams, advisory boards, focus groups, communication forms or private preview programs
  • Published technical samples (e.g. code snippets, data migration templates, integration samples, etc) in the PowerCAT GitHub channel
  • Proof of escalation reduction in customer implementations
  • Reference architecture article/s used with a customer that leverages the Power Platform Well Architected framework

Dynamics 365

  • Onboarded customer implement project(s) in the Dynamics 365 implementation portal, leveraging Dynamics 365 guidance hub frameworks
  • Published Microsoft Customer Stories or Microsoft Partner Stories, or evidence of nomination to be published
  • Contribution of product feedback to engineering teams, advisory boards, focus groups, communication forms or private preview programs
  • Published technical samples (e.g. code snippets, data migration templates, integration samples, etc) in the Dynamics 365 guidance hub
  • Published contributions to the Business Process Guide Catalogue
  • Proof of escalation reduction in customer implementations (either partner led or FastTrack led implementation)
  • Submit additional reference architecture articles for review and potential publication

This is a significant change for the program – for the last 6 years, it’s been purely expertise recognised from client engagements. Now (in the 7th year, and I’d think very likely going forward), people considering nominating for FTRSA will need to prove that they’re giving back to Microsoft in some way, other than just running client engagements.

Overall, I think this is an interesting concept, and generally a good one. Let’s face it – being able to talk about technology (at scale) is something quite a few people can do, but it doesn’t meant that they’re necessarily good at it. I know of several over-architected projects that I was brought in on, where just because lots of technology components were used, didn’t mean it was doing well. Part of the skillset as an experienced/knowledgeable architect is also when less is more!

Additionally, being technically competent is of course important, but personally I believe that being able to be clear & communicative is also a very important role for a solution architect. Essentially having that functional view, as well as being able to engage appropriately with customers (as the owner of the project) is vital as well. One of the

I also think that Microsoft is wanting to see that the program in which they’re investing time, effort & resources (yes, FTRSA’s get a wonderful SWAG box – THANK YOU TEAM!) are providing ROI back into Microsoft in terms of feedback, input & other information. This way products can (hopefully!) get better, visions can be assisted with customer information, and others can be helped as well.

Some people may say that this is becoming more like the Microsoft MVP program. Given how much MVP’s are required to do, in terms of community (& Microsoft) engagement, I can understand the thoughts, but really don’t think that it’s anything anywhere near to that. My only note on this would be that I hope that contributions remain business/technical focused, which to me seems in line with the stated goals of the program, rather then also include (other) community contributions.

Of course, there are those people who may choose not to do such things, and just focus on the project/s that they’re working on. This is a valid scenario, and there is of course absolutely NOTHING wrong with this. Not all of us may wish to engage with Microsoft engineering teams, or provide information publicly. And that’s all fine. However I would politely point out that nothing remains static, and if you’re wanting to receive (or continue to receive) the FTRSA award, you may need to do some thinking around how you’re approaching it, with the change that’s come this year.

I’d also encourage people who are considering applying for the FTRSA award recognition to reach out to an existing FTRSA, who could possibly help mentor, review & guide you. They’ve already been through the process and are recognised as such, and therefore have a pretty good idea of what ‘hits the bar’ and what may not.

So if you’re thinking of going for it – I wish you the best of luck!

Error in Customer Insights – Data

Not a long blog post, but something that may come in handy for some people!

I was recently playing around with Customer Insights – both the Data & Journeys side of thing for a Proof of Concept I was creating for a customer. It’s definitively interesting to see how Microsoft have been evolving the product over the last year or so (which was the last time I played around with it).

One of the components that we were very interested to play around with specifically is the ‘Discovery’ part of Customer Insights – Data. As shown in the screenshot below, this is where you’re able to use natural language to query your data, to then get results using AI. This means that you don’t have to understand any specific query language (SQL, R, M etc), but rather just ‘converse’ with it as you would another person.

You’ll perhaps note that there’s NO mention of Copilot here, though perhaps Microsoft may at some point decide to call this Copilot functionality as well?

The team had loaded in the data – we had a fair few number of rows (multiple millions of them!), gone through the unification process, enrichment process, etc etc. All of this was set up & working properly.

However, when I tried to go to the ‘Discovery’ tab in my own browser, I was getting an extremely strange error:

As you can see, it’s incredibly informative…NOT!!! I mean, what does ‘Value cannot be null. (Parameter ‘key’)’ actually mean to the average person?

At first, I thought it was something to do with the underlying data, so went back to check that. However the data seemed fine. Furthermore, other people on the team were able to access Discovery in their own browsers without any issues.

Having no other option (turning it off & on again didn’t work), I raised a support ticket with Microsoft. This was responded to in a timely fashion, and I found myself working with Rohan, the Microsoft Support Representative.

In my initial ticket submission, I had included details of what was going on, what I had clicked on, that others in the team didn’t have the problem, the Organisation ID, URL’s – you name it!

Rohan jumped on a call with me, and it turned out to be the shortest support session I have EVER had. He asked me to change the system language to another language (I had been using English, to decided to change it to German). Once the language change had been applied, we navigated back to the ‘Discovery’ tab (all in German, I may add), and when the screen loaded, there was no error! Holding our breath, we then changed the language back to English (more complicated than I had imagined, with navigating in a different language).

Once this was done, and everything was back in familiar English, the ‘Discovery’ tab then loaded without issues (again!), and I was able to go ahead and start running queries in it using natural language. It was great!

In fact, it’s actually taken me longer to type out the above than the length of the support call – it was indeed that quick! Obviously lots of praise to Rohan (he did mention he had seen this once before, which is why he knew how to fix the issue).

The bigger question in my mind is what exactly was happening/going wrong underneath – I have asked this to Microsoft, but haven’t gotten a response. My guess is that something in the user/language settings hadn’t been populated properly, and therefore resulted in that error message. Updating/changing this forced it to then populate properly, and it worked.

Have you ever seen something like this, where changing a system setting (such as language) helped resolve an issue? I’d love to hear more about it –

MB-280: Microsoft Dynamics 365 Customer Experience Analyst

It’s been a while since taking a Microsoft certification exam, but with the new MB-280 exam being launched in the last few days, I’ve obviously needed to take a look at it! It felt a little strange, as I’m now used to the certification renewal process (which is why I haven’t taken any exams in a while), but thankfully things went alright with the overall exam.

For those who haven’t been following the news, Microsoft made an announcement a few months back that some exams would be retiring, and the new MB-280 exam would be the replacement for this. In short, this is supposed to replace the MB-210 (Sales), MB-220 (Customer Insights – Journeys) & MB-260 (Customer Insights – Data). Malin Martnes wrote a good blog post in June – I’d suggest to take a look at it at for more general information around it.

Now I’m all up for new certifications being created & made available. However, and I know this could be considered controversial, I have ABSOLUTELY NO IDEA as to why this exam was created in THIS specific way. If an exam had been created, for example, to bring together the two sides of Customer Insights (ie to cover both Data & Journeys in a single exam), I think that would have been quite good.

But with having taken this, my thoughts (& feedback to Microsoft directly) is that they should un-deprecate (if that’s a word/phrase?) the MB-210 exam, and continue it forward. There’s no reason that I can see having Marketing & Sales together in a single exam – it feels like two (or technically 3?) lego bricks lumped together without any rhyme or reason.

The learning path for the exam was also launched in the last few days, and can be found at Study guide for Exam MB-280: Microsoft Dynamics 365 Customer Experience Analyst | Microsoft Learn

The official description of the exam is:

As a candidate for this exam, you’re a Microsoft Dynamics 365 customer experience analyst who has:

  • Participated in or plans to participate in Dynamics 365 Sales implementations.
  • An understanding of an organization’s sales process.
  • An understanding of the seller’s perspective (user experience).
  • The ability to demonstrate Dynamics 365 Customer Insights – Data and Customer Insights – Journeys capabilities.

You’re responsible for configuring, customizing, and expanding the functionality of Dynamics 365 Sales to create business solutions that support, automate, and accelerate the company’s sales process. You use your knowledge of customer experience capabilities in Dynamics 365 Sales and Microsoft Power Platform to inform the following design and implementation tasks:

  • Configure Dynamics 365 Sales standard and premium features.
  • Implement collaboration features.
  • Configure the security model.
  • Perform Dynamics 365 Sales customizations.
  • Extend Dynamics 365 Sales with Microsoft Power Platform.
  • Deploy the Dynamics 365 App for Outlook.

As a candidate, you need:

  • An understanding of the Dataverse security model and features, including business units, security roles, and row ownership and sharing.
  • Experience configuring model-driven apps in Microsoft Power Apps.
  • An understanding of accounts, contacts, and activities.
  • An understanding of leads and opportunities.
  • An understanding of the components of model-driven apps, including forms, views, charts, and dashboards.
  • An understanding of model-driven app personal settings.
  • Experience working with Dataverse solutions.
  • An understanding of Dataverse, including tables, columns, and relationships.
  • Familiarity with Power Automate cloud flow concepts, such as connectors, triggers, and actions.

More can be found at the exam page itself, which is located at Exam MB-280: Microsoft Dynamics 365 Customer Experience Analyst (beta) – Certifications | Microsoft Learn

Now during my exam, I was looking forward to seeing the ‘new’ capability around being able to use Microsoft Learn during the exam (new to me – as I haven’t taken any other exams in the last year or so since it was announced!). However there didn’t seem to be any capability to launch Microsoft Learn – I’m not sure why it wasn’t available, as this isn’t a Fundamental level exam

Questions also used the older terms of references rather than the newer/accepted terms – ie using ‘field’ instead of ‘column’, and ‘entity’ instead of ‘table’. Again, I have no idea why this is – all other exams (including the renewals for them) are using these properly (in my summary below I have ensured I use the correct terms).

So, as I’ve posted before around my exam experiences, it’s not permitted to share any of the exam questions. This is in the rules/acceptance for taking the exam. I’ve therefore put an overview of the sorts of questions that came up during my exam. (Note: exams are composed from question banks, so there could be many things that weren’t included in my exam, but could be included for someone else!). It’s also in beta at the moment, which means that things can obviously change.

I’ve tried to group things as best together as I feel (in my recollection), to make it easier to revise.

  • Sales Apps
    • Configuring forms, columns & tables
    • Configuring security roles & access to records
    • Configuring relationships between records (including deletion properties)
    • Sales Mobile App – security & deployment
    • Forecasting – setting up & configuring
    • Configuring Goals
    • Configuring Opportunities
    • Handling currencies
  • Copilot for Sales
    • Setting up & deploying to users
    • Configuring access
  • Outlook App
    • Deploying & setting up
    • Configuring forms & information
  • Exchange
    • Connecting to mailboxes
    • Configuring folder permissions
    • Configuring multiple domains
  • Product Families & Catalogue
    • Creating & setting up
    • Configuring options
    • Adding items to be used
  • Price Lists
    • Creating & setting up
    • Configuring options, including discounts
    • Using time-restricted price lists
    • Handling currencies
  • Document Management
    • Different document management capabilities
    • Usage of SharePoint in different ways
  • Data Import
    • Usage of Power Query
    • Data manipulation
    • Handling duplicate records
  • SMS
    • Setting up & configuring SMS provider
  • Journeys
    • Different triggers to use based on scenarios & requirements
    • How to trigger journeys
    • How to set up emails to be used within a journey
  • Segments
    • Different types of segments
    • Creating & modifying segments
  • Searching/Filtering
    • Using Advanced Find
    • Setting up/modifying queries to include/exclude records based on conditions
  • Business Process Flows
    • Modifying business process flows
    • Handling conditions within business process flows

As a Sales exam, it seemed alright. But as mentioned above, the Customer Insights questions just seemed strange to me – I’d expect a consultant to be very technically skilled in Customer Insights, but not in Sales (& vice versa), so I’m not understanding bringing these two sides together.

I’m going to be quite interested in seeing how the exam is actually launched (as it’s currently in Beta of course). Having chatted with a few others who have taken the exam (whilst obviously respecting the NDA!), they also can’t really understand the landscape. Personally, I think that if it continues like this, Microsoft is going to hear quite a few complaints around it.

I hope that this is helpful for anyone who’s thinking of taking it – good luck, and please do drop a comment below to let me know how you found it! I’d also be interested in your thoughts/opinions around the direction that Microsoft has taken for this!

New Power Platform Security Role Editor

We’ve all been there. Security role wise, that is. It’s the point in any project where we start looking at configuring user security. To do this, we’ve used the Security Role section in the Settings area (once it’s actually loaded, of course):

Ah, the joys of this – dating back to CRM 3.0 (to my recollection – though it possibly might be 4.0). All of these lovely little circles, which fill up more & more as we click on them, whilst trying to work out what each one actually does:

And that’s not to mention the ‘Missing Entities’ tab (did anyone ever figure out what this was supposed to be used for), or the ‘Custom Entities’ tab which seemed like a catch all place. Plus the fact that non-table permissions (eg Export to Excel) were placed on random tabs that meant we needed to hunt through each tab to find the appropriate item.

Now many of us spend hours in here (then further hours once we started troubleshooting user issues that were down to security role misconfiguration). The absolutely ‘JOYS‘ of the header title row not being scrollable (though it was possible to hover over each permission, and it would tell you what it was). The power of clicking on the line item, and seeing ALL of the little circles fill up – if you haven’t ever done it, you’ll not have experienced the bliss that this could bring!

But all things come to an end(or as the Wheel of Time series says:’ The Wheel of Time turns, and Ages come and pass, leaving memories that become legend‘), and now we have a NEW security role experience.

First of all, the UI has changed. It’s cleaner, responsive, gives more information to users upfront….and the heading SCROLL!!!

We’re able to show just the tables that have permissions assigned to them (rather than wading through dozens or hundreds of entries that have no relevance), or show everything:

Oh, and those random non-table privileges that we had to try to find beforehand – these are now grouped very nicely. This is SO much easier to manage!

We can also take permissions that have been set on a specific table, and then copy them to another table (it promps us to select – and we can select MULTIPLE tables to copy to!):

But best of all is the way that we can now set permissions for items. There are several different ways of doing this.

Firstly, Microsoft has now provided us with the ability to select standard pre-defined options. Using these will set permissions across all categories for the item appropriately:

This is really neat, and is likely to save quite a bit of time overall. However if we’re needing to tweak security permissions to custom settings, we can do this as well. Instead of clicking on circles, we now have lovely dropdowns to use:

In short, I’m absolutely loving this. The interface is quick to load, intuitive, and works well without fuss.

Given how much time I’ve spent over the years in wrestling with security roles, I think this is going to be a definite timesaver for so many people (though we’ll still need to troubleshoot interesting error messages at times that testing will throw up, and work out how/what we’re needing to tweak for security access to work).

There are still some tweaks that I think Microsoft could make to get this experience even better. My top three suggestions would include:

  • The ability to select multiple lines, and then set a permission across all of them (sort of like bulk editing)
  • Being able to have this area solution aware. When we have various different projects going on, it would be great to have the ability to filter the permissions grid by a solution. This would be a timesaver, rather than having to wade through items that aren’t relevant
  • Export to Excel. Having a report generated to save digitally or print off is amazing for documentation purposes. There are 3rd party tools (thank you XrmToolBox!), but it would be great to be built into here

Overall, I’m really quite happy and impressed with it (it’s definitely taken enough time for Microsoft to pay attention to this, and get it out), and hope that it’ll continue to improve!

What have your bugbears with the legacy security editor been over the years, and how are you liking the new experience? Drop a comment below – I’d love to hear!

Active or inactive, that is the question?!?!

Catchy title, right? Well I was wondering what exactly I should use for this blog post, and as you’ll see as we go through things, this is probably quite a good paraphrase to use.

So, where to start? Well, with a customer, of course! Now, this customer has been running live with a custom Dynamics 365 solution for a little while. Importantly for this story, there have not been ANY releases in quite a few months. This is of course good to bear in mind, given that we can all, um, occasionally find that a release could cause an issue, somewhere, sometimes…

Part of the capabilities that they’re using is bringing in Leads, and qualifying them appropriately. As part of this process, there are various custom attributes (aka columns) that have been added to the Lead table, along with corresponding columns added to the Contact table. There’s also some custom logic that, when a lead is qualified, copies the values from Lead to Contact record, updating it (essentially extending the standard capabilities of the system).

This has all been working well to date, and the customer team has been very happy with their system. Until it stopped working, last week. Which was strange, as nothing seemed to have changed at all?

When trying to qualify leads in the system, they were getting the following error message:

Cryptic, right? This seemed a little more interesting as well, given that when only inputting basic information into a Lead record (eg First Name, Last Name, Phone Number), it didn’t matter how many leads existed with the same information, it qualified without a problem.

However, using any custom columns that had been added to the table caused this error to occur.

The first thing that I did was to check that there had been no updates released to Production. This was confirmed as being the case. I then also checked that there had been no OTHER solutions released to Production (as this could have impacted on it). Thankfully there hadn’t – the system looked to be in as fine a shape as it’s been running for a while.

OK – on to the next step. What updates have been released by Microsoft? Well, with the fact that we were able to pinpoint the date that the functionality had stopped working, we went to find the corresponding Learn article about the release (Update 22102 – Release Notes | Microsoft Learn). Don’t worry about clicking through to read it – there’s essentially not much in it, and there’s nothing at all around the Lead table or its functionality!

Continuing to dig around, I really wasn’t sure of what was causing this, but obviously had to work it out & figure out a fix! It was quite a dilemna.

This is where the amazing Microsoft community came into play. I noticed a post by Jeroen Scheper on one of the channels that I’m on. It turns out that he was having the same issues, so we started to try collaborate on it. This both reassured me (that it wasn’t just me), but also increased the confusion, as we couldn’t work out what was going on underneath to cause this!

Raising with Microsoft (we both actually raised support incidents), I had an amazing support call almost immediately. Demonstrating the problem, I was told that it was due to Duplicate Detection rules.

Now I’ll admit that this confused me somewhat. See, I had already checked the Duplicate Detection rules, but nothing had been changed, and no new rules had been implemented.

Getting the support agent to walk me through things, they told me that I had to unpublish the rules, modify a setting on them, and then re-publish the rules. This was the setting (on each one) that had to be updated:

This again caused me to be confused. Why was the system having issues with inactive records? Surely qualified leads are active records, but just qualified (& then being locked down as a result)?

Well, it turns out that my perspective of how this works is actually incorrect. As we (hopefully) all know, whilst all records have a Status value (eg Active, Inactive), there are some records that also have a Status Reason value.

In fact, the ‘State Code’ choice value in Dataverse is restricted (we can’t access it), and seems to have some quite interesting functionality running behind it. Depending on which table is accessed, there are different options available within it.

For example, the Lead table shows:

Whereas the Contact table shows:

And the Task table shows:

Anyhow – it turns out that when a Lead record is qualified or disqualified, though it’s not shown in the user interface (nor behind the scenes), the record is actually being deactivated!

More information on this can be found at Qualify and convert leads to opportunity | Microsoft Learn.

So, this was the underlying reason behind the error message. Obviously Microsoft had updated something, which then caused this to fail. I don’t know how many different customers may have been (or still be?) experiencing the issue, but I think that the error message at least could be a little clearer? Perhaps including a link to the relevant Microsoft documentation page, for a start.

Well, thankfully this was put to bed, and I was quite thankful (as was the customer). And this is how I decided to come up with the title of this blog post!

Have you ever had something similar happen to you? Drop a comment below – I’d love to hear!

Power Platform Capacity Monitoring

If I look back at customer engagements over the last few years around Power Platform, whether it was a new capability or an existing capability, there was ONE thing that stood out above all. This was the ability to be able to track capacity usage over time, and to be honest, most organisations weren’t really doing very well at it.

For those who are unaware, there are actually three different types of capacity present within Power Platform environments. These are:

  • Data
  • File
  • Log

Each one is used for a specific purpose – broadly speaking, File holds all attachements that are uploaded directly into Dataverse, Log is used for auditing purposes, and Data holds everything else (hence the name)!

Now this data is shown within the Power Platform Admin Centre, under the ‘Resources/Capacity’ section’. An example of this is:

There’s also a nice little breakdown of capacity allocation through licenses etc, which essentially shows where the available capacity has come from:

If we drill down a bit further, we can open up a specific environment, and see not only the overall usage per capacity type, but also which tables are consuming the most amount of data:

All of this is well & good so far, for someone wanting to take a look at what is currently happening. But this is a manual action – it is possible to manually export the data, but again, this isn’t automated.

It’s also not possible (at least not at this point in time) to query the underlying records that hold these values. So we’re a little stuck. If an organisation wanted to see historical data usage, and/or predict data trends (such as ‘how much capacity would we need to have in 6 months if we continued our scaling’), there’s no way to do this. At least not automatically – someone would need to store the values down manually, then report on it. A hassle, to say the least.

Now when it comes to looking overall at Power Platform, the Centre of Excellence Starter Toolkit is really quite amazing. The Microsoft PowerCAT team continue to iterate existing functionality within it, as well as bring new functionality as well.

At this point in time, however, it doesn’t have any capacity monitoring in it. Well, it sort of does – we can implement notifications to alert us when capacity reaches a certain value. But this doesn’t solve the challenge as laid out above.

So with this in mind, I set out to create a solution to handle it. I’ve always wanted to create some sort of tool for giving back to the community & helping others, and I saw this as my chance to do so (I’m in awe of the various XrmToolBox tool creators, for the record).

So, I’m releasing a capacity monitoring tool. I’m using GitHub as the host, and the repo can be accessed at https://github.com/thecrmninja/Power-Platform-Capacity-Monitoring (it was a learning experience as well as how to use GitHub as a source repository, as I’ve not done that before!).

Model-Driven App:

Reporting Dashboard:

This is just the first version – I have various ideas about how to iterate on it, and tweak functionality. Each release will include release notes & important information to be aware of (such as security needing to run it). Also importantly, thanks to the amazing Matt Collins-Jones for reviewing some of my work around this.

The audience for this tool is aimed at IT/Power Platform admins who are already familiar with the Microsoft CoE toolkit solution, and have appropriate access to it.

If you find any issues, please raise an appropriate GitHub Issue item, and I’ll look into it. Also, if you have any ideas that you think could be worthwhile, please feel free to suggest them!

Finally, I’d be interested in hearing how you think this could support you or your organisation – feel free to drop a comment below!

Wave 2 2022 – Omnichannel

I know it’s taken me a week (or two) to get round to this, but I’ve had other things on the go (such as starting my new job, for instance). However it wouldn’t be this time of year without doing a summary of new features for the Wave 2 2022 release.

As with previous posts in this area, I’ll be focusing on the Customer Service side of things, and also more precisely with a focus on the Omnichannel capabilities. However, whilst previously I’ve tended to focus just on the Omnichannel items, Customer Service is now being much more tied together with the Omnichannel offering, so it makes sense to broaden things out a bit.

So let’s start taking a look at the wonders that will (hopefully!) be in store for us within a few months:

Customer Service Workspace – enhanced layout

Public Preview – August 2022. GA – October 2022

I’ve previously taken a look at some of the capabilities of the Customer Service Workspace (see Omnichannel vs Customer Service Workspace), and how they compare to Omnichannel. With Microsoft now rollowing out the ability to have multi-session capabilities within it, it’s sometimes a hard decision for organisations to decide which one to use (there are some key differences though).

With the upcoming release, there are going to be new layouts for the site map (navigation menu), sessions & tabs. Some of the key changes coming are:

  • Sessions and child tabs are displayed horizontally
  • Improved handling of overflowing tabs and sessions
  • Tab bar is visible only if multiple tabs are present in a session
  • Improved site map that’s accessed from the hamburger icon with support for grouping and areas
  • Improved accessibility with 400% zoom mode
  • Increased predictability of session closure in multisession apps
  • In-app notifications aligned with the multisession navigation

These look to be quite good (I definitely wouldn’t have thought of all of them!), and I can’t wait to try them out for myself.

Single sign-on capabilities

GA – October 2022

One of the things that can be quite frustrating for customers is that if they’re interacting through live chat capabilities, and then switch over to a Power Virtual Agent, they need to re-authenticate. This is of course not quite optimal for a seamless customer experience.

Microsoft are therefore enabling single sign-on capabilities. What this means in practise is the following:

  • Authenticant contexts are shared between Power Virtual Agents and Omnichannel live chat sessions. If a user authenticates in one of them, then they become authenticated across all of the capabilities. There’s no need to authenticate per communication type
  • Customers can start with an unauthenticated conversation, and then authenticate at a later point in the conversation. This will then continue as authenticated across the different channels that they’re communicating through

Voice channel – expansion of availability

GA – October 2022

The voice channel (which I still need to do a write up on!) is really amazing, allowing customers to call in directly via phone etc. It’s been rolled out already to several regions, but customers in other regions have been asking for it.

Microsoft has now confirmed that the voice channel will now be available in the following countries:

  • United Kingdom
  • Canada
  • India
  • Switzerland

This is a great move – it still doesn’t mean that every country has the voice channel available, so I expect that Microsoft will keep on adding more countries for the availability of this (I know that there’s a decent amount of back-end systems that are needed, which is why it’s taking this long to get in place).

Voicemails

GA – January 2023

This one is getting me really excited. Obviously, being able to connect to a customer service agent is important. But what if the agent isn’t around? We could of course send an email, but if we’re already connected through a specific method of contact, ideally we’d like to continue with that method.

Especially when it comes to actually calling into an organisation, it can be quite frustrating to not reach the person we’re trying to get hold of, and then need to send an email.
Voicemail capabilities, coming in early 2023, will mean that customers will be able to leave voicemails for customer service agents to pick up. The agents will be able to set up welcome messages, as well as manage & playback voicemails that have been left.
This is really cool – I’m wondering if there are going to be AI capabilities included in this in the future, so as to automatically transcibe voicemails for the agents, for instance. I don’t think that it would take a LOT more technical capabilities – we already have Azure Cognitive Services that audio can be fed through to for a written transcription to be produced.

Customer Callbacks

GA – January 2023

One of the frustrations that I think is shared universally is when contacting an organisation, and being told that you’re in a queue. Not only are you in a queue, but there may be dozens/hundreds/thousands of people ahead of you…and the number doesn’t seem to be decreasing at a rapid rate.

Some organisations offer the ability to ‘reserve’ your spot in the queue, and will call you back when you’re next. To date, this hasn’t been a feature of Omnichannel.

However, coming in early 2023, this feature will be rolling out!. It will give customers the ability to keep their queue position, and to choose if they’d like a callback to happen when they’re at the front of the queue. Note that this would require a phone number to be provided, for the customer service agent to use to contact the customer.

I think that this is a nice feature, but will be curious to see how it plays out ‘in the real world’. I know that when my local doctor surgery implemented this, it was supposed to be great, but in practise actually didn’t work well.

I’ll be looking deeper into the different functionalities when they land, and will share them here. If there’s anything you think would be helpful to focus on, drop a comment & let me know!

Environment types, capabilities & backups

Interesting title to start a blog post with, right? I can’t tell you how much I tried to work out what to call this, but then I figured that I’d just put at a high level what I’m going to be talking about!

So let’s start at the beginning. Environments within Dataverse. An environment is essentially a container for all sorts of different components, such as data models, apps, code, etc.

Examples of what an environment can contain

Within the Power Platform, there are different types of environments. As a quick recap, currently we have the following:

  • Default. Every Power Platform tenant has a default environment. We of course shouldn’t be using this for any proper development!
  • Production. Used for any Line of Business application
  • Sandbox. A sandbox environment is any non-production Dataverse environment. Isolated from production, a sandbox environment is the place to safely develop and test application changes with low risk.
  • Trial. Used to take out a trial
  • Trial (Subscription Based). Used to take out a trial when there’s subscription licensing in place
  • Developer. Personal environment, limited to one user. Previously called the Community plan.
  • Teams. Used when an app is created within Teams, to use a Dataverse for Teams environment. Doesn’t have full Dataverse capabilities, and has various limitations
  • Support. Only able to be created by Microsoft support during a support case. Is essentially a clone of an existing environment, used for diagnosis purposes.

Now, sandbox & production environments are automatically backed up – backups occur continuously, using Azure SQL Databases underneath. It’s also possible to create a manual backup instance of an environment as well, which usually takes a few seconds to carry out (restoring a backup, on the other hand, takes quite a bit longer…).

When restoring an environment, it’s not possible to restore to a production environment (though the backup could be from a production environment). It’s only possible to restore the backup to a sandbox environment – you’d then need to promote the environment from sandbox to production.

Let’s move away from backups for a moment. When we create an environment, we have the ability to select that the environment should be enabled for Dynamics 365:

This is actually a REALLY IMPORTANT CONSIDERATION! At this point in time, it’s not possible to update from a Power Platform Dataverse environment to then bring in Dynamics 365 capabilities. What this means is that if an organisation starts with just Power Apps, and then wants to expand into using Dynamics 365, IT’S NOT POSSIBLE TO DO THIS NATIVELY. Even Microsoft Support can’t do anything around this – you’d need to create a new environment, enable it for Dynamics 365, and then restore a backup to it.

It’s something that a lot of us would like be in place, but we’re not sure if it’ll ever come about. This is a tweet of mine from 2019 that Charles Lamanna responded to (I was SO thrilled that he actually responded to me!!):

https://twitter.com/clamanna/status/1176629306484637696

However, it’s still not in place. As a result, we recommend to all clients that when they deploy a Dataverse environment, they toggle the switch above (Note: A Dynamics 365 license is NOT needed to toggle this). Once this has been toggled (without deploying any of the Dynamics 365 apps), the Dynamics 365 apps and functionality can be installed/deployed at a later point in time.

There are actually various capabilities, such as the Data Export Service (yes, I know it’s now been deprecated) that actually relied on having the environment enabled as a Dynamics 365 environment in order to work. We found this out the hard way at a client, and had to do an overnight environment re-build to get the capabilities in place.

But there’s one other thing to consider around the differences between a native Dataverse environment, and an environment which has been enabled for Dynamics 365. This is around backups.

Now, backups are of course very important (thankfully they now occur automatically, as mentioned above – I remember my onpremise days when needing to run these manually!). But there are also some important differences for backup behaviour when it comes to environment types. See, it turns out that environments aren’t actually equal in backup behaviour. This is what actually happens:

  • Sandbox environments (all types) – backups retained for 7 days
  • Dataverse production environment (not enabled for Dynamics 365) – backups retained for 7 days
  • Dataverse production environment (enabled for Dynamics 365) – backups retained for 28 days

See that? Having Dynamics 365 enabled for an environment gives you FOUR TIMES as much backup retention time! That’s incredible!

Dataverse Environment enabled for Dynamics 365 – 28 days of backups available!

So not only are you able to then upgrade to Dynamics 365 applications at a later date, you then also have more peace of mind (hopefully you don’t need to use it though!) around keeping backups for longer.

This is really cool – I hope it helps you plan your environment implementation strategy! Have you ever come up against issues when using environments, or the type/s of environment? Drop a comment below – I’d love to hear!

Sharing multiple knowledge articles

Most of us are aware of the knowledge article functionality within Dynamics 365. For those who aren’t familiar with it, knowledge articles can empower users within any organisation with access to existing information.

Types of knowledge articles can include solutions to common issues, product or feature documentation, answers to frequently asked questions (FAQs), product briefs, and more. Being able to have access to this means that customer service agents can easily answer queries, without needing to spend (lots of) time investigating what’s happening, and find resolutions.

Note: At this point in time, the Knowledge Article functionality is still a restricted table within Dataverse. It requires either a Dynamics 365 plan, Customer Engagement plan, or Customer Service Enterprise plan

It’s great to be able to share information around within an organisation. There is native functionality for this, with the ability to share a knowledge article record directly with other users by clicking the ‘Email a link’ button:

Note: This is performed through going to the to Knowledge Article table, opening up a record, and carrying out the functionality from there. It cannot be done through access to Knowledge Articles on the Case form.

This will create an email (in Outlook) with a URL to the specific record:

This is of course very helpful, but is only internally facing. It’s not possible to send this to a customer who’s having an issue, as the customer wouldn’t be able to access the URL!

It’s also not particularly useful if we think about how customer service agents work, as they’d need to be moving through different areas of Dynamics 365.

Thankfully knowledge management is built into the customer service capability within the system. So for example, when we open a case record, we have the ability to search for knowledge articles directly in here:

This of course works much better from a customer service agent perspective – they have all of the functionality that they need in just one area.

So how can we share information directly with customers? Copying and pasting the information into a chat or email interaction seems quite manual and bothersome.

But there’s no need to do this manually, thankfully! Again, we have in-built functionality to handle this:

Clicking the little email icon on a knowledge article creates an email within Dynamics 365 (so you’ll need to have email enabled for users, to make this work!) with the information copied into it:

OK – this much easier. We can send customers the exact information, so that they then have it to hand.

But here’s a new scenario – what if we wanted to send MULTIPLE knowledge articles to a customer at once? We could of course click the email icon each time, but that results in a separate email being created for each one, which means the customer will be receiving multiple emails. Not the most ideal scenario, surely?

Well, thanks to my amazing colleague Ryan Hunter-Stott, there is actually a way to do this! In fact, technically you could say that there are two approaches, but holistically it’s the same thing – it involves an email.

So, you can either:

  • Select to email the knowledge article to the customer, or
  • Create a blank new email

Within the email message, we have the following option to insert a knowledge article:

Clicking this brings up an interface to be able to search for knowledge articles. Clicking the envelope icon will then insert the information in the email:

Now it’s not possible to select multiple knowledge articles in this window. BUT, it IS possible to click the button to open it up again, and insert a second one. And then a third one!

This can then be sent out to the customer, with all of the information contained in it!

It’s a nice little touch, and I think it definitely beats copying & pasting information into an email manually!

Have you ever thought about this scenario? Did you find this functionality, or end up doing it in a different way? Drop a comment below – I’d love to hear!