PL-500: Microsoft Power Automate RPA Developer

RPA (or Robotic Process Automation) is a capability that Microsoft has been developing for a while within the Power Platform space. Whilst cloud flows can be used to interact with any systems that has an API in place, many organisations have (legacy) systems that have no API, so interacting with them can be challengin. RPA capabilities allow organisations to be able to interact with any system overall, thereby enabling & empowering businesses holistically.

I’ve been aware for a while that there’s been an exam coming out for RPA, though it’s taken a bit of time to land. That’s fine though – I can’t really think of any absolute rush to have it in place. I do think that over time, just as with some of the other certifications, it will become a required for solution or specialisation status.

The official page for it is at https://docs.microsoft.com/en-us/certifications/exams/pl-500. The specification for it is:

Candidates for this exam automate time-consuming and repetitive tasks by using Microsoft Power Automate. They review solution requirements, create process documentation, and design, develop, troubleshoot, and evaluate solutions.

Candidates work with business stakeholders to improve and automate business workflows. They collaborate with administrators to deploy solutions to production environments, and they support solutions.

Additionally, candidates should have experience with JSON, cloud flows and desktop flows, integrating solutions with REST and SOAP services, analyzing data by using Microsoft Excel, VBScript, Visual Basic for Applications (VBA), HTML, JavaScript, one or more programming languages, and the Microsoft Power Platform suite of tools (AI Builder, Power Apps, Dataverse, and Power Virtual Agents).

Now here’s the thing. I occasionally work in the automation space, either on customer projects, or when training users in the technologies. I wouldn’t describe myself as an advanced automation developer (whether cloud or RPA capabilities). I’m most definitely NOWHERE near the level of legends such as Matt Collins-Jones, for example (go check him out if you don’t know about him!).

So I knew that I may be a bit challenged when taking the exam, especially in the more ‘pro dev’ space (aka JSON etc). In fact, I didn’t actually realise that the exam specification included that sort of thing. I know, I should have – it’s aimed at developers overall…shows that I need to brush up on reading things properly!

Also, there’s still quite a bit of a focus on Power Automate cloud flows – it’s not JUST about RPA capabilities.

Now, really nicely, there are already Microsoft Learn pathways available (which have been around for a while, and updated appropriately). This really is a big help, I feel, especially for people who are new’ish to RPA.

Of course, there’s a lovely shiny two star badge awarded when passing the exam, along with the title of ‘Microsoft Certified: Power Automate RPA Developer Associate’:

As with previous exams, I sat it from home (the proctored experience). Learning from previous times that I’ve taken exams, I ensured that my workspace was entirely clear from everything. As a result, the check-in process happened automatically, and I didn’t need to engage with any proctors at all (which was quite nice actually).

As in my previous exam posts, I’m going to stress that it’s not permitted to share any of the exam questions. This is in the rules/acceptance for taking the exam. I’ve therefore put an overview of the sorts of questions that came up during my exam. (Note: exams are composed from question banks, so there could be many things that weren’t included in my exam, but could be included for someone else! ). I’ve tried to group things together as best as possible for the different subject areas.

  • Cloud flows vs RPA flows
    • Capabilities of each
    • When to use each (ie how to handle different scenarios)
    • How to trigger each one
  • Cloud flows
    • Different types of triggers, & when each type should be used
    • Different types of actions, and the capabilities of them (at a high’ish level – expected to know common Microsoft actions, but not need to know all of the hundreds of different ones!)
    • Controls/operators. What they are, how they can be used to accomplish different requirements
    • JSON formatting & syntax
  • Business Process flow vs Business Rules
    • What each is
    • When to use each one
    • Capabilities
  • RPA flows
    • Common actions, how they work, capabilities of them
    • How expression syntax works within them
    • Debugging capabilities, and what to use when
    • How to interact with desktop applications
    • How to interact with websites
      • How data values can be used
      • How data tables can be used
      • How to use data that’s extracted from a website
    • Troubleshooting functionality
  • Usage of automation capabilities from Office 365 applications such as Excel & Visio
  • Loops
    • How they work for cloud & RPA flows
    • Troubleshooting
    • Implementing success/fail criteria
    • Error handling
  • Process Advisor
    • What it is
    • What it does
    • How it can help organisations
    • Limitations
    • What it cannot do
    • Process Mining vs Task Mining, & the important differences between them
  • Variables
    • How to handle variables across different environments
    • How to declare them (cloud flow vs RPA flow)
  • Runtime operations
    • How flows are triggered (async vs sync)
    • How flows are queued (cloud vs RPA)
    • How RPA flows are carried out when using machine groups
  • Artificial Intelligence (AI) capabilities
    • How AI can be used within flows
    • Different AI capability types (what each one can be used for)
    • AI within Power Platform, & AI within Azure Cognitive Services
  • Sharing flows
    • Different ways to share cloud flows
    • Different ways to share RPA flows
  • Application Lifecycle Management (ALM)
    • Solutions (managed vs unmanaged). Capabilities of each, when to use each type
    • AzureDevOps (ADO). What it is, when/how to use it, capabilities
    • Solution imports
    • Solution layers. What these are, troubleshooting functionality
    • Upgrade/Stage for Upgrade/Update. Which each is, what each does, how/when to use each one
    • Moving desktop flows between users
  • Security
    • Security roles needed to create
    • Security roles needed to share/modify
    • Security roles needed to register machine for RPA
    • Security roles needed to register machine groups for RPA
    • Security requirements to run different types of RPA flows (how it interacts with desktop/s)
    • Data Loss Prevention (DLP) – how it affects creation & runtime of flows

Overall, I had 46 questions, with a single case study. I’m used to having at least two case studies, so it was nice to have just one of them this time.

So….it’s a lot of stuff. Definitely targeted much more at the ‘pro-developer’ end of the scale that someone who might occasionally automate things. It’s absolutely necessary to understand coding conventions, ALM, etc.

It’s definitely an exam that if you’re not already currently hands-on with the skills needed, I’d highly recommend you get a decent amount of experience with it before taking the exam! I’d highly recommend ensuring that you have an environment in which you’re able to be hands on with all types of automation (cloud & desktop flows), and really understand how they can be handled with an eye on the enterprise scale!

If you’re aiming to take it – I wish you the very best of luck, and let me know your experience!

Recognition as Microsoft Partner for Business Application Solutions

It’s been a little while since I’ve previously blogged around developing customer solutions and the Microsoft Specialisations. Since I spoke about it last year (Apps & Microsoft Partner Specialisations) the landscape has moved on a little, and I thought that it would be good to take a look again at it.

Currently in the Business Applications space, there’s a single specialisation. This is the ‘Microsoft Low Code Application Development Advanced Specialisation’, which is covered in detail at the Microsoft page for it (Microsoft Low Code Application Development Advanced Specialization).

In essence, this specialisation is aimed at partners who are developing Power Apps (yes, this is specifically aimed at Power Apps), and has been around for a year or so.

In order for Microsoft to track the qualifying metrics against this specialisation, it’s very important to carry out the PAL (Partner Attach Link) process. The details of how to do this is in my earlier post, which includes some of my thoughts at the time around how a partner should best implement the procedure.

Since then, my blog post has gained a good amount of traction, and several Microsoft partners have engaged with me directly to understand this better, and to implement the process into their project playbook. I’m really delighted at having been able to help others understand the process, and the reasoning behind it.

Now that’s all good for a partner who is staying in place at a customer. However there are multiple scenarios that can differ from this. Examples of this are:

  1. Multiple partners developing a single application together
  2. One partner handing over the application to a second partner for further development
  3. One partner implementing a solution, with a second partner providing support

Now, there’s really a single answer to all of the above scenarios, but it’s a matter of how to go about implementing this properly. Let me explain.

Originally, all developers would register PAL, and this would then be tracked through the environment cadence, and associated appropriately to the partner. This would be from the developers having been the creators of the apps.

This has now changed a little bit. Microsoft now recognises the capabilities of PAL using both the Owner of the app, as well as any Co-Owners of the app. This is a little more subtle, so let’s explain this in some detail.

It is possible, of course, to change the owner of an app. More commonly, however, is the practice of adding co-owner/s to an app (I always recommend this as best practice actually, to remove key-person responsibility risks).

Note: Changing the actual owner of an app requires the usage of a PowerShell command

So what happens now is that Microsoft will track the owners/co-owners of any app that’s deployed, and PAL association will flow through this. But there are a couple of caveats which it’s important to be very aware of!

  1. All owners/co-owners must have registered PAL with their user accounts (if using a service principal/service account as an owner, there’s a way of doing this using PowerShell)
  2. Microsoft will recognise the LATEST owner/co-owner association with the app as the partner organisation that will receive PAL recognition

Now if a customer adds co-owners to an app, this shouldn’t be an issue (as none of the users would have registered PAL). But if there are multiple partners in place, ONLY THE LATEST ONE WILL BE RECOGNISED.

Therefore to take the three scenarios above, let’s see how this would apply.

  1. Multiple partners developing a single app. Recognition would not work for all partners involved, just the latest one to associate with the app
  2. Partner 1 handing over app to Partner 2. Recognition would stop for Partner 1, and would then start for Partner 2
  3. Partner 1 implementing solution, Partner 2 providing support. Care would need to be taken that the appropriate partner is associated as owner/co-owner to the app, for PAL recognition.

It’s also important for both partners & customers to understand this, in the wider context of being careful about app ownership, and the recognition that it brings from Microsoft for partners delivering solutions. If a partner would go into a customer, and suddenly start taking ownership of apps that it’s not involved in, I don’t think that Microsoft would be very approving of it.

Now, all of the above is in relation to Power Apps specifically, as I’ve noted. However, the PAL article was updated last week (located at Link a partner ID to your Power Platform and Dynamics Customer Insights accounts with your Azure credentials | Microsoft Docs) and also interestingly talks about:

Note the differences between each item

Reading between the lines here, I think that we’re going to be seeing more advanced specialisations coming out at some point. Either that, or else partner status will be including these as well, as I can’t think of any other reason why PAL would need to be tracked for these as well! I’m also wondering if other capabilities (eg Power Virtual Agents, Power Pages, etc) will be added at some point as well…

Have you had any challenges with the PAL process? Is there anything more you’d like to find out about it? Drop a comment below, and I’ll do my best to respond!

New Platform DLP Capabilities

DLP (or Data Loss Prevention) is a very important capability in the Power Platform. With being able to bring together multiple data sources, both within the Microsoft technology stack as well as from other providers gives users amazing capabilities.

However with such great capabilities comes great responsibility. Of course, we trust users to be able to make proper judgements as to how different data sources can be used together. But certain industries require proper auditing around this, and so being able to specify DLP policies are extremely important to any governance team.

Being able to set how data connectors can be used together (or, in the reverse, not used together) across both Power Apps as well as Power Automate flows is imperative in any modern organisation.

To date, Power Platform DLP capabilities have existed that allow us to be able to categorise connectors (whether Microsoft provided or custom) into three categories. These categories specify how the connectors are able to function – they’re able to work with other connections that are in the same category group, but cannot work with connectors that are in a different category group.

So for example, it’s been possible to allow a user to create a Power App or a Power Automate flow that interacts with data from Dataverse, but cannot interact with Twitter (in the same app or flow).

With this approach, it’s possible to create multiple DLP policies, and ‘layer’ them as needed (much like baking a 7 layer cake!) to give the functionality required per environment (or also at the tenant level).

Now this has been great, but what has been missing has been the ability to be more granular in the approach to this. What about if we need to read data from Twitter, but just push data out to Twitter?

Well, Microsoft has now iterated on the DLP functionality available! It’s important to note that this is per connector, and will depend on the capabilities of the connector. What we’re now able to do is to control the specific actions that are contained within a connector, and either allow or not allow them to be able to be utilised.

Let’s take the Twitter connector as an example:

We’re able to see all of the actions that the connector is capable of (the scroll bar on the side is a nice touch for connectors that have too many actions to fit on a single screen!). We’re then able to toggle each one to either allow or disallow it.

What’s also really nice are the options for new connector capabilities.

This follows in the footsteps of handling connectors overall – we’re able to specify which grouping they should come under (ie Business, Non-Business, or Blocked). As new connectors are released by Microsoft, we don’t need to worry that users will automatically get access to them.

So too with new actions being released for existing connectors (that we’ve already classified). We’re able to set whether we want them to be automatically allow, or automatically blocked. This means that we don’t need to be worried that suddenly a new connector action will be available for users to use, that they perhaps should not be using.

From my perspective, I think that any organisation that’s blocking one or more action capabilities for a connector will want this to be blocked by default, just to ensure that everything remains secure until they confirm whether the action should be allowed or not.

So I’m really pleased about this. The question did cross my mind as to whether it would be nice to be able to specify this on a per environment basis when creating a tenant-level policy, but I guess that this would be handled by creating multiple policies. The only issue I could see around this would be the number of policies that could need to be handled, and ensuring that they’re named properly!

Have you ever wanted these capabilities? How have you managed until now, and how do you think you’ll roll this out going forward? Drop a comment below – I’d love to hear!

Environment types, capabilities & backups

Interesting title to start a blog post with, right? I can’t tell you how much I tried to work out what to call this, but then I figured that I’d just put at a high level what I’m going to be talking about!

So let’s start at the beginning. Environments within Dataverse. An environment is essentially a container for all sorts of different components, such as data models, apps, code, etc.

Examples of what an environment can contain

Within the Power Platform, there are different types of environments. As a quick recap, currently we have the following:

  • Default. Every Power Platform tenant has a default environment. We of course shouldn’t be using this for any proper development!
  • Production. Used for any Line of Business application
  • Sandbox. A sandbox environment is any non-production Dataverse environment. Isolated from production, a sandbox environment is the place to safely develop and test application changes with low risk.
  • Trial. Used to take out a trial
  • Trial (Subscription Based). Used to take out a trial when there’s subscription licensing in place
  • Developer. Personal environment, limited to one user. Previously called the Community plan.
  • Teams. Used when an app is created within Teams, to use a Dataverse for Teams environment. Doesn’t have full Dataverse capabilities, and has various limitations
  • Support. Only able to be created by Microsoft support during a support case. Is essentially a clone of an existing environment, used for diagnosis purposes.

Now, sandbox & production environments are automatically backed up – backups occur continuously, using Azure SQL Databases underneath. It’s also possible to create a manual backup instance of an environment as well, which usually takes a few seconds to carry out (restoring a backup, on the other hand, takes quite a bit longer…).

When restoring an environment, it’s not possible to restore to a production environment (though the backup could be from a production environment). It’s only possible to restore the backup to a sandbox environment – you’d then need to promote the environment from sandbox to production.

Let’s move away from backups for a moment. When we create an environment, we have the ability to select that the environment should be enabled for Dynamics 365:

This is actually a REALLY IMPORTANT CONSIDERATION! At this point in time, it’s not possible to update from a Power Platform Dataverse environment to then bring in Dynamics 365 capabilities. What this means is that if an organisation starts with just Power Apps, and then wants to expand into using Dynamics 365, IT’S NOT POSSIBLE TO DO THIS NATIVELY. Even Microsoft Support can’t do anything around this – you’d need to create a new environment, enable it for Dynamics 365, and then restore a backup to it.

It’s something that a lot of us would like be in place, but we’re not sure if it’ll ever come about. This is a tweet of mine from 2019 that Charles Lamanna responded to (I was SO thrilled that he actually responded to me!!):

https://twitter.com/clamanna/status/1176629306484637696

However, it’s still not in place. As a result, we recommend to all clients that when they deploy a Dataverse environment, they toggle the switch above (Note: A Dynamics 365 license is NOT needed to toggle this). Once this has been toggled (without deploying any of the Dynamics 365 apps), the Dynamics 365 apps and functionality can be installed/deployed at a later point in time.

There are actually various capabilities, such as the Data Export Service (yes, I know it’s now been deprecated) that actually relied on having the environment enabled as a Dynamics 365 environment in order to work. We found this out the hard way at a client, and had to do an overnight environment re-build to get the capabilities in place.

But there’s one other thing to consider around the differences between a native Dataverse environment, and an environment which has been enabled for Dynamics 365. This is around backups.

Now, backups are of course very important (thankfully they now occur automatically, as mentioned above – I remember my onpremise days when needing to run these manually!). But there are also some important differences for backup behaviour when it comes to environment types. See, it turns out that environments aren’t actually equal in backup behaviour. This is what actually happens:

  • Sandbox environments (all types) – backups retained for 7 days
  • Dataverse production environment (not enabled for Dynamics 365) – backups retained for 7 days
  • Dataverse production environment (enabled for Dynamics 365) – backups retained for 28 days

See that? Having Dynamics 365 enabled for an environment gives you FOUR TIMES as much backup retention time! That’s incredible!

Dataverse Environment enabled for Dynamics 365 – 28 days of backups available!

So not only are you able to then upgrade to Dynamics 365 applications at a later date, you then also have more peace of mind (hopefully you don’t need to use it though!) around keeping backups for longer.

This is really cool – I hope it helps you plan your environment implementation strategy! Have you ever come up against issues when using environments, or the type/s of environment? Drop a comment below – I’d love to hear!

Staying up to date with release information

Microsoft releasing new functionality can be an interesting experience, to say the least. As a cloud platform (SAAS – Software As A Service), functionality is released the entire time. A user could log off on Friday for the weekend, and come back on Monday morning to find that something has changed slightly, or a new button is present in the interface. Over time, most of us have come to accept this.

However this is for the ‘smaller’ functionality parts within the system, whether that’s Dynamics 365, or Power Platform related. There are of course two MAIN release announcements each year. These are the Wave 1 (Spring) and Wave 2 (Autumn) release windows, with information announced about what is included in each one publicly. This information usually starts to be available around 4-6 weeks or so before the release starts to hit.

Now that’s not to say that everything within a Wave release is released in a ‘Big Bang’ moment. Far from it actually, based on my experience. Microsoft will announce what is coming as part of the Wave release, along with projected timeframes as to when it will be available. Obviously, just because it’s been announced for Day X doesn’t mean that actually happens, at least for some of the time.

But there’s an inherent time-sink to being on top of all of this information. Firstly, people need to download the Wave release information (there’s one for Dynamics 365, and a second one for Power Platform), wade through all of the information, and somehow then remember it. Let’s just say that this can be challenging for a lot of people…

But what if there was somewhere where we could track this? Well, to date there hasn’t been, at least not until now.

Microsoft have created & made available the ‘Dynamics 365 & Power Platform Release Planner’, which can be found at https://experience.dynamics.com/releaseplans:

So just as a start, this is already MUCH better than the downloadable PDF documents for wave release information (admittedly the information is also available online as a Microsoft document, but still it’s lacking in certain areas).

But there’s more to this functionality than simply presenting a list of areas. Let’s take a look into some of these.

To begin with, there’s the sitemap on the left hand side. This allows us to select a specific area of interest, whether it’s Dynamics 365 or Power Platform (amusingly this reminds me a little of a model-driven app!).

Once in an area, we can then select between Planned features, Coming Soon features, and Try Now features by using the options in the menu bar. This is a nice little piece of functionality, in my opinion, allowing us to see what falls under each ‘category’:

By default, the items are displayed in a list format. However, we’re also able to toggle the view from the menu bar to a release date format, which shows us all items grouped by release month:

There’s also some filtering functionality, allowing us to narrow down the results even further:

Opening a line item (regardless of whether it’s being displayed as a list, or arranged by date) will give further information around the specific item. It also includes a lovely little timeline widget, showing the release dates information, as well as where it’s actually up to currently (which I think is great to have it as a visual reference!):

In here, links are included to documentation around the release overview, as well as specific documentation around the selected functionality item.

Now if this was all that there was, I think that truthfully I would be quite satisfied. It’s a much more modern interface, and really looks nice. I know that various colleagues of mine would be quite satisfied as well.

But….it doesn’t stop there. There’s something else, which is really the cherry on top of the cake icing! So what is it? Well, it’s the ability to create a PERSONALISED release plan information overview.

So on each item of functionality, there’s a button called ‘+ To my plan’:

Note: You do need to be signed into the portal to have this option available to you

Clicking this will add it to a personalised release plan, which you can access from the left-side menu. Here, all of the items that you’ve selected will show up. This is really cool, I think, as it allows you to see the overall picture, but also then focus on just the areas that you’re interested in:

It’s still got all of the functionality available for filtering, date/item sorting, etc. It’s also possible to toggle back to the ‘main’ view of all release information.

So in summary, I think that this is really cool. Admittedly (as it says on the site), it’s in BETA currently. I’m hoping that it’ll stick around, and come out of Beta pretty soon! Regardless, I’m definitely starting to make use of this already in tracking the upcoming features that I’m interested in.

Updates to the Power Platform Admin Center

There’s a saying amongst seasoned IT professionals who deal with Microsoft software. It goes something like this – ‘Why make do with one admin centre, when you could just have MULTIPLE admin centres to carry out functions!’.

It’s a bit of a tongue-in-check response to the numerous different admin centres that Microsoft technology seems to have. Now, I/we totally understand that over time, different (standalone) products have come together to co-exist, but their administration centres still differ.

Over time, Microsoft has been applying efforts to make them work better together, but it can still sometimes be quite frustrating not to know exactly where to go to in order to carry out specific function/s, or not to be able to see capabilities holistically overall in a single place.

So for example, we have:

  • Microsoft 365 Admin Centre
  • Power BI Admin Centre
  • Power Platform Admin Centre (which, for Dynamics 365 deployments, still leads users to the Classic Advanced Settings for some of the functionality…)
  • etc….

Now when it comes to Power Platform related items, admins would usually go to the Power Platform Admin Centre (which though it has a URL of admin.powerplatform.com, this auto-resolves to admin.powerplatform.microsoft.com – I have no idea why this is, given that no other admin centre seems to have this structure in place….another mystery…)

From here, we’d be presented with a list of environments, similar to the screenshot below:

The menu on the left hand side gave us a few of the different admin centres that we’re able to switch to. Alternatively, we could expand the overall menu to show us more capabilities, including other apps that we may wish to access:

So this is what we’ve been used to for the last few years. Essentially, information in different areas, and we’d need to go to each admin centre to find out what’s happening. So for example, if a Power Platform Admin user wanted to see any health advisories, they’d need to go to the Microsoft 365 Admin Centre to view the Service Health area there.

Not anymore! As part of the focus on unifying information across admin centres, Microsoft has now updated the functionality for this!

Now, with the new functionality, there’s a Home screen. On this, information is able to be presented to users, as well as applying one of several themes to the interface, such as a rainbow:

Now, in terms of information available to users, these are presented as ‘cards’. Within each card, information is shown, based on the card type:

At the moment, there are three cards to choose from:

Service Health

This section outlines any service health issues, such as outages or advisory information that users should be aware of. Clicking through it will bring users to the Service Health section of the Microsoft 365 Admin Centre:

From here, users can choose to switch across to other categories, such as Incidents, History & Reported Issues.

It’s (at least) one less click from the previous method, and I’m quite liking this. In my mind, it’s about making the information as accessible as possible (leaving aside that I think that Power Platform specific alerts should actually show within the Power Platform Admin Centre…)

Message Center

The second section is the Message Center. Here we’re able to see specific messages (yes, I know I have a LOT of messages sitting here!), and clicking on them will bring up the corresponding information directly within the same interface (which again, I’m really liking). So for example:

Nicely for messages, we also have options to filter the types of services that we want to see here. This, in my mind, is quite important, as we wouldn’t want Power Platform admins to be overwhelmed by messages that have absolutely no (usual) interest for them:

We also have the ability to specify which email notifications we want to be receiving. Again, we may be interested in some non-Power Platform notifications, but not want to see them directly within the Power Platform Admin Centre. Instead, we can specify to receive these via email – another nice touch!

Documentation

Finally, we have linked out to various Power Platform (& Dynamics 365) related resources on the Microsoft website. These are all static (ie they’re provided by Microsoft), but hopefully in the future admins will have the capability to add custom links to other resources as well.

What is nice about the documentation section though is that it’s got linked to the various Community forums. Microsoft has recently started to promote these within the products, and they can be a very helpful resource at times to be able to use!

There are also links to the Microsoft Centre of Excellence toolkit, which is a great resource that organisations should look to implement.

All in all, I think that this is a VERY good start to things. I’m hopeful that with Microsoft implementing this ‘home screen’ functionality with the ability to add cards to it, there will be additional cards that are released, bringing more information & functionality into the interface. I’m also hopeful that Microsoft will allow admins to add custom functionality here as well.

It’s a good first step – now let’s wait to see how this functionality iterates over time, and hopefully enables admin users in better ways!

Searching tables within the Modern Advanced Find

Well for a start, I know that the title of this blog post is somewhat of a mouthful. It’s definitely longer than my usual titles! However I felt it important to do so, given the functionality that I’m actually going to talk about…

So here goes!

As part of the Wave 1 2022 release, both for Power Platform as well as Dynamics 365, we have the new ‘Modern Advanced Find’ capability. This replaces the (legacy) Advanced Find interface, which has been around since almost the beginning of Microsoft CRM…that’s quite a few years!

So within a model-app (as this covers both Power Apps as well as Dynamics 365), the classic Advanced Find was a good friend. Though using the legacy interface (& sometimes being VERY slow to load initially), we could create powerful queries through it. Being able to specify conditions, span multiple tables (with needing to understand the data model), we were able to show & filter data as we needed to.

When loading the Advanced Find interface, we could select from any of the tables within the system, with a LONG list presented to us for this purpose:

Now, just because we could see all tables (system & custom) within the list didn’t mean we could view all data within the tables. Oh no – the security roles applied to users limited what we could do.

In fact, users having security roles with NO permissions on certain tables would NOT see those tables appearing in the Advanced Find interface. Even when users had permissions on tables, but these permissions were limited (such as only being able to view our own records), the data results would be filtered based on our security role access to the records within the table.

OK – all good so far. Well, in general – there have been various complaints over the years about the Advanced Find functionality. So finally, Microsoft updated it to the ‘Modern Advanced Find’.

This needs to be enabled by a system administrator in the environment settings:

So in order to access the Modern Advanced Find, we need to do the following:

  1. Click in the search box at the top of the screen
  2. At the bottom, click the ‘Search for rows in a table using advanced filters’ (that’s a mouthful as well, isn’t it!)

After clicking this, we then get presented with the following interface:

Once we select a table (we can only select one table, as this will be the primary table used), we then switch screens to set the filters that we want to use:

Now here’s where things got a little strange. On the filter screen, we can select related tables to the primary table (ie connected through a relationship), and we get EVERY table that’s available for this. So if we’re starting with the Accounts table, we can then select from the following:

So in this list, I can see tables such as Emails, Invoices, and various others as well. In fact, it’s actually a very extensive list (limited, of course, to all tables that have a relationship in place with the Accounts table, and which the user has access to through their security role).

But if I look back at the initial list of tables, I’m MUCH more limited in my choice:

This, to me, was quite confusing. After all, what if I wanted to start the search from a different table – one that isn’t shown in this initial list?

So I started doing some digging. Initially, I thought that these tables are the ones defined in the sitemap (ie the app navigation). This could mean that I’d need to somehow create a section that shows all tables within it, just to be able to have them searchable.

Thankfully, it turns out that this isn’t actually the case. What’s happening is that with the new Modern Advanced Find, tables need to be directly associated to the APP, to be able to show up and use for search purposes.

Actually, there’s some more granularity around this. The list of tables available to search on (as the primary table) need to meet ALL of the following criteria:

  1. Table is part of the model-driven app
  2. Table is enabled for unified interface
  3. Table is valid for advanced find (set on the table settings)
  4. User has read access to the table (handled through security roles)

So essentially, the ability to search tables within an app is now limited to the tables that have been associated to the app itself! This could be very helpful in various scenarios, when users can be quite confused with seeing the entire list of tables.

To do this, we’d edit the app, and add it to the list of tables available through the app designer (note – we don’t have to include them in the sitemap, if we don’t want to display them in the app navigation):

So this now makes sense, and I think it’s a good step forward.

Also thanks to my colleague Bill (who’s an AMAZING Customer Success Manager!) for his collaboration on this.

What are your thoughts on the Modern Advanced Find? Are you finding it better for functionality? Is there something that you feel is missing, or that you’d like to see in it? Drop a comment below – I’d love to hear!

Calculated columns not working with data migration

Interesting title, isn’t it? I thought to do something that might grab peoples attention, and this was the best that I could come up with! So, let’s get into the scenario, the issue experienced, and how we managed to resolve it.

The scenario on this project was as follows. We’ve been implementing a customer service solution for a sales company, that manufacture multiple products, under multiple brands. Currently there are multiple systems used for order entries, which at some point will be moved to a single system.

However for the moment, they’re wanting to be able to carry out holistic customer service across all brands, to be able to enable all customer service agents to have access to the same data, customers able to be serviced in the same way, regardless of brand, etc.

rectangular brown wooden table

As a result, Dynamics 365 Customer Service was the ticket, and has many standard capabilities that addresses the need of the customer.

Now, whilst sales (aka orders) will not be handled within Dynamics 365 itself, we didn’t want the customer service agents to have to look up order information in the ordering systems. Instead, we wanted to be able to bring the sales/order information into Dynamics 365 for reference (at some point it’s likely that the customer will actually use Dynamics 365 capabilities for sales as well).

In order to do this, we’ve had some amazing data architects bringing the data together into Azure Data Factory (ADF)) from the multiple order systems, and then pushing the data into Dynamics 365 (users have read-only view of it).

With bringing in the data, we were looking to capitalise on the native functionality of Dynamics 365, namely the ability for columns to be automatically calculated. An example of this would be bringing in the order line amount, the tax amount, and then having the total order line amount automatically calculated. This is standard system functionality, and when working in Dynamics 365, has many different uses across the system.

Now, it’s important to note here that as we’re not actually handling orders within Dynamics 365, we’re also not holding a ‘proper’ product list within Dynamics 365 itself. However, orders need to show product information on them (bit useless otherwise!), so we’re using the capability of ‘write-in products’.

Note: If you haven’t come across write-in products before, it’s actually a really great item. Essentially, it allows products to be entered for opportunities, quotes, orders etc (wherever products are used), but for when the product/s aren’t in the system product catalogue. Write-in products allow you to simply type the name of a product or service, & then type in the price. This is very useful if, for instance, a product isn’t yet available in the product catalogue, but you still want to be able to quote it. In our scenario, we’re using write-in products to avoid the need to manage the product catalogue itself. It’s also helpful for when you don’t want to use price lists, as all products need to be associated to a price list.

So we start off the data migration, and it’s looking good. No issues being reported by the integration…

But, then users go in to the UAT system to check through things, and find that when looking at orders, the totals aren’t being calculated:

Order line not calculating
Order not calculating either!

Hmm. That’s strange. So we started to look at what could have caused this problem…

  • Is the environment in ‘admin mode’? If an environment is in admin mode, then auto-calculations won’t work at all. Well, the environment wasn’t in admin mode, so it wasn’t that
  • Is there a plugin not firing correctly? Well, this is native Microsoft standard functionality within the platform, so unlikely, but we double-checked to make sure. No, there wasn’t anything causing issues in that dimension
  • Does it work for users, when it’s created manually within the system? Yes, it DOES work when users enter an order/order line with a product. Hmm…this was getting VERY confusing

For clarification, we didn’t want to auto-calculate the information within ADF, and then push it into the relevant Dynamics 365 columns. We wanted to be able to rely on the system working in the way that it should!

Finally, we found out why the calculated columns weren’t working. There’s actually a system setting that governs how this works:

With this set, the auto calculations are now working in the system:

So, thankfully we managed to get this working, and everything went smoothly from that point.

Have you ever been caught out by something similar? I’d love to hear – please drop a comment below!

Security Roles & Assigning Records

Let’s face it, and call a spade a spade (or a shovel, depending on where in the world you happen to be). Security roles are very important within Dataverse, to control what users can (& can’t!) do within the system. Setting them up can be quite time-consuming, and troubleshooting them can sometimes be a bit of a nightmare.

Obviously we need to ensure that users can carry out the actions that they’re supposed to do, and stop them doing any actions that they’re not supposed to do. This, believe it or not, is generally common sense (which can be lacking at times, I’ll admit).

Depending on the size of the organisation, and of course the project, the number of security roles can range from a few, to a LOT!

Testing out security can take quite a bit of time, to ensure that testing covers all necessary functionality. It’s a very granular approach, and can often feel like opening a door, to then find another closed door behind the first one. Error messages appear, a resolution is implemented, then another appears, etc…

Most of us aren’t new to this, and understand that it’s vitally important to work through these. We’ve seen lots of different errors over our lifetime of projects, and can usually identify (quickly) what’s going on, and what we need to resolve.

Last week, however, I had something new occur, that I’ve never seen before. I therefore thought it might be good to talk about it, so that if it happens to others, they’ll know how to handle it!

The scenario is as follows:

  • The client is using Leads to capture initial information (we’re not using Opportunities, but that’s a whole other story)
  • Different teams of users have varying access requirements to the Leads table. Some need to be able to view, some need to be able to create/edit, and others aren’t allowed to view it at all
  • The lead process is driven by both region (where the lead is located), as well as products (which products the lead is interested in)

Now, initially we had some issues with different teams not having the right level of access, but we managed to handle those. Typically we’d see an error message along the following lines:

We’d then use this to narrow down the necessary permissions, adjust the security role, re-test, and continue (sometimes onto the next error message, but hey, that’s par for the course!).

However, just as we thought we had figured out all of the security roles, we had a small sub-set of users report an error that I had NEVER seen before.

The scenario was as follows:

  • The users were able to access Lead records. All good there.
  • The users were able to edit Lead records. All good there.
  • The users were trying to assign records (ie change the record owner) to another user. This generally worked, but when trying to assign the record to certain users, they got the following error:

Now this was a strange error. After all, the users were able to open/edit the lead record, and on checking the permissions in the security role, everything seemed to be set up alright.

The next step was to go look at the error log. In general, error logs can be a massive help (well, most of the time), assuming that the person looking at it can interpret what it means. The error log gave us the following:

As an aside, the most amusing thing about this particular error log, in my opinion, was that the HelpLink URL provided actually didn’t work! Ah well…

So on taking a look, we see that the user is missing the Read privilege (on what we’re assuming is the Lead table). This didn’t make sense – we then went back to DOUBLE-check, and indeed the user who was trying to carry out the action had read privileges on the table. It also didn’t make sense, as the user was able to open the lead record itself (disclaimer – I’ve not yet tried doing a security role where the user has create/write access to a table, but no read access..I’m wondering what would happen in such a scenario)

Then we had a lightbulb moment.

photo of bulb artwork

In truth, we should have probably figured this out before, which I’ll freely admit. See, if we take a look at the original error that the user was getting, they were getting this when trying to assign the record to another user. We had also seen that the error was only happening when the record was being assigned to certain users (ie it wasn’t happening for all users). And finally, after all, the error message title itself says ‘Assignee does not hold the required read permissions’.

So what was the issue? Well, it was actually quite simple (in hindsight!). The error was occurring when the record was being attempted to be assigned to a user that did not have any permissions to the Lead table!

What was the resolution? Well, to simply grant (read) access to the Lead table, and ensure that all necessary users had this granted to them! Thankfully a quick resolution (once we had worked out what was going on), and users were able to continue testing out the rest of the system.

Has something like this ever happened to you? Drop a comment below – I’d love to hear the details!

MB-260: Microsoft Customer Data Platform Specialist

It’s been a while since I’ve taken an exam. Admittedly, this is for two reason. Firstly, the renewal process for exams now (as updated last year) is not to take it again, but rather to re-qualify through Microsoft Learn. The second reason is that I’ve been waiting for some new exams to come out (OK – there’s the DA-100, which is still on my list of things to do…).

Well, there’s a new exam on the block. In fact, it’s a different type of exam – this is a ‘Speciality’ exam, rather than focusing on a specific type of application. It’s the first of its kind, though there are likely to be more to follow in the future.

It’s the MB-260, which is all around Customer Data. That’s right – it’s not about how to do sales, or customer service, or something else. It’s about taking the (holistic) approach to ALL of the data that we can hold on customers, and do something with it.

The official page for it is at https://docs.microsoft.com/en-us/learn/certifications/exams/mb-260https://docs.microsoft.com/en-us/learn/certifications/exams/mb-260. The specification for it is:

Candidates for this exam implement solutions that provide insights into customer profiles and that track engagement activities to help improve customer experiences and increase customer retention.

Candidates should have firsthand experience with Dynamics 365 Customer Insights and one or more additional Dynamics 365 apps, Power Query, Microsoft Dataverse, Common Data Model, and Microsoft Power Platform. They should also have direct experience with practices related to privacy, compliance, consent, security, responsible AI, and data retention policy.

Candidates need experience with processes related to KPIs, data retention, validation, visualization, preparation, matching, fragmentation, segmentation, and enhancement. They should have a general understanding of Azure Machine Learning, Azure Synapse Analytics, and Azure Data Factory.

Note that there’s quite a bit of Azure in there – it’s not just about Power Platform, Dataverse, or Dynamics 365. People who handle reporting on customer data should have various Azure skills as well.

There’s also a new type of badge that will be available:

At the time of writing, there are no official Microsoft Learning paths available to use to study. I do expect this to change in the near future, and will update this article when they’re out. However the objectives/sub-objectives are available to view from the main exam page, and I’d highly recommend going ahead & taking a good look at these.

As in my previous exam posts, I’m going to stress that it’s not permitted to share any of the exam questions. This is in the rules/acceptance for taking the exam. I’ve therefore put an overview of the sorts of questions that came up during my exam. (Note: exams are composed from question banks, so there could be many things that weren’t included in my exam, but could be included for someone else! ). I’ve tried to group things together as best as possible for the different subject areas.

Overall, I had 51 questions, which was towards the higher number of questions that I’ve experienced in my exams over the last year or so. There was only a single case study though.

Some of the naming conventions weren’t updated to the latest methods, which I would have expected. I still had a few references to ‘entities’ and ‘fields’ come up, though for the most part ‘tables’ and ‘columns’ were used. I guess it’s a matter of time to get everything up to speed with it.

  • Differences between Audience Insights and Engagement Insights
    • What are the benefits of each
    • When would you use each one
    • What types of users will benefit from each type
    • How to create customer insights
  • Environments
    • Types of environments
    • How to create a new environment
    • What options are available when creating an environment
    • What is possible to copy from an existing environment
  • Relationships
    • Different types of relationships
    • What is each one used for
    • Limitations of different relationship types
  • Business level measures vs customer level measures
    • What each one is, and what they’re used for
  • Power Query
    • How to use
    • How to configure
    • How to load data
  • Data mapping
    • Different types available to use
    • Scenarios each type should be used for
    • Limitations of each type
    • How to set it up
  • Segments
    • What are segments, how are they set up, how are they used
      What are quick segments, how are they set up, how are they used
      What are segment overlaps, how are they set up, how are they used
      What are segment differentiators, how are they set up, how are they used
  • Measures
    • What are measures, how are they set up, how are they used
  • Data refresh
    • Automated vs manual options
    • Limitations of each type
    • Availability of each type
    • How to set up each type
    • How to apply each type
  • Data Unification
    • What is this
    • How it can be used
    • How to set it up
    • Limitations of it
    • Process validation
    • Changing existing models
  • AI for Audience Insights
    • What is this
    • What can it be used for
    • How to use it
    • Factors that can affect outcomes
  • Security
    • Using Azure Key Vault
    • Capabilities of this
    • How to set it up
    • How to use it
  • Dynamics 365
    • Capabilities for interacting with Dynamics 365
    • How to set it up
    • How to display data, and where it can be displayed
    • What actions users are able to carry out within Dynamics 365

Wow. It’s a lot of stuff. It’s definitely an exam that if you’re not already currently hands-on with the skills needed, I’d highly recommend you get a decent amount of experience with it before taking the exam!

I can’t tell you if I’ve passed it or not…YET!. Results aren’t going to be out for several months, and to be honest, I’m not quite sure how well I’ve actually done.

So, if you’re aiming to take it – I wish you the very best of luck, and let me know your experience!