The story of MFA & the Centre of Excellence

I’ve been rolling out the Microsoft Centre of Excellence solution for several years now at customers. It’s a great place to start getting a handle on what exactly is going on within a Power Platform tenant, though there’s obviously so much more that takes place within a Centre of Excellence team.

The solution gathers telemetry around environments, Power Apps, Power Automates etc through the usage of the Power Automate Admin connectors for Power Platform (see Power Platform for Admins – Connectors | Microsoft Learn for further information on these).

Now obviously we need a user account to run these, and this usually has been through the use of a ‘pseudo service account’, as using a service principal has been tricky, to say the least. So we would get customers to set up an appropriate account with licensing & permissions in place, and use this to own & run the Power Automate flows that bring in the information to the CoE solution.

It is important to note that usage of these connectors do require a pretty high level of permissions – in fact, we usually suggest applying the Power Platform Admin security role (within the Microsoft 365 Admin Centre) to the user account. All good so far.

The tricky part has, to date, been around security. Organisations usually require (for good reasons) multi-factor authentication to be in place (aka MFA). Now this is fine for users logging in & accessing systems. However, it proves to be somewhat tricker for automations.

See, when a user logs in & authenticates through MFA, a token is stored to allow them to access systems. Automations can also use this. However the token will expire at some point (based on how each organisations has implemented MFA access/controls). When the token expires, the automations will stop running, and fail silently. There’s no prompt that the token has expired, and the only way of knowing is to take a look at the Power Automate flow history. This can be interesting though, as signing in (with the pseudo service account) will prompt for MFA authentication, and then everything will start running again!

So this has usually resulted in conversations with the client to politely point out that implementing MFA on the service account will mean that, at some point, the Power Automate flows are going to start failing. Discussions with security teams take place, mitigation using tools such as Azure Sentinel are implemented, and things move ahead (cautiously). It’s been, to date, the most annoying pain for the technical implementation (that I can think of at least, in my experience).

Now you’d think that a change in this would be shouted from the rooftops, people talking about it, social media blowing up, etc. Well, I was starting an implementation recently for a customer, and was talking to them around this, as I’d usually do. Imagine my surprise when Todd, one of the Microsoft technical people attached to the client, asked why we weren’t recommending MFA.

Taking a look at the online documentation, I noticed that something had slipped in. Finally there was the ability to use MFA!

Trawling back through the GitHub history (after all, I wanted to find out EXACTLY when this had slipped in), I discovered that it was a few months old. I was still very surprised that there hadn’t been more publicity around this (though definately a good incentive to write about it, and a great blog post to start off 2023 with!).

So moving forward, we’re now able to use MFA for the CoE user account. This is definately going to put a lot of mind at rest (especially those who are in security and/or governance). The specifics around the MFA implementation can be found at Conditional access and multi-factor authentication in Flow – Power Automate | Microsoft Learn – but it’s important to note that specific MFA policies will need to be set up & implemented for this account.

So, now the job will be to retro-fit this to all organisations that already have the CoE toolkit in place. Thankfully this shouldn’t be too difficult to do, and will most definitely enhance the security controls around it!

Have you implemented any mitigation in the past to handle non-MFA? I’m curious if you have – please drop a comment below!

PL-500: Microsoft Power Automate RPA Developer

RPA (or Robotic Process Automation) is a capability that Microsoft has been developing for a while within the Power Platform space. Whilst cloud flows can be used to interact with any systems that has an API in place, many organisations have (legacy) systems that have no API, so interacting with them can be challengin. RPA capabilities allow organisations to be able to interact with any system overall, thereby enabling & empowering businesses holistically.

I’ve been aware for a while that there’s been an exam coming out for RPA, though it’s taken a bit of time to land. That’s fine though – I can’t really think of any absolute rush to have it in place. I do think that over time, just as with some of the other certifications, it will become a required for solution or specialisation status.

The official page for it is at https://docs.microsoft.com/en-us/certifications/exams/pl-500. The specification for it is:

Candidates for this exam automate time-consuming and repetitive tasks by using Microsoft Power Automate. They review solution requirements, create process documentation, and design, develop, troubleshoot, and evaluate solutions.

Candidates work with business stakeholders to improve and automate business workflows. They collaborate with administrators to deploy solutions to production environments, and they support solutions.

Additionally, candidates should have experience with JSON, cloud flows and desktop flows, integrating solutions with REST and SOAP services, analyzing data by using Microsoft Excel, VBScript, Visual Basic for Applications (VBA), HTML, JavaScript, one or more programming languages, and the Microsoft Power Platform suite of tools (AI Builder, Power Apps, Dataverse, and Power Virtual Agents).

Now here’s the thing. I occasionally work in the automation space, either on customer projects, or when training users in the technologies. I wouldn’t describe myself as an advanced automation developer (whether cloud or RPA capabilities). I’m most definitely NOWHERE near the level of legends such as Matt Collins-Jones, for example (go check him out if you don’t know about him!).

So I knew that I may be a bit challenged when taking the exam, especially in the more ‘pro dev’ space (aka JSON etc). In fact, I didn’t actually realise that the exam specification included that sort of thing. I know, I should have – it’s aimed at developers overall…shows that I need to brush up on reading things properly!

Also, there’s still quite a bit of a focus on Power Automate cloud flows – it’s not JUST about RPA capabilities.

Now, really nicely, there are already Microsoft Learn pathways available (which have been around for a while, and updated appropriately). This really is a big help, I feel, especially for people who are new’ish to RPA.

Of course, there’s a lovely shiny two star badge awarded when passing the exam, along with the title of ‘Microsoft Certified: Power Automate RPA Developer Associate’:

As with previous exams, I sat it from home (the proctored experience). Learning from previous times that I’ve taken exams, I ensured that my workspace was entirely clear from everything. As a result, the check-in process happened automatically, and I didn’t need to engage with any proctors at all (which was quite nice actually).

As in my previous exam posts, I’m going to stress that it’s not permitted to share any of the exam questions. This is in the rules/acceptance for taking the exam. I’ve therefore put an overview of the sorts of questions that came up during my exam. (Note: exams are composed from question banks, so there could be many things that weren’t included in my exam, but could be included for someone else! ). I’ve tried to group things together as best as possible for the different subject areas.

  • Cloud flows vs RPA flows
    • Capabilities of each
    • When to use each (ie how to handle different scenarios)
    • How to trigger each one
  • Cloud flows
    • Different types of triggers, & when each type should be used
    • Different types of actions, and the capabilities of them (at a high’ish level – expected to know common Microsoft actions, but not need to know all of the hundreds of different ones!)
    • Controls/operators. What they are, how they can be used to accomplish different requirements
    • JSON formatting & syntax
  • Business Process flow vs Business Rules
    • What each is
    • When to use each one
    • Capabilities
  • RPA flows
    • Common actions, how they work, capabilities of them
    • How expression syntax works within them
    • Debugging capabilities, and what to use when
    • How to interact with desktop applications
    • How to interact with websites
      • How data values can be used
      • How data tables can be used
      • How to use data that’s extracted from a website
    • Troubleshooting functionality
  • Usage of automation capabilities from Office 365 applications such as Excel & Visio
  • Loops
    • How they work for cloud & RPA flows
    • Troubleshooting
    • Implementing success/fail criteria
    • Error handling
  • Process Advisor
    • What it is
    • What it does
    • How it can help organisations
    • Limitations
    • What it cannot do
    • Process Mining vs Task Mining, & the important differences between them
  • Variables
    • How to handle variables across different environments
    • How to declare them (cloud flow vs RPA flow)
  • Runtime operations
    • How flows are triggered (async vs sync)
    • How flows are queued (cloud vs RPA)
    • How RPA flows are carried out when using machine groups
  • Artificial Intelligence (AI) capabilities
    • How AI can be used within flows
    • Different AI capability types (what each one can be used for)
    • AI within Power Platform, & AI within Azure Cognitive Services
  • Sharing flows
    • Different ways to share cloud flows
    • Different ways to share RPA flows
  • Application Lifecycle Management (ALM)
    • Solutions (managed vs unmanaged). Capabilities of each, when to use each type
    • AzureDevOps (ADO). What it is, when/how to use it, capabilities
    • Solution imports
    • Solution layers. What these are, troubleshooting functionality
    • Upgrade/Stage for Upgrade/Update. Which each is, what each does, how/when to use each one
    • Moving desktop flows between users
  • Security
    • Security roles needed to create
    • Security roles needed to share/modify
    • Security roles needed to register machine for RPA
    • Security roles needed to register machine groups for RPA
    • Security requirements to run different types of RPA flows (how it interacts with desktop/s)
    • Data Loss Prevention (DLP) – how it affects creation & runtime of flows

Overall, I had 46 questions, with a single case study. I’m used to having at least two case studies, so it was nice to have just one of them this time.

So….it’s a lot of stuff. Definitely targeted much more at the ‘pro-developer’ end of the scale that someone who might occasionally automate things. It’s absolutely necessary to understand coding conventions, ALM, etc.

It’s definitely an exam that if you’re not already currently hands-on with the skills needed, I’d highly recommend you get a decent amount of experience with it before taking the exam! I’d highly recommend ensuring that you have an environment in which you’re able to be hands on with all types of automation (cloud & desktop flows), and really understand how they can be handled with an eye on the enterprise scale!

If you’re aiming to take it – I wish you the very best of luck, and let me know your experience!

Environments & ‘Admin Mode’

With some recent events happening (both professional & personal), I’ve taken a slight step back from putting out posts on here. Thankfully things seem to be settling down, so I’m getting (back) into the swing of things!

I thought that it would be good to talk about a subject that I fell ‘foul’ of recently. This is around environments, and more specifically, the ‘admin mode’ that it’s possible to use on them.

So what exactly is this ‘admin mode’? Well, the aim of it to restrict access to certain users, namely System Administrators & System Customisers. Why would we want to do this? There are several scenarios that come into mind:

  • Performing a system upgrade (such as enabling new features)
  • Changing environment type (eg Production to Sandbox, or vice-versa)
  • Restoring an environment

Essentially, any time we have operation-type work that we’re wanting to carry out. This way whatever we’re doing won’t affect users, and anything that the users are doing won’t affect things either (symbiotic relationship there!).

So as an example, if we’re doing a major release, which changes functionality within a system, we wouldn’t want users in the system carrying out their usual work, as this could have data issue if saving during the actual release. We of course SHOULD be communicating to users that a release is going to take place, and that they shouldn’t be in the system at the time, but ‘admin mode’ is how we can truly enforce it.

Something to bear in mind as well is that if you’re going ahead & restoring an environment to a previous state (whether that’s an automatic save point, or a manual one), it will automatically put the environment into ‘admin mode’ once the restore has been completed. This is very important to keep in mind!

There are three settings around administration mode:

  1. ‘Administration Mode’. This sets whether admin mode is on or off!
  2. ‘Background Operations’. This sets whether background processes, such as workflows, power automate flows, and Exchange synchronisation are enabled (allowed to happen) or disabled (stopped from happening
  3. ‘Custom Message’. This allows you to set a custom message that users (who are not system administrator/system customiser) will see when they attempt to access the environment

So this is the scenario that tripped me up a few weeks back:

  • I was needing to restore an environment to an earlier save point (to be clear, this was NOT a production environment)
  • I went ahead with the restore, and it completed successfully
  • Given that I was doing this at night, one of my children woke up, and I had to deal with them
  • I came back to things, saw that it completed, and then went ahead with the release that I was needing to do

All seemed to go well. However, when users were testing (which admittedly was a few days later), they reported that some functionality wasn’t working. This was strange, as it had been working before the release (& the release that I did hadn’t actually touched it!).

It turned out to be Power Automate flows that just didn’t seem to be running. OK – I started to look into them, but couldn’t figure out why they hadn’t run.

Creating a test Power Automate flow didn’t seem to work either – despite running it to test it, the trigger never activated! I was quite puzzled by this, and couldn’t (initially) work out the reason.

Then I thought to check environment settings! Lo & behold, the environment was STILL in administration mode, and the Background Process option was disabled! Aha – I’ve found the source!

Flipping this out of administration mode thankfully then allowed all Power Automate flows to work/run, and users confirmed that functionality was indeed running as expected. As you can imagine, I was quite relieved!

man in white shirt and black pants standing on black concrete bench near white building during

Something that I hadn’t realised previously is that if you manually put an environment into administration mode, it doesn’t automatically disable background processes. However, if you restore an environment, it DOES disable background processes by default. So if you’re wanting to try out automation items within a restored environment that’s still in administration mode, you’re going to need to ensure that you toggle the Background Processes toggle to allow it to work!

One further thing to learn as well (which I’ve been asked already by some people, so thought that I would mention it here). I’ve mentioned above that users were in the system, but reporting that things weren’t working. Now given that the environment was in administration mode, people have asked how users could be in it! The answer is that these users actually had the system customiser role applied to them, which is why they could get in! If they hadn’t had the role, then perhaps I might have realised things a little sooner (ie that the environment was in administration mode).

So a (good) little lesson learned, and I’ll definitely take it forwards. Has this, or anything else like it, ever tripped you up? Drop a comment below – I’d love to hear!

Working with Opportunity Close table

I’ve recently had the experience of working with the Opportunity Close functionality within Dynamics 365, and given what occurred, thought it would be useful to document this so that others are able to see this as well. There are many scenarios in which we’d use this, and being able to give a comprehensive solution to clients does make all of the difference!

There are three areas that I’d like to cover:

  • Working with Opportunity Close table
  • Challenges with data
  • Power Automate to the rescue!
  • Caveats

So let’s get started then!

Thanks to various members of the community such as Matt Collins-Jones, Andrew Bibby & others, who helped me along the way

Working with Opportunity Close

The Opportunity Close functionality within Dynamics 365 (& yes, I’m going to refer to it as this, rather than Power Platform) is used to provide information around why an opportunity is being closed. This is regardless of whether the opportunity has been won, or it’s been lost. It’s still quite important to track the information around it, so that companies can understand better how the market views the products it offers, how it stacks up against others, etc.

The default path in the system is to create a lead, and then qualify it. Qualifying a lead then automatically creates an opportunity record, which further information (quotes, etc) can be entered against. An account record (if company information is specified) is also created:

Updated Solution Release: Lead Qualification Version 2.0.0 for Microsoft Dynamics  365

On the opportunity record, users are able to show if it’s been won or lost by clicking an appropriate button on the toolbar:

Doing this brings up the Opportunity Close pane on the right hand side of the screen:

Now it’s possible to customise this screen. In fact, the screenshot above shows 3 custom columns that have been added to it already in the system I was in.

To do this, we go to customise the solution (in the Maker Experience), and add the column/s that we’re wanting to:

Next, we need to remember to add it to the form! Otherwise it’s not going to show up. If we’re wanting it to appear on the side bar, then it’s important to customise the ‘Quick Create’ form version, to make our customisations show up.

Note: We’re able to put conditional visibility of the column/s if we want to, based on whether the opportunity is won or lost, using Business Rules. I haven’t done so in this scenario, but you’re obviously able to do so if you want to

Remember to save & publish the form, and then it’ll display within the system for users. Brilliant!

Challenges with data

So we’ve gone ahead & created the custom columns, and users are actually using them to record data. Wonderful – that’s exactly what we’ve been wanting to achieve.

OK – let’s now review the data so that we can see overall what’s happened with our opportunities. Of course we’re wanting to do this simply & easily, so we’ll open an Advanced Find window, go to the Opportunity Close table, add columns from the associated Opportunity, and….hold on. Opportunity Close ISN’T displaying in the Advanced Find????

It’s just NOT there. In case you’re wondering if you saved/published things correctly, or forgot some system setting, stop worrying. It’s not you – it’s the system.

See, Opportunity Close, though a table in its own right, is a SPECIAL sort of table. It doesn’t show up, and can’t be directly queried. I know – frustrating. I felt exactly the same way.

On digging deeper into things, I found out that there’s actually an activity record saved. It’s possible to query against this:

However, and this is the BIG catch, it’s NOT possible to return custom columns when carrying out this query. The search will ONLY return the (system) columns that are present for activities. So this leaves us with a problem.

Essentially, though we can set up custom columns to track the data that we’re needing to, it’s not possible (through the front end) to query it. This sort of negates what we’re trying to achieve here overall, and is a pain.

So what’s the way round it? Well, it’s actually going to be Power Automate!

Power Automate to the rescue

In order to handle our issue, what we need to do is the following:

  • Add custom columns to the Opportunity table (these should mimic the custom columns that we’ve added to the Opportunity Close table)
  • Use Power Automate for automation purposes!

The first step is easy. We need to go & create custom columns on the Opportunity table. These WILL show up in the Advanced Find search. They obviously need to be the same as the custom columns on the Opportunity Close table. If we’ve used Choice or Choices there, point the Opportunity column to the same source (it’s a good argument for using Global, rather than Local, choice/s).

We then can go and create a Power Automate. This should trigger when an Opportunity Close record is created.

Note: For this, I’ve made it so that it runs under the user triggering the action, rather than a system account. This is to keep in line with licensing limits etc

You’ll then need to add a ‘Get Dataverse row’ step, and get the Opportunity Close record that has just been created. This is annoying, but for some strange reason the trigger doesn’t present the custom columns/values in the JSON that it returns. Hopefully Microsoft fixes this at some point, but for the moment, we need to work around it.

The last step is to add a ‘Update Dataverse row’. This should point to the Opportunity table, & we can simply map the values across (from the SECOND step, NOT the first one – VERY IMPORTANT).

Once this is all done, save & test it, and you should see it working. I generally don’t add the Opportunity custom columns to the form, but rather leave them for querying against.

Caveats

It’s important to keep in mind that when an opportunity is marked as either won or lost, it’s then closed, and changed to a read-only state. That’s how the system is designed to be, and makes sense.

However it’s ALSO possible to re-activate a closed opportunity, and then close it again. Ie a single Opportunity record could have multiple Opportunity Close records against it. This solution won’t handle this (it would need to be built out further – the Opportunity record itself will only show the values from the latest Opportunity Close action, so please do keep this in mind!

Have you ever come up against something like this? How have you handled it? I’d love to hear – please drop a comment!

Canvas Apps & Power Automates

So it’s been a busy few weeks here, which is why I haven’t really been putting up any articles. March/April is always a busy time for our family with stuff going on, and this year I decided not to push myself to get articles out, as otherwise I’d be running very low on sleep!

That being said, I’ve still had some great ideas about things that I’d like to share, and have been keeping a series of short notes for me to pick up. Today’s topic is one of them, which I think has been a major pain to anyone involved in canvas app development!

So, the back story to this is that we’re able to use Power Automate flows together with canvas apps. What I mean by this is that we’re able to directly trigger them from within the canvas app, rather than needing to do something like edit or create a record, and then have the Power Automate flow trigger from the record creation or modification.

There’s a specific Power Apps trigger that’s available within Power Automate exactly for this purpose:

When clicked, it gives us the trigger line in the steps as follows:

So what we’d do is within the canvas app, we would bind a button (or another control) that when selected, it would then go away & trigger the Power Automate flow. Great – so many different things that we can get to happen! One of the benefits of doing things like this is that we can then pass information from the Power Automate flow back to the canvas app directly:

This can then mean that the user can know, within the canvas app itself, that the Power Automate flow has run, and use data (or other things) that have come out of it.

OK – all good so far.

The main issue to date has been with deploying canvas apps together with Power Automate flows. See, as per best practise, we would create a solution, place the canvas app, flows, and anything else that’s necessary for it to work within it, and then deploy the solution to our target environment/s. And that’s where things just…didn’t go quite right.

Obviously within the development environment, the canvas app would be hooked up to the flows, and everything would work. Clicking the button would cause the flow to run, etc. User authentication would be in place (along with licenses of course!), and it was just fine.

But when deploying a solution containing canvas apps and associated flows between environments (regardless of whether it’s been manually deploying, or automated using a tool such as Azure DevOps), the connections to the flows would be broken. Ie, the canvas app would run, but the flows wouldn’t trigger. Looking at the connections in the canvas app within Studio would show something like the following:

All of the connections to Power Automate flows would show as ‘Not connected’. It’s not even possible to click the ellipse next to them and re-connect them – the only option available is to remove it from the canvas app!

So in order to get things working again, we’d need to do the following steps:

  • Open up the canvas app
  • Remove all connections to Power Automate flows
  • Add a temporary button, set it to be a Power Automate trigger
  • Click through all of the Power Automates needing to be connected (waiting for each one to connect, then go to the next one)
  • Remove the temporary button
  • Save and publish the solution

This, in a nutshell, has been a (major) headache. For example, I’ve been working with a solution that has over 30 Power Automate flows that can be triggered from the canvas app (lots of different functionality!). Each deployment has needed the above process to be carried out, which has usually added on at least an hour to the deployment process!

Now, this hasn’t been something that’s been unknown. In fact, the official Microsoft documentation noted the following:

So this is something that Microsoft has been well aware of, but it’s been a pain point that we’ve had to work with.

However, this has now ALL changed, which I (and MANY others) are really pleased about!

Microsoft has rolled out an update last month that means that canvas app connections to Power Automate flows will NOT break when they’re deployed across environments! This is such a massive time-saver, that I’m now trying to work out what to do with all of my free time! Only kidding…more project work will commence!

So what we can now do is take our solution, deploy it across the different environment/s that we need to get it out to (whether manually, or automated using tools such as Azure DevOps), publish the solution, and then everything works! Amazing!!

One small caveat though – to ensure that this work, you will need to go into the app, and re-publish it on the latest Power Apps version. This should of course be done in a development environment, and then can be exported and deployed as required.

Microsoft have also updated their documentation at https://docs.microsoft.com/en-us/powerapps/maker/data-platform/solutions-overview to remove the limitation text shown above. It’s a good place to keep an eye on changes that occur over time too.

This is definitely a welcome piece of development, and I know that we’ve been eagerly waiting for this for a while, and now it’s here!

Record security with Power Automate

Today’s post is around record security, and how Power Automate can really be quite useful with this!

Let’s take a quick recap of how security works (which is applicable to both Dynamics 365, as well as Power Platform apps). We have the following:

  • Security roles, which are set up with specific privileges (Create/Read/Update/Delete etc) across each entity table, as well as for other system permissions
  • Users, who can have one (or more) security roles applied to them (security roles being additive in nature)
  • Teams, who can have one (or more) security roles applied to them. Users are added into the team, and inherit all permissions that the team has (much easier than applying multiple roles on a ‘per user’ basis)

That’s great for general security setup, but it does take a system admin to get it handled. Alternatively, of course, it’s possible to use AAD Security Groups which are connected to security teams within Power Platform, and users added to them will inherit the necessary permissions.

But what if we want to allow users who aren’t system administrators to allow other users access to the records? Well, it’s also possible to share a specific record with another user – doing this allows the second user to see/access the record, even if they usually wouldn’t be able to do so. This is really great, but does require a manual approach (in that each record would need to be opened, shared with the other user/s, and then closed).

I’ve been working on a project recently where we have the need to share/un-share a larger number of records, but with a different user for each record. We’ve been looking into different ways of doing this, and obviously Power Automate came into mind! We didn’t want to use code for this, for a variety of reasons.

Security and Compliance in PowerApps and Flow - Michał Guzowski Consulting

The scenario we had in mind was to have a lookup to the User record, and with populating this with a user, it would then share the record with them. This would be great, as we could bulk-update records as needed (even from an integration perspective), and hopefully all would work well.

So with that, I started to investigate what options could be available. Unfortunately, there didn’t seem to be any out of the box connectors/actions that could be used for this, which was quite disheartening.

My next move was to look at the user forums, & see if anyone had done anything similar. I was absolutely excited to come across a series of responses from Chad Althaus around this exact subject! It turns out that there’s something called ‘Unbound Actions’, which is perfect for the scenario that we’re trying to achieve.

There are two types of actions available within Power Automate:

  • Bound actions. This are actions that target a single entity table or a set of records for a single entity table
  • Unbound actions. These aren’t bound to an entity type and are called as static operations. They can be used in different ways

There are quite a lot of unbound actions available to use:

The one I’m interested in for this scenario is the GrantAccess action. More information around this can be found at https://docs.microsoft.com/en-us/dynamics365/customer-engagement/web-api/grantaccess?view=dynamics-ce-odata-9

It does require some JSON input, but when formatted correctly, it shows along the following lines:

The different parts of this works as follows:

  • Target is the actual record we’re wanting to apply the action to
  • SystemUserID is the actual system user, and we also need to specify the odatatype
  • AccessMask is what we’re wanting to do when sharing the record (as there are different options available for sharing, ie ReadOnly, Edit, ShareOnwards, etc)

Using this, we’ve therefore built out the following scenario:

  1. Field added to the record, looking up to Users
  2. Relevant users who are able to access the record can set this lookup field to be a specific user record (who doesn’t have access to this record)
  3. Power Automate flow fires on the update of the record when it’s saved (filtering on just this attribute), sharing the record with the selected user
  4. The user then gets an email to notify them that the record has been shared with them, with a URL link to it (it’s somewhat annoying that there’s no inbuild system notification when a record has been shared with you, but I guess that’s something we’re having to live with!)
  5. They can then go in & access the record as they need to

We’ve also given some thought to general record security, and have additionally implemented the following as well:

  1. If the user lookup value is changed, we obviously share the record with the new user that’s been saved to it
  2. Using a different Unbound Action (RevokeAccess), we remove the sharing of the record with the previous user (we have another field that’s being updated with the value of it, which we’re using to pass the action in, as otherwise we don’t actually know who the previous user was!)

All in all, we’re quite happy that we’ve managed to come up with this solution, which is working splendidly for us. Also, major thanks to Chad for his assistance in getting the syntax correct!

Have you ever needed to do something like this? Did you manage to implement it in some way? Drop a comment below – I’d love to hear how your experience was!

PL-400: Microsoft Power Platform Developer Exam

I’ve been continuing with taking new exams as they come out. Having recently taken the MB-400 exam (see MB-400 Power Apps & Dynamics 365 Developer Exam), I was slightly surprised to see the announcement that it was going to be replaced!

Admittedly, I was also surprised (in a good way) that I passed the MB-400, not being a developer! It’s been quite amusing to tell people that I’m a certified Microsoft Dynamics Developer. It definitely puts a certain look on their faces, which always cracks me up.

Then again, the general approach seems to be to move all of the ‘traditional’ Dynamics 365 exams to the new Power Platform (PL) format. This includes obviously re-doing the exams to be more Power Platform centric, covering the different parts of the platform than just the ‘first party apps’. It’s going to be interesting to see how this landscape extends & matures over time.

The learning path came out in the summer, and is located at https://docs.microsoft.com/en-us/learn/certifications/exams/pl-400. It’s actually quite good. There’s quite a lot that overlaps with the MB-400 exam material, as well as the information that’s recently been covered by Julian Sharp & Joe Griffin.

The official description of the exam is:

Candidates for this exam design, develop, secure, and troubleshoot Power Platform solutions. Candidates implement components of a solution, including application enhancements, custom user experience, system integrations, data conversions, custom process automation, and custom visualizations.

Candidates must have strong applied knowledge of Power Platform services, including in-depth understanding of capabilities, boundaries, and constraints. Candidates should have a basic understanding of DevOps practices for Power Platform.

Candidates should have development experience that includes Power Platform services, JavaScript, JSON, TypeScript, C#, HTML, .NET, Microsoft Azure, Microsoft 365, RESTful web services, ASP.NET, and Microsoft Power BI.

So the PL-400 was announced on the Wednesday of Ignite this year (at least in my timezone). Waking up to hear of the announcement, I went right ahead to book it! Unfortunately, there seemed to be some issues with the Pearson Vue booking system. It took around 12 hours to be sorted out, & I then managed to get it booked Wednesday evening, to take it Thursday.

So, as before, it’s not permitted to share any of the exam questions. This is in the rules/acceptance for taking the exam. I’ve therefore put an overview of the sorts of questions that came up during my exam. (Note: exams are composed from question banks, so there could be many things that weren’t included in my exam, but could be included for someone else!). It’s also in beta at the moment, which means that things can obviously change.

There were a few glitches during the actual exam. One or two questions with answers that didn’t make sense (eg line 30 does X, but the code sample finished at line 18), and question numbers that seemed to jump back & forth (first time it’s happened to me). I guess that I’ve gotten used to at least ONE glitch happening somewhere, so this was par for the course.

I’ve tried to group things as best together as I feel (in my recollection), to make it easier to revise.

  • Model Apps.
    • Charts. How they work, what drives them, what they need in order to actually work, configuring them
    • Visualisation components for forms. What they are, examples of them, what each one does, when to use each one
    • Custom ribbon buttons. What these are, different tools able to be used to create/set them up, troubleshooting them
    • Entity alternate keys. What these are, when they should be used, how to set them up & configure them
    • Business Process Flows. What these are, how they can be used across different scenarios, limitations of them
    • Business Rules. What these are, how they can be used across different scenarios, limitations of them
  • Canvas apps
    • Different code types, expressions, how to use them & when to use them
    • Network connectivity, & how to handle this correctly within the app for data capture (this was an interesting one, which I’ve actually been looking at for a client project!)
    • Power Apps solution checker. How to run it, how to handle issues identified in it
  • Power Automates
    • Connectors – what these are, how to use them, security around them, querying/returning results in the correct way
    • Triggers. What is a trigger, how do they work, when to use/not use them
    • Actions. What these are, how they can be used, examples of them
    • Conditions. What these are, how to use them, types of conditions/expressions/data
    • Timeouts. How to use them, when to use them, how to configure
  • Power Virtual Agents. How to set them up, how to configure them, how to deploy them, how to connect them to other systems
  • Power App Portals. Different types, how to set them up, how to configure them, how they can work with underlying data & users
  • Solutions
    • Managed, unmanaged, differences between them, how to use each one.
    • Deploying solutions. Different methods that can be used to do it, best practise for each, when to use each one
    • Package Deployer & how to use it correctly
  • Security.
    • All of the different security types within Dynamics 365/Power Platform. Roles/Teams/Environment/Field level. How to set up, configure, use in the right way.
    • Hierarchy security
    • Wider platform security. How to use Azure Active Directory for authentication methods, what to know around this, how to set it up correctly to interact with CDS/Dynamics 365
    • What authentication methods are allowed, when/how they can be used, how to configure them
  • ‘Development type stuff’
    • API’s. The different API’s that can be used, methods that are valid with each one, the Organisation service
    • Discovery URL’s. What these are, which ones are able to be used, how they’d be used/queried
    • Plugins. How to set up, how to register, how to deploy. Steps needed for each
    • Plugin debugging/troubleshooting. Synchronous vs asynchronous
    • Component types. Actions/conditions/expressions/data operations. What these are, when each is used
    • Custom ribbon buttons. What these are, different tools able to be used to create/set them up, troubleshooting them
    • Javascript web resources. How to use these correctly, how to set them up on entities/forms/fields
    • Powerapps Component Framework (PCF). What these are, how to develop them, how to use them in the right way
  • System Design
    • Entity relationship types. What they are, what each one does, how they work, when to use them appropriately. Tools that can be used to display them for system design purposes
    • Storage considerations across different types, including CDS & Azure options
  • Azure items
    • Azure Consumption API. How to monitor, how to handle, how to change/update
    • Azure Event Grid. What it is, the different ways in which it can be used, when each source should be used
  • Dynamics 365 for Finance. Native functionality included in it

The biggest surprise that I had really when thinking back to things was the inclusion of Dynamics 365 for Finance in it. Generally the world is split into ‘front of house’ (being Dynamics 365/Power Platform), and ‘back of house’ (Dynamics 365 for Finance & Supply Chain Management). The two don’t really overlap, though they’re supposed to be coming more together over time. Being that this is going to happen, I guess it’s only natural that exam questions around each other will come up!

Overall it was quite a good exam. Some of the more ‘code-style’ questions were somewhat out of my comfort zone, and I’ll freely admit to guessing some of the answers around them! Time will tell, as they say, to see how I’ve done in it.

I hope that this is helpful for anyone who’s thinking of taking it – good luck, and please do drop a comment below to let me know how you found it!

Good news for Power Automate Flows!

As a starter for 10, this wasn’t actually the blog post that I was going to write today. In fact, the subject of the post wasn’t even going to be about Power Automate! However, there was some really amazing news that dropped today from Microsoft, which I just couldn’t pass up being able to talk about.

You’ve guessed it – it’s about Power Automate! Well, I suppose that the post title was somewhat of a giveaway, wasn’t it…ah well. So let’s go ahead and find out what this is all about then!

To date, we’ve been able to put Power Automate flows into a solution. Well, it wasn’t there exactly at the beginning of things, but it happened somewhere along the way. This was very convenient, as we didn’t then need to deploy each one individually to different environments. Some solutions can contain dozens & dozens of flows, and we really do love to package them all up together for ease of movement.

So that was good. But there was still a (major) ‘bugbear’ (as I like to refer to them as). This is the fact that after we deploy a Power Automate flow, we then need to go into it & (re)authenticate it. This is due to the fact that the connector/s that it uses contains what is referred to as a ‘secret’, and these can’t be moved across environments. As a result, we need to essentially recreate the ‘secret’ in the connector (ie authentication details) every time we move it. This is an annoyance (if you have one or two flows), and an absolute bloody nightmare if you have lots.

For the technical minded – every action in a flow is bound to a specific instance of a connection that it will use to “execute” that action. This is why when moving flows across environments, users are required to rebind every operation to a connection.

For example, I’ve been working with COVID-19 triage solutions. These contain lots of flows within them, connecting to multiple different sources, and doing different things. Every time we’ve performed a release (even if it’s just a simple update), we’ve needed to manually go through each flow, (re)authenticate them, and turn them on. If you forgot one, then everything can come crashing down & not work! But there’s been no other way to do it. To represent this visually, we have the following diagram

For each & every Power Automate, the connection line gets ‘broken’ when it’s deployed, and needs to be re-made.

Until now, that is. For today, Microsoft has announced the Public Preview for ‘Connection References’. Now when something is put into Preview, I usually caveat the usage of it with saying things like ‘it might go away, or not be released for a while’. But I’m going to be quietly confident about this particular piece of functionality, as I really don’t think it’s going to be pulled!

So what exactly are these? Well, in (mostly) simple terms, Connection References provide an ‘in-between’ or ‘abstraction’ layer for the connections that use them. Let’s show this visually as well

We still need to re-authenticate the Connection Reference once we deploy things. But let’s now see how we can save ourselves a massive headache, and LOTS of time:

Oooo…now this is looking better. Instead of having to update three Power Automate flows, we only have to update the SINGLE Connection Reference that’s sitting in the middle. Now multiple that by however many flows you have (eg sending emails out, etc), and start calculating how much time you’ll now be able to spend on coffee breaks, rather than doing this manually one at a time…

We can create Connection References directly from within the solution:

We then give it a name & description, choose which connector we’re going to be using, and either select an existing connection or set a new one up:

Once we’re finished, we click ‘Create’ at the bottom. Voila – we can now see it within our solution!

Note: Interestingly enough I couldn’t actually see this within the solution after I created it, even with the component selector set to show ‘All’. How I actually got them to display was changing the component selector to ‘Connection Reference’, and they then showed up. I’m thinking that this is due to it being new today/in the process of rolling out, and am expecting it to display without any issues in the near future

Let’s take a look at a Power Automate flow itself now to see how it’s referenced. When we open an item with a connector, we can now see the following:

We’re able to select the Connection Reference that we’re wanting to use. Simple, yet so powerful.

When importing a solution containing a Connection Reference, we will be prompted during the import process to set the actual connection that should be used with it:

If you don’t have any connections set up already in the environment, you’ll be able to create a new one from the dropdown.

Some things to note around this:

  • During the preview phase, Microsoft has specified that a single Connection Reference can only be used by up to 16 flows. This limitation will be removed once it goes GA
  • Existing flows will not be automatically upgraded. What you can do though is export the unmanaged solution, re-import it to the same environment, and then they will be automatically created for you. The flow/s can then be edited to update them to the correct connection reference record
  • The connection name and connection reference name are not currently synchronised. They can be different. Therefore it’s best to keep the naming conventions the same. Don’t set different names for connections and their associated connection references.

In summary – this is an awesome step forward with Power Automate functionality. I’m already tasking some of the developers on the team to re-do existing solutions to use it for ease of use. How do you think it’ll best benefit you? Drop a comment below!

Lookup fields & Power Automate

This is an interesting post, for several reasons. Firstly, it’s the first one in 3 weeks – I was off on holiday, and decided to take an (almost) absolute break from all things digital, which included this blog. It was actually quite refreshing, though now coming back & starting to write again does seem a bit daunting, I’ll admit!

Thankfully, whilst wondering what exactly to start with, a scenario came up that I was working on. It seemed quite simple at first, but then actually got someone complicated. I therefore thought it would be helpful to others if I wrote about it, so here it is.

The scenario was as follows. We had records being auto-created in the system, and needed to create child records for them. This, as I’m sure you’ll agree, is really quite simple to do with Power Automate. We also needed to set lookup values on the child record, that were already populated on the parent record (for reference purposes).

So for example, the parent record has a lookup to Country (being a separate entity), and the child record also has a lookup to Country. These need to be the same.

Being both lookup fields, I figured that I’d be able to take the value from the parent record, and simply plop it into the corresponding field on the child record in Power Automate:

So I did that – and immediately hit an error. Not just any error, but the fabled ‘Resource not found for the segment’ error!

Obviously, I did what anyone would do at first – I put it into Google & Twitter, and took a look at what came up.

The ‘problem’ was coming from using the ‘CDS Current Environment’ connector, which is the latest version available (the old one is no longer available to use). It’s really great for a lot of things, but unfortunately not so great in a few areas. See, in the old CDS Connector, you could just drop the lookup field value into the field you were wanting to populate. Power Automate had no issues with that, & it would run just fine.

However in the ‘new’ CDS Connector, you can’t just do that. Instead, you need to use an OData reference (which I haven’t done much of before, to tell the truth). So based on the blogs I had come across, I went to work to try to get this working.

Part of the challenge was that there didn’t seem to be a unified consensus in how to do it. I came across the following variations:

  • /entityname(Lookup Field Value)
  • /entityname/(Lookup Field Value)
  • /pluralentityname(Lookup Field Value)
  • /pluralentityname/(Lookup Field Value)

Somewhat confusing, as I’m sure you’d agree. Nevertheless, I ploughed through all of the different possibilities. But nothing was working – every single time, I still got the ‘segment not found’ error message. This, as you can image, was extremely frustrating!

Thankfully, one of my good friends was around & able to help out. Namely, Tricia Sinclair came to the rescue!

We took a look at the code I was using, and she took a look at some of her own use cases (where it had worked for her). I was starting to think down the path of needing a capital letter in the entity name (some systems can be REALLY finicky around things like that), but thankfully it wasn’t.

Instead, it was the following. See, this was a custom entity. It turns out that for a custom entity (& heck, for all I know system entities as well) the syntax needed is ‘publisherprefix_pluralentityname(lookupfieldvalue)’. Now that’s not something that I had come across ANYWHERE at all!

Looking at it, I guess it makes sense. After all it would technically be possible to have multiple entities with the same name, though with different publishers. As a result, the system needs to know WHICH exact entity is being needed for the Power Automate, so uses this. Somewhat complicated (and hey – it worked without all of this in the OLD CDS Connector), but we got it to work!

Testing it out, everything worked smoothly. The Power Automates fired off without any issues, the data got created & populated, and everyone was happy.

So there you go. Another interesting little twist in syntax needed, which hopefully will NOT change in the (near) future!

Have you come across anything like this? I’d love to hear – drop a comment below around it!

Updating User Settings with Power Automate

Here’s a scenario that could be all too familiar to us. We’re on-boarding users (to either Dynamics 365 or a Power Platform app), & they’re new to the environment that it’s deployed to. So they’re set up, and all ready to go. Suddenly they start asking why records created (or modified) by colleagues show up as having the wrong time on them.

Reverse Wall Clock Unusual Numbers Backwards Modern Decorative ...

Does this sound familiar? I’m sure it does to quite a few people out there!. See, there’s no way to set a default system-wide time zone in Dynamics 365 (or Power Platform). At least not that I’ve come across – if you know of one, please comment below with instructions as to how to do this!

As a result, users are given the default timezone, and need to change it. This is easily done through the Personalization settings area in the app. Users click here, and then select their appropriate time-zone. Brilliant…or so you’d think.

See, when it’s one or two users, it’s generally OK to tell them to do that. However, when it’s 200 or 2000 users, you’re going to get push-back. The last thing you want is for a large number of them to start contacting you to work out how to do it (read the instructions, perhaps?).

User queue stock photo © zam ri (OneO2) (#258450) | Stockfresh

I’ve had this scenario over the last week, where the client actually told us that they didn’t want us to tell users to update it manually. They wanted a better solution.

Well, there is a solution out there to update users. It’s the ‘User Settings Utility’ app that’s in the XrmToolBox (https://www.xrmtoolbox.com/plugins/MsCrmTools.UserSettingsUtility/). Really neat & nifty, and does just what it says on the box. Simple enough to select users (or all of them at a time), select the time-zone you’re wanting to apply to them, and click a button. Hey presto – it’s been updated

Hmm. But what if you didn’t want to have to do this manually. Or (and this is what I was dealing with), there were decent enough number of users being added to the app every few days, & I didn’t want to have to do this as a manual task.

So I started digging into how the time-zone setting was actually stored. It turns out that there’s an entity called ‘User Settings’, which is associated with a User record. Oh, and if you’re going to want to take a look at this entity to see what it contains, it’s NOT available through the front end. You can’t go into the entity list and just display it (though if you’ve found a way to do this through the Power Platform NATIVELY, drop me a line, please?).

Anyhow, back to things. There’s a value for ‘TimeZoneCode’, which maps to a specific time-zone. Aha, I thought! Right – now what’s the best way that I could work out to do this automatically. Checking in with some contacts in the tech community (thanks BlackOps etc!), Power Automate was suggested, so I started to see about how I could go about it…

So, I created a Power Automate Flow (haha…I got the name right there!). On creation of a new user record, it would programmatically go away and update the value to the one for the time-zone that I wanted it to be set as. This actually worked really well.

The only drawback is that through the user interface, it’s not actually shown as being updated, though it has been. Or sometimes it changes, but doesn’t reflect it accurately. This is somewhat annoying, and caused me quite some confusion between checking the front end to see if things were working, & confirming through the back end (& opening records up) to see that it was. I still have NO idea why this was happening.

Before changing my settings
After changing my time zone to USA (EST)

For my specific scenario, all of the users are in the UK, so I set it to update every user on creation to the UK time-zone. Obviously if you have users in different time-zones, you’d want to set this differently. This shouldn’t be an issue though, as you can expand the Power Automate Flow and add logic conditions/branches to be able to do this.

Now I think that this is pretty cool, and I couldn’t find anything out there for this. I’ve therefore decided to release this in a small solution, for others to be able to use. Part of this is the entire list of time-zones with their specific codes, so that you can update to whichever one you need to.

I hope that this helps solve a small but annoying problem (at least it did for me). Please do provide feedback if you want to!