Canvas Apps & Power Automates

So it’s been a busy few weeks here, which is why I haven’t really been putting up any articles. March/April is always a busy time for our family with stuff going on, and this year I decided not to push myself to get articles out, as otherwise I’d be running very low on sleep!

That being said, I’ve still had some great ideas about things that I’d like to share, and have been keeping a series of short notes for me to pick up. Today’s topic is one of them, which I think has been a major pain to anyone involved in canvas app development!

So, the back story to this is that we’re able to use Power Automate flows together with canvas apps. What I mean by this is that we’re able to directly trigger them from within the canvas app, rather than needing to do something like edit or create a record, and then have the Power Automate flow trigger from the record creation or modification.

There’s a specific Power Apps trigger that’s available within Power Automate exactly for this purpose:

When clicked, it gives us the trigger line in the steps as follows:

So what we’d do is within the canvas app, we would bind a button (or another control) that when selected, it would then go away & trigger the Power Automate flow. Great – so many different things that we can get to happen! One of the benefits of doing things like this is that we can then pass information from the Power Automate flow back to the canvas app directly:

This can then mean that the user can know, within the canvas app itself, that the Power Automate flow has run, and use data (or other things) that have come out of it.

OK – all good so far.

The main issue to date has been with deploying canvas apps together with Power Automate flows. See, as per best practise, we would create a solution, place the canvas app, flows, and anything else that’s necessary for it to work within it, and then deploy the solution to our target environment/s. And that’s where things just…didn’t go quite right.

Obviously within the development environment, the canvas app would be hooked up to the flows, and everything would work. Clicking the button would cause the flow to run, etc. User authentication would be in place (along with licenses of course!), and it was just fine.

But when deploying a solution containing canvas apps and associated flows between environments (regardless of whether it’s been manually deploying, or automated using a tool such as Azure DevOps), the connections to the flows would be broken. Ie, the canvas app would run, but the flows wouldn’t trigger. Looking at the connections in the canvas app within Studio would show something like the following:

All of the connections to Power Automate flows would show as ‘Not connected’. It’s not even possible to click the ellipse next to them and re-connect them – the only option available is to remove it from the canvas app!

So in order to get things working again, we’d need to do the following steps:

  • Open up the canvas app
  • Remove all connections to Power Automate flows
  • Add a temporary button, set it to be a Power Automate trigger
  • Click through all of the Power Automates needing to be connected (waiting for each one to connect, then go to the next one)
  • Remove the temporary button
  • Save and publish the solution

This, in a nutshell, has been a (major) headache. For example, I’ve been working with a solution that has over 30 Power Automate flows that can be triggered from the canvas app (lots of different functionality!). Each deployment has needed the above process to be carried out, which has usually added on at least an hour to the deployment process!

Now, this hasn’t been something that’s been unknown. In fact, the official Microsoft documentation noted the following:

So this is something that Microsoft has been well aware of, but it’s been a pain point that we’ve had to work with.

However, this has now ALL changed, which I (and MANY others) are really pleased about!

Microsoft has rolled out an update last month that means that canvas app connections to Power Automate flows will NOT break when they’re deployed across environments! This is such a massive time-saver, that I’m now trying to work out what to do with all of my free time! Only kidding…more project work will commence!

So what we can now do is take our solution, deploy it across the different environment/s that we need to get it out to (whether manually, or automated using tools such as Azure DevOps), publish the solution, and then everything works! Amazing!!

One small caveat though – to ensure that this work, you will need to go into the app, and re-publish it on the latest Power Apps version. This should of course be done in a development environment, and then can be exported and deployed as required.

Microsoft have also updated their documentation at https://docs.microsoft.com/en-us/powerapps/maker/data-platform/solutions-overview to remove the limitation text shown above. It’s a good place to keep an eye on changes that occur over time too.

This is definitely a welcome piece of development, and I know that we’ve been eagerly waiting for this for a while, and now it’s here!

PL-600: Microsoft Power Platform Solution Architect

Well, it’s FINALLY here. And by finally, I guess I’m saying that I’ve been waiting for this for a while? The PL-600 exam is the new ‘Holy Grail’ for Dynamics 365/Power Platform people, being the Solution Architect (3 star) exam. Ten minutes after it went live, I booked to take it, and four hours after it went live I sat it! (I would have taken it sooner, but had to have supper first, get the kids to bed, etc…)

The first solution architect exam that Microsoft has done in this space has been the MB-600 (see my exam experience write-up on it at MB-600 Solution Architect Exam). However with the somewhat recent shift moving towards certifications for the wider Power Platform, it was inevitable that this exam would change as well.

Interestingly enough, the MB-600 now counts towards some of the Microsoft Partner qualifications. I’d expect that when it retires (currently planned for June 2021), the PL-600 will take the place of it in the required certifications to have.

So, how to discuss it? Well, the obvious first start is to link to the official Microsoft page for it, which is at https://docs.microsoft.com/en-us/learn/certifications/power-platform-solution-architect-expert/. According to the specification for it:

Microsoft Power Platform solution architects lead successful implementations and focus on how solutions address the broader business and technical needs of organizations.
A solution architect has functional and technical knowledge of the Power Platform, Dynamics 365 customer engagement apps, related Microsoft cloud solutions, and other third-party technologies. A solution architect applies knowledge and experience throughout an engagement. The solution architect performs proactive and preventative work to increase the value of the customer’s investment and promote organizational health. This role requires the ability to identify opportunities to solve business problems.
Solution architects have experience across functional and technical disciplines of the Power Platform. Solution architects should be able to facilitate design decisions across development, configuration, integration, infrastructure, security, availability, storage, and change management. This role balances a project’s business needs while meeting functional and non-functional requirements.

So not really changed that much from the MB-600, though obviously there’s now an expectation for solutions to bring in other parts of the Power Platform, as well as dip into Azure offerings as well. Pretty much par for the course, in my experience, with how recent projects that I’ve been on have been implemented.

At the time of writing, there are no official Microsoft Learning paths available to use to study. I do expect this to change in the near future, and will update this article when they’re out. However the objectives/sub-objectives are available to view from the main exam page, and I’d highly recommend going ahead & taking a good look at these.

Passing the exam (along with having either the PL-200 Microsoft Power Platform Functional Consultant or PL-400: Microsoft Power Platform Developer Exam qualifications as well) will result in a lovely (new) shiny badge. Oh, we do so love those three stars on it!

As in my previous exam posts, I’m going to stress that it’s not permitted to share any of the exam questions. This is in the rules/acceptance for taking the exam. I’ve therefore put an overview of the sorts of questions that came up during my exam. (Note: exams are composed from question banks, so there could be many things that weren’t included in my exam, but could be included for someone else! ). I’ve tried to group things together as best as possible for the different subject areas.

Overall, I had 47 questions, which is around the usual amount that I’ve experienced in my exams over the last year or so. What was slightly unusual was that instead of two case studies, I got three of them! (note that your own experience may likely vary from mine).

Some of the naming conventions weren’t updated to the latest methods, which I would have expected. I still had a few references to ‘entities’ and ‘fields’ come up, though for the most part ‘tables’ and ‘columns’ were used. I guess it’s a matter of time to get everything up to speed with it.

  • Environments
    • Region locations, handling scenarios with multiple countries
    • Analytics
    • Data migrations
  • Requirement Gathering
    • Functional
    • Non-functional
  • Data structure
    • Tables
      • Types of tables
        • Standard vs custom functionality
        • Virtual tables. What these are, when they would be used, limitations to them
        • Activity types
      • Table relationships & behaviours
      • Types of columns, what each one is suited for
      • Business rules. What they are, how they can be used
      • Business process flows. What they are, how they can be used
  • App types (differences between them, scenarios each one is best suited for
    • Model
    • Canvas
    • Portal
  • Model-driven apps
    • Form controls (standard vs custom)
    • Form layout (standard functionality vs custom functionality)
    • Formatting inputs
    • Restricting inputs
  • Automation
    • Power Automate flows. What they are, how they can be used, restrictions with them
    • Azure Logic Apps. What they are, how they can be used, restrictions with them
    • Power Virtual Agents
  • Communication channels
    • Self service abilities through Power Virtual Agent chatbots. How this works, when you’d use them, limitations that exist
    • Live agent abilities through Omnichannel. How this is implemented, how customers can connect to a live agent (directly, as well as through chatbots)
    • Teams. When this can be used, how other platform abilities can be used through it
  • Integration
    • Integration tools
    • Power Platform systems
    • Azure systems
    • Third party systems
    • Reporting across data held in different systems
    • Dynamics 365 API
  • Reporting
    • Power BI. What it is, how it’s used, how it’s configured, limitations with it, how to share information with other users
    • Interactive Dashboards. What these are, how these are set up and used, limitations to them
  • Troubleshooting
    • Canvas app issues
    • Model driven app issues
    • Data migration
  • Security
    • Data Protection. What is it, where it’s set up, how it’s used across different requirements in the platform
    • Types of users (interactive/non-interactive)
    • Azure Active Directory, and the role/s it can play, different types of AAD authentication
    • Power Platform security roles
    • Power Platform security teams, types
    • Portal security
    • Restricting who can view forms
    • Field level security
    • Hierarchy abilities
    • Auditing abilities and controls
    • Portal security

Wow. It’s a lot of stuff. Not that I’m surprised by that, as essentially it’s the sort of thing that I was expecting (being familiar with the MB-600). I think that on a ‘day to day’ basis, I cover most of these items already, so didn’t have to do a massive amount of revision for items that I wasn’t familiar with.

From my experience in taking it, I’d say that around 30% of the questions seemed to be focused on Dynamics 365, with 70% being focused on Power Platform capabilities. It’s about what I thought it would be when the exam was first announced. Obviously some people are more Dynamics 365 focused, and others are more Power Platform focused, but the aim of the exam (& qualification) is to really understand the breadth of the offerings available.

I can’t tell you if I’ve passed it or not…YET!. Results aren’t going to be out for several months, based on previous experience with Beta exams, but I’ve got a good feeling about this.

So, if you’re aiming to take it – I wish you the very best of luck, and let me know your experience!

Customising Case Resolutions

Well, the title is a bit of a mouthful, I’ll admit. Hopefully though this brings some good information, and can help people out.

Cases are wonderful things, and can be used for tracking client interactions, compliments/complaints, and so many other things. What cases do have is the ability to resolve them, and provide information around the resolution.

Now, the standard way of doing this provides the following screen:

There’s the ability to set the Resolution Type (being a dropdown, aka Choice, field), & putting in free text for the Resolution itself (allowing us to track information around it). There are also time fields, which can be used for working out the time spent, as well as any time that’s going to be chargeable.

Now when going in to modify these, we’d think to open up the Case Resolution table. However, this isn’t actually the right place to do it. Instead, we’re needing to update the Case table itself, as the Care Resolution items comes from the Case Status field!

Somewhat annoyingly, it’s not possible to do this through the new ‘Maker’ interface:

In order to actually handle this, we need to switch across to the Classis editor to set this up. This could be because it’s actually a situation of having both parent & child entries. What I mean by this is that there’s the actual status (being Active, Resolved or Cancelled), and then a reason under each one. Hopefully at some point it’ll be updated into the new UI, so that we can do it from there.

We’ll need to change the Status item to ‘Resolved’, & can then add in the options that we want:

After adding them, we need to save & publish, and then they’ll show up for us, and are able to be selected:

So that’s great – we’re able to customise it. But what if we’re wanting to customise the actual ‘Resolve Case’ form itself? Not everyone wants to show Time/Billable Time on it (quite a few of our clients ask us to remove it), and perhaps they want to add additional custom fields.

So from the usual perspective of doing this, we’d open up the Case Resolution table, create new fields as required, and modify the existing form (we’re not able to create any other forms for this specific table). After all, this is how we’d do it for any table in the system (whether a standard one, or a custom one). This is going to be the Main form, rather than the QuickCreate one:

We save & publish it, and then would open up a Case record, click ‘Resolve Case’, and expect to see it. However, that doesn’t happen, which has been most puzzlingly to me!

It turns out that there are two things needed to be done in order to get to see our ‘custom’ form (though it’s not really custom, as it’s modifying the default form, but whatever).

  1. We need to modify security permissions for users, and is a critical requirement. An example of this is shown below:
Security Role: Customer Service Representative

2. We need to enable customisable dialogues. Yes, it’s a setting that needs to be updated in order for users to see the custom layout of the form. If we don’t do this, they’re shown the default form, even though we’ve modified it! Seems a little strange that the system seems to have this concept of a ‘shadow’ form, but I guess that’s how it is.

To do this, we need to go into the Service Management settings area. I usually launch this through the Customer Service Hub app, though it’s available through several of the other standard apps as well:

Once there, we need to click into the Service Configuration menu item, and then change the ‘Resolve Case Dialogue’ option as shown below:

Remember to click the ‘Save’ button to save this.

Finally we can go back to our Case record, click ‘Resolve Case’, and look what appears!

So in summary, it’s definitely possible to modify & change the way that Case resolutions works in the system. It does take a little bit of fiddling around with settings in different areas, which can be confusing if we’re not used to this, but can give a great result in the end.

Have you ever come across this, and wondered how to do it? Have you developed Case Resolutions any further? Drop a comment below – I’d love to hear!

Data Export Service Connection Issues

This is a slightly different post from the usually stuff that I talk about. It’s much more ‘techy/developer’ focused, but I thought it would be quite useful still for people to keep in mind.

The background to this comes from a project that I’ve been working on with some colleagues. Part of the project involves setting up an Azure SQL database, and replicating CDS data to it. Why, I hear you ask? Well, there are some downstream systems that may be heavy users of the data, and as we well know, CDS isn’t specifically build to handle a large number of queries against it. In fact, if you start hammering the CDS layer, Microsoft is likely to reach out to ask what exactly you’re trying to do!

Therefore (as most people would do), we’re putting in database layer/s within Azure to handle the volume of data requests that we’re expecting to occur.

Azure SQL Database | Microsoft Azure

So with setting up things like databases, we need to create the name for them, along with access credentials. All regular ‘run of the mill’ stuff – no surprises there. In order for adequate security, we usually use one of a handful of password generators that we keep to hand. These have many advantages to them, such as ensuring that it’s not something we (as humans) are dreaming up, that might be easier to be guessed at. I’ve used password generators over the years for many different professional & personal projects, and they really are quite good overall.

Sordum Random Password Generator Creates Random Passwords with Ease -  MajorGeeks
Example of a password generation tool

Once we had the credentials & everything set up, we then logged in (using SQL Server Management Studio), and all was good. Everything that we needed was in place, and it was looking superb (from the front end, at least).

OK – on to getting the data actually loaded in. To do this, we’re using the Data Export Service (see https://docs.microsoft.com/en-us/power-platform/admin/replicate-data-microsoft-azure-sql-database for further information around this). The reason for using this is that the Data Export Service intelligently synchronises the entire database initially, and thereafter synchronises on a continuous basis as changes occur (delta changes) in the system. This is really good, and means we don’t need to build anything custom to handle it. Wonderful!

Setting up the Data Export Service takes a little bit of time. I’m not going to go into the details of how to set it up – instead there’s a wonderful walkthrough by the AMAZING Scott Durow at http://develop1.net/public/post/2016/12/09/Dynamic365-Data-Export-Service. Go take a look at it if you’re needing to find out how to do it.

So we were going through the process. Part of this is needing to copy the Azure connection string into into a script that you run. When you do this, you need to re-insert the password (as Azure doesn’t include it in the string). For our purposes (as we had generated this), we copied/pasted the password, and ran things.

However all we were getting was a red star, and the error message ‘Unable to validate profile’.

As you’d expect, this was HIGHLY frustrating. We started to dig down to see what actual error log/s were available (with hopefully more information on them), but didn’t make much progress there. We logged in through the front end again – yes, no problems there, all was working fine. Back to the Export Service & scripts, but again the error. As you can imagine, we weren’t very positive about this, and were really trying to find out what could possibly be causing this. Was it a system error? Was there something that we had forgotten to do, somewhere, during the initial setup process?

It’s at these sorts of times that self-doubt can start to creep in. Did we miss something small & minor, but that was actually really important? We went over the deployment steps again & again. Each time, we couldn’t find anything that we had missed out. It was getting absolutely exasperating!

Finally, after much trial & error, we narrowed the issue down to one source. It’s something we hadn’t really expected, but had indeed caused all of this to happen!

What happened was that the password that we had auto-generated had a semi-colon (‘;’) in it. In & of itself, that’s not an issue (usually). As we had seen, we were able to log into SSMS (the ‘front-end’) successfully, with no issues at all.

However when put into code, Azure treats the semi-colon as a special character (a command separator). It was therefore not recognising the entire password, which was causing the entire thing to fail! To resolve this was simple – we regenerated the password to ensure that it didn’t include a semi-colon character within it!

Now, this is indeed something that’s quite simple, and should be at the core of programming knowledge. Most password generators will have an option to avoid this happening, but not all password generators have this. Unfortunately we had fallen subject to this, but thankfully all was resolved in the end.

The setup then carried on successfully, and we were able (after all of the effort above) to achieve what we had set out to do initially.

Have you ever had a similar issue? Either with passwords, or where something worked through a front-end system, but not in code? Drop a comment below – I’d love to hear!

PL-200 Microsoft Power Platform Functional Consultant

Well, the last week has been quite busy, on many fronts! One of those is having a few new exams come out in Beta. I’ve already taken the PL-400 (see PL-400: Microsoft Power Platform Developer Exam for my review of it). Last Friday, the new PL-200 exam was released as well, so I scheduled it in for as soon as I could sit it.

Now the PL-200 is scheduled to be replacing the MB-200 exam at the end of this year (2020), assuming it comes out of beta by then of course. I remember sitting my MB-200, though I didn’t write up about it at the time. Compared to some of the other exams I’ve taken, it was hefty. I’ll freely admit that I didn’t pass on first go of it – it took me 3 tries to gain it! People will be required to take this as a pre-requisite for attaining the Microsoft Certified: Power Platform Functional Consultant Associate badge.

So I’ve been expecting this new PL-200 to be quite similar, but with more of a Power Platform focus. It’s still heavy on Dynamics 365, and I wasn’t expecting that part to change. The existing MB-2xx series are also staying in place (for the moment, anyhow).

According to the official description for the exam:

Candidates for this exam perform discovery, capture requirements, engage subject matter experts and stakeholders, translate requirements, and configure Power Platform solutions and apps. They create application enhancements, custom user experiences, system integrations, data conversions, custom process automation, and custom visualizations.

Candidates implement the design provided by and in collaboration with a solution architect and the standards, branding, and artifacts established by User Experience Designers. They design integrations to provide seamless integration with third party applications and services.

Candidates actively collaborate with quality assurance team members to ensure that solutions meet functional and non-functional requirements. They identify, generate, and deliver artifacts for packaging and deployment to DevOps engineers, and provide operations and maintenance training to Power Platform administrators.

The official Microsoft Learn page for the exam is at https://docs.microsoft.com/en-us/learn/certifications/exams/pl-200, and I’d highly recommend people to go check it out. I didn’t use it that much, but felt that I was on reasonable grounds with existing knowledge. It’s mostly there, but (at least in my exam) there were some sneaky extras that I was NOT really expecting. Hopefully I managed to get them (mostly) accurate!

Once again, I sat the exam through the proctored option (ie from home). The experience went without issues for once – sign in was fine, no issues with my headset during check-in, exam loaded & worked without problems at all.

So, as before, it’s not permitted to share any of the exam questions. This is in the rules/acceptance for taking the exam. I’ve therefore put an overview of the sorts of questions that came up during my exam. (Note: exams are composed from question banks, so there could be many things that weren’t included in my exam, but could be included for someone else!). I’ve tried to group things together as best as possible for the different subject areas.

  • Environments
    • Different types of environments, what each one is used for, how to set/switch them between the different types
    • How to handle security/restrict access as necessary
  • Field types. All of the available field types, what are the benefits of each, and when each type should be used
  • Data storage types. Differences between Office documents (eg Excel), CDS, SQL Server, Azure SQL. When to use each one best
  • Charts. How they’re set up, how they can be shared with other users.
  • System views. What these are, who can access them, how to set them up
  • Entity forms. The different types of forms available, how to set them up, limitations of each. When each one should be used for a given scenarios
  • Model apps. Site map. What this is, how it’s used. Implementing/customising it, the different controls available & what each one does
  • Entity editable grids
    • What these are, how they can be used, how to enable & set them up
    • Limitations that they have within the system
  • Entity/record ownership. The different types of ownerships available, benefits of each, when each should be used for a given scenario
  • Data management
    • Data importing from different sources, different methods to import data
    • What is data mapping for import, and how it’s used
  • Duplicate detection. What it is, what it does, how it works. How to implement & configure it
  • Microsoft Word templates. How they can interact with Dynamics 365, how to set them up/adjust them, what they can be used for
  • Canvas Apps
    • Expression/function types, what they are, how they’re used
    • Handling data (eg collections)
    • Offline usage & data storage
    • Controls that can be used, navigating around, loading/saving data.
  • Power Virtual Agent/Chatbots.
    • Setting them up, deploying them onto websites, deploying them into Teams
    • Configuring topics, routing, handling unknown questions
    • Bot model data, including being able to access across multiple chatbots
    • Reporting on their usage, & how customer engagements have been processed
  • Power App portals
    • Registering users, registration code process
    • Validating/confirming user accounts
    • Forms security, displaying/hiding forms & data
  • AI capabilities. AI models available. Pre-built models vs custom training, capabilities (eg text scanning), and when to use each one.
  • Omnichannel
    • What it is, when it’s used
    • How to implement, deploy & configure customers being able to be sent through to it
  • Automation
    • Workflows, Power Automate, Business Process Flows
    • What each one is, benefits/use cases for each one, when to use each for specific scenarios
  • Power Automate
    • What are triggers, & how do they work
    • What are actions, and how do they work
    • What are connectors, and how do they work
    • Prebuilt vs custom connectors, capabilities, and when to use each one
    • How to set up each type & configure them
    • Instant vs Scheduled vs Triggered
    • Security – how to enable/disable their use by users
  • Business Process Flows
    • What they are, how they’re used, limitations that they have
    • How to handle security for them
  • Business rules
    • What they are, how they’re used, how to set up/configure
    • How to use them in different parts of the system (eg forms, apps, etc)
    • Actions vs Conditions vs Recommendations
  • UI Flows (RPA)
    • What these are, how they are used
    • Requirements in order to use them
    • Desktop vs Cloud
    • Implementation, customisation, configuration & deployment
    • Limitations of them
    • Data extraction from runs
  • Security & Compliance
    • Security roles, security teams, security groups
    • What each one is, how it’s used
    • System auditing, what it is, how it’s used, how to implement & configure
    • How to access & run user audit log reports
  • PowerBI. Setting up & sharing dashboards, setting up & configuring alerts, security options/roles & how they work with data
  • Dynamics 365 integrations. What other systems can integrate directly with Dynamics 365, & any limitations that they may have

The main surprise for me was mostly around the UI flows, and the various questions I had on them. I’ve not played around with them (yet!), but they are really cool!

If you’re going to take this, I’d love to hear how your experience of it went. Drop a comment below for me to see!

PL-400: Microsoft Power Platform Developer Exam

I’ve been continuing with taking new exams as they come out. Having recently taken the MB-400 exam (see MB-400 Power Apps & Dynamics 365 Developer Exam), I was slightly surprised to see the announcement that it was going to be replaced!

Admittedly, I was also surprised (in a good way) that I passed the MB-400, not being a developer! It’s been quite amusing to tell people that I’m a certified Microsoft Dynamics Developer. It definitely puts a certain look on their faces, which always cracks me up.

Then again, the general approach seems to be to move all of the ‘traditional’ Dynamics 365 exams to the new Power Platform (PL) format. This includes obviously re-doing the exams to be more Power Platform centric, covering the different parts of the platform than just the ‘first party apps’. It’s going to be interesting to see how this landscape extends & matures over time.

The learning path came out in the summer, and is located at https://docs.microsoft.com/en-us/learn/certifications/exams/pl-400. It’s actually quite good. There’s quite a lot that overlaps with the MB-400 exam material, as well as the information that’s recently been covered by Julian Sharp & Joe Griffin.

The official description of the exam is:

Candidates for this exam design, develop, secure, and troubleshoot Power Platform solutions. Candidates implement components of a solution, including application enhancements, custom user experience, system integrations, data conversions, custom process automation, and custom visualizations.

Candidates must have strong applied knowledge of Power Platform services, including in-depth understanding of capabilities, boundaries, and constraints. Candidates should have a basic understanding of DevOps practices for Power Platform.

Candidates should have development experience that includes Power Platform services, JavaScript, JSON, TypeScript, C#, HTML, .NET, Microsoft Azure, Microsoft 365, RESTful web services, ASP.NET, and Microsoft Power BI.

So the PL-400 was announced on the Wednesday of Ignite this year (at least in my timezone). Waking up to hear of the announcement, I went right ahead to book it! Unfortunately, there seemed to be some issues with the Pearson Vue booking system. It took around 12 hours to be sorted out, & I then managed to get it booked Wednesday evening, to take it Thursday.

So, as before, it’s not permitted to share any of the exam questions. This is in the rules/acceptance for taking the exam. I’ve therefore put an overview of the sorts of questions that came up during my exam. (Note: exams are composed from question banks, so there could be many things that weren’t included in my exam, but could be included for someone else!). It’s also in beta at the moment, which means that things can obviously change.

There were a few glitches during the actual exam. One or two questions with answers that didn’t make sense (eg line 30 does X, but the code sample finished at line 18), and question numbers that seemed to jump back & forth (first time it’s happened to me). I guess that I’ve gotten used to at least ONE glitch happening somewhere, so this was par for the course.

I’ve tried to group things as best together as I feel (in my recollection), to make it easier to revise.

  • Model Apps.
    • Charts. How they work, what drives them, what they need in order to actually work, configuring them
    • Visualisation components for forms. What they are, examples of them, what each one does, when to use each one
    • Custom ribbon buttons. What these are, different tools able to be used to create/set them up, troubleshooting them
    • Entity alternate keys. What these are, when they should be used, how to set them up & configure them
    • Business Process Flows. What these are, how they can be used across different scenarios, limitations of them
    • Business Rules. What these are, how they can be used across different scenarios, limitations of them
  • Canvas apps
    • Different code types, expressions, how to use them & when to use them
    • Network connectivity, & how to handle this correctly within the app for data capture (this was an interesting one, which I’ve actually been looking at for a client project!)
    • Power Apps solution checker. How to run it, how to handle issues identified in it
  • Power Automates
    • Connectors – what these are, how to use them, security around them, querying/returning results in the correct way
    • Triggers. What is a trigger, how do they work, when to use/not use them
    • Actions. What these are, how they can be used, examples of them
    • Conditions. What these are, how to use them, types of conditions/expressions/data
    • Timeouts. How to use them, when to use them, how to configure
  • Power Virtual Agents. How to set them up, how to configure them, how to deploy them, how to connect them to other systems
  • Power App Portals. Different types, how to set them up, how to configure them, how they can work with underlying data & users
  • Solutions
    • Managed, unmanaged, differences between them, how to use each one.
    • Deploying solutions. Different methods that can be used to do it, best practise for each, when to use each one
    • Package Deployer & how to use it correctly
  • Security.
    • All of the different security types within Dynamics 365/Power Platform. Roles/Teams/Environment/Field level. How to set up, configure, use in the right way.
    • Hierarchy security
    • Wider platform security. How to use Azure Active Directory for authentication methods, what to know around this, how to set it up correctly to interact with CDS/Dynamics 365
    • What authentication methods are allowed, when/how they can be used, how to configure them
  • ‘Development type stuff’
    • API’s. The different API’s that can be used, methods that are valid with each one, the Organisation service
    • Discovery URL’s. What these are, which ones are able to be used, how they’d be used/queried
    • Plugins. How to set up, how to register, how to deploy. Steps needed for each
    • Plugin debugging/troubleshooting. Synchronous vs asynchronous
    • Component types. Actions/conditions/expressions/data operations. What these are, when each is used
    • Custom ribbon buttons. What these are, different tools able to be used to create/set them up, troubleshooting them
    • Javascript web resources. How to use these correctly, how to set them up on entities/forms/fields
    • Powerapps Component Framework (PCF). What these are, how to develop them, how to use them in the right way
  • System Design
    • Entity relationship types. What they are, what each one does, how they work, when to use them appropriately. Tools that can be used to display them for system design purposes
    • Storage considerations across different types, including CDS & Azure options
  • Azure items
    • Azure Consumption API. How to monitor, how to handle, how to change/update
    • Azure Event Grid. What it is, the different ways in which it can be used, when each source should be used
  • Dynamics 365 for Finance. Native functionality included in it

The biggest surprise that I had really when thinking back to things was the inclusion of Dynamics 365 for Finance in it. Generally the world is split into ‘front of house’ (being Dynamics 365/Power Platform), and ‘back of house’ (Dynamics 365 for Finance & Supply Chain Management). The two don’t really overlap, though they’re supposed to be coming more together over time. Being that this is going to happen, I guess it’s only natural that exam questions around each other will come up!

Overall it was quite a good exam. Some of the more ‘code-style’ questions were somewhat out of my comfort zone, and I’ll freely admit to guessing some of the answers around them! Time will tell, as they say, to see how I’ve done in it.

I hope that this is helpful for anyone who’s thinking of taking it – good luck, and please do drop a comment below to let me know how you found it!

Good news for Power Automate Flows!

As a starter for 10, this wasn’t actually the blog post that I was going to write today. In fact, the subject of the post wasn’t even going to be about Power Automate! However, there was some really amazing news that dropped today from Microsoft, which I just couldn’t pass up being able to talk about.

You’ve guessed it – it’s about Power Automate! Well, I suppose that the post title was somewhat of a giveaway, wasn’t it…ah well. So let’s go ahead and find out what this is all about then!

To date, we’ve been able to put Power Automate flows into a solution. Well, it wasn’t there exactly at the beginning of things, but it happened somewhere along the way. This was very convenient, as we didn’t then need to deploy each one individually to different environments. Some solutions can contain dozens & dozens of flows, and we really do love to package them all up together for ease of movement.

So that was good. But there was still a (major) ‘bugbear’ (as I like to refer to them as). This is the fact that after we deploy a Power Automate flow, we then need to go into it & (re)authenticate it. This is due to the fact that the connector/s that it uses contains what is referred to as a ‘secret’, and these can’t be moved across environments. As a result, we need to essentially recreate the ‘secret’ in the connector (ie authentication details) every time we move it. This is an annoyance (if you have one or two flows), and an absolute bloody nightmare if you have lots.

For the technical minded – every action in a flow is bound to a specific instance of a connection that it will use to “execute” that action. This is why when moving flows across environments, users are required to rebind every operation to a connection.

For example, I’ve been working with COVID-19 triage solutions. These contain lots of flows within them, connecting to multiple different sources, and doing different things. Every time we’ve performed a release (even if it’s just a simple update), we’ve needed to manually go through each flow, (re)authenticate them, and turn them on. If you forgot one, then everything can come crashing down & not work! But there’s been no other way to do it. To represent this visually, we have the following diagram

For each & every Power Automate, the connection line gets ‘broken’ when it’s deployed, and needs to be re-made.

Until now, that is. For today, Microsoft has announced the Public Preview for ‘Connection References’. Now when something is put into Preview, I usually caveat the usage of it with saying things like ‘it might go away, or not be released for a while’. But I’m going to be quietly confident about this particular piece of functionality, as I really don’t think it’s going to be pulled!

So what exactly are these? Well, in (mostly) simple terms, Connection References provide an ‘in-between’ or ‘abstraction’ layer for the connections that use them. Let’s show this visually as well

We still need to re-authenticate the Connection Reference once we deploy things. But let’s now see how we can save ourselves a massive headache, and LOTS of time:

Oooo…now this is looking better. Instead of having to update three Power Automate flows, we only have to update the SINGLE Connection Reference that’s sitting in the middle. Now multiple that by however many flows you have (eg sending emails out, etc), and start calculating how much time you’ll now be able to spend on coffee breaks, rather than doing this manually one at a time…

We can create Connection References directly from within the solution:

We then give it a name & description, choose which connector we’re going to be using, and either select an existing connection or set a new one up:

Once we’re finished, we click ‘Create’ at the bottom. Voila – we can now see it within our solution!

Note: Interestingly enough I couldn’t actually see this within the solution after I created it, even with the component selector set to show ‘All’. How I actually got them to display was changing the component selector to ‘Connection Reference’, and they then showed up. I’m thinking that this is due to it being new today/in the process of rolling out, and am expecting it to display without any issues in the near future

Let’s take a look at a Power Automate flow itself now to see how it’s referenced. When we open an item with a connector, we can now see the following:

We’re able to select the Connection Reference that we’re wanting to use. Simple, yet so powerful.

When importing a solution containing a Connection Reference, we will be prompted during the import process to set the actual connection that should be used with it:

If you don’t have any connections set up already in the environment, you’ll be able to create a new one from the dropdown.

Some things to note around this:

  • During the preview phase, Microsoft has specified that a single Connection Reference can only be used by up to 16 flows. This limitation will be removed once it goes GA
  • Existing flows will not be automatically upgraded. What you can do though is export the unmanaged solution, re-import it to the same environment, and then they will be automatically created for you. The flow/s can then be edited to update them to the correct connection reference record
  • The connection name and connection reference name are not currently synchronised. They can be different. Therefore it’s best to keep the naming conventions the same. Don’t set different names for connections and their associated connection references.

In summary – this is an awesome step forward with Power Automate functionality. I’m already tasking some of the developers on the team to re-do existing solutions to use it for ease of use. How do you think it’ll best benefit you? Drop a comment below!

Keeping belief in oneself

Although I usually post around technical matters & such, occasionally I digress into personal reflection. After all, this is my personal blog, & I feel it’s sometimes good/relevant to share certain personal things. Today’s post is along those lines, though it does relate to a technical matter.

Let’s set the scene. As many of you know (either from knowing me personally, or from reading my blog posts), I’m from the ‘model-driven app’ background. Canvas apps are really cool, but I wouldn’t say that I’m a very advanced creator of them. I’m learning the whole time about them (well, when I have a free minute here & there). There are many people in the community who are extremely more advanced than I am, and I love being able to learn from them.

I’m also considered to be in ‘Delivery’, This is the fancy word for those who run/are involved in projects, rather than selling concepts to clients. I’d run a mile if someone tried to put me in a Sales role (though I do admire the power suits that Sales have, occasionally). I’ve done a bit of Pre-Sales (where I’m helping out from a technical perspective), but haven’t been heavily involved. It’s actually something that I’m trying to work on, with being a tech evangelist. After all, if people already know/rave about the tech, how can you evangelise about it to them!!

Account Managers vs Sales People - davidmarkshaw

So last week I get a call from our Sales team. They’re really nice, and know their stuff. However they’re not ‘techies’. They had a situation – we’d been talking to a client about a potential project, and the client told us to pitch for it. Brilliant, right? Well…

The client told us that we had 4 days until the pitch deadline. Not only were we needing to pitch with the usual presentation pack (however would Sales operate without PowerPoint…?), we also had to do a live demo. Not for a completed product, but rather a Proof of Concept (PoC).

The only person available was….yes, you guess it…me. There wasn’t anyone else around with the necessary knowledge/skills to create the PoC in the time-frame needed. I’ll freely admit that I was absolutely slammed with existing projects, but wanted to be able to help out.

However, things then got ‘better’. And by ‘better’, I meant ‘interesting’. I got told who else was pitching to the client. Obviously I’m not going to mention any specific details here, but I knew who they are. More importantly, I figured that I had a very good idea of who from their side would be creating the tech, & doing the pitch.

Now as I’m not mentioning any identifiable details, I’m feeling free to say this. They’re not at my level of tech skills. They’re nowhere NEAR my level of tech skills. This is NOT because I’m better than they are. Totally the opposite – they’re SO far ahead of me with their knowledge of things, I can barely see the dust that they kick up in a race.

Knowing this, I knew that I couldn’t build a model-driven app (though it would have worked perfectly for the scenario/s we were given). I HAD to do a canvas app. But even with doing that, it wasn’t going to be anywhere near as good as what the other side would be able to put on.

The phrase ‘gibbering in fear’ does come to mind with my reaction to finding all of this out. I did feel slightly like a deer caught in the headlights. I wanted to do well, both for myself & my company, but I honestly had no idea how we could stack up.

Deer in the Headlights: By Generation Success – Generation Success
How I felt I looked like!

Thankfully, my company has an extremely open culture, and I was therefore able to talk to my manager about it. He understood where I was coming from, but encouraged me to go for it & do what I would be able to create.

My wife also encouraged me to go for it. Well, actually her words were ‘it’s not sexy when a husband says that he can’t do it, so man up and go for it!’. Ha…after that I couldn’t very well NOT do it.

So I applied myself, and with some VERY late nights (I did have other projects on, as I mentioned above), managed to get something in place. Not only did I create it, I think it looked really good. There was some really nice (canvas app) functionality, and it all came together pretty well.

Everything was in place in time (including some last minute tweaks). I even decided to spice up the demo a bit, and borrowed some dinosaurs from the kids to use for personas. We were using live camera feeds for part of the demo, and suddenly the demo was joined by ‘Rexy’, the ‘Customer Service Representative’ T-Rex! They were quite amused by it (thankfully!), and our team thought it was absolutely hilarious.

Hire A Dinosaur - Creature Events
‘Good afternoon, how may I be of assistance?’

I have no idea how the other partner pitched to the client, or what the decision will be from the client. It’s way too early for that.

What I do know is that sometimes we can lose track of ourselves. I’m not going to go into the subject of ‘Imposter Syndrome’ (check out Em D’Arcy if you want to read up about that). Rather that having others around to encourage us, even though others may be more skilled, can really make the difference.

In life, we can often face challenges. How we handle them, and how we decide to move forward, can define who we are. When dealing with technology items such as the Power Platform, where there’s constant change, it can sometimes feel very daunting, but we still need to push ahead.

Yesterday I was listening to Lisa Crosbie talking about her journey into technology (and canvas apps). As she put it – ‘there is no comfort zone here – you need to find a place to feel comfortable with this level of discomfort, and ride it to be successful’. It’s really so true. It’s not just needing to push ourselves in the traditional way, but to keep up our own confidence in our skills & abilities. With this, we can continue to drive forward, keep on learning, and continue our journey of greatness!

I’m really glad that I was able to do this, and hope that I can keep this with me. By doing so, I’ll be able to continue along my own journey.

Have you ever had a time when a challenge seemed insurmountable? How did you cope with it? Drop a comment below – I’d love to hear!

Strange behaviour with views

Normally when I write a blog post, it’s about sharing some cool features, new functionality, etc. However this post is going to be a little different, because I don’t actually have an answer (yet!) to what is going on here.

Let me explain the situation.

I’m needing to show some very specific data for reference purposes. For the purposes of this, let’s say that I’m looking at Contacts, and needing to report on Phone Calls. The reason is to identify Contacts who are frequent callers. My criteria are as follows:

  • At least one phone call (that has the Contact as the Regarding value) need to have a specific field set
  • At least one phone call (that has the Contact as the Regarding value) needs to have its Activity Status as Open

These two conditions are separate. So the contact essentially needs to have at least 2 phone calls against them, with each one meeting one of the conditions. There can be more than 1 phone call record with the same condition – that’s not an issue here.

Back in the (good old) day, I’d have written some cool SQL to return this data. Two Left Outer Joins, and we’d be done. However I can’t do that now (I’ve recently started dipping into FetchXML, which is an entirely other story to cover at some point). So I’m having to use the Advanced Find to check that I’m getting the right data.

This isn’t the easiest of things to do. I’m needing to start from Contact, go to Phone Call, go back up to Contact, & go back down to Phone Call. But hey, this is what it looks like:

So with this set up, I run the query, and get some results (in this specific scenario/time, there are 3 results). I go through the data to check that the results are actually satisfying my requirements, which they are:

Wonderful – let’s move forward then!

My next step is to look to set this up as a system view. To do this, I go to the Power Apps Maker (http://make.powerapps.com/), open my solution & find the Contact entity. Opening it, I switch to the Views tab:

I create a new view, add the columns I need, and then open up the Filter Criteria to start setting this up. I’m using the Advanced Find as a reference guide for the conditions I’m needing to use. Going through it, I replicate the values across:

That looks about the same as the Advanced Find, right? It’s laid out slightly differently, but that’s just the designer. OK – let’s go ahead to save/publish it, and see it it in the app:

Hold on. There’s only 1 record showing up there. Admittedly it’s in the list that came from Advanced Find, but what’s happened to the other 2 records?

So I go to check the data. I had already done this before, but I thought that perhaps I overlooked something, so I checked again. Nope – all of the data is fine/correct. There should indeed be 3 records showing up in the system view, but 2 are missing…

Note: As an aside, I do know that this isn’t permissions related. I’m doing all of this as a systems administrator with full privileges to everything. So it’s not that

OK – next steps:

  • Clear browser cache, reload and see if they’re showing up (useful tip – Control+F5 does this!). Nope, they’re not showing
  • Use Incognito mode, log in and see if they’re showing there. No, they’re still hiding away
  • Use a different browser, with a different system administrator login. Unbelievably they’re still being very shy, and refusing to appear!

Even more confusing about all of this is something truly perplexing. I can open up Advanced Find, select the system view (without doing ANYTHING else) & click ‘Results’. When doing this, all of the records appear! So in the entity view they’re not, but when I use that same system view through Advanced Find, they are!

I’m scratching my head at this. It just doesn’t make sense. I have no idea why this is happening. Reaching out to others, they also don’t seem to have any idea either.

My next step (I’m feeling SO proud of this, and so dev!) was to check the FetchXML. Perhaps there was something underlying in it that’s causing this? Using the FetchXML Builder in XrmToolBox, I loaded both views up, and compared them. It’s crazy – they seem to be exactly the same! (well, some cosmetic differences with where aliases appeared on the line, but this wouldn’t affect it):

At this point, I’m thinking that there are some magic elves under the hood, squirrelling away the data. It has to be the only logical reason for this, right?

The only thing I could find in the FetchXML that might make a difference is that there’s a ‘Distinct’ clause at the top of it in the one that’s working:

Why this would cause the issue, I have NO idea. Views return distinct results in them anyhow, so I’m not sure what this is actually doing here.

Regardless, using FetchXML Builder I updated the code, and WOW – it worked! I’m now returning 3 records in my system view! Absolutely strange, but hey – if it’s working now, who am I to question it…

I’m going to try to raise this through official Microsoft Channels, and see what I might be able to find out from them. However if you’ve come across this (or similar), or have some ideas about how to work around it, I’d LOVE to hear from you!

Canvas Apps, Collections & Dropdown Fields

This post is based around some recent work that I’ve been doing, which includes canvas apps. For those of you who aren’t familiar with canvas apps, imagine if PowerPoint & Excel had a baby! Though I’m expecting most people who are reading this to already know all about them 🙂

So enough with the waffle, let’s get on with things…let me paint the scenario for you.

The app is aimed to be used by a contact centre. Part of their function is to capture address information. So far this has been done absolutely manually. The issue with this is that data can be typed incorrectly, or in the wrong fields. We’re also needing to enhance the data with geographic-specific information (for reporting purposes). This information isn’t known by either the callers, or by the contact centre agents (for those who are curious, it’s the unique property reference number, which is unique to every address in the UK).

Thankfully, we’ve been given a source from the client which we can look this up against. In essence, we pass a postcode to it, and values are returned (in a JSON format). This includes the data that we’re looking for. Brilliant, so far.

When we got to thinking about things, there are several ways in which we could implement this:

  • Capture the data as we are already doing, & use Power Automate to get the relevant additional information

or

  • Automate this within the canvas app itself, and even give the customer service agents a bespoke address picker!

Deciding to go with the second option (it was a no-brainer, really), we moved ahead with this. We had the details that we needed in order to hit the address lookup API. One of the developers on the team created the Custom Connector, and got it working. We tested it out, and amazingly we got information back!

The next step was to see how we could do this within the canvas app itself. Now I’m going to admit here that although I’ve HEARD great things about Collections, I had never used them myself. In fact not only had I NOT used them before, I had NO idea how they worked! That was to change VERY quickly though…

Within a few hours, I had learned enough about collections to get how they worked, and pull data into them. It was actually really simple – I used the ClearCollect command to create a collection that was fed by the API query, which then created the data into a collection table for me to use. I was very impressed!

The code to return the postcode data. We had to do some manipulation due to the API constraints

OK – so I had my data in the collection now:

What were my next steps? Well, I was wanting to achieve the following:

  • Give the customer service agents an ‘address picker’ to use. They’d enter the customer postcode, & then be presented with a list of addresses that they could pick the correct one from
  • Automatically populate the customer address fields on the form from the selected address

Well, the first item (the ‘address picker’) was simple enough. Using a dropdown field, I pointed it at the collection data. This worked great, but the dropdown was only allowing me to select a single column from the collection to display. This meant that I could only select ONE column of data to return:

I can only select a single column!

1 column from the collection. OK, I thought – should be simple enough to handle. Let’s go and concatenate column values in the dropdown, to present the interface I’m looking for:

Now that’s more like it! Much easier for the customer service agents to use. OK – onto the next stage. Let’s go & set the fields to point to the collection, match to the value that’s selected in the dropdown, and populate. Should be simple to do, right?

Well…um, no, it’s not simple to do. In fact, it’s actually impossible to do. I was expecting to point to the dropdown selected value, & have the columns returned (from the collection). I could then select which column to use for a specific field. This, however, was not the case:

You have to love the ‘.’ (or ‘dot’) notation used in canvas app code. It shows you what values are available, and saves having to do lots of type. In this case, however, it also showed me that there was only ONE column of data to select from to display in the field. This was the ‘Result’ column.

This got me very confused. I tried going back to basics, and stripping out the concatenation in the dropdown. Wonderfully I was then presented with all of the different collection columns to use:

So let’s sum up things so far:

  • If I want to present the best option to the customer service agents (using concatenation), I can’t select different parts of the data for auto-population into fields
  • If I want to be able to auto-populate field values from the collection, I can’t use concatenation (& therefore can’t present user-friendly data to the customer service agents).

Note: Leaving aside wanting to show the house number & street, one of the main reason for wanting to concatenate was to handle buildings that had flats (aka apartments) in them. This is stored in a different column in the collection. It would therefore be difficult to show these both to the customer service agents

In essence, the behaviour of the dropdown field seemed to be that I couldn’t just change the displayed values without it ‘losing’ connection to the rest of the data. There was no ID that I could use to match on, or display what I wanted to.

This seemed to be a massive Catch-22. I tried various things, but couldn’t see a way out of this. I started to try to create a second collection, & concatenate fields from the first collection. This seemed like a good idea, though (with being totally new to it), I got lost. I tried various things; I even ended up managing to collect the entire data from the collection into a new column for EACH ROW!!

Thankfully, the community helped me out, in the forms of Peter Bryant & Clarissa Gillingham (I had posted about my issues on Twitter – the hashtag #poweraddicts is really great!).

With the help provided, I managed to work out the CORRECT syntax to use for the ‘AddColumns’ command. This now being in hand, I was successfully able to create a second collection & add concatenated field values to it:

Now for the moments of truth. Would the dropdown show this new column, & could I point the form fields to auto-populate specific columns?

Anticipation is the way to keep consumers coming back for more
Not me, but exactly how I was feeling!

The answer….was YES! It was working! I felt SO relieved. Let’s take a peek:

This was brilliant! We’re also populating other data in the background, but that doesn’t need to be visible to the customer service agents.

So in summary, I learned about collections, & how to use them. I also learned about the limitations of dropdown controls (when referencing them from other places), but came up with a way around it. Finally I achieved the result that I was aiming for. Very pleasing all round!

Have you come across something like this in an implementation? How did you manage to handle it (if you did)? Drop a comment below – I’d love to hear all about it!