Good news for Power Automate Flows!

As a starter for 10, this wasn’t actually the blog post that I was going to write today. In fact, the subject of the post wasn’t even going to be about Power Automate! However, there was some really amazing news that dropped today from Microsoft, which I just couldn’t pass up being able to talk about.

You’ve guessed it – it’s about Power Automate! Well, I suppose that the post title was somewhat of a giveaway, wasn’t it…ah well. So let’s go ahead and find out what this is all about then!

To date, we’ve been able to put Power Automate flows into a solution. Well, it wasn’t there exactly at the beginning of things, but it happened somewhere along the way. This was very convenient, as we didn’t then need to deploy each one individually to different environments. Some solutions can contain dozens & dozens of flows, and we really do love to package them all up together for ease of movement.

So that was good. But there was still a (major) ‘bugbear’ (as I like to refer to them as). This is the fact that after we deploy a Power Automate flow, we then need to go into it & (re)authenticate it. This is due to the fact that the connector/s that it uses contains what is referred to as a ‘secret’, and these can’t be moved across environments. As a result, we need to essentially recreate the ‘secret’ in the connector (ie authentication details) every time we move it. This is an annoyance (if you have one or two flows), and an absolute bloody nightmare if you have lots.

For the technical minded – every action in a flow is bound to a specific instance of a connection that it will use to โ€œexecuteโ€ that action. This is why when moving flows across environments, users are required to rebind every operation to a connection.

For example, I’ve been working with COVID-19 triage solutions. These contain lots of flows within them, connecting to multiple different sources, and doing different things. Every time we’ve performed a release (even if it’s just a simple update), we’ve needed to manually go through each flow, (re)authenticate them, and turn them on. If you forgot one, then everything can come crashing down & not work! But there’s been no other way to do it. To represent this visually, we have the following diagram

For each & every Power Automate, the connection line gets ‘broken’ when it’s deployed, and needs to be re-made.

Until now, that is. For today, Microsoft has announced the Public Preview for ‘Connection References’. Now when something is put into Preview, I usually caveat the usage of it with saying things like ‘it might go away, or not be released for a while’. But I’m going to be quietly confident about this particular piece of functionality, as I really don’t think it’s going to be pulled!

So what exactly are these? Well, in (mostly) simple terms, Connection References provide an ‘in-between’ or ‘abstraction’ layer for the connections that use them. Let’s show this visually as well

We still need to re-authenticate the Connection Reference once we deploy things. But let’s now see how we can save ourselves a massive headache, and LOTS of time:

Oooo…now this is looking better. Instead of having to update three Power Automate flows, we only have to update the SINGLE Connection Reference that’s sitting in the middle. Now multiple that by however many flows you have (eg sending emails out, etc), and start calculating how much time you’ll now be able to spend on coffee breaks, rather than doing this manually one at a time…

We can create Connection References directly from within the solution:

We then give it a name & description, choose which connector we’re going to be using, and either select an existing connection or set a new one up:

Once we’re finished, we click ‘Create’ at the bottom. Voila – we can now see it within our solution!

Note: Interestingly enough I couldn’t actually see this within the solution after I created it, even with the component selector set to show ‘All’. How I actually got them to display was changing the component selector to ‘Connection Reference’, and they then showed up. I’m thinking that this is due to it being new today/in the process of rolling out, and am expecting it to display without any issues in the near future

Let’s take a look at a Power Automate flow itself now to see how it’s referenced. When we open an item with a connector, we can now see the following:

We’re able to select the Connection Reference that we’re wanting to use. Simple, yet so powerful.

When importing a solution containing a Connection Reference, we will be prompted during the import process to set the actual connection that should be used with it:

If you don’t have any connections set up already in the environment, you’ll be able to create a new one from the dropdown.

Some things to note around this:

  • During the preview phase, Microsoft has specified that a single Connection Reference can only be used by up to 16 flows. This limitation will be removed once it goes GA
  • Existing flows will not be automatically upgraded. What you can do though is export the unmanaged solution, re-import it to the same environment, and then they will be automatically created for you. The flow/s can then be edited to update them to the correct connection reference record
  • The connection name and connection reference name are not currently synchronised. They can be different. Therefore itโ€™s best to keep the naming conventions the same. Donโ€™t set different names for connections and their associated connection references.

In summary – this is an awesome step forward with Power Automate functionality. I’m already tasking some of the developers on the team to re-do existing solutions to use it for ease of use. How do you think it’ll best benefit you? Drop a comment below!

Handling ‘Out of Hours’

Let’s face it – we can be quite spoiled at times. As a customer, we can sometimes expect that companies be available 24/7 to service our requests, needs, issues, etc. That would be wonderful, wouldn’t it! Imagine that you have a mobile phone issue at 2am – you could call up your provider, and have it handled (or a new handset sent out) immediately. That would be quite nice!

Unfortunately the real world doesn’t (always) quite work like that. Of course there are companies that operate on a multi-national or even global scale, and there’s always customer service available (Amazon – I’m thinking of you right now!).

Previously I’ve gone into how we can set operating hours for a company, so that the ability to contact a customer support agent is only shown during these times. Take a look at Handling Company Hours for a refresher on this.

But sometimes not showing the ability to contact support could potentially be counter-productive. Customers may think that our website isn’t working properly, and possibly attempt to try to reach us through other means. This could quite well frustrate them.

Due to this, we have a nice little piece of functionality that’s now come out in Omnichannel. It’s small, simple, but yet quite brilliant in my humble opinion. This is the ability to have a chat widget available, but let customers know that that it’s currently out of company hours.

To activate this, we need to open the Chat record in the Omnichannel Administration Hub, and go to the Design tab:

Quite helpfully, the section is labelled ‘Offline’! How much better could we get.

We do need to understand that (at the time of writing this post) it’s currently in Preview, with all of the usual caveats around how that works.

We have several items available here:

  • Show widget during offline hours. This is what actually activates the setting – leaving this to false won’t do anything for us!
  • Theme colour. This allows us to set the specific theme to be used during ‘offline’ hours. It’s actually really helpful, as it serves/gives a very visual aspect to the customer to display that it’s out of hours
  • Title. The title of the chat widget, which will be displayed to the user
  • Subtitle. This allows us to place a subtitle as well, for the user to be able to see

So what does this then look like? Well, let’s take a look:

Personally I think that being able to set a theme colour for offline access gives it that little edge. Customers will become aware of this (subconsciously) when visiting the website, and come to the point of not even trying to start a chat when they see that it’s out of hours.

One MAJOR thing to bear in mind. We’re only going to be given the option to set this when we have a value set for Operating Hours. Without this being set, we won’t be shown this option. Go try it for yourself and see!

There’s not really much else to this, to be honest. But I’m liking it. I know that from a personal perspective I’ve been on various websites, and have no idea if the support chat is actually working or not. With this in place, I’m able to see that it is available for use at the correct time, and not have to wonder about it.

Have you ever thought about implementing something like this? Have you actually done so? I’d be really interested to hear from you about how you went about it – please drop a comment below!

Dynamics 365 Admin Centre for Omnichannel

I’ll freely admit that the title for this post is a bit of a mouthful! I’ll also admit that I used the British spelling of ‘centre’, rather than what it actually is. You’ll have to excuse my grammar ๐Ÿ˜‰

This post is about something that we all knew was coming. The old Admin Centre is no longer – and we shall miss it! It was inevitable that it would be moved over to the new Power Platform interface, as so many other things have already. Therefore I thought it would be good to do a quick article about where it is now, how to access it, etc.

After all, it is vitally important when needing to carry out the initial configuration for Omnichannel, or to check for upgrades to the Omnichannel installed solution!

Let us, however, cast our mind back to the very familiar layout shown below. We’ve spent so many years here that it seems quite sudden. But though you may be gone, you will not be forgotten!

Manage Omnichannel application

Right – now onto the new version of it! So this actually took me a few minutes of digging around as to how to find it & get to it.

The first thing I tried was looking in the environment settings, but alas, I didn’t find it there. So I continued digging around.

Wishing you spare you the exact itinerary of everywhere that I looked into, I’ve decided just to show you it! I can hear the sighs of relief at this point…

What we need to do is navigate to the Power Platform Admin Centre, at https://admin.powerplatform.microsoft.com/. Once there, we expand Resources on the left hand side, and select ‘Dynamics 365 apps’. Note that you do NOT have to select a specific environment first to be able to do this./

Now we can see a list of all apps installed. Nicely we’re able to scroll, which we couldn’t do in the old interface! That’s actually really helpful, and avoids needing to navigate to a different page. If we scroll down, we can see the entry for Omnichannel:

Click on ‘Manage’, and we get the following lovely popup:

Click OK to this, and we get taken to the (familiar) interface for configuring the initial items for Omnichannel:

Here we can go about the usual items, such as checking each environment to see if there are any updates available, or configure the main channels.

Nicely, Microsoft has actually updated (some of) their documentation, which is obviously very good. I’m now going to have to go and check through previous articles of mine, and update as necessary!

Lookup fields & Power Automate

This is an interesting post, for several reasons. Firstly, it’s the first one in 3 weeks – I was off on holiday, and decided to take an (almost) absolute break from all things digital, which included this blog. It was actually quite refreshing, though now coming back & starting to write again does seem a bit daunting, I’ll admit!

Thankfully, whilst wondering what exactly to start with, a scenario came up that I was working on. It seemed quite simple at first, but then actually got someone complicated. I therefore thought it would be helpful to others if I wrote about it, so here it is.

The scenario was as follows. We had records being auto-created in the system, and needed to create child records for them. This, as I’m sure you’ll agree, is really quite simple to do with Power Automate. We also needed to set lookup values on the child record, that were already populated on the parent record (for reference purposes).

So for example, the parent record has a lookup to Country (being a separate entity), and the child record also has a lookup to Country. These need to be the same.

Being both lookup fields, I figured that I’d be able to take the value from the parent record, and simply plop it into the corresponding field on the child record in Power Automate:

So I did that – and immediately hit an error. Not just any error, but the fabled ‘Resource not found for the segment’ error!

Obviously, I did what anyone would do at first – I put it into Google & Twitter, and took a look at what came up.

The ‘problem’ was coming from using the ‘CDS Current Environment’ connector, which is the latest version available (the old one is no longer available to use). It’s really great for a lot of things, but unfortunately not so great in a few areas. See, in the old CDS Connector, you could just drop the lookup field value into the field you were wanting to populate. Power Automate had no issues with that, & it would run just fine.

However in the ‘new’ CDS Connector, you can’t just do that. Instead, you need to use an OData reference (which I haven’t done much of before, to tell the truth). So based on the blogs I had come across, I went to work to try to get this working.

Part of the challenge was that there didn’t seem to be a unified consensus in how to do it. I came across the following variations:

  • /entityname(Lookup Field Value)
  • /entityname/(Lookup Field Value)
  • /pluralentityname(Lookup Field Value)
  • /pluralentityname/(Lookup Field Value)

Somewhat confusing, as I’m sure you’d agree. Nevertheless, I ploughed through all of the different possibilities. But nothing was working – every single time, I still got the ‘segment not found’ error message. This, as you can image, was extremely frustrating!

Thankfully, one of my good friends was around & able to help out. Namely, Tricia Sinclair came to the rescue!

We took a look at the code I was using, and she took a look at some of her own use cases (where it had worked for her). I was starting to think down the path of needing a capital letter in the entity name (some systems can be REALLY finicky around things like that), but thankfully it wasn’t.

Instead, it was the following. See, this was a custom entity. It turns out that for a custom entity (& heck, for all I know system entities as well) the syntax needed is ‘publisherprefix_pluralentityname(lookupfieldvalue)’. Now that’s not something that I had come across ANYWHERE at all!

Looking at it, I guess it makes sense. After all it would technically be possible to have multiple entities with the same name, though with different publishers. As a result, the system needs to know WHICH exact entity is being needed for the Power Automate, so uses this. Somewhat complicated (and hey – it worked without all of this in the OLD CDS Connector), but we got it to work!

Testing it out, everything worked smoothly. The Power Automates fired off without any issues, the data got created & populated, and everyone was happy.

So there you go. Another interesting little twist in syntax needed, which hopefully will NOT change in the (near) future!

Have you come across anything like this? I’d love to hear – drop a comment below around it!

PL-100: Microsoft Power Platform App Maker Exam

As many people are aware, Microsoft is changing the certification landscape somewhat. With the emergence of the Power Platform, there’s a need to test skills other than the traditional Dynamics 365 ones.

To this end, a new series (the PL-XXX) has been created. The first (main) one of these exams is the PL-100, which is the entry level exam.

You can take a look at the exam requirements & learning paths by going to https://docs.microsoft.com/en-us/learn/certifications/exams/pl-100.

Now, when I say ‘entry level’, I’m not referring to basics. This isn’t a Fundamentals exam – for that, you’ll be wanting to take a look at the PL-900 exam (which came out a while ago). To put it into perspective, the PL-200 (which is aimed to launch in September 2020) will replace the MB-200 exam!

So, the exam went live (in Beta) just over a week ago (July 17th). I’ve been waiting for this for a while, as I’ve really been wanting to see how the new exams are structured. Taking it in Beta means I’m going to have to wait (a little while) for my results to come through, but it gives me the opportunity to see the new landscape upfront.

I booked it as soon as it was available, for Wednesday July 22nd. Nicely (as mentioned above), there were already learning paths in place, so I eagerly went through them (again) in preparation. I was feeling pretty much quite prepared, but then….

See, I had signed up to attend the Power Platform Virtual Happy Hour (PPPVHH) on the same day as I had booked the exam for. Incidentally, if you haven’t come across this before, take a look. It’s hosted every month, and has some AMAZING speakers. Clarissa Gillingham presented on the ‘Infinity Form’, and it was a joy to behold. But I’m digressing.

After the event had finished, some of us remained chatting in the virtual room. Amongst them was none other than Chris Huntingford, who we all love and adore!. I mentioned that I had to sign off soon to get ready for the exam. No sooner had I mentioned this than Chris said to me something along the lines of ‘BRO….WATCH OUT!! It’s REALLY HARD!!’.

I might mention here that I have a slight (friendly) rivalry with Chris, in seeing who can take newly released exams first. I had figured that he’d be so busy with everything going on that I’d get this one before he did. Little did I know that he had ALREADY taken it.

Here I was, about to go sit down for the exam, and he got me TOTALLY freaked out. I’m not sure how much of it he did on purpose, but I’m sure that when I get him into a corner, I’ll find out…one day!

Anyhow – I sat the exam, took most of the time available (pretty sure I hit the 2 hour mark), and found it quite good overall. One or two things that seemed to be totally random/in the wrong place, but otherwise it was fine. Definitely much better that the MB-600 (MB-600 Solution Architect Exam), and I felt much more comfortable than I did with the MB-400 (MB-400 Power Apps & Dynamics 365 Developer Exam).

It really is very cleared aimed at app developers (both model & canvas), as well as other Power Platform skills. According to the exam description:

The app maker builds solutions to simplify, automate, and transform tasks and processes for themselves and their team where they have deep expertise in the solution domain. They are skilled in key technical business analyst tasks such as data modeling, basic UX design, requirements analysis, process analysis, etc.

The app maker creates and enforces business processes, structures digital collection of information, improves efficiency of repeatable tasks, and automates business processes.

The app maker uses the maker tools of Power Platform to solve business problems. They may have experience with Visual Basic for Applications, Excel pivot tables, Teams, and other tools. They should have a basic understanding of data models, user interface, and processes. The app maker is aware of the capabilities and limitations of available tools and understands how to apply them.

The app maker is self-directed, and solution focused. They may not have formal IT training but are comfortable using technology to solve business problems with a personal growth mindset. They understand the operational need and have a vision of the desired outcome. They approach problems with phased and iterative strategies.

So, as before, itโ€™s not permitted to share any of the exam questions. This is in the rules/acceptance for taking the exam. Iโ€™ve therefore put an overview of the sorts of questions that came up during my exam. (Note: exams are composed from question banks, so there could be many things that werenโ€™t included in my exam, but could be included for someone else!).

  • Canvas App Test Studio. What it does, how to carry out tests in it, how to set up Test Suites, etc
  • Developing Power Automate Flows. Different types of connectors, different types of steps/actions. How to deploy properly between environments using solutions
  • Field Level Security. What it is, what it does, what it can/can’t be used for
  • Canvas Apps:
    • Publishing rights
    • Access rights
    • Versioning
    • Editing vs using
    • Sharing & security
    • Saving changes, & deploying them to users
    • Collections. What they are, what they do, how they work
    • Galleries. What they are, what they can do, how to configure them in different way
    • Navigation around screens. How to set this up, how to pass information from one screen to the next
    • New vs Display vs Edit forms. What each one is, how each one is used
    • Charts. Which ones are available, how they’re configured
    • Using AI features, such as text/data recognition. What’s able to be used, how are they configured, what the benefits of each are
    • Versioning. How to handle this, what the benefits are
    • Accessibility for less-abled users. What options are available to facilitate this, how are they configured
  • Data Security. Different types of security available (roles/teams/access teams/business units) etc. Configuring security roles with different levels of permissions
  • Power BI Security. Showing/hiding information for specific users/teams, and how to configure this
  • Solution publishers. How these are set up, what you can modify after they’ve been set, considerations between default & other solutions
  • Business Logic. Differences between Business Process Flows, Business Rules & Power Automate. What each one can/can’t do, and is best suited for
  • Creating environments. Where to do this, how to do this, what steps are needed
  • Connecting to data sources. Different types of data connections, what each one is suited for
  • Model Apps:
    • Forms & Views. What these are, how to set up & configure them
    • Navigation, Sitemap etc.
  • Business Rules. How they work, what the different levels of scope are, how they affect functionality
  • Automation. Workflows vs Power Automate Flows. The different types (eg On Demand, Instant, Scheduled)
  • Arrays. What they are, what they do, how they work
  • DLP (Data Loss Prevention). How this works, how to set it up, different options available
  • Data field types. What each one is, how each one is used & able to be configured
  • Calculated/Rollup vs Autonumber. What each is, when to use each one

That’s quite a lot of stuff, with an emphasis on canvas app functionality & solutions. It definitely is important to ensure that you’re really on top of these. Thankfully not too much mention of Power BI (at least not in my exam), and for that I’m quite grateful!

I do have to say that in one respect, I found something quite amusing. See, on the same day as I took the exam, Microsoft Ignite was taking place. One of the major announcements was the ‘rename/rebrand’ of CDS to Dateflex (Pro). I therefore kept laughing when questions would refer to CDS again & again! Obviously I’m expecting this to change in the exam (at some point?).

In summary, I think that this is a good start for the new range of exams, and look forward to the other ones in the series coming out!

Have you taken this? What was your experience like? Drop a comment below – I’d love to hear!

Keeping belief in oneself

Although I usually post around technical matters & such, occasionally I digress into personal reflection. After all, this is my personal blog, & I feel it’s sometimes good/relevant to share certain personal things. Today’s post is along those lines, though it does relate to a technical matter.

Let’s set the scene. As many of you know (either from knowing me personally, or from reading my blog posts), I’m from the ‘model-driven app’ background. Canvas apps are really cool, but I wouldn’t say that I’m a very advanced creator of them. I’m learning the whole time about them (well, when I have a free minute here & there). There are many people in the community who are extremely more advanced than I am, and I love being able to learn from them.

I’m also considered to be in ‘Delivery’, This is the fancy word for those who run/are involved in projects, rather than selling concepts to clients. I’d run a mile if someone tried to put me in a Sales role (though I do admire the power suits that Sales have, occasionally). I’ve done a bit of Pre-Sales (where I’m helping out from a technical perspective), but haven’t been heavily involved. It’s actually something that I’m trying to work on, with being a tech evangelist. After all, if people already know/rave about the tech, how can you evangelise about it to them!!

Account Managers vs Sales People - davidmarkshaw

So last week I get a call from our Sales team. They’re really nice, and know their stuff. However they’re not ‘techies’. They had a situation – we’d been talking to a client about a potential project, and the client told us to pitch for it. Brilliant, right? Well…

The client told us that we had 4 days until the pitch deadline. Not only were we needing to pitch with the usual presentation pack (however would Sales operate without PowerPoint…?), we also had to do a live demo. Not for a completed product, but rather a Proof of Concept (PoC).

The only person available was….yes, you guess it…me. There wasn’t anyone else around with the necessary knowledge/skills to create the PoC in the time-frame needed. I’ll freely admit that I was absolutely slammed with existing projects, but wanted to be able to help out.

However, things then got ‘better’. And by ‘better’, I meant ‘interesting’. I got told who else was pitching to the client. Obviously I’m not going to mention any specific details here, but I knew who they are. More importantly, I figured that I had a very good idea of who from their side would be creating the tech, & doing the pitch.

Now as I’m not mentioning any identifiable details, I’m feeling free to say this. They’re not at my level of tech skills. They’re nowhere NEAR my level of tech skills. This is NOT because I’m better than they are. Totally the opposite – they’re SO far ahead of me with their knowledge of things, I can barely see the dust that they kick up in a race.

Knowing this, I knew that I couldn’t build a model-driven app (though it would have worked perfectly for the scenario/s we were given). I HAD to do a canvas app. But even with doing that, it wasn’t going to be anywhere near as good as what the other side would be able to put on.

The phrase ‘gibbering in fear’ does come to mind with my reaction to finding all of this out. I did feel slightly like a deer caught in the headlights. I wanted to do well, both for myself & my company, but I honestly had no idea how we could stack up.

Deer in the Headlights: By Generation Success โ€“ Generation Success
How I felt I looked like!

Thankfully, my company has an extremely open culture, and I was therefore able to talk to my manager about it. He understood where I was coming from, but encouraged me to go for it & do what I would be able to create.

My wife also encouraged me to go for it. Well, actually her words were ‘it’s not sexy when a husband says that he can’t do it, so man up and go for it!’. Ha…after that I couldn’t very well NOT do it.

So I applied myself, and with some VERY late nights (I did have other projects on, as I mentioned above), managed to get something in place. Not only did I create it, I think it looked really good. There was some really nice (canvas app) functionality, and it all came together pretty well.

Everything was in place in time (including some last minute tweaks). I even decided to spice up the demo a bit, and borrowed some dinosaurs from the kids to use for personas. We were using live camera feeds for part of the demo, and suddenly the demo was joined by ‘Rexy’, the ‘Customer Service Representative’ T-Rex! They were quite amused by it (thankfully!), and our team thought it was absolutely hilarious.

Hire A Dinosaur - Creature Events
‘Good afternoon, how may I be of assistance?’

I have no idea how the other partner pitched to the client, or what the decision will be from the client. It’s way too early for that.

What I do know is that sometimes we can lose track of ourselves. I’m not going to go into the subject of ‘Imposter Syndrome’ (check out Em D’Arcy if you want to read up about that). Rather that having others around to encourage us, even though others may be more skilled, can really make the difference.

In life, we can often face challenges. How we handle them, and how we decide to move forward, can define who we are. When dealing with technology items such as the Power Platform, where there’s constant change, it can sometimes feel very daunting, but we still need to push ahead.

Yesterday I was listening to Lisa Crosbie talking about her journey into technology (and canvas apps). As she put it – ‘there is no comfort zone here – you need to find a place to feel comfortable with this level of discomfort, and ride it to be successful’. It’s really so true. It’s not just needing to push ourselves in the traditional way, but to keep up our own confidence in our skills & abilities. With this, we can continue to drive forward, keep on learning, and continue our journey of greatness!

I’m really glad that I was able to do this, and hope that I can keep this with me. By doing so, I’ll be able to continue along my own journey.

Have you ever had a time when a challenge seemed insurmountable? How did you cope with it? Drop a comment below – I’d love to hear!

Strange behaviour with views

Normally when I write a blog post, it’s about sharing some cool features, new functionality, etc. However this post is going to be a little different, because I don’t actually have an answer (yet!) to what is going on here.

Let me explain the situation.

I’m needing to show some very specific data for reference purposes. For the purposes of this, let’s say that I’m looking at Contacts, and needing to report on Phone Calls. The reason is to identify Contacts who are frequent callers. My criteria are as follows:

  • At least one phone call (that has the Contact as the Regarding value) need to have a specific field set
  • At least one phone call (that has the Contact as the Regarding value) needs to have its Activity Status as Open

These two conditions are separate. So the contact essentially needs to have at least 2 phone calls against them, with each one meeting one of the conditions. There can be more than 1 phone call record with the same condition – that’s not an issue here.

Back in the (good old) day, I’d have written some cool SQL to return this data. Two Left Outer Joins, and we’d be done. However I can’t do that now (I’ve recently started dipping into FetchXML, which is an entirely other story to cover at some point). So I’m having to use the Advanced Find to check that I’m getting the right data.

This isn’t the easiest of things to do. I’m needing to start from Contact, go to Phone Call, go back up to Contact, & go back down to Phone Call. But hey, this is what it looks like:

So with this set up, I run the query, and get some results (in this specific scenario/time, there are 3 results). I go through the data to check that the results are actually satisfying my requirements, which they are:

Wonderful – let’s move forward then!

My next step is to look to set this up as a system view. To do this, I go to the Power Apps Maker (http://make.powerapps.com/), open my solution & find the Contact entity. Opening it, I switch to the Views tab:

I create a new view, add the columns I need, and then open up the Filter Criteria to start setting this up. I’m using the Advanced Find as a reference guide for the conditions I’m needing to use. Going through it, I replicate the values across:

That looks about the same as the Advanced Find, right? It’s laid out slightly differently, but that’s just the designer. OK – let’s go ahead to save/publish it, and see it it in the app:

Hold on. There’s only 1 record showing up there. Admittedly it’s in the list that came from Advanced Find, but what’s happened to the other 2 records?

So I go to check the data. I had already done this before, but I thought that perhaps I overlooked something, so I checked again. Nope – all of the data is fine/correct. There should indeed be 3 records showing up in the system view, but 2 are missing…

Note: As an aside, I do know that this isn’t permissions related. I’m doing all of this as a systems administrator with full privileges to everything. So it’s not that

OK – next steps:

  • Clear browser cache, reload and see if they’re showing up (useful tip – Control+F5 does this!). Nope, they’re not showing
  • Use Incognito mode, log in and see if they’re showing there. No, they’re still hiding away
  • Use a different browser, with a different system administrator login. Unbelievably they’re still being very shy, and refusing to appear!

Even more confusing about all of this is something truly perplexing. I can open up Advanced Find, select the system view (without doing ANYTHING else) & click ‘Results’. When doing this, all of the records appear! So in the entity view they’re not, but when I use that same system view through Advanced Find, they are!

I’m scratching my head at this. It just doesn’t make sense. I have no idea why this is happening. Reaching out to others, they also don’t seem to have any idea either.

My next step (I’m feeling SO proud of this, and so dev!) was to check the FetchXML. Perhaps there was something underlying in it that’s causing this? Using the FetchXML Builder in XrmToolBox, I loaded both views up, and compared them. It’s crazy – they seem to be exactly the same! (well, some cosmetic differences with where aliases appeared on the line, but this wouldn’t affect it):

At this point, I’m thinking that there are some magic elves under the hood, squirrelling away the data. It has to be the only logical reason for this, right?

The only thing I could find in the FetchXML that might make a difference is that there’s a ‘Distinct’ clause at the top of it in the one that’s working:

Why this would cause the issue, I have NO idea. Views return distinct results in them anyhow, so I’m not sure what this is actually doing here.

Regardless, using FetchXML Builder I updated the code, and WOW – it worked! I’m now returning 3 records in my system view! Absolutely strange, but hey – if it’s working now, who am I to question it…

I’m going to try to raise this through official Microsoft Channels, and see what I might be able to find out from them. However if you’ve come across this (or similar), or have some ideas about how to work around it, I’d LOVE to hear from you!

Omnichannel โ€“ Wave 2 2020

Yesterday was an extremely exciting day, for a number of reason! The main reason (& the purpose of this article) is that the release notes for Wave 2 2020 dropped. This covers Dynamics 365 (https://docs.microsoft.com/en-us/dynamics365-release-plan/2020wave2/) and Power Platform (https://docs.microsoft.com/en-us/power-platform-release-plan/2020wave2/). I’ve been quite eager to see (& talk about) some of the features that will be in it, so here goes!

Now there’s obviously a lot of different stuff in there, covering all of the different first party apps in Dynamics 365, as well as functionality for the different parts of the Power Platform. I’m going to focus on the Omnichannel features, as after all that’s what I (mostly) talk about ๐Ÿ™‚

As I’ve done before, I’m going to include the dates that are applicable (at this point in time) for each time.

Agent suggestions for similar cases

Public preview – October 2020. No current date for GA release

Reader Question: What should I do if a project sells in Hollywood ...

Customer service agents will usually use multiple resources to efficiently handle & resolve customer cases. When doing so, the ideal is to provided consistent responses across agents & sessions. Carrying this out can involve knowledge articles, involving other agents, or reviewing active/similar cases.

The aim is to therefore improve this to enable the agents to be more efficient. Items such as being able to identify other cases that similar actions have occurred will result in a much more empowered experience.

The key highlights for this new feature include:

  • AI driven case suggestions bases on the context & historical success rate
  • Secondary actions that can be taken, such as collaboration with an expert
  • Continuous improvement of the recommendation model through a comprehensive feedback mechanism

Embedding chats in mobile experience

GA – October 2020

Until now, the only way possible for customers to engage with Omnichannel customer support on mobile has been through a mobile browser. This hasn’t always been the best of experiences.

This feature allows business with mobile applications to provide customers the ability to engage with customer service directly within the mobile app. There are two options available for this across iOS & Android devices:

  • By embedding the Omnichannel chat widget into an iFrame, with minimal code customisation. The colour & logo can be set through the Omnichannel Administrator app. This method includes a sample app on AppSource with examples for common scenarios
  • Through the React Native Mobile SDK for Omnichannel for Customer Service. This will allow developers to build fully customised chat widgets in mobile apps

Persistent messaging for chat

GA – October 2020

One of the main challenges around chat is that when a conversation is closed, the history is accessible the the chat transcript. It’s not easily visible in the chat conversation history. Asynchronous messaging channels such as WhatsApp don’t have this ‘restriction’, as the chat history is directly available in the session.

If an agent needs to go look up the chat history, it can cause a delay in responding to the customer whilst it’s located, opened & read. This in turn can degrade the customer experience.

In this new feature, admins are able to enable persistent messaging, so that the customers previous conversation shows in the actual conversation window. This in turn allows the agent to have the full context of the customers previous engagement before responding.

Outbound messaging

Public preview – August 2020. GA October 2020

There are occasions when organisations need to proactively reach out to customers. This could be to notify them of a case update, regulatory information, upcoming appointments, etc. Being able to do this via the customers preferred method of communication is important in delivery best-of-class service

With this feature, organisations will be able to dynamically message their customers based on specific events through supported channels. This includes the following capabilities:

  • Creating message templates that can be adopted for outbound messages
  • Configure outbound messages based on specific events on any entity, & send messages when these events are triggered

The customer will now be able to respond back upon receiving one of the messages. This response will be treated like any other incoming conversation in Omnichannel. It will flow through the appropriate routing & work distributions, agent assignment, etc, and the agent will be able to respond back to the customer in real time

Post-conversation surveys using Forms Pro

Public preview – August 2020. GA – October 2020

Until now, there was no (easy) method for post-conversation surveys to be sent out to customers. Businesses wish to ensure that customer satisfaction is met, and often make use of such things. Custom Power Automate actions could be implemented, but this obviously takes additional time &/or resource to develop.

With this new feature, Omnichannel administrators can configure post conversations surveys using Forms Pro. These can be presented natively as part of the customer experience to provide feedback. There’s also support for sending offline surverys through one of the various enabled channels.

Real-time language translation of messages

Public preview – August 2020. GA – October 2020

Language Translator - Apps on Google Play

Customers would like to receive support in their native language. Businesses are not always able to do this, as it would require having staff who can communicate in different languages to their customers.

In this feature, real-time translation of messages is carried out between the customer & the support agent. It’s also available for the internal collaboration between agents. It’s enabled as a plug-in that exposes API’s to bring in 3rd party translation services, & also provides a native implementation through Azure Cognitive Services.

This is a feature that I can see being used heavily by not only multi-national organisations, but also by by smaller national-based organisations that have an international customer base

Agent personalisation of quick replies

What is Quick Response in S Planner of Samsung Galaxy Note2(GT ...

Agents spend a considerable amount of time engaging with customers. We already have quick responses within Omnichannel to save agents time in typing replies, & to deliver consistent replies.

However these can’t be personalised for individual agents to represent their identity. This new feature allows the pre-configured quick responses to be customised by agents to represent their styles. With this, they can then address communication scenarios in a manner that’s common & personal to them.

Agent personalisation of sound notifications

Public preview – September 2020. GA – October 2020

This feature enables agents to customise sound notifications for incoming conversations so that they can easily distinguish between different sessions. It will also allow them to
differentiate their sessions from others around them in a call centre setting.

Agent suggestions for knowledge

Public preview – September 2020. GA – October 2020

Agents typically use several different resources to efficiently resolve customer cases. These include knowledge articles, collaboration with other agents, and reviewing other cases. Up until now, this has all be done manually.

The new feature for this uses AI to proactively surface relevant knowledge articles, taking case context & previous history into account. It also includes a comprehensive feedback mechanism to continuously improve the recommendation model.

Experience designer for multi-session apps

Public preview – September 2020. GA – October 2020

The experience designer is an out-of-the-box solution that lets organisations create targeted app experiences, to be used by agents & supervisors. Administrators can select specific channels & tool to be used by these profiles, as an alternative to building & maintaining custom apps.

All in all, I think that there are some really helpful additions to the core Omnichannel product, aimed at making the customer & agent experiences more productive. I’m looking forward to getting hands on with these, and writing about them!

Canvas Apps, Collections & Dropdown Fields

This post is based around some recent work that I’ve been doing, which includes canvas apps. For those of you who aren’t familiar with canvas apps, imagine if PowerPoint & Excel had a baby! Though I’m expecting most people who are reading this to already know all about them ๐Ÿ™‚

So enough with the waffle, let’s get on with things…let me paint the scenario for you.

The app is aimed to be used by a contact centre. Part of their function is to capture address information. So far this has been done absolutely manually. The issue with this is that data can be typed incorrectly, or in the wrong fields. We’re also needing to enhance the data with geographic-specific information (for reporting purposes). This information isn’t known by either the callers, or by the contact centre agents (for those who are curious, it’s the unique property reference number, which is unique to every address in the UK).

Thankfully, we’ve been given a source from the client which we can look this up against. In essence, we pass a postcode to it, and values are returned (in a JSON format). This includes the data that we’re looking for. Brilliant, so far.

When we got to thinking about things, there are several ways in which we could implement this:

  • Capture the data as we are already doing, & use Power Automate to get the relevant additional information

or

  • Automate this within the canvas app itself, and even give the customer service agents a bespoke address picker!

Deciding to go with the second option (it was a no-brainer, really), we moved ahead with this. We had the details that we needed in order to hit the address lookup API. One of the developers on the team created the Custom Connector, and got it working. We tested it out, and amazingly we got information back!

The next step was to see how we could do this within the canvas app itself. Now I’m going to admit here that although I’ve HEARD great things about Collections, I had never used them myself. In fact not only had I NOT used them before, I had NO idea how they worked! That was to change VERY quickly though…

Within a few hours, I had learned enough about collections to get how they worked, and pull data into them. It was actually really simple – I used the ClearCollect command to create a collection that was fed by the API query, which then created the data into a collection table for me to use. I was very impressed!

The code to return the postcode data. We had to do some manipulation due to the API constraints

OK – so I had my data in the collection now:

What were my next steps? Well, I was wanting to achieve the following:

  • Give the customer service agents an ‘address picker’ to use. They’d enter the customer postcode, & then be presented with a list of addresses that they could pick the correct one from
  • Automatically populate the customer address fields on the form from the selected address

Well, the first item (the ‘address picker’) was simple enough. Using a dropdown field, I pointed it at the collection data. This worked great, but the dropdown was only allowing me to select a single column from the collection to display. This meant that I could only select ONE column of data to return:

I can only select a single column!

1 column from the collection. OK, I thought – should be simple enough to handle. Let’s go and concatenate column values in the dropdown, to present the interface I’m looking for:

Now that’s more like it! Much easier for the customer service agents to use. OK – onto the next stage. Let’s go & set the fields to point to the collection, match to the value that’s selected in the dropdown, and populate. Should be simple to do, right?

Well…um, no, it’s not simple to do. In fact, it’s actually impossible to do. I was expecting to point to the dropdown selected value, & have the columns returned (from the collection). I could then select which column to use for a specific field. This, however, was not the case:

You have to love the ‘.’ (or ‘dot’) notation used in canvas app code. It shows you what values are available, and saves having to do lots of type. In this case, however, it also showed me that there was only ONE column of data to select from to display in the field. This was the ‘Result’ column.

This got me very confused. I tried going back to basics, and stripping out the concatenation in the dropdown. Wonderfully I was then presented with all of the different collection columns to use:

So let’s sum up things so far:

  • If I want to present the best option to the customer service agents (using concatenation), I can’t select different parts of the data for auto-population into fields
  • If I want to be able to auto-populate field values from the collection, I can’t use concatenation (& therefore can’t present user-friendly data to the customer service agents).

Note: Leaving aside wanting to show the house number & street, one of the main reason for wanting to concatenate was to handle buildings that had flats (aka apartments) in them. This is stored in a different column in the collection. It would therefore be difficult to show these both to the customer service agents

In essence, the behaviour of the dropdown field seemed to be that I couldn’t just change the displayed values without it ‘losing’ connection to the rest of the data. There was no ID that I could use to match on, or display what I wanted to.

This seemed to be a massive Catch-22. I tried various things, but couldn’t see a way out of this. I started to try to create a second collection, & concatenate fields from the first collection. This seemed like a good idea, though (with being totally new to it), I got lost. I tried various things; I even ended up managing to collect the entire data from the collection into a new column for EACH ROW!!

Thankfully, the community helped me out, in the forms of Peter Bryant & Clarissa Gillingham (I had posted about my issues on Twitter – the hashtag #poweraddicts is really great!).

With the help provided, I managed to work out the CORRECT syntax to use for the ‘AddColumns’ command. This now being in hand, I was successfully able to create a second collection & add concatenated field values to it:

Now for the moments of truth. Would the dropdown show this new column, & could I point the form fields to auto-populate specific columns?

Anticipation is the way to keep consumers coming back for more
Not me, but exactly how I was feeling!

The answer….was YES! It was working! I felt SO relieved. Let’s take a peek:

This was brilliant! We’re also populating other data in the background, but that doesn’t need to be visible to the customer service agents.

So in summary, I learned about collections, & how to use them. I also learned about the limitations of dropdown controls (when referencing them from other places), but came up with a way around it. Finally I achieved the result that I was aiming for. Very pleasing all round!

Have you come across something like this in an implementation? How did you manage to handle it (if you did)? Drop a comment below – I’d love to hear all about it!

MB-400 Power Apps & Dynamics 365 Developer Exam

I haven’t usually been putting up posts around the exams that I take. A few months back I did decide to write one on the MB-600 exam (MB-600 Solution Architect Exam), which just took off! It was quite amazing (& pleasing) how many people were looking at it, & asking me questions around the exam.

As a result, I’ve decided to continue this, and am therefore now writing this post on the MB-400 exam.

There are several different ‘ranges’ of exams within the Dynamics 365/Power Platform space. These are aimed at different types of roles, or specific specialisation/s within a role. A good example of this is the MB-2xx range. It covers functional technology, and is split across the different ‘main’ areas of Dynamics 365.

The MB-400 (the only one in the range at the moment) is aimed at developers. According to the official description for the exam:

Candidates for this exam are Developers who work with Microsoft Power Apps model-driven apps in Dynamics 365 to design, develop, secure, and extend a Dynamics 365 implementation. Candidates implement components of a solution that include application enhancements, custom user experience, system integrations, data conversions, custom process automation, and custom visualizations.

Candidates must have strong applied knowledge of Power Apps model-driven apps in Dynamics 365, including in-depth understanding of customization, configuration, integration, and extensibility, as well as boundaries and constraints. Candidates should have a basic understanding of DevOps practices for Power Apps model-driven apps in Dynamics 365. Candidates must expose, store, and report on data.

Candidates should have development experience that includes JavaScript, TypeScript, C#, HTML, .NET, Microsoft Azure, Office 365, RESTful Web Services, ASP.NET, and Power BI.

As anyone who knows me will attest, I am NOT a developer. However I decided (for several reasons) to give this one a go, and see what would happen! I knew I’d be pushing myself out of my comfort zone, there would be things I wouldn’t understand/know at ALL, but hey – I was curious to see what would happen! Even more challenging, I decided to book & take it within a 24 hour period!

Now as this has been out for a little while (& isn’t in Beta), there’s thankfully some good resources on Microsoft Learn about it. Take a look at https://docs.microsoft.com/en-us/learn/certifications/exams/mb-400, where there are several learning paths that can be followed.

A big shout out as well to Julian Sharp & Joe Griffin who recently ran a multi-week course around it. The official Microsoft learning paths are great of course, but seem to miss out quite a bit of what’s actually needed to be known for this. The course that they ran covered a lot more. Hopefully there will be more courses like this run in the future!

When passing it (& assuming that you’ve passed the MB-200 as well), you get a lovely shiny badge!

Microsoft Certified: Power Apps + Dynamics 365 Developer Associate
I’m SO proud of this!

Once again, I sat the exam through the proctored option (ie from home). The experience went somewhat better than previous times. Amusingly I got told off by the proctor during the exam for ‘looking down at the keyboard’, rather than looking at the screen! I explained that I was using a different computer, & kept clicking the wrong mouse button on it (leaving aside that I was exhausted when doing it!).

So, as before, it’s not permitted to share any of the exam questions. This is in the rules/acceptance for taking the exam. I’ve therefore put an overview of the sorts of questions that came up during my exam. (Note: exams are composed from question banks, so there could be many things that werenโ€™t included in my exam, but could be included for someone else!).

  • Model driven apps:
    • User experience
    • Show/hide fields
    • Change field labels
  • Canvas apps – functionality, online/offline capabilities, field types (including searching/filtering data)
  • Plugin debugging
  • Configuring security for system connections (security types)
  • D365 Web API – how it’s used, types of calls made from/to it
  • Azure API – making calls to/from it
  • Code for importing data (debugging, variables)
  • Advanced Find
  • Types of calls (synchronous, asynchronous, )
  • Data modelling
  • Creating & deploying solutions through different methods
  • Publisher versioning
  • Identifying code variables, and saying what would happen in given scenarios
  • Power Apps Component Framework (PCF) – how to use, how to package components, how to deploy
  • PCF components & classes
  • JavaScript – code examples, what happens when a given scenario happens
  • JavaScript functions
  • Dynamics 365 Ribbon – what it is, what you can do with it, different types of functionality & ways to do things with it
  • Security & Permissions, including roles, teams, field level security, business units
  • Workflows, Power Automate Flows (how they’re set up, different functionality within them, how to do things with them given a specific scenario)
  • Business Rules (what they can/can’t do, different scopes, etc)
  • Field types (eg option-sets, calculated fields, roll-up fields, multi-select, etc)
  • Importing solutions – requirements for this, versioning, deployment between environments
  • Compatibility with Microsoft Teams

Now many of these (as I said above) are outside of my comfort zone. In fact, I’d say that even with absolutely cramming for a whole day for the exam, I still felt that I was guessing the answer for at least 30% of the questions. Admittedly though, as Julian Sharp says, a ‘gut feeling’ answer is usually right most of the time, coming from what the subconscious has absorbed during revision.

I was REALLY happy that I got a passing mark for this, & admittedly was VERY relieved as well. So now another lovely shiny badge in my collection, and I’m now going to go and update it on LinkedIn as well!

If you have any questions on this, feel free to drop them below, and I’ll try to help out as best as I can!