Omnichannel Admin Center (Part II)

We’ve started off looking at the new Omnichannel Admin Center in Part I. I’m going to continue going through the wonderful new app (interface?), showcasing the functionality that’s different (there’s no point in me mentioning things that are the same, right?).

So having taken a look at the general overview, let’s start delving deeper into how it really is better!

Queues

Queues are really the backbone of Omnichannel. Customer interactions come through to a queue, where agents can then pick them up & respond. Without a queue, nothing would ever happen!

In the new interface, the functionality around queues has been extended. This is what the new interface looks like overall:

You’ll note that the default queues aren’t showing up in here. I’m not quite sure why that is, but am looking into it, and will post about it when I find out the reason behind it.

Opening up a queue record gives us the following:

I’m loving the cleanliness of the new layout – it’s something I’m probably going to keep saying! The new UI is just so much nicer on the eye, in my opinion. We have the information laid out well.

New users can be added from the ‘Add Users’ button on the right top, which is a pretty standard interface (ie adding new/existing records into a subgrid on a form).

But there are several new features here that weren’t present through the old interface. The first to talk about is the ability to set Operation Hours (the block at the bottom of the screenshot above). It’s great to see the prompt that if no operating hours are set, it’ll default to 24/7 operation.

Previously, it was a slight pain (ie clicking around a lot!) to get these to be associated. Now all we need to do is click the ‘Set Operation Hours’ button at the bottom of the page, and we can then add an existing record for this, or set up a new one:

Choosing an existing record will also give us the option to modify the settings for it:

One of the really nice things about this is the Assignment Method, which shows how work items will be prioritised. It’s possible change this, as well as create a NEW assignment method:

So quite a few additional functionality options available from the initial interface, rather than needing to click around. I’m liking it!

Workstreams

Just as with Queues, the Workstreams interface has been streamlined as well. One of the important things to note is that workstreams will need to be migrated over from the old interface to the new interface (I guess that there’s something happening behind the scenes?). I’m going to cover how to do this in a future post (stay tuned!), but let’s take a look the functionality in the new interface:

Clicking into a workstream record gives us the following information:

That’s already MUCH better laid out than the previous way, I think!

So let’s see what we have here. Well firstly, we’re able to move between the channels that are associated to the workstream. This is really helpful, as it can allow us to flip quickly backwards & forwards, and see the relevant information for each channel. We’re able to directly edit each individual channel just by clicking on it (loving the ‘fly out’ side screens for this!), and change the behaviour of it:

The abilities to quickly & rapidly do all of this is just wonderful, rather than needing to have a concrete understanding of the (complex) relationship structures within the system, and clicking around.

It’s also possible to add a new channel directly from this screen, which will easily walk (admin) users through setting up a new channel as needed:

Moving down the options available, we’re able to set routing rules, as well as work classifications. I’m going to talk about this in a separate post, but there’s some really interesting new capabilities here!

Looking at the Work Distribution information, we’re also able to view more information around this, as well as modify some of the settings available. Again, this comes in as a ‘fly out’ style window:

One of the neat pieces of functionality that has been slipped in is the ‘Keep same agent for entire conversation’ option. This means that if the customer interaction drops for some reason, & they come back, it can look for the same agent that they were chatting with previously, if it’s set as such.

Finally, we then have the ‘Advanced Settings’ tab, which gives us information around sessions, notifications, context variables, smart-assist bots, and quick replies. All of these are able to be viewed & configured directly from within the workstream, rather than needing to jump around different parts of the Omnichannel system, & then associating them together:

So to wrap up here (don’t worry, more to come shortly!), the new interface is really enabling admins to be able to quickly & easily create the necessary setup that’s needed. It’s avoiding needing to click around into different parts of the system. Omnichannel is complex enough as it is, and with being able to do the setup from one screen, it really makes life a LOT easier overall with getting the initial setup in place!

What are your thoughts on the new app? Have you used it yet? Have you found that it’s saving you time/effort? Drop a comment below – I’d love to hear!

Omnichannel Admin Center (Part I)

So there’s a new kid on the block. Or rather, it’s probably more accurate to say that there’s a new app available in Dynamics 365! This is the ‘Omnichannel Admin Center’ app that’s now present for anyone who currently has Omnichannel installed in their environment, or who is creating a new installation of Omnichannel.

So, what is this all about then?

Well, let’s back up a step here. Previously to set up Omnichannel, users had to go into the Dynamics 365 Settings, find the Omnichannel App, start the setup of it, and then go ahead & manually configure everything in the Omnichannel Administration app.

This, to be frank, took quite a bit of time to do, and needed users to be very familiar with the different parts of the interface. I’ve previously covered the (multiple) steps needed to do all of this in various blog posts, to help users understand what is actually needing to be done.

Thankfully, Microsoft realised the complexity around this, and have come out with a simplified administration experience. I’m very much in support of this, as it reduces the complexity of getting things started for Omnichannel in the first instance!

So let’s go ahead & take a look at this new app

The first thing to notice when opening the new Omnichannel Admin Center app is the interface itself. I think that this is really nice – rather than a ‘typical’ model-driven app experience, users are able to see some useful information on the home page itself!

Also, very nicely done in my opinion, are the three links at the bottom of the page:

  • Release Notes. This takes users to the release notes section on the Microsoft Docs website. It’s a great little thing that can help users understand the latest/greatest features that are being released
  • Ideas forum. People come up with great ideas to suggest to Microsoft to be able to include in their products. The Ideas forum is the location for these, where users can upvote popular concepts, or submit their own ideas. The Microsoft engineering teams do actually keep an eye on this!
  • Support community. The community forums are really helpful in allowing users to raise questions around the products, and give the ability for other users to help them out by giving answers etc. Most users will have already experienced the support forums in one way or another, but having a link directly to it is definitely quite useful to have

Now one thing that’s usually asked is ‘how can we quickly/easily see & set up chat in Omnichannel’? It’s one of the first things asked, as people tend to want to deploy (web)chat capabilities first, and then add other capabilities later on. Setting this up manually does take several steps, along with some waiting time (or, as I like to refer to it as, a coffee/snack break!)

It’s possible to quickly launch this through the button at the top of the page, rather than needing to go through the multiple configuration steps manually:

Click the button to launch it, and you’ll see the following window come up:

Clicking the ‘open chat demo’ will allow the system to start automatically configuring it for you – no more need for manual steps! You’re also able to use sample data if you wish to, to be able to show the experience without needing to load it in manually.

Yes, this really does only take a minute or two to happen!

Once the system has auto-configured everything, you’re now able to go ahead & launch the demo. Again, all the links & information are presented easily to us, telling/showing us what we need to do.

You’ll notice the chat widget in the lower right hand corner, which I’ve outlined in the image above. This launches into the chat widget directly, rather than needing to deploy it first to a webpage:

There’s no need to start needing to get into the setup of workstreams, queues, channels, routing capabilities, etc. It’s all configured right for you, to get you immediately started!

Of course, to test it out fully you’ll also need someone logged in as an Omnichannel Agent, to be able to respond to the chat instance. This could be the same user (in a different tab/browser on the same machine), or a different user on another machine. It’s really up to you as to how you would like to go about it.

So this is a really great feature to be able to have now. It’s not the ONLY great thing about the new app, however – stay turned for Part II next week when I’ll go into more capabilities that it provides!

Environments & ‘Admin Mode’

With some recent events happening (both professional & personal), I’ve taken a slight step back from putting out posts on here. Thankfully things seem to be settling down, so I’m getting (back) into the swing of things!

I thought that it would be good to talk about a subject that I fell ‘foul’ of recently. This is around environments, and more specifically, the ‘admin mode’ that it’s possible to use on them.

So what exactly is this ‘admin mode’? Well, the aim of it to restrict access to certain users, namely System Administrators & System Customisers. Why would we want to do this? There are several scenarios that come into mind:

  • Performing a system upgrade (such as enabling new features)
  • Changing environment type (eg Production to Sandbox, or vice-versa)
  • Restoring an environment

Essentially, any time we have operation-type work that we’re wanting to carry out. This way whatever we’re doing won’t affect users, and anything that the users are doing won’t affect things either (symbiotic relationship there!).

So as an example, if we’re doing a major release, which changes functionality within a system, we wouldn’t want users in the system carrying out their usual work, as this could have data issue if saving during the actual release. We of course SHOULD be communicating to users that a release is going to take place, and that they shouldn’t be in the system at the time, but ‘admin mode’ is how we can truly enforce it.

Something to bear in mind as well is that if you’re going ahead & restoring an environment to a previous state (whether that’s an automatic save point, or a manual one), it will automatically put the environment into ‘admin mode’ once the restore has been completed. This is very important to keep in mind!

There are three settings around administration mode:

  1. ‘Administration Mode’. This sets whether admin mode is on or off!
  2. ‘Background Operations’. This sets whether background processes, such as workflows, power automate flows, and Exchange synchronisation are enabled (allowed to happen) or disabled (stopped from happening
  3. ‘Custom Message’. This allows you to set a custom message that users (who are not system administrator/system customiser) will see when they attempt to access the environment

So this is the scenario that tripped me up a few weeks back:

  • I was needing to restore an environment to an earlier save point (to be clear, this was NOT a production environment)
  • I went ahead with the restore, and it completed successfully
  • Given that I was doing this at night, one of my children woke up, and I had to deal with them
  • I came back to things, saw that it completed, and then went ahead with the release that I was needing to do

All seemed to go well. However, when users were testing (which admittedly was a few days later), they reported that some functionality wasn’t working. This was strange, as it had been working before the release (& the release that I did hadn’t actually touched it!).

It turned out to be Power Automate flows that just didn’t seem to be running. OK – I started to look into them, but couldn’t figure out why they hadn’t run.

Creating a test Power Automate flow didn’t seem to work either – despite running it to test it, the trigger never activated! I was quite puzzled by this, and couldn’t (initially) work out the reason.

Then I thought to check environment settings! Lo & behold, the environment was STILL in administration mode, and the Background Process option was disabled! Aha – I’ve found the source!

Flipping this out of administration mode thankfully then allowed all Power Automate flows to work/run, and users confirmed that functionality was indeed running as expected. As you can imagine, I was quite relieved!

man in white shirt and black pants standing on black concrete bench near white building during

Something that I hadn’t realised previously is that if you manually put an environment into administration mode, it doesn’t automatically disable background processes. However, if you restore an environment, it DOES disable background processes by default. So if you’re wanting to try out automation items within a restored environment that’s still in administration mode, you’re going to need to ensure that you toggle the Background Processes toggle to allow it to work!

One further thing to learn as well (which I’ve been asked already by some people, so thought that I would mention it here). I’ve mentioned above that users were in the system, but reporting that things weren’t working. Now given that the environment was in administration mode, people have asked how users could be in it! The answer is that these users actually had the system customiser role applied to them, which is why they could get in! If they hadn’t had the role, then perhaps I might have realised things a little sooner (ie that the environment was in administration mode).

So a (good) little lesson learned, and I’ll definitely take it forwards. Has this, or anything else like it, ever tripped you up? Drop a comment below – I’d love to hear!

Working with Opportunity Close table

I’ve recently had the experience of working with the Opportunity Close functionality within Dynamics 365, and given what occurred, thought it would be useful to document this so that others are able to see this as well. There are many scenarios in which we’d use this, and being able to give a comprehensive solution to clients does make all of the difference!

There are three areas that I’d like to cover:

  • Working with Opportunity Close table
  • Challenges with data
  • Power Automate to the rescue!
  • Caveats

So let’s get started then!

Thanks to various members of the community such as Matt Collins-Jones, Andrew Bibby & others, who helped me along the way

Working with Opportunity Close

The Opportunity Close functionality within Dynamics 365 (& yes, I’m going to refer to it as this, rather than Power Platform) is used to provide information around why an opportunity is being closed. This is regardless of whether the opportunity has been won, or it’s been lost. It’s still quite important to track the information around it, so that companies can understand better how the market views the products it offers, how it stacks up against others, etc.

The default path in the system is to create a lead, and then qualify it. Qualifying a lead then automatically creates an opportunity record, which further information (quotes, etc) can be entered against. An account record (if company information is specified) is also created:

Updated Solution Release: Lead Qualification Version 2.0.0 for Microsoft Dynamics  365

On the opportunity record, users are able to show if it’s been won or lost by clicking an appropriate button on the toolbar:

Doing this brings up the Opportunity Close pane on the right hand side of the screen:

Now it’s possible to customise this screen. In fact, the screenshot above shows 3 custom columns that have been added to it already in the system I was in.

To do this, we go to customise the solution (in the Maker Experience), and add the column/s that we’re wanting to:

Next, we need to remember to add it to the form! Otherwise it’s not going to show up. If we’re wanting it to appear on the side bar, then it’s important to customise the ‘Quick Create’ form version, to make our customisations show up.

Note: We’re able to put conditional visibility of the column/s if we want to, based on whether the opportunity is won or lost, using Business Rules. I haven’t done so in this scenario, but you’re obviously able to do so if you want to

Remember to save & publish the form, and then it’ll display within the system for users. Brilliant!

Challenges with data

So we’ve gone ahead & created the custom columns, and users are actually using them to record data. Wonderful – that’s exactly what we’ve been wanting to achieve.

OK – let’s now review the data so that we can see overall what’s happened with our opportunities. Of course we’re wanting to do this simply & easily, so we’ll open an Advanced Find window, go to the Opportunity Close table, add columns from the associated Opportunity, and….hold on. Opportunity Close ISN’T displaying in the Advanced Find????

It’s just NOT there. In case you’re wondering if you saved/published things correctly, or forgot some system setting, stop worrying. It’s not you – it’s the system.

See, Opportunity Close, though a table in its own right, is a SPECIAL sort of table. It doesn’t show up, and can’t be directly queried. I know – frustrating. I felt exactly the same way.

On digging deeper into things, I found out that there’s actually an activity record saved. It’s possible to query against this:

However, and this is the BIG catch, it’s NOT possible to return custom columns when carrying out this query. The search will ONLY return the (system) columns that are present for activities. So this leaves us with a problem.

Essentially, though we can set up custom columns to track the data that we’re needing to, it’s not possible (through the front end) to query it. This sort of negates what we’re trying to achieve here overall, and is a pain.

So what’s the way round it? Well, it’s actually going to be Power Automate!

Power Automate to the rescue

In order to handle our issue, what we need to do is the following:

  • Add custom columns to the Opportunity table (these should mimic the custom columns that we’ve added to the Opportunity Close table)
  • Use Power Automate for automation purposes!

The first step is easy. We need to go & create custom columns on the Opportunity table. These WILL show up in the Advanced Find search. They obviously need to be the same as the custom columns on the Opportunity Close table. If we’ve used Choice or Choices there, point the Opportunity column to the same source (it’s a good argument for using Global, rather than Local, choice/s).

We then can go and create a Power Automate. This should trigger when an Opportunity Close record is created.

Note: For this, I’ve made it so that it runs under the user triggering the action, rather than a system account. This is to keep in line with licensing limits etc

You’ll then need to add a ‘Get Dataverse row’ step, and get the Opportunity Close record that has just been created. This is annoying, but for some strange reason the trigger doesn’t present the custom columns/values in the JSON that it returns. Hopefully Microsoft fixes this at some point, but for the moment, we need to work around it.

The last step is to add a ‘Update Dataverse row’. This should point to the Opportunity table, & we can simply map the values across (from the SECOND step, NOT the first one – VERY IMPORTANT).

Once this is all done, save & test it, and you should see it working. I generally don’t add the Opportunity custom columns to the form, but rather leave them for querying against.

Caveats

It’s important to keep in mind that when an opportunity is marked as either won or lost, it’s then closed, and changed to a read-only state. That’s how the system is designed to be, and makes sense.

However it’s ALSO possible to re-activate a closed opportunity, and then close it again. Ie a single Opportunity record could have multiple Opportunity Close records against it. This solution won’t handle this (it would need to be built out further – the Opportunity record itself will only show the values from the latest Opportunity Close action, so please do keep this in mind!

Have you ever come up against something like this? How have you handled it? I’d love to hear – please drop a comment!

Canvas Apps & Power Automates

So it’s been a busy few weeks here, which is why I haven’t really been putting up any articles. March/April is always a busy time for our family with stuff going on, and this year I decided not to push myself to get articles out, as otherwise I’d be running very low on sleep!

That being said, I’ve still had some great ideas about things that I’d like to share, and have been keeping a series of short notes for me to pick up. Today’s topic is one of them, which I think has been a major pain to anyone involved in canvas app development!

So, the back story to this is that we’re able to use Power Automate flows together with canvas apps. What I mean by this is that we’re able to directly trigger them from within the canvas app, rather than needing to do something like edit or create a record, and then have the Power Automate flow trigger from the record creation or modification.

There’s a specific Power Apps trigger that’s available within Power Automate exactly for this purpose:

When clicked, it gives us the trigger line in the steps as follows:

So what we’d do is within the canvas app, we would bind a button (or another control) that when selected, it would then go away & trigger the Power Automate flow. Great – so many different things that we can get to happen! One of the benefits of doing things like this is that we can then pass information from the Power Automate flow back to the canvas app directly:

This can then mean that the user can know, within the canvas app itself, that the Power Automate flow has run, and use data (or other things) that have come out of it.

OK – all good so far.

The main issue to date has been with deploying canvas apps together with Power Automate flows. See, as per best practise, we would create a solution, place the canvas app, flows, and anything else that’s necessary for it to work within it, and then deploy the solution to our target environment/s. And that’s where things just…didn’t go quite right.

Obviously within the development environment, the canvas app would be hooked up to the flows, and everything would work. Clicking the button would cause the flow to run, etc. User authentication would be in place (along with licenses of course!), and it was just fine.

But when deploying a solution containing canvas apps and associated flows between environments (regardless of whether it’s been manually deploying, or automated using a tool such as Azure DevOps), the connections to the flows would be broken. Ie, the canvas app would run, but the flows wouldn’t trigger. Looking at the connections in the canvas app within Studio would show something like the following:

All of the connections to Power Automate flows would show as ‘Not connected’. It’s not even possible to click the ellipse next to them and re-connect them – the only option available is to remove it from the canvas app!

So in order to get things working again, we’d need to do the following steps:

  • Open up the canvas app
  • Remove all connections to Power Automate flows
  • Add a temporary button, set it to be a Power Automate trigger
  • Click through all of the Power Automates needing to be connected (waiting for each one to connect, then go to the next one)
  • Remove the temporary button
  • Save and publish the solution

This, in a nutshell, has been a (major) headache. For example, I’ve been working with a solution that has over 30 Power Automate flows that can be triggered from the canvas app (lots of different functionality!). Each deployment has needed the above process to be carried out, which has usually added on at least an hour to the deployment process!

Now, this hasn’t been something that’s been unknown. In fact, the official Microsoft documentation noted the following:

So this is something that Microsoft has been well aware of, but it’s been a pain point that we’ve had to work with.

However, this has now ALL changed, which I (and MANY others) are really pleased about!

Microsoft has rolled out an update last month that means that canvas app connections to Power Automate flows will NOT break when they’re deployed across environments! This is such a massive time-saver, that I’m now trying to work out what to do with all of my free time! Only kidding…more project work will commence!

So what we can now do is take our solution, deploy it across the different environment/s that we need to get it out to (whether manually, or automated using tools such as Azure DevOps), publish the solution, and then everything works! Amazing!!

One small caveat though – to ensure that this work, you will need to go into the app, and re-publish it on the latest Power Apps version. This should of course be done in a development environment, and then can be exported and deployed as required.

Microsoft have also updated their documentation at https://docs.microsoft.com/en-us/powerapps/maker/data-platform/solutions-overview to remove the limitation text shown above. It’s a good place to keep an eye on changes that occur over time too.

This is definitely a welcome piece of development, and I know that we’ve been eagerly waiting for this for a while, and now it’s here!

PL-600: Microsoft Power Platform Solution Architect

Well, it’s FINALLY here. And by finally, I guess I’m saying that I’ve been waiting for this for a while? The PL-600 exam is the new ‘Holy Grail’ for Dynamics 365/Power Platform people, being the Solution Architect (3 star) exam. Ten minutes after it went live, I booked to take it, and four hours after it went live I sat it! (I would have taken it sooner, but had to have supper first, get the kids to bed, etc…)

The first solution architect exam that Microsoft has done in this space has been the MB-600 (see my exam experience write-up on it at MB-600 Solution Architect Exam). However with the somewhat recent shift moving towards certifications for the wider Power Platform, it was inevitable that this exam would change as well.

Interestingly enough, the MB-600 now counts towards some of the Microsoft Partner qualifications. I’d expect that when it retires (currently planned for June 2021), the PL-600 will take the place of it in the required certifications to have.

So, how to discuss it? Well, the obvious first start is to link to the official Microsoft page for it, which is at https://docs.microsoft.com/en-us/learn/certifications/power-platform-solution-architect-expert/. According to the specification for it:

Microsoft Power Platform solution architects lead successful implementations and focus on how solutions address the broader business and technical needs of organizations.
A solution architect has functional and technical knowledge of the Power Platform, Dynamics 365 customer engagement apps, related Microsoft cloud solutions, and other third-party technologies. A solution architect applies knowledge and experience throughout an engagement. The solution architect performs proactive and preventative work to increase the value of the customer’s investment and promote organizational health. This role requires the ability to identify opportunities to solve business problems.
Solution architects have experience across functional and technical disciplines of the Power Platform. Solution architects should be able to facilitate design decisions across development, configuration, integration, infrastructure, security, availability, storage, and change management. This role balances a project’s business needs while meeting functional and non-functional requirements.

So not really changed that much from the MB-600, though obviously there’s now an expectation for solutions to bring in other parts of the Power Platform, as well as dip into Azure offerings as well. Pretty much par for the course, in my experience, with how recent projects that I’ve been on have been implemented.

At the time of writing, there are no official Microsoft Learning paths available to use to study. I do expect this to change in the near future, and will update this article when they’re out. However the objectives/sub-objectives are available to view from the main exam page, and I’d highly recommend going ahead & taking a good look at these.

Passing the exam (along with having either the PL-200 Microsoft Power Platform Functional Consultant or PL-400: Microsoft Power Platform Developer Exam qualifications as well) will result in a lovely (new) shiny badge. Oh, we do so love those three stars on it!

As in my previous exam posts, I’m going to stress that it’s not permitted to share any of the exam questions. This is in the rules/acceptance for taking the exam. I’ve therefore put an overview of the sorts of questions that came up during my exam. (Note: exams are composed from question banks, so there could be many things that weren’t included in my exam, but could be included for someone else! ). I’ve tried to group things together as best as possible for the different subject areas.

Overall, I had 47 questions, which is around the usual amount that I’ve experienced in my exams over the last year or so. What was slightly unusual was that instead of two case studies, I got three of them! (note that your own experience may likely vary from mine).

Some of the naming conventions weren’t updated to the latest methods, which I would have expected. I still had a few references to ‘entities’ and ‘fields’ come up, though for the most part ‘tables’ and ‘columns’ were used. I guess it’s a matter of time to get everything up to speed with it.

  • Environments
    • Region locations, handling scenarios with multiple countries
    • Analytics
    • Data migrations
  • Requirement Gathering
    • Functional
    • Non-functional
  • Data structure
    • Tables
      • Types of tables
        • Standard vs custom functionality
        • Virtual tables. What these are, when they would be used, limitations to them
        • Activity types
      • Table relationships & behaviours
      • Types of columns, what each one is suited for
      • Business rules. What they are, how they can be used
      • Business process flows. What they are, how they can be used
  • App types (differences between them, scenarios each one is best suited for
    • Model
    • Canvas
    • Portal
  • Model-driven apps
    • Form controls (standard vs custom)
    • Form layout (standard functionality vs custom functionality)
    • Formatting inputs
    • Restricting inputs
  • Automation
    • Power Automate flows. What they are, how they can be used, restrictions with them
    • Azure Logic Apps. What they are, how they can be used, restrictions with them
    • Power Virtual Agents
  • Communication channels
    • Self service abilities through Power Virtual Agent chatbots. How this works, when you’d use them, limitations that exist
    • Live agent abilities through Omnichannel. How this is implemented, how customers can connect to a live agent (directly, as well as through chatbots)
    • Teams. When this can be used, how other platform abilities can be used through it
  • Integration
    • Integration tools
    • Power Platform systems
    • Azure systems
    • Third party systems
    • Reporting across data held in different systems
    • Dynamics 365 API
  • Reporting
    • Power BI. What it is, how it’s used, how it’s configured, limitations with it, how to share information with other users
    • Interactive Dashboards. What these are, how these are set up and used, limitations to them
  • Troubleshooting
    • Canvas app issues
    • Model driven app issues
    • Data migration
  • Security
    • Data Protection. What is it, where it’s set up, how it’s used across different requirements in the platform
    • Types of users (interactive/non-interactive)
    • Azure Active Directory, and the role/s it can play, different types of AAD authentication
    • Power Platform security roles
    • Power Platform security teams, types
    • Portal security
    • Restricting who can view forms
    • Field level security
    • Hierarchy abilities
    • Auditing abilities and controls
    • Portal security

Wow. It’s a lot of stuff. Not that I’m surprised by that, as essentially it’s the sort of thing that I was expecting (being familiar with the MB-600). I think that on a ‘day to day’ basis, I cover most of these items already, so didn’t have to do a massive amount of revision for items that I wasn’t familiar with.

From my experience in taking it, I’d say that around 30% of the questions seemed to be focused on Dynamics 365, with 70% being focused on Power Platform capabilities. It’s about what I thought it would be when the exam was first announced. Obviously some people are more Dynamics 365 focused, and others are more Power Platform focused, but the aim of the exam (& qualification) is to really understand the breadth of the offerings available.

I can’t tell you if I’ve passed it or not…YET!. Results aren’t going to be out for several months, based on previous experience with Beta exams, but I’ve got a good feeling about this.

So, if you’re aiming to take it – I wish you the very best of luck, and let me know your experience!

App Profile Manager

When going through a backlog of various items, I suddenly realised that although the App Profile Manager was released in September 2020, I hadn’t devoted any space to it! So I’ve therefore decided (finally) to do an article to cover it.

First of all, what exactly is the App Profile Manager? Well, it’s a (somewhat) new feature that never existed beforehand. Essentially, the Omnichannel Agent App window has a number of configurable items, such as tabs to load at start-up, etc. Trying to work out where the configuration for each item is can, at times, be slightly frustrating, and I (for one) can’t always remember it correctly! But there’s also more, as I’ll go into below…

So, enter the App Profile Manager. At the moment, it’s only able to be used for two specific standard apps. These are the Customer Service Workspace app, and the Omnichannel for Customer Service app. In the future this may open up some more, but we’re limited to these for the moment.

So what does it do? Well, it’s there to enable system administrators to add configurations to an app. Essentially, it’s focused on on giving users access to certain items & functionality within an app.

As Microsoft puts it, it ‘allows administrators to create targeted app experiences for agents and supervisors as an alternative to building and maintaining custom apps’. Wow – Marketing sure can come up with some interesting lines at times!

I can hear you asking ‘so why should we use it’? After all, customer support agents will just log into the app, for example, and see the interface. Why should we use this, when we can just use the Omnichannel Administration app to configure things.

There’s actually a really simple answer to this. See, if we’re carrying out the configuration just through the Omnichannel Administration app, this will be set company-wide. All users logging in will have the same experience. However, there are companies that, although it’s all based around customer service, will have different teams that handle different things, and want them to have different screen layouts. Perhaps they’re even a multi-national.

It’s exactly for this purpose that the App Profile Manager exists. See, using it we can set up different screen profiles, showing different tabs, having different notifications, etc. We then assign it users to it (unfortunately we can’t use a security group at this point in time). When the users log in, they’ll then be presented with whichever layout they’re associated with. We can create custom profiles as we need, to handle the business needs!

Right – enough of talking about the concept. How do we actually get to it? Well, we need to go to make.powerapps.com, click into the list of apps, and then select either the Customer Service Workspace app, or the Omnichannel for Customer Service app. Clicking the ellipse next to it will give us the option for the App Profile Manager at the bottom of the fly-out menu:

This will then launch the App Profile Manager homepage. Some nice information shown here, with even a link to a video that we can launch to see how to go about things.

On the left hand side, we can see the apps in place, along with the ability to launch directly into the different settings areas for them. This is all standard stuff.

The power of App Profile Manager really comes when we’re going into the App Profiles section. Here we can see all of the app profiles that exist in the system. The ones with padlocks next to them are default system ones, which we can’t modify. But the other ones we ARE able to change, as well as being able to set up new ones:

When we open up one that we’ve created, we can see how we can go about customising it. There’s even a handy little visual guide to help users understand what/where each section is:

We’re able to configure the following (per app profile):

Once we’re happy with the setup performed, we then need to assign users to the app profile itself. To do this, simply slick the ‘Assign users’ button on the menu bar:

This will open up the screen to add users to the app profile. We can easily select from existing users, and then associate them to the app profile:

And voila, we’re done!

Users will access the app in the normal way, either through launching it in the browser, or using their bookmarks. When they log in, they’ll be presented with the app profile that they’ve been associated with.

If a user doesn’t have an app profile associated with them, then the default system app profile will be assigned to them, and they’ll see that when they log in.

Note: Although the system doesn’t enforce it, you should ensure to only assign users to a single app profile!

So there you have it. A way to customise the customer service agent experience across a customer, to provide the best interface possible to them.

How could this help you with your own scenarios? I’d love to hear – drop a comment below to share.

Solution Dependencies & Management

Solutions are marvellous things. They enable us to be able to package up lots of components, and deploy them to different environments all together as one single package.

However, there have been changes over time as to how solutions are used. I’m not (for the most part) going to go into the Managed VS Unmanaged debate, which I leave to people who are more in the know….

Microsoft Dynamics 365 apps are installed using solutions. Third party apps provided by Independent Software Vendors (ISVs) also use solutions.

In Power Apps, solutions are leveraged to transport apps and components from one environment to another or to apply a set of customisations to existing apps. A solution can contain one or more apps as well as other components such as entities, option sets, etc. You can get a solution from AppSource or from an independent software vendor (ISV).

Custom development should also take place within a solution, to allow it to be deployed appropriately.

But it’s important to take a closer look at how solutions work overall, as we can be involved on multiple projects within the same environment. Not only that, some solutions may require other solutions to be present first, in order to actually work! A great example of this is Master Data Management (or MDM), which is where companies have a ‘backbone’ of data, which other parts of the system then hangs off.

To understand this concept better, let’s take a quick look at solution layering.

Solution Layering

Layering occurs on the import of solutions and describes the dependency chain of components from the root solution introducing it, through each solution that extends or changes the components behaviours. Layers are created through an extension of an existing component (taking a dependency on it) or creation of a new component or version of a solution

Managed and unmanaged solutions exist at different levels within a Microsoft Dataverse environment. In Dataverse, there are two distinct layer levels:

  • Unmanaged layer. All imported unmanaged solutions and unmanaged customizations exist at this layer. The unmanaged layer is a single layer.
  • Managed layers. All imported managed solutions and the system solution exist at this level. When multiple managed solutions are installed, the last one installed is above the managed solution installed previously. This means that the second solution installed can customize the one installed before it. When two managed solutions have conflicting definitions, the runtime behaviour is either “Last one wins” or a merge logic is implemented. If you uninstall a managed solution, the managed solution below it takes effect. If you uninstall all managed solutions, the default behaviour defined within the system solution is applied. At the base of the managed layers level is the system layer. The system layer contains the tables and components that are required for the platform to function.

The following diagram introduces how managed and unmanaged solutions interact with the system solution to control application behavior.

  • The system solution represents the solution components defined within Dynamics 365 or the Power Platform. Without any managed solutions or customisations, the system solution defines the default application behaviour. Many of the components in the system solution are customisable and can be used in managed solutions or unmanaged customisations.
  • Managed solutions are installed on top of the system solution and can modify any customisable solution components or add more solution components. Managed solutions can also be layered on top of other managed solutions. As long as a managed solution enables customization of its solution components, other managed solutions can be installed on top of it and modify any customisable solution components that it provides.
  • Unmanaged customisations. All customisable solution components provided by the system solution or any managed solutions can be customized in the unmanaged customisations
  • Unmanaged solutions are groups of unmanaged customisations. Any unmanaged customized solution component can be associated with any number of unmanaged solutions. These can be edited & modified, regardless of the environment in which they’ve been deployed to
  • The ultimate behaviour of an instance of Dynamics 365 or Power Platform application is the culmination of the system solution, any managed solutions, and any unmanaged customisations.

The official stance of Microsoft, according to its Application Lifecyle Management (ALM) documentation, is that unmanaged solutions are used for development, and that managed solutions are released downstream to further environments. For bespoke solutions, however, this may not fit, and an appropriate balance must be found.

Data ‘Backbone’ & Solution Dependencies

Given the way that companies are adopting Power Platform (and Dynamics 365, of course!) it’s highly likely that we will build out system structures that will form the backbone for multiple applications on an on-going basis. With this in mind, it’s appropriate to put in place proper planning for this, to avoid any issues that could occur in the future with appropriate system designs

Solution Dependencies

When creating system structures within an environment, using unmanaged solutions, connecting two (or more) tables together will create dependencies on each other. In simple terms, if we connect Table A to Table B, there’s a reciprocal relationship created back from Table B to Table A:

This happens even if Table A is in Solution 1, and Table B is in Solution 2. If they’re in the same environment (& both solutions are unmanaged), it will create the two-way dependency.

This will cause issues if trying to deploy each solution individually, and will fail on import, as the system will require all items to be available in the solution

Workable scenario

The way in which to handle the issue of solution dependencies is to ensure that the ‘master backbone’ of system design is created in the main development environment, and then to use that in secondary development environments as the core of additional solutions:

This is in line with the emerging recent Microsoft Best Practise information around solution management (which is likely to be moving towards having a single environment per developer, rather than multiple developers working in the same environment).

The steps for doing this are as follows:

  1. Main ‘core solution’ exists (as unmanaged) within the main development environment
  2. When a project requires this to build upon:
    1. Secondary development environment is created
    1. ‘Core solution’ is exported as managed from the main development environment, & imported into the secondary development environment
    1. Project work is carried out within the secondary development environment
    1. Once project solution is complete (or when appropriate for deployment), it can be exported from the secondary development environment
      1. If deploying directly from the secondary development environment to downstream environments, it should be exported as managed
    1. The solution should be exported as unmanaged, and imported back into the main development environment. This will not cause dependencies to be created with the ‘core solution’ in it

Note: The main ‘core solution’ should consist of the items that are needed for core system work. If additional items are needed for multiple projects to work off (eg Account Manager field), this would need to be added to the core solution, rather than the individual project solution/s, as otherwise there could be further issues downstream.

If the project is completed, but requires further work to be carried out later on (or development support), then the following should be done:

  1. Secondary development environment is created
  2. ‘Core solution’ exported from the main development environment as a managed solution, and imported into the secondary development environment
  3. Project solution exported as unmanaged from the main development environment, and imported into the secondary development environment
  4. Work and/or support can be carried out within the secondary development environment, and released appropriately

I’m expecting further information around this to be released by Microsoft in due course (I’m a little surprised there’s not more out there at the moment, to be honest!). It’s vital that we ensure that we’re working with solutions in the right way, to stop any issues occurring later on down the line.

Have you ever had a problem around this? Drop a comment below – I’d love to hear your experiences!

MB-910: Microsoft Dynamics 365 Fundamentals Customer Engagement Apps

So here’s the thing. There used to be the MB-900 exam, which was the Microsoft Dynamics 365 Fundamentals exam. This was aimed at people who had a small knowledge of Dynamics 365, and it was really the base/entry-level exam into the qualifications for it.

However, Dynamics 365 is actually comprised of two ‘parts’. There’s the ‘front office’ part that’s usually referred to as Customer Engagement (well, depending on how Microsoft wish to refer to it as, which can change from time to time!), and there’s the ‘back office’ part, which is the ERP side of things. This is the finance & operations sphere, where those functions take place.

The MB-900 was a slightly strange exam, in my opinion, because it covered both. There were questions around things like Sales, Customer Service, etc, but there were also Supply Chain Management questions as well, for example. Now I’m not saying that people shouldn’t know about both ‘sides’ of the equation, but people usually (for the most part) handle one or the other. It’s generally unusual to find someone knowledgeable about both.

Furthermore, if we take a look at the more in-depth exams in the MB range, we find that there’s a definitive split there. The MB-2xx series cover Customer Engagement, whereas the MB-3xx series covers the ERP side of things. So it’s definitely not the norm to have both sides included in a single exam.

Microsoft came to the realisation around this, and have therefore decided to update the Fundamentals space. In doing this, they’ve split things out. There’s the MB-910 exam (which is what this post is about), and the MB-920 exam, which focuses specifically on the ERP space. A good move, in my opinion..

The MB-910 launched this past weekend, and I took it around a day after it went live. Let’s go take a look at it, and recap my experience with it.

The official description of the exam is:

This exam covers the features and capabilities of Microsoft Dynamics 365 customer engagement apps.

Candidates for this exam should have general knowledge of or relevant working experience in an Information Technology (IT) environment. They should also have a fundamental understanding of customer engagement principles and business operations.

Taking it leads to the qualification for ‘Microsoft Certified: Dynamics 365 Fundamentals Customer Engagement Apps (CRM)’.

The description around the qualification is:

If you’re familiar with business operations, customer relationship management (CRM), and are IT savvy—either generally or through work experience—take advantage of this certification to highlight those skills. Validate your broad exposure to the customer engagement capabilities of Dynamics 365 to enhance your career journey.

People in different roles and at various stages in their careers can benefit from this fundamentals certification. Here are some examples:

IT professionals who want to show a general understanding of the applications they work with

Business stakeholders and others who know Dynamics 365 and who want to validate their skills and experience

Developers who want to highlight their understanding of business operations and CRM

Students, recent graduates, and people changing careers who want to leverage Dynamics 365 customer engagement capabilities to move to the next level

The official page for the exam is at https://docs.microsoft.com/en-us/learn/certifications/exams/mb-910 where it gives quite a good overview of things. Go take a look at it, and also take a look at the associated learning paths.

Once again, I sat the exam through the proctored option (ie from home). This is the way that I now usually take exams (even if I could go to an exam centre, I think that I’d be unlikely to, given the travel/time needed!). Checking in for the exam went without issues (the process definitely seems to be getting smoother each time), and I was ready to go within a few minutes.

As in my previous exam posts, I’m going to stress that it’s not permitted to share any of the exam questions. This is in the rules/acceptance for taking the exam. I’ve therefore put an overview of the sorts of questions that came up during my exam. (Note: exams are composed from question banks, so there could be many things that weren’t included in my exam, but could be included for someone else! ). I’ve tried to group things together as best as possible for the different subject areas.

  • Project Operations
    • Scheduling resources
    • Entering project time/costs
    • Skills
    • Roles
    • Different types of project costings
  • Customer Service
    • SLA’s, what they are, which ones to use
    • Omnichannel, including capabilities and channel functions/availabilities
    • Power Virtual Agents
  • Sales
    • Lead processes, deactivating & reactivating
    • Opportunity processes
    • LinkedIn Sales Navigator. How it interacts, which capabilities it has within it, how it works
    • Quotes. How they work, what’s required to handle them, document generation
  • Marketing
    • Website forms
    • Automation around responses
    • A/B testing
    • Event management
  • Field Service
    • Work orders
    • Route optimization
    • Scheduling boards
  • Document options
    • Attachments that users can access within the system, as well as outside of Dynamics 365
    • File collaboration tools, and integration with them
  • Timelines & activities
  • System currencies, default options, additional currencies, and updating them
  • Understanding different types of tables, and when you’d use each one
  • Reporting capabilities
    • How data is able to be reported on
    • Report Builder Wizard
    • Reporting on data held in Dataverse
    • Reports in dashboards
    • Usage of Power BI, including data gateways

I was slightly surprised with the level of detail in some of the areas. I wasn’t, for example, expecting the emphasis on Project Operations and Field Service that came up for me. Some of the level of detail seemed more fitting for an MB-2xx exam than this Fundamentals exam.

In a similar vein, I also wasn’t expecting Power BI and Power Automate so much. Perhaps that’s just my own perspective, though obviously with the Power Platform it would be there. However there is a PL-900 exam, around Power Platform capabilities, that I’d expect those sorts of questions to be in, rather than here in this exam.

Otherwise I think that it was generally on point for what I’d expect to find at this level of exam. The questions have definitely evolved over time, and I found myself giving more consideration to answers than I would have on the previous version.

It’s a good place to start for people who are looking to get qualified around Dynamics 365! If you do decide to take it, please drop a comment below to let me know how it was for you – I’d love to hear about your experience!

Finding Employment

I thought that, given it’s now a new year, this would be a topic that could be of use to people.
Maybe it’s making all those new years resolutions that fills you with thoughts of new possibilities, but it’s usually around this time that people consider what they want their year to look like, and whether to decide to move employers, or stay in the same place.

I frequently get messages on LinkedIn, as well as direct emails, from recruiters. What happens next usually seems to fall along the same sorts of lines. They try to get some information from me, promise me the world, etc. Usually they’d try to get me on the phone, whilst not really providing any information for me to go on, or showing why having a call with them would be of any value to me.

Together with the amazing Alison Mulligan, we’ve drawn up the below. Alison is not only a seasoned recruiter, she’s also another Microsoft BizApps MVP! It’s a topic that we’ve been discussing on & off over the last few years since we first met. Alison also does a ‘One Minute Monday’ quick tips session every week. I’d strongly recommend to go & check it out!

We both chat with lots of people, and thought that giving a view from ‘both sides of the fence’ would be helpful to others. With this, we’ve put the time into drawing up a shortlist of points that both sides might think to take into consideration. Our aim here? Purely to help out – we’re not getting anything for this at all.

Tips for people looking for a new position

Why use a recruiter? Isn't it better to apply for jobs · Ambition
  1. Ensure that your CV is up to date, with all relevant information on it. Include any professional qualifications, employment history, etc, & it’s laid out well. Personally, I’m a great fan of ‘Words in Tables’ by John Moon (https://www.jmoon.co.uk/index.cfm). Free registration (or an optional charitable donation – what better way to do good for yourself and someone else at the same time) on his site will give you a great CV template that will stand out from most others (I use it myself!)
  2. Think about what you’re wanting in a new position. Be comfortable with discussing these, as you’ll need to mention them to recruiters. They could include:
    • Salary
    • Benefits
    • Career progression options
    • Volunteering
    • Work/Life balance
  3. Ensure that your LinkedIn profile is up to date, with all relevant information. Include your qualifications as well as any other experiences. Use the space available – any good recruiter will read the information that you’re including, rather than just skim the first line. A good rule of thumb is to do at least two paragraphs for each position, detailing your achievements, & what you brought to the company. You can also use LinkedIn’s own 20 steps to a better profile
  4. Use the LinkedIn ‘About’ section to describe why you’d be an asset to an employer, your skills & expertise that you bring to the table, and what you absolutely enjoy & love doing. If you don’t know what you love doing, sit down and give it some thought – start with tasks that, when you do them, get you into a ‘flow state’ (as in time seems to pass quickly)
  5. Keep your overall career plan in mind – if you haven’t done that yet, then now is a good time to start.

Above all, if someone contacts you with a role, be open and honest about things, and if you feel it’s not appropriate, you can say so. Alternatively if you think that someone else you know would be suitable for the position, you can always recommend them.

Tips for recruiters

The Pros and Cons of Using a Recruiter - HR Daily Advisor
  1. Have a proper job specification available, listing out the required & wished for items. If you don’t have one, be open about what the role is actually supposed to be, rather than guessing at it. Or at the very least have a detailed view of the company and WHY they are looking to hire. If you’re speaking to someone with more than a few years experience in their line of work, be pro-active about giving the spec to them. They’re able to take an initial look & assess whether it’s appropriate or not in much less time than you (as a recruiter) might be able to.
  2. Ensure that you actually know the salary range for the position and are happy to share it. I’ve been approached multiple times with absolutely no salary information, and when it is finally available, it turns out to be half of what I’m currently on – this feels like a giant waste of time for everyone, recruiter included, as no matter how awesome the company or role, no one is taking a 50% pay cut (unless its to work with Elon Musk or Satya Nadella)
  3. If using LinkedIn to search for candidates, take a proper look at their experience & information. All too often we’re asked about Developer opportunities, when it’s quite clear that I’m not a Developer. Use the appropriate filtering tools/options available to return pertinent results. And, we know you are under pressure in terms of time, but if you spend five minutes reading our profile (particularly if we have bothered to make it as detailed as possible), you will get a 10x greater return than if you don’t.

One final thing to keep in mind, in general. If you feel that you’re being pushed into something, take a step back, and consider if it is indeed the right move for you. It could just be the way that it’s being pitched at you, but taking a few minutes to make sure you’re alright is very important. You could also consult with a mentor around it, who we’re sure would be only too happy to help you out.

Oh, and if you have any tips you’d like to share, feel free to post them below in the comments – we’d love to see them!