Finding out about Afshan’s love of a specific TV show genre (you’ll need to watch to find out!), and discussing the challenges of finding a new role in difficult times. Some very important tips to keep in mind.
If you’d like to come appear on the show, please sign up at http://bit.ly/2NqP5PV – I’d love to have you on it!
Click here to take a look at the other videos that are available to watch.
I’ve recently had the experience of working with the Opportunity Close functionality within Dynamics 365, and given what occurred, thought it would be useful to document this so that others are able to see this as well. There are many scenarios in which we’d use this, and being able to give a comprehensive solution to clients does make all of the difference!
There are three areas that I’d like to cover:
Working with Opportunity Close table
Challenges with data
Power Automate to the rescue!
Caveats
So let’s get started then!
Thanks to various members of the community such as Matt Collins-Jones, Andrew Bibby & others, who helped me along the way
Working with Opportunity Close
The Opportunity Close functionality within Dynamics 365 (& yes, I’m going to refer to it as this, rather than Power Platform) is used to provide information around why an opportunity is being closed. This is regardless of whether the opportunity has been won, or it’s been lost. It’s still quite important to track the information around it, so that companies can understand better how the market views the products it offers, how it stacks up against others, etc.
The default path in the system is to create a lead, and then qualify it. Qualifying a lead then automatically creates an opportunity record, which further information (quotes, etc) can be entered against. An account record (if company information is specified) is also created:
On the opportunity record, users are able to show if it’s been won or lost by clicking an appropriate button on the toolbar:
Doing this brings up the Opportunity Close pane on the right hand side of the screen:
Now it’s possible to customise this screen. In fact, the screenshot above shows 3 custom columns that have been added to it already in the system I was in.
To do this, we go to customise the solution (in the Maker Experience), and add the column/s that we’re wanting to:
Next, we need to remember to add it to the form! Otherwise it’s not going to show up. If we’re wanting it to appear on the side bar, then it’s important to customise the ‘Quick Create’ form version, to make our customisations show up.
Note: We’re able to put conditional visibility of the column/s if we want to, based on whether the opportunity is won or lost, using Business Rules. I haven’t done so in this scenario, but you’re obviously able to do so if you want to
Remember to save & publish the form, and then it’ll display within the system for users. Brilliant!
Challenges with data
So we’ve gone ahead & created the custom columns, and users are actually using them to record data. Wonderful – that’s exactly what we’ve been wanting to achieve.
OK – let’s now review the data so that we can see overall what’s happened with our opportunities. Of course we’re wanting to do this simply & easily, so we’ll open an Advanced Find window, go to the Opportunity Close table, add columns from the associated Opportunity, and….hold on. Opportunity Close ISN’T displaying in the Advanced Find????
It’s just NOT there. In case you’re wondering if you saved/published things correctly, or forgot some system setting, stop worrying. It’s not you – it’s the system.
See, Opportunity Close, though a table in its own right, is a SPECIAL sort of table. It doesn’t show up, and can’t be directly queried. I know – frustrating. I felt exactly the same way.
On digging deeper into things, I found out that there’s actually an activity record saved. It’s possible to query against this:
However, and this is the BIG catch, it’s NOT possible to return custom columns when carrying out this query. The search will ONLY return the (system) columns that are present for activities. So this leaves us with a problem.
Essentially, though we can set up custom columns to track the data that we’re needing to, it’s not possible (through the front end) to query it. This sort of negates what we’re trying to achieve here overall, and is a pain.
So what’s the way round it? Well, it’s actually going to be Power Automate!
Power Automate to the rescue
In order to handle our issue, what we need to do is the following:
Add custom columns to the Opportunity table (these should mimic the custom columns that we’ve added to the Opportunity Close table)
Use Power Automate for automation purposes!
The first step is easy. We need to go & create custom columns on the Opportunity table. These WILL show up in the Advanced Find search. They obviously need to be the same as the custom columns on the Opportunity Close table. If we’ve used Choice or Choices there, point the Opportunity column to the same source (it’s a good argument for using Global, rather than Local, choice/s).
We then can go and create a Power Automate. This should trigger when an Opportunity Close record is created.
Note: For this, I’ve made it so that it runs under the user triggering the action, rather than a system account. This is to keep in line with licensing limits etc
You’ll then need to add a ‘Get Dataverse row’ step, and get the Opportunity Close record that has just been created. This is annoying, but for some strange reason the trigger doesn’t present the custom columns/values in the JSON that it returns. Hopefully Microsoft fixes this at some point, but for the moment, we need to work around it.
The last step is to add a ‘Update Dataverse row’. This should point to the Opportunity table, & we can simply map the values across (from the SECOND step, NOT the first one – VERY IMPORTANT).
Once this is all done, save & test it, and you should see it working. I generally don’t add the Opportunity custom columns to the form, but rather leave them for querying against.
Caveats
It’s important to keep in mind that when an opportunity is marked as either won or lost, it’s then closed, and changed to a read-only state. That’s how the system is designed to be, and makes sense.
However it’s ALSO possible to re-activate a closed opportunity, and then close it again. Ie a single Opportunity record could have multiple Opportunity Close records against it. This solution won’t handle this (it would need to be built out further – the Opportunity record itself will only show the values from the latest Opportunity Close action, so please do keep this in mind!
Have you ever come up against something like this? How have you handled it? I’d love to hear – please drop a comment!
Discussing various technical interests, such as the wonders of 3D printing, with Andrew, and going into the right (& wrong!) approaches to data security & protections.
If you’d like to come appear on the show, please sign up at http://bit.ly/2NqP5PV – I’d love to have you on it!
Click here to take a look at the other videos that are available to watch.
Talking to Carmen about Belgium, her love of the countryside, and the beauty of nature and hiking through it. Also touching on the ‘wonders’ of canvas app functions…
If you’d like to come appear on the show, please sign up at http://bit.ly/2NqP5PV – I’d love to have you on it!
Click here to take a look at the other videos that are available to watch.
Finding out about Kylie’s love of knitting (some of the things she’s created over time too), discussing pizza and monkey, and what happens when we’re not careful enough about system security! Cue a rescue mission at an unexpected time!
If you’d like to come appear on the show, please sign up at http://bit.ly/2NqP5PV – I’d love to have you on it!
Click here to take a look at the other videos that are available to watch.
So it’s been a busy few weeks here, which is why I haven’t really been putting up any articles. March/April is always a busy time for our family with stuff going on, and this year I decided not to push myself to get articles out, as otherwise I’d be running very low on sleep!
That being said, I’ve still had some great ideas about things that I’d like to share, and have been keeping a series of short notes for me to pick up. Today’s topic is one of them, which I think has been a major pain to anyone involved in canvas app development!
So, the back story to this is that we’re able to use Power Automate flows together with canvas apps. What I mean by this is that we’re able to directly trigger them from within the canvas app, rather than needing to do something like edit or create a record, and then have the Power Automate flow trigger from the record creation or modification.
There’s a specific Power Apps trigger that’s available within Power Automate exactly for this purpose:
When clicked, it gives us the trigger line in the steps as follows:
So what we’d do is within the canvas app, we would bind a button (or another control) that when selected, it would then go away & trigger the Power Automate flow. Great – so many different things that we can get to happen! One of the benefits of doing things like this is that we can then pass information from the Power Automate flow back to the canvas app directly:
This can then mean that the user can know, within the canvas app itself, that the Power Automate flow has run, and use data (or other things) that have come out of it.
OK – all good so far.
The main issue to date has been with deploying canvas apps together with Power Automate flows. See, as per best practise, we would create a solution, place the canvas app, flows, and anything else that’s necessary for it to work within it, and then deploy the solution to our target environment/s. And that’s where things just…didn’t go quite right.
Obviously within the development environment, the canvas app would be hooked up to the flows, and everything would work. Clicking the button would cause the flow to run, etc. User authentication would be in place (along with licenses of course!), and it was just fine.
But when deploying a solution containing canvas apps and associated flows between environments (regardless of whether it’s been manually deploying, or automated using a tool such as Azure DevOps), the connections to the flows would be broken. Ie, the canvas app would run, but the flows wouldn’t trigger. Looking at the connections in the canvas app within Studio would show something like the following:
All of the connections to Power Automate flows would show as ‘Not connected’. It’s not even possible to click the ellipse next to them and re-connect them – the only option available is to remove it from the canvas app!
So in order to get things working again, we’d need to do the following steps:
Open up the canvas app
Remove all connections to Power Automate flows
Add a temporary button, set it to be a Power Automate trigger
Click through all of the Power Automates needing to be connected (waiting for each one to connect, then go to the next one)
Remove the temporary button
Save and publish the solution
This, in a nutshell, has been a (major) headache. For example, I’ve been working with a solution that has over 30 Power Automate flows that can be triggered from the canvas app (lots of different functionality!). Each deployment has needed the above process to be carried out, which has usually added on at least an hour to the deployment process!
Now, this hasn’t been something that’s been unknown. In fact, the official Microsoft documentation noted the following:
So this is something that Microsoft has been well aware of, but it’s been a pain point that we’ve had to work with.
However, this has now ALL changed, which I (and MANY others) are really pleased about!
Microsoft has rolled out an update last month that means that canvas app connections to Power Automate flows will NOT break when they’re deployed across environments! This is such a massive time-saver, that I’m now trying to work out what to do with all of my free time! Only kidding…more project work will commence!
So what we can now do is take our solution, deploy it across the different environment/s that we need to get it out to (whether manually, or automated using tools such as Azure DevOps), publish the solution, and then everything works! Amazing!!
One small caveat though – to ensure that this work, you will need to go into the app, and re-publish it on the latest Power Apps version. This should of course be done in a development environment, and then can be exported and deployed as required.
Talking to Olena about different hobbies such as painting, and the challenges that can come with projects. However, also finding out about ways to overcome those challenges, and what actually can allow us to overcome challenges and succeed!
If you’d like to come appear on the show, please sign up at http://bit.ly/2NqP5PV – I’d love to have you on it!
Click here to take a look at the other videos that are available to watch.
Finding out what Keegan has been learning during lockdown (awesome use of time!), challenges that raising children brings, and why it’s important to listen to your gut feelings.
If you’d like to come appear on the show, please sign up at http://bit.ly/2NqP5PV – I’d love to have you on it!
Click here to take a look at the other videos that are available to watch.
Well, it’s FINALLY here. And by finally, I guess I’m saying that I’ve been waiting for this for a while? The PL-600 exam is the new ‘Holy Grail’ for Dynamics 365/Power Platform people, being the Solution Architect (3 star) exam. Ten minutes after it went live, I booked to take it, and four hours after it went live I sat it! (I would have taken it sooner, but had to have supper first, get the kids to bed, etc…)
The first solution architect exam that Microsoft has done in this space has been the MB-600 (see my exam experience write-up on it at MB-600 Solution Architect Exam). However with the somewhat recent shift moving towards certifications for the wider Power Platform, it was inevitable that this exam would change as well.
Interestingly enough, the MB-600 now counts towards some of the Microsoft Partner qualifications. I’d expect that when it retires (currently planned for June 2021), the PL-600 will take the place of it in the required certifications to have.
Microsoft Power Platform solution architects lead successful implementations and focus on how solutions address the broader business and technical needs of organizations. A solution architect has functional and technical knowledge of the Power Platform, Dynamics 365 customer engagement apps, related Microsoft cloud solutions, and other third-party technologies. A solution architect applies knowledge and experience throughout an engagement. The solution architect performs proactive and preventative work to increase the value of the customer’s investment and promote organizational health. This role requires the ability to identify opportunities to solve business problems. Solution architects have experience across functional and technical disciplines of the Power Platform. Solution architects should be able to facilitate design decisions across development, configuration, integration, infrastructure, security, availability, storage, and change management. This role balances a project’s business needs while meeting functional and non-functional requirements.
So not really changed that much from the MB-600, though obviously there’s now an expectation for solutions to bring in other parts of the Power Platform, as well as dip into Azure offerings as well. Pretty much par for the course, in my experience, with how recent projects that I’ve been on have been implemented.
At the time of writing, there are no official Microsoft Learning paths available to use to study. I do expect this to change in the near future, and will update this article when they’re out. However the objectives/sub-objectives are available to view from the main exam page, and I’d highly recommend going ahead & taking a good look at these.
As in my previous exam posts, I’m going to stress that it’s not permitted to share any of the exam questions. This is in the rules/acceptance for taking the exam. I’ve therefore put an overview of the sorts of questions that came up during my exam. (Note: exams are composed from question banks, so there could be many things that weren’t included in my exam, but could be included for someone else! ). I’ve tried to group things together as best as possible for the different subject areas.
Overall, I had 47 questions, which is around the usual amount that I’ve experienced in my exams over the last year or so. What was slightly unusual was that instead of two case studies, I got three of them! (note that your own experience may likely vary from mine).
Some of the naming conventions weren’t updated to the latest methods, which I would have expected. I still had a few references to ‘entities’ and ‘fields’ come up, though for the most part ‘tables’ and ‘columns’ were used. I guess it’s a matter of time to get everything up to speed with it.
Environments
Region locations, handling scenarios with multiple countries
Analytics
Data migrations
Requirement Gathering
Functional
Non-functional
Data structure
Tables
Types of tables
Standard vs custom functionality
Virtual tables. What these are, when they would be used, limitations to them
Activity types
Table relationships & behaviours
Types of columns, what each one is suited for
Business rules. What they are, how they can be used
Business process flows. What they are, how they can be used
App types (differences between them, scenarios each one is best suited for
Model
Canvas
Portal
Model-driven apps
Form controls (standard vs custom)
Form layout (standard functionality vs custom functionality)
Formatting inputs
Restricting inputs
Automation
Power Automate flows. What they are, how they can be used, restrictions with them
Azure Logic Apps. What they are, how they can be used, restrictions with them
Power Virtual Agents
Communication channels
Self service abilities through Power Virtual Agent chatbots. How this works, when you’d use them, limitations that exist
Live agent abilities through Omnichannel. How this is implemented, how customers can connect to a live agent (directly, as well as through chatbots)
Teams. When this can be used, how other platform abilities can be used through it
Integration
Integration tools
Power Platform systems
Azure systems
Third party systems
Reporting across data held in different systems
Dynamics 365 API
Reporting
Power BI. What it is, how it’s used, how it’s configured, limitations with it, how to share information with other users
Interactive Dashboards. What these are, how these are set up and used, limitations to them
Troubleshooting
Canvas app issues
Model driven app issues
Data migration
Security
Data Protection. What is it, where it’s set up, how it’s used across different requirements in the platform
Types of users (interactive/non-interactive)
Azure Active Directory, and the role/s it can play, different types of AAD authentication
Power Platform security roles
Power Platform security teams, types
Portal security
Restricting who can view forms
Field level security
Hierarchy abilities
Auditing abilities and controls
Portal security
Wow. It’s a lot of stuff. Not that I’m surprised by that, as essentially it’s the sort of thing that I was expecting (being familiar with the MB-600). I think that on a ‘day to day’ basis, I cover most of these items already, so didn’t have to do a massive amount of revision for items that I wasn’t familiar with.
From my experience in taking it, I’d say that around 30% of the questions seemed to be focused on Dynamics 365, with 70% being focused on Power Platform capabilities. It’s about what I thought it would be when the exam was first announced. Obviously some people are more Dynamics 365 focused, and others are more Power Platform focused, but the aim of the exam (& qualification) is to really understand the breadth of the offerings available.
I can’t tell you if I’ve passed it or not…YET!. Results aren’t going to be out for several months, based on previous experience with Beta exams, but I’ve got a good feeling about this.
So, if you’re aiming to take it – I wish you the very best of luck, and let me know your experience!