Exam AB-731: AI Transformation Leader

What better way to start 2026 then to talk about a Microsoft certification, especially one for a totally NEW type of user!

Following on the steps of the other AB exams I’ve been writing about my experience with (see Exam AB-730: AI Business Professional, Exam AB-100: Agentic AI Business Solutions Architect and Exam AB-900: Microsoft 365 Copilot and Agent Administration Fundamentals ), this article will cover the AB-731 exam.

This exam is focusing on the Microsoft AI capabilities from a Business Leader perspective, and to the best of my knowledge is the first time that Microsoft has ever created an exam from a ‘Business Leader’ perspective. Taking this exam was a complete mindset shift to me, especially when seeing the questions – it’s not about understanding the in depth technical capabilities, but more around the breadth of technology options (spanning Azure, Microsoft 365 Copilot, Copilot Studio & other tools), and what they bring/enable from a BUSINESS perspective.

The official description of the proposed exam candidate is:

As a candidate for this Microsoft Certification, you should understand how to recognize opportunities for AI transformation, identify the right AI tools and resources, plan for AI adoption, optimize business processes, and drive innovation by using Microsoft 365 Copilot and Azure AI services.

This Certification is designed for business decision-makers at all levels who are responsible for guiding transformation and innovation within their teams or organizations. In this role, you’re expected to demonstrate AI fluency, strategic vision, and the ability to lead AI adoption across teams and functions but are not expected to write any code.

As a candidate for this Certification, you should be able to evaluate AI opportunities, champion responsible AI practices, and align AI investments with business goals. You need experience leading adoption or change management in a business context. You must also be familiar with Microsoft 365 services, Azure AI services, and general AI capabilities.

The overall information for the exam can be found at Microsoft Certified: AI Transformation Leader, and there is an official Learning Path available for it.

As I’ve posted before around my exam experiences, it’s not permitted to share any of the exam questions. This is in the rules/acceptance for taking the exam. I’ve therefore put an overview of the sorts of questions that came up during my exam. (Note: exams are composed from question banks, so there could be many things that weren’t included in my exam, but could be included for someone else!). It’s also in beta at the moment, which means that things can obviously change for when it comes out of beta.

Overall, the exam approach was quite different to me – though I do talk with organisations frequently around general AI matters, I’ve never taken an example written in this way beforehand. However, I do feel that it’s very helpful to have this in place, to ensure that business leaders can demonstrate that they actually do know what they’re talking about 😉

I’ve tried to group things as best together as I feel (in my recollection), to make it easier to revise.

  • Azure Components & Capabilities
    • AI Vision – what it can be used for, benefits of using it, capabilities that it has
    • AI Language – what it can be used for, benefits of using it, capabilities that it has
    • AI Document Intelligence – what it can be used for, benefits of using it, capabilities that it has
    • Machine Learning – what it can be used for, benefits of using it, capabilities that it has
    • AI Foundry – what it can be used for, benefits of using it, capabilities that it has
    • AI Search – what it can be used for, benefits of using it, capabilities that it has
  • Microsoft 365 Copilot Chat
    • What license is needed
    • What data does it have access to
    • What security controls are in place
  • Microsoft 365 Copilot
    • What is it, what can it be used for
    • What can it do
    • How does it connect to data
    • What are the connectors for it (standard & custom)
    • Benefits of using it (vs 3rd party AI tooling)
    • Different agents (eg Analyst & Researcher) within it – what they do, how to access and use them
  • Microsoft Copilot Studio
    • What is it, what can be used for
    • What can it do
    • What license is needed
    • What data can it access
  • Microsoft Security Copilot
    • What is it, what can be used for
    • What can it do
    • Benefits that it provides
  • Security & Governance
    • Content filtering controls within Copilot
    • Policies
    • Handling requirements to prevent inappropriate language & responses
    • Responsible AI principles
    • Governance ownership, responsibility & requirements
  • Generative AI
    • AI model hallucinations
    • Grounding in data
    • Improving response quality
    • Prompt engineering
    • Pre trained models vs fine tuned models
    • Reasoning models vs non-reasoning models
    • Understanding usage costs (including different pricing models)
    • What is RAG, and how can it be used for business scenarios
    • Adoption throughout organisations – personas to involve in adoption team

    I hope that this is helpful for anyone who’s thinking of taking it – good luck, and please do drop a comment below to let me know how you found it! I’d also be interested in your thoughts/opinions around the direction that Microsoft has taken for this!

    Exam AB-900: Microsoft 365 Copilot and Agent Administration Fundamentals

    Following on the steps of the other AB exams I’ve been writing about my experience with (see Exam AB-730: AI Business Professional & Exam AB-100: Agentic AI Business Solutions Architect), this article will cover the AB-900 exam.

    This exam is focusing on the Microsoft 365 Copilot capabilities from a user & administration perspective, and doesn’t cover/include anything from Copilot Studio.

    Now, though it’s a Fundamentals exam, to be honest it’s the HARDEST fundamentals exam that I’ve ever taken!

    The approach is around being able to demonstrate understanding of how to use the Microsoft 365 Copilot, as well as a lot of focus on how to control & administer it.

    The official description of the proposed exam candidate is:

    As a candidate for this Microsoft Certification, you should be familiar with Microsoft 365, including core services, security, identity and access, data protection, and governance, along with Microsoft 365 Copilot and agents.

    Additionally, you should be familiar with the admin centers used to access Microsoft 365 workloads, such as Exchange Online, SharePoint in Microsoft 365, Microsoft Teams, Microsoft Entra, and Microsoft Purview. You need to have experience with AI-driven productivity tools and modern IT management practices.

    You must be able to identify the roles of the core features and objects available in Microsoft 365, such as users, groups, teams, sites, and libraries. Plus, you should understand the core security features of Microsoft 365, such as authentication methods, conditional access policies, and single sign-on (SSO).

    The overall information for the exam can be found at Microsoft 365 Certified: Copilot and Agent Administration Fundamentals, and there is an official Learning Path available for it.

    As I’ve posted before around my exam experiences, it’s not permitted to share any of the exam questions. This is in the rules/acceptance for taking the exam. I’ve therefore put an overview of the sorts of questions that came up during my exam. (Note: exams are composed from question banks, so there could be many things that weren’t included in my exam, but could be included for someone else!). It’s also in beta at the moment, which means that things can obviously change for when it comes out of beta.

    One thing to keep in mind about this exam – though I do mention Microsoft Purview in the list of items below, I haven’t gone into it extensively. However, there were a LOT of questions that touched on Purview (& other governance stuff as well) – you REALLY need to be knowing & understanding these capabilities to be able to take & pass the exam. Just guessing the answers is not going to help at all!

    Overall, the exam seemed to me to be pretty decent, though with indeed a heavy focus on security & governance (as I’ve mentioned above). I don’t see this as a bad thing though, as it can help to show that administrators really do know what they’re talking about.

    I’ve tried to group things as best together as I feel (in my recollection), to make it easier to revise.

    • Agent types
      • Native Microsoft 365 Copilot agent
      • Native Microsoft 365 Copilot advanced agents (eg Researcher & Analyst). What they are, how to access, what to use them for
      • Custom Microsoft 365 Copilot agent
      • SharePoint agent
    • Creating/using Agents
      • Using natural language to create agents
      • How to handle/perform multi-step reasoning
      • Use of notebooks
      • Custom instructions
      • Scheduling prompts
      • Querying data types
        • Structured
        • Unstructured
    • Governance & security
      • Blocking access to different types of searches & collateral
      • Blocking access to specific agents
      • Tools to use for blocking
      • How to share agents with other users
      • Assigning licenses to users
      • Data retention policies
      • Data labelling policies
      • Use of Microsoft Purview, covering capabilities, tools, auditing, how to use, etc
      • Use of DLP
      • Data source permissions
      • Conditional access policies
      • Microsoft Defender – what it is, capabilities it has, how to use it, etc.
      • Types of authentication
    • Reporting
      • Licensing & usage
      • Adoption & interactions
    • Payment options & capabilities
      • Credit usage – internal vs external users
      • Pay As You Go Billing, and scenarios you can use it for

    I hope that this is helpful for anyone who’s thinking of taking it – good luck, and please do drop a comment below to let me know how you found it! I’d also be interested in your thoughts/opinions around the direction that Microsoft has taken for this!

    Exam AB-730: AI Business Professional

    Following on from the recent launch of the new Exam AB-100: Agentic AI Business Solutions Architect exam, Microsoft has now developed & released other exams in the AB series – this post is on the AB-730 exam.

    The approach continues to be around how to use AI within technology for business purposes, rather than needing to be able to create AI or code. This exam focused on the Microsoft 365 Copilot experience, how to use it within various Microsoft Office applications, etc.

    The official description of the proposed exam candidate is:

    As a candidate for this Microsoft Certification, you should have experience using generative AI–powered productivity tools, including Microsoft 365 Copilot, Researcher, and Analyst. You take advantage of AI to improve daily work, drive business outcomes, and make informed decisions in business contexts—without building AI apps or writing code.

    You should have a basic understanding of Microsoft 365 and should be comfortable navigating core apps, such as Outlook, Word, Microsoft Teams, PowerPoint, and Excel. You should also be familiar with common business processes, including drafting emails, creating presentations, generating images, and managing documents.

    The overall information for the exam can be found at at Microsoft Certified: AI Business Professional (beta) – Certifications | Microsoft Learn, and there is an official Learning Path available for it.

    As I’ve posted before around my exam experiences, it’s not permitted to share any of the exam questions. This is in the rules/acceptance for taking the exam. I’ve therefore put an overview of the sorts of questions that came up during my exam. (Note: exams are composed from question banks, so there could be many things that weren’t included in my exam, but could be included for someone else!). It’s also in beta at the moment, which means that things can obviously change for when it comes out of beta.

    One thing that I found I didn’t like about the exam is the new question type of ‘Best Answer’. This question type gives various options, whilst telling you that more than one answer choice may achieve the goal, but asking you to select the BEST answer. I believe that questions like this are subjective, and the answers will vary based on each person’s knowledge, understanding & experience, so I’m not quite sure why Microsoft have decided that this would be good to use. It will be interesting to see what happens when the exam comes out of Beta, and if these questions are still around or not then.

    Overall, the exam seemed to me to be pretty decent – I initially thought it would be quite generic, but you really do need to know how all the Copilot offerings work including Copilot Chat and Copilot in the Office applications.

    If you’re new to Copilot, and/or not really sure as to how it actually works & the capabilities, I’d suggest not to take the exam yet. Instead, go and take a look at the learning paths, and look to find out how it actually works & operates.

    I’ve tried to group things as best together as I feel (in my recollection), to make it easier to revise.

    • Microsoft 365 Copilot vs Microsoft 365 Copilot Chat
      • What each one does/doesn’t do
      • When to use each one
    • What to include when prompting Copilot
    • Copilot security framework
      • How data is used
      • The different data controls that are in place
      • How data protection works, different data protection capabilities & using them
      • Removing data & prompts from Copilot,
    • Copilot capabilities in Microsoft Word
    • Copilot capabilities in Microsoft Excel
    • Copilot capabilities in Microsoft Outlook
    • Copilot capabilities in Microsoft PowerPoint
    • Copilot capabilities for Teams
      • Using Copilot within Teams for queries
      • Using Copilot within Teams for meetings (preparing for them, during the meeting, after the meeting)
    • Collaboration with Copilot report outputs
    • Copilot Researcher agent – getting access, capabilities & use cases, inputs & outputs
    • Copilot Analyst agent – getting access, capabilities & use cases, inputs & outputs
    • Using custom instructions within Copilot – how to do this, how it is used/applied, etc
    • Using documents with Copilot for answers & generating material. Updating new versions of documents, and how Copilot will behave
    • Microsoft 365 agents – creating, configuring, sharing, security etc
    • Creating, sharing & scheduling prompts, including limitationsent

    I hope that this is helpful for anyone who’s thinking of taking it – good luck, and please do drop a comment below to let me know how you found it! I’d also be interested in your thoughts/opinions around the direction that Microsoft has taken for this!

    Exam AB-100: Agentic AI Business Solutions Architect

    It’s always interesting when Microsoft release a new type of exam, especially when it’s not tied to specific functionality, but rather to an overall approach. The AB-100 exam (don’t pay too much attention to the ‘100’ designator, in my opinion) follows the approach that we’re seeing Microsoft taking – needing to use technology (& here, specifically AI in technology) holistically across multiple solution.

    I took the exam in Beta as soon as it launched, though due to preparing for the Power Platform Community Conference (which I’m currently writing this at), it’s taken a bit of time to get this blog post up and published.

    As an architect, AI isn’t new to us – we know of multiple different capabilities (spanning Microsoft 365, Copilot Studio & Azure AI Foundry), which we need to use appropriately to handle customer scenarios. AI isn’t new to exams either – there are multiple Azure exams with AI in them, we have multiple Business Application exams with Copilot Studio in them, etc.

    However, exams to date focus on a specific part of the technology stack. For example, the PL-600 focused on Power Platform & Dynamics 365 Customer Engagement. The MB-700 focused on Dynamics 365 Finance & Operations, and so on and so forth.

    This new exam is somewhat of a paradigm shift – needing to understand AI holistically as an architect across multiple parts of the technology stack, what & how it’s used for and where, etc. This is most definitely a new approach, and it will be interesting to see how it users react to it.

    Truthfully, having taken it, I’d personally say that it feels a bit more like an enterprise architect exam approach (which also doesn’t exist in the Microsoft stack), albeit focused around Business Applications. Given the way in which Microsoft partners have specialists in each technology part of the stack, it will be interesting to see if this approach will pivot the way in which people are trained/skilled, and deliver projects. I think that there’s likely to be a lot of feedback to Microsoft that it’s not the way that the partner landscape currently works – though perhaps Microsoft is specifically trying to influence this itself to change. Only time will tell…

    The overall information for the exam can be found at Microsoft Certified: Agentic AI Business Solutions Architect (beta) – Certifications | Microsoft Learn, though there is NO learning path that’s been created (at the time of writing). I think that this is because Microsoft may want to see the reaction to this new approach, and pivot appropriately, rather than needing to create a lot of content that may potentially need to be re-done.

    The official description of the exam can be found at the link above (it’s too long to post here), so please go take a look!

    So, as I’ve posted before around my exam experiences, it’s not permitted to share any of the exam questions. This is in the rules/acceptance for taking the exam. I’ve therefore put an overview of the sorts of questions that came up during my exam. (Note: exams are composed from question banks, so there could be many things that weren’t included in my exam, but could be included for someone else!). It’s also in beta at the moment, which means that things can obviously change for when it comes out of beta.

    I’ve tried to group things as best together as I feel (in my recollection), to make it easier to revise.

    • Business usage of AI
      • Different agents usage and results
      • How to use appropriately for business/agent analysis
      • Different types of metrics and results
      • Best practices for building Copilot Studio agents, and using Copilot Studio agents
      • Looking at the ROI for using Copilot Studio agents
      • Designing the usage of different AI and agent capabilities for business needs
    • Building agents
      • What Copilot Studio agents need to work
      • Data types that agents can use
      • Data sources that agents can use
      • Use of knowledge sources for agents
      • Usage of custom connectors
      • Handling token usage with Azure AI Foundry
      • How to handle testing for Copilot Studio agents
      • Different testing types & approaches
      • Extending Microsoft 365 Copilot
      • Using Power Automate with Copilot Studio agents
      • Speech to Text/Text to Speech
      • Handing conversation to live customer service representative using Dynamics 365 Contact Centre
      • Using RPA within an agent
    • Models
      • Different types of models that could be used within Azure AI Foundry
      • Orchestration
      • Improving performance
    • Security
      • How to handle Copilot Studio security
      • Governance & compliance tooling (eg Purview)
      • Handling/restricting connectors for Copilot Studio agents
      • Ensuring user security when using agents (ie not able to retrieve data that the user cannot access directly)
    • Reporting
      • Monitoring tools for Copilot Studio agents
      • Metrics, usage & analytics for Copilot Studio agents
      • Investigating Copilot Studio agent transcripts
      • Monitoring tools for Azure AI models
      • Evaluating Azure AI Foundry model outputs
    • Application Lifecyle Management
      • Focusing on AI Agents for Dynamics 365 CE, Finance & Operations, and Power Platform
      • How/what components to use and include
      • What tooling to use for ALM

    Overall, the exam seemed to me to be pretty decent – I was worried that it would focus just on Copilot Studio, with not much else in it, but there’s a good balance across other AI capabilities as well.

    The big change, for me at least, were the questions around Dynamics 365 Finance and Operations – this isn’t an area that I’m an expert in generally, and most definitely not for AI tooling. I think that this, as I mention above, is what may get the biggest pushback/feedback into Microsoft.

    I’m going to be quite interested in seeing how the exam is actually launched (as it’s currently in Beta of course). Having chatted with a few others who have taken the exam (whilst obviously respecting the NDA!), they also think that this is an approach pivot from Microsoft, and are wondering about the real world application of it.

    I hope that this is helpful for anyone who’s thinking of taking it – good luck, and please do drop a comment below to let me know how you found it! I’d also be interested in your thoughts/opinions around the direction that Microsoft has taken for this!

    Changes in the FTRSA Program

    Firstly for those who are not aware, the acronym ‘FTRSA’ stands for ‘Fast Track Recognised Solution Architect’. This is an award that Microsoft bestows on people working for Microsoft Partners who have demonstrated clear technical expertise & understanding of the Microsoft Business Applications Platform at (enterprise) scale.

    To quote from the Microsoft documentation for the program:

    The FTRSA designation is awarded by Microsoft’s Business Industry & Copilot (BIC) engineering team to enterprise solution architects who exhibit outstanding expertise in architecture and deliver high-quality solutions. Recipients are typically nominated based on their exceptional skills, extensive experience with Microsoft products, relevant certifications, and leadership in projects.

    The award covers two main areas – Power Platform & Dynamics 365, with different capabilities under each area.

    The program has been around for 6 years now (since 2019), with people needing to submit for annual (re)award & recognition. On average, approx. 120 people are recognised with this award globally. It is definitely something that Microsoft Partners can place a large emphasis on if they have people with this!

    Generally over the last few years, the categories for being awarded have included:

    • Power Apps
    • Power Automate
    • Power BI
    • Dynamics 365 (CE)
    • Dynamics 365 (ERP)

    Changes over the last few years have included the Power BI category being retired. This is to be expected, I guess, given that Microsoft programs tend to flex/pivot over time.

    The process for application is simple. By this, I mean that nominees need to fill in a form (located at https://aka.ms/FTRSANomination). In this form, they then need to provide various pieces of information, such as their personal information, the partner that they work for (including the Microsoft Partner ID), as well as submitting proofs to show that they currently fulfil the necessary requirements for the program. These requirements can vary based on the technology, and over the last few years I’ve seen a few different versions (based on the year).

    The form is usually open for around 3 months or so, opening at some point in October, and closing at some point in January.

    Once submitted, the information is then sent to the relevant Microsoft team who oversee & run the program for review. There are several stages to the review that is carried out:

    1. The team carry out an initial review of the information provided, ensuring that it meets the program requirements. Applicants who have not provided the information to meet the program requirements/criteria, or who do not pass the initial review threshold as evaluated by the team (this is why applicants are recommended to ensure that they’re focusing on quality of information being submitted), are not progressed and are notified.
    2. Applicants who pass the first stage are then invited to an interview. This is carried out with one of the wider team members, based on region & availability. The interview usually lasts around one hour, and is an evaluation of the technical skills & expertise of the applicant. During this interview, candidates are required to present on a project that they have implemented, and to demonstrate their in-depth knowledge & role that they played on the project.
    3. Finally, the team reviews the interviews, and decides as to which applicants have successfully shown their skills & expertise. Applications who have not met the level required are notified, along with feedback and areas that they could look to work on for a future nomination.
    4. Successful applicants are notified as well directly, though the news is not publicised until May or so, when the public announcement takes place with the relevant FTRSA websites being updated with their information.

    Business Contributions

    Having taken a look at the nomination form for this year, there are some new changes coming in that will be quite important (in my opinion) to pay attention to. These are being referred to as ‘Business Contributions’. Specifically, applicants will not only need to demonstrate technical/project expertise, but will also need to demonstrate one or more business contributions.

    Depending on the technical area being selected for the application (Power Apps or Dynamics 365), these are the areas that contributions can be submitted for:

    Power Apps

    • Published Microsoft Customer Stories or Microsoft Partner Stories, or evidence of nomination to be published
    • Contribution of product feedback to engineering teams, advisory boards, focus groups, communication forms or private preview programs
    • Published technical samples (e.g. code snippets, data migration templates, integration samples, etc) in the PowerCAT GitHub channel
    • Proof of escalation reduction in customer implementations
    • Reference architecture article/s used with a customer that leverages the Power Platform Well Architected framework

    Dynamics 365

    • Onboarded customer implement project(s) in the Dynamics 365 implementation portal, leveraging Dynamics 365 guidance hub frameworks
    • Published Microsoft Customer Stories or Microsoft Partner Stories, or evidence of nomination to be published
    • Contribution of product feedback to engineering teams, advisory boards, focus groups, communication forms or private preview programs
    • Published technical samples (e.g. code snippets, data migration templates, integration samples, etc) in the Dynamics 365 guidance hub
    • Published contributions to the Business Process Guide Catalogue
    • Proof of escalation reduction in customer implementations (either partner led or FastTrack led implementation)
    • Submit additional reference architecture articles for review and potential publication

    This is a significant change for the program – for the last 6 years, it’s been purely expertise recognised from client engagements. Now (in the 7th year, and I’d think very likely going forward), people considering nominating for FTRSA will need to prove that they’re giving back to Microsoft in some way, other than just running client engagements.

    Overall, I think this is an interesting concept, and generally a good one. Let’s face it – being able to talk about technology (at scale) is something quite a few people can do, but it doesn’t meant that they’re necessarily good at it. I know of several over-architected projects that I was brought in on, where just because lots of technology components were used, didn’t mean it was doing well. Part of the skillset as an experienced/knowledgeable architect is also when less is more!

    Additionally, being technically competent is of course important, but personally I believe that being able to be clear & communicative is also a very important role for a solution architect. Essentially having that functional view, as well as being able to engage appropriately with customers (as the owner of the project) is vital as well. One of the

    I also think that Microsoft is wanting to see that the program in which they’re investing time, effort & resources (yes, FTRSA’s get a wonderful SWAG box – THANK YOU TEAM!) are providing ROI back into Microsoft in terms of feedback, input & other information. This way products can (hopefully!) get better, visions can be assisted with customer information, and others can be helped as well.

    Some people may say that this is becoming more like the Microsoft MVP program. Given how much MVP’s are required to do, in terms of community (& Microsoft) engagement, I can understand the thoughts, but really don’t think that it’s anything anywhere near to that. My only note on this would be that I hope that contributions remain business/technical focused, which to me seems in line with the stated goals of the program, rather then also include (other) community contributions.

    Of course, there are those people who may choose not to do such things, and just focus on the project/s that they’re working on. This is a valid scenario, and there is of course absolutely NOTHING wrong with this. Not all of us may wish to engage with Microsoft engineering teams, or provide information publicly. And that’s all fine. However I would politely point out that nothing remains static, and if you’re wanting to receive (or continue to receive) the FTRSA award, you may need to do some thinking around how you’re approaching it, with the change that’s come this year.

    I’d also encourage people who are considering applying for the FTRSA award recognition to reach out to an existing FTRSA, who could possibly help mentor, review & guide you. They’ve already been through the process and are recognised as such, and therefore have a pretty good idea of what ‘hits the bar’ and what may not.

    So if you’re thinking of going for it – I wish you the best of luck!

    MB-280: Microsoft Dynamics 365 Customer Experience Analyst

    It’s been a while since taking a Microsoft certification exam, but with the new MB-280 exam being launched in the last few days, I’ve obviously needed to take a look at it! It felt a little strange, as I’m now used to the certification renewal process (which is why I haven’t taken any exams in a while), but thankfully things went alright with the overall exam.

    For those who haven’t been following the news, Microsoft made an announcement a few months back that some exams would be retiring, and the new MB-280 exam would be the replacement for this. In short, this is supposed to replace the MB-210 (Sales), MB-220 (Customer Insights – Journeys) & MB-260 (Customer Insights – Data). Malin Martnes wrote a good blog post in June – I’d suggest to take a look at it at for more general information around it.

    Now I’m all up for new certifications being created & made available. However, and I know this could be considered controversial, I have ABSOLUTELY NO IDEA as to why this exam was created in THIS specific way. If an exam had been created, for example, to bring together the two sides of Customer Insights (ie to cover both Data & Journeys in a single exam), I think that would have been quite good.

    But with having taken this, my thoughts (& feedback to Microsoft directly) is that they should un-deprecate (if that’s a word/phrase?) the MB-210 exam, and continue it forward. There’s no reason that I can see having Marketing & Sales together in a single exam – it feels like two (or technically 3?) lego bricks lumped together without any rhyme or reason.

    The learning path for the exam was also launched in the last few days, and can be found at Study guide for Exam MB-280: Microsoft Dynamics 365 Customer Experience Analyst | Microsoft Learn

    The official description of the exam is:

    As a candidate for this exam, you’re a Microsoft Dynamics 365 customer experience analyst who has:

    • Participated in or plans to participate in Dynamics 365 Sales implementations.
    • An understanding of an organization’s sales process.
    • An understanding of the seller’s perspective (user experience).
    • The ability to demonstrate Dynamics 365 Customer Insights – Data and Customer Insights – Journeys capabilities.

    You’re responsible for configuring, customizing, and expanding the functionality of Dynamics 365 Sales to create business solutions that support, automate, and accelerate the company’s sales process. You use your knowledge of customer experience capabilities in Dynamics 365 Sales and Microsoft Power Platform to inform the following design and implementation tasks:

    • Configure Dynamics 365 Sales standard and premium features.
    • Implement collaboration features.
    • Configure the security model.
    • Perform Dynamics 365 Sales customizations.
    • Extend Dynamics 365 Sales with Microsoft Power Platform.
    • Deploy the Dynamics 365 App for Outlook.

    As a candidate, you need:

    • An understanding of the Dataverse security model and features, including business units, security roles, and row ownership and sharing.
    • Experience configuring model-driven apps in Microsoft Power Apps.
    • An understanding of accounts, contacts, and activities.
    • An understanding of leads and opportunities.
    • An understanding of the components of model-driven apps, including forms, views, charts, and dashboards.
    • An understanding of model-driven app personal settings.
    • Experience working with Dataverse solutions.
    • An understanding of Dataverse, including tables, columns, and relationships.
    • Familiarity with Power Automate cloud flow concepts, such as connectors, triggers, and actions.

    More can be found at the exam page itself, which is located at Exam MB-280: Microsoft Dynamics 365 Customer Experience Analyst (beta) – Certifications | Microsoft Learn

    Now during my exam, I was looking forward to seeing the ‘new’ capability around being able to use Microsoft Learn during the exam (new to me – as I haven’t taken any other exams in the last year or so since it was announced!). However there didn’t seem to be any capability to launch Microsoft Learn – I’m not sure why it wasn’t available, as this isn’t a Fundamental level exam

    Questions also used the older terms of references rather than the newer/accepted terms – ie using ‘field’ instead of ‘column’, and ‘entity’ instead of ‘table’. Again, I have no idea why this is – all other exams (including the renewals for them) are using these properly (in my summary below I have ensured I use the correct terms).

    So, as I’ve posted before around my exam experiences, it’s not permitted to share any of the exam questions. This is in the rules/acceptance for taking the exam. I’ve therefore put an overview of the sorts of questions that came up during my exam. (Note: exams are composed from question banks, so there could be many things that weren’t included in my exam, but could be included for someone else!). It’s also in beta at the moment, which means that things can obviously change.

    I’ve tried to group things as best together as I feel (in my recollection), to make it easier to revise.

    • Sales Apps
      • Configuring forms, columns & tables
      • Configuring security roles & access to records
      • Configuring relationships between records (including deletion properties)
      • Sales Mobile App – security & deployment
      • Forecasting – setting up & configuring
      • Configuring Goals
      • Configuring Opportunities
      • Handling currencies
    • Copilot for Sales
      • Setting up & deploying to users
      • Configuring access
    • Outlook App
      • Deploying & setting up
      • Configuring forms & information
    • Exchange
      • Connecting to mailboxes
      • Configuring folder permissions
      • Configuring multiple domains
    • Product Families & Catalogue
      • Creating & setting up
      • Configuring options
      • Adding items to be used
    • Price Lists
      • Creating & setting up
      • Configuring options, including discounts
      • Using time-restricted price lists
      • Handling currencies
    • Document Management
      • Different document management capabilities
      • Usage of SharePoint in different ways
    • Data Import
      • Usage of Power Query
      • Data manipulation
      • Handling duplicate records
    • SMS
      • Setting up & configuring SMS provider
    • Journeys
      • Different triggers to use based on scenarios & requirements
      • How to trigger journeys
      • How to set up emails to be used within a journey
    • Segments
      • Different types of segments
      • Creating & modifying segments
    • Searching/Filtering
      • Using Advanced Find
      • Setting up/modifying queries to include/exclude records based on conditions
    • Business Process Flows
      • Modifying business process flows
      • Handling conditions within business process flows

    As a Sales exam, it seemed alright. But as mentioned above, the Customer Insights questions just seemed strange to me – I’d expect a consultant to be very technically skilled in Customer Insights, but not in Sales (& vice versa), so I’m not understanding bringing these two sides together.

    I’m going to be quite interested in seeing how the exam is actually launched (as it’s currently in Beta of course). Having chatted with a few others who have taken the exam (whilst obviously respecting the NDA!), they also can’t really understand the landscape. Personally, I think that if it continues like this, Microsoft is going to hear quite a few complaints around it.

    I hope that this is helpful for anyone who’s thinking of taking it – good luck, and please do drop a comment below to let me know how you found it! I’d also be interested in your thoughts/opinions around the direction that Microsoft has taken for this!

    The story of MFA & the Centre of Excellence

    I’ve been rolling out the Microsoft Centre of Excellence solution for several years now at customers. It’s a great place to start getting a handle on what exactly is going on within a Power Platform tenant, though there’s obviously so much more that takes place within a Centre of Excellence team.

    The solution gathers telemetry around environments, Power Apps, Power Automates etc through the usage of the Power Automate Admin connectors for Power Platform (see Power Platform for Admins – Connectors | Microsoft Learn for further information on these).

    Now obviously we need a user account to run these, and this usually has been through the use of a ‘pseudo service account’, as using a service principal has been tricky, to say the least. So we would get customers to set up an appropriate account with licensing & permissions in place, and use this to own & run the Power Automate flows that bring in the information to the CoE solution.

    It is important to note that usage of these connectors do require a pretty high level of permissions – in fact, we usually suggest applying the Power Platform Admin security role (within the Microsoft 365 Admin Centre) to the user account. All good so far.

    The tricky part has, to date, been around security. Organisations usually require (for good reasons) multi-factor authentication to be in place (aka MFA). Now this is fine for users logging in & accessing systems. However, it proves to be somewhat tricker for automations.

    See, when a user logs in & authenticates through MFA, a token is stored to allow them to access systems. Automations can also use this. However the token will expire at some point (based on how each organisations has implemented MFA access/controls). When the token expires, the automations will stop running, and fail silently. There’s no prompt that the token has expired, and the only way of knowing is to take a look at the Power Automate flow history. This can be interesting though, as signing in (with the pseudo service account) will prompt for MFA authentication, and then everything will start running again!

    So this has usually resulted in conversations with the client to politely point out that implementing MFA on the service account will mean that, at some point, the Power Automate flows are going to start failing. Discussions with security teams take place, mitigation using tools such as Azure Sentinel are implemented, and things move ahead (cautiously). It’s been, to date, the most annoying pain for the technical implementation (that I can think of at least, in my experience).

    Now you’d think that a change in this would be shouted from the rooftops, people talking about it, social media blowing up, etc. Well, I was starting an implementation recently for a customer, and was talking to them around this, as I’d usually do. Imagine my surprise when Todd, one of the Microsoft technical people attached to the client, asked why we weren’t recommending MFA.

    Taking a look at the online documentation, I noticed that something had slipped in. Finally there was the ability to use MFA!

    Trawling back through the GitHub history (after all, I wanted to find out EXACTLY when this had slipped in), I discovered that it was a few months old. I was still very surprised that there hadn’t been more publicity around this (though definately a good incentive to write about it, and a great blog post to start off 2023 with!).

    So moving forward, we’re now able to use MFA for the CoE user account. This is definately going to put a lot of mind at rest (especially those who are in security and/or governance). The specifics around the MFA implementation can be found at Conditional access and multi-factor authentication in Flow – Power Automate | Microsoft Learn – but it’s important to note that specific MFA policies will need to be set up & implemented for this account.

    So, now the job will be to retro-fit this to all organisations that already have the CoE toolkit in place. Thankfully this shouldn’t be too difficult to do, and will most definitely enhance the security controls around it!

    Have you implemented any mitigation in the past to handle non-MFA? I’m curious if you have – please drop a comment below!

    PL-500: Microsoft Power Automate RPA Developer

    RPA (or Robotic Process Automation) is a capability that Microsoft has been developing for a while within the Power Platform space. Whilst cloud flows can be used to interact with any systems that has an API in place, many organisations have (legacy) systems that have no API, so interacting with them can be challengin. RPA capabilities allow organisations to be able to interact with any system overall, thereby enabling & empowering businesses holistically.

    I’ve been aware for a while that there’s been an exam coming out for RPA, though it’s taken a bit of time to land. That’s fine though – I can’t really think of any absolute rush to have it in place. I do think that over time, just as with some of the other certifications, it will become a required for solution or specialisation status.

    The official page for it is at https://docs.microsoft.com/en-us/certifications/exams/pl-500. The specification for it is:

    Candidates for this exam automate time-consuming and repetitive tasks by using Microsoft Power Automate. They review solution requirements, create process documentation, and design, develop, troubleshoot, and evaluate solutions.

    Candidates work with business stakeholders to improve and automate business workflows. They collaborate with administrators to deploy solutions to production environments, and they support solutions.

    Additionally, candidates should have experience with JSON, cloud flows and desktop flows, integrating solutions with REST and SOAP services, analyzing data by using Microsoft Excel, VBScript, Visual Basic for Applications (VBA), HTML, JavaScript, one or more programming languages, and the Microsoft Power Platform suite of tools (AI Builder, Power Apps, Dataverse, and Power Virtual Agents).

    Now here’s the thing. I occasionally work in the automation space, either on customer projects, or when training users in the technologies. I wouldn’t describe myself as an advanced automation developer (whether cloud or RPA capabilities). I’m most definitely NOWHERE near the level of legends such as Matt Collins-Jones, for example (go check him out if you don’t know about him!).

    So I knew that I may be a bit challenged when taking the exam, especially in the more ‘pro dev’ space (aka JSON etc). In fact, I didn’t actually realise that the exam specification included that sort of thing. I know, I should have – it’s aimed at developers overall…shows that I need to brush up on reading things properly!

    Also, there’s still quite a bit of a focus on Power Automate cloud flows – it’s not JUST about RPA capabilities.

    Now, really nicely, there are already Microsoft Learn pathways available (which have been around for a while, and updated appropriately). This really is a big help, I feel, especially for people who are new’ish to RPA.

    Of course, there’s a lovely shiny two star badge awarded when passing the exam, along with the title of ‘Microsoft Certified: Power Automate RPA Developer Associate’:

    As with previous exams, I sat it from home (the proctored experience). Learning from previous times that I’ve taken exams, I ensured that my workspace was entirely clear from everything. As a result, the check-in process happened automatically, and I didn’t need to engage with any proctors at all (which was quite nice actually).

    As in my previous exam posts, I’m going to stress that it’s not permitted to share any of the exam questions. This is in the rules/acceptance for taking the exam. I’ve therefore put an overview of the sorts of questions that came up during my exam. (Note: exams are composed from question banks, so there could be many things that weren’t included in my exam, but could be included for someone else! ). I’ve tried to group things together as best as possible for the different subject areas.

    • Cloud flows vs RPA flows
      • Capabilities of each
      • When to use each (ie how to handle different scenarios)
      • How to trigger each one
    • Cloud flows
      • Different types of triggers, & when each type should be used
      • Different types of actions, and the capabilities of them (at a high’ish level – expected to know common Microsoft actions, but not need to know all of the hundreds of different ones!)
      • Controls/operators. What they are, how they can be used to accomplish different requirements
      • JSON formatting & syntax
    • Business Process flow vs Business Rules
      • What each is
      • When to use each one
      • Capabilities
    • RPA flows
      • Common actions, how they work, capabilities of them
      • How expression syntax works within them
      • Debugging capabilities, and what to use when
      • How to interact with desktop applications
      • How to interact with websites
        • How data values can be used
        • How data tables can be used
        • How to use data that’s extracted from a website
      • Troubleshooting functionality
    • Usage of automation capabilities from Office 365 applications such as Excel & Visio
    • Loops
      • How they work for cloud & RPA flows
      • Troubleshooting
      • Implementing success/fail criteria
      • Error handling
    • Process Advisor
      • What it is
      • What it does
      • How it can help organisations
      • Limitations
      • What it cannot do
      • Process Mining vs Task Mining, & the important differences between them
    • Variables
      • How to handle variables across different environments
      • How to declare them (cloud flow vs RPA flow)
    • Runtime operations
      • How flows are triggered (async vs sync)
      • How flows are queued (cloud vs RPA)
      • How RPA flows are carried out when using machine groups
    • Artificial Intelligence (AI) capabilities
      • How AI can be used within flows
      • Different AI capability types (what each one can be used for)
      • AI within Power Platform, & AI within Azure Cognitive Services
    • Sharing flows
      • Different ways to share cloud flows
      • Different ways to share RPA flows
    • Application Lifecycle Management (ALM)
      • Solutions (managed vs unmanaged). Capabilities of each, when to use each type
      • AzureDevOps (ADO). What it is, when/how to use it, capabilities
      • Solution imports
      • Solution layers. What these are, troubleshooting functionality
      • Upgrade/Stage for Upgrade/Update. Which each is, what each does, how/when to use each one
      • Moving desktop flows between users
    • Security
      • Security roles needed to create
      • Security roles needed to share/modify
      • Security roles needed to register machine for RPA
      • Security roles needed to register machine groups for RPA
      • Security requirements to run different types of RPA flows (how it interacts with desktop/s)
      • Data Loss Prevention (DLP) – how it affects creation & runtime of flows

    Overall, I had 46 questions, with a single case study. I’m used to having at least two case studies, so it was nice to have just one of them this time.

    So….it’s a lot of stuff. Definitely targeted much more at the ‘pro-developer’ end of the scale that someone who might occasionally automate things. It’s absolutely necessary to understand coding conventions, ALM, etc.

    It’s definitely an exam that if you’re not already currently hands-on with the skills needed, I’d highly recommend you get a decent amount of experience with it before taking the exam! I’d highly recommend ensuring that you have an environment in which you’re able to be hands on with all types of automation (cloud & desktop flows), and really understand how they can be handled with an eye on the enterprise scale!

    If you’re aiming to take it – I wish you the very best of luck, and let me know your experience!

    Environments & ‘Admin Mode’

    With some recent events happening (both professional & personal), I’ve taken a slight step back from putting out posts on here. Thankfully things seem to be settling down, so I’m getting (back) into the swing of things!

    I thought that it would be good to talk about a subject that I fell ‘foul’ of recently. This is around environments, and more specifically, the ‘admin mode’ that it’s possible to use on them.

    So what exactly is this ‘admin mode’? Well, the aim of it to restrict access to certain users, namely System Administrators & System Customisers. Why would we want to do this? There are several scenarios that come into mind:

    • Performing a system upgrade (such as enabling new features)
    • Changing environment type (eg Production to Sandbox, or vice-versa)
    • Restoring an environment

    Essentially, any time we have operation-type work that we’re wanting to carry out. This way whatever we’re doing won’t affect users, and anything that the users are doing won’t affect things either (symbiotic relationship there!).

    So as an example, if we’re doing a major release, which changes functionality within a system, we wouldn’t want users in the system carrying out their usual work, as this could have data issue if saving during the actual release. We of course SHOULD be communicating to users that a release is going to take place, and that they shouldn’t be in the system at the time, but ‘admin mode’ is how we can truly enforce it.

    Something to bear in mind as well is that if you’re going ahead & restoring an environment to a previous state (whether that’s an automatic save point, or a manual one), it will automatically put the environment into ‘admin mode’ once the restore has been completed. This is very important to keep in mind!

    There are three settings around administration mode:

    1. ‘Administration Mode’. This sets whether admin mode is on or off!
    2. ‘Background Operations’. This sets whether background processes, such as workflows, power automate flows, and Exchange synchronisation are enabled (allowed to happen) or disabled (stopped from happening
    3. ‘Custom Message’. This allows you to set a custom message that users (who are not system administrator/system customiser) will see when they attempt to access the environment

    So this is the scenario that tripped me up a few weeks back:

    • I was needing to restore an environment to an earlier save point (to be clear, this was NOT a production environment)
    • I went ahead with the restore, and it completed successfully
    • Given that I was doing this at night, one of my children woke up, and I had to deal with them
    • I came back to things, saw that it completed, and then went ahead with the release that I was needing to do

    All seemed to go well. However, when users were testing (which admittedly was a few days later), they reported that some functionality wasn’t working. This was strange, as it had been working before the release (& the release that I did hadn’t actually touched it!).

    It turned out to be Power Automate flows that just didn’t seem to be running. OK – I started to look into them, but couldn’t figure out why they hadn’t run.

    Creating a test Power Automate flow didn’t seem to work either – despite running it to test it, the trigger never activated! I was quite puzzled by this, and couldn’t (initially) work out the reason.

    Then I thought to check environment settings! Lo & behold, the environment was STILL in administration mode, and the Background Process option was disabled! Aha – I’ve found the source!

    Flipping this out of administration mode thankfully then allowed all Power Automate flows to work/run, and users confirmed that functionality was indeed running as expected. As you can imagine, I was quite relieved!

    man in white shirt and black pants standing on black concrete bench near white building during

    Something that I hadn’t realised previously is that if you manually put an environment into administration mode, it doesn’t automatically disable background processes. However, if you restore an environment, it DOES disable background processes by default. So if you’re wanting to try out automation items within a restored environment that’s still in administration mode, you’re going to need to ensure that you toggle the Background Processes toggle to allow it to work!

    One further thing to learn as well (which I’ve been asked already by some people, so thought that I would mention it here). I’ve mentioned above that users were in the system, but reporting that things weren’t working. Now given that the environment was in administration mode, people have asked how users could be in it! The answer is that these users actually had the system customiser role applied to them, which is why they could get in! If they hadn’t had the role, then perhaps I might have realised things a little sooner (ie that the environment was in administration mode).

    So a (good) little lesson learned, and I’ll definitely take it forwards. Has this, or anything else like it, ever tripped you up? Drop a comment below – I’d love to hear!

    Working with Opportunity Close table

    I’ve recently had the experience of working with the Opportunity Close functionality within Dynamics 365, and given what occurred, thought it would be useful to document this so that others are able to see this as well. There are many scenarios in which we’d use this, and being able to give a comprehensive solution to clients does make all of the difference!

    There are three areas that I’d like to cover:

    • Working with Opportunity Close table
    • Challenges with data
    • Power Automate to the rescue!
    • Caveats

    So let’s get started then!

    Thanks to various members of the community such as Matt Collins-Jones, Andrew Bibby & others, who helped me along the way

    Working with Opportunity Close

    The Opportunity Close functionality within Dynamics 365 (& yes, I’m going to refer to it as this, rather than Power Platform) is used to provide information around why an opportunity is being closed. This is regardless of whether the opportunity has been won, or it’s been lost. It’s still quite important to track the information around it, so that companies can understand better how the market views the products it offers, how it stacks up against others, etc.

    The default path in the system is to create a lead, and then qualify it. Qualifying a lead then automatically creates an opportunity record, which further information (quotes, etc) can be entered against. An account record (if company information is specified) is also created:

    Updated Solution Release: Lead Qualification Version 2.0.0 for Microsoft Dynamics  365

    On the opportunity record, users are able to show if it’s been won or lost by clicking an appropriate button on the toolbar:

    Doing this brings up the Opportunity Close pane on the right hand side of the screen:

    Now it’s possible to customise this screen. In fact, the screenshot above shows 3 custom columns that have been added to it already in the system I was in.

    To do this, we go to customise the solution (in the Maker Experience), and add the column/s that we’re wanting to:

    Next, we need to remember to add it to the form! Otherwise it’s not going to show up. If we’re wanting it to appear on the side bar, then it’s important to customise the ‘Quick Create’ form version, to make our customisations show up.

    Note: We’re able to put conditional visibility of the column/s if we want to, based on whether the opportunity is won or lost, using Business Rules. I haven’t done so in this scenario, but you’re obviously able to do so if you want to

    Remember to save & publish the form, and then it’ll display within the system for users. Brilliant!

    Challenges with data

    So we’ve gone ahead & created the custom columns, and users are actually using them to record data. Wonderful – that’s exactly what we’ve been wanting to achieve.

    OK – let’s now review the data so that we can see overall what’s happened with our opportunities. Of course we’re wanting to do this simply & easily, so we’ll open an Advanced Find window, go to the Opportunity Close table, add columns from the associated Opportunity, and….hold on. Opportunity Close ISN’T displaying in the Advanced Find????

    It’s just NOT there. In case you’re wondering if you saved/published things correctly, or forgot some system setting, stop worrying. It’s not you – it’s the system.

    See, Opportunity Close, though a table in its own right, is a SPECIAL sort of table. It doesn’t show up, and can’t be directly queried. I know – frustrating. I felt exactly the same way.

    On digging deeper into things, I found out that there’s actually an activity record saved. It’s possible to query against this:

    However, and this is the BIG catch, it’s NOT possible to return custom columns when carrying out this query. The search will ONLY return the (system) columns that are present for activities. So this leaves us with a problem.

    Essentially, though we can set up custom columns to track the data that we’re needing to, it’s not possible (through the front end) to query it. This sort of negates what we’re trying to achieve here overall, and is a pain.

    So what’s the way round it? Well, it’s actually going to be Power Automate!

    Power Automate to the rescue

    In order to handle our issue, what we need to do is the following:

    • Add custom columns to the Opportunity table (these should mimic the custom columns that we’ve added to the Opportunity Close table)
    • Use Power Automate for automation purposes!

    The first step is easy. We need to go & create custom columns on the Opportunity table. These WILL show up in the Advanced Find search. They obviously need to be the same as the custom columns on the Opportunity Close table. If we’ve used Choice or Choices there, point the Opportunity column to the same source (it’s a good argument for using Global, rather than Local, choice/s).

    We then can go and create a Power Automate. This should trigger when an Opportunity Close record is created.

    Note: For this, I’ve made it so that it runs under the user triggering the action, rather than a system account. This is to keep in line with licensing limits etc

    You’ll then need to add a ‘Get Dataverse row’ step, and get the Opportunity Close record that has just been created. This is annoying, but for some strange reason the trigger doesn’t present the custom columns/values in the JSON that it returns. Hopefully Microsoft fixes this at some point, but for the moment, we need to work around it.

    The last step is to add a ‘Update Dataverse row’. This should point to the Opportunity table, & we can simply map the values across (from the SECOND step, NOT the first one – VERY IMPORTANT).

    Once this is all done, save & test it, and you should see it working. I generally don’t add the Opportunity custom columns to the form, but rather leave them for querying against.

    Caveats

    It’s important to keep in mind that when an opportunity is marked as either won or lost, it’s then closed, and changed to a read-only state. That’s how the system is designed to be, and makes sense.

    However it’s ALSO possible to re-activate a closed opportunity, and then close it again. Ie a single Opportunity record could have multiple Opportunity Close records against it. This solution won’t handle this (it would need to be built out further – the Opportunity record itself will only show the values from the latest Opportunity Close action, so please do keep this in mind!

    Have you ever come up against something like this? How have you handled it? I’d love to hear – please drop a comment!