Exam AB-731: AI Transformation Leader

What better way to start 2026 then to talk about a Microsoft certification, especially one for a totally NEW type of user!

Following on the steps of the other AB exams I’ve been writing about my experience with (see Exam AB-730: AI Business Professional, Exam AB-100: Agentic AI Business Solutions Architect and Exam AB-900: Microsoft 365 Copilot and Agent Administration Fundamentals ), this article will cover the AB-731 exam.

This exam is focusing on the Microsoft AI capabilities from a Business Leader perspective, and to the best of my knowledge is the first time that Microsoft has ever created an exam from a ‘Business Leader’ perspective. Taking this exam was a complete mindset shift to me, especially when seeing the questions – it’s not about understanding the in depth technical capabilities, but more around the breadth of technology options (spanning Azure, Microsoft 365 Copilot, Copilot Studio & other tools), and what they bring/enable from a BUSINESS perspective.

The official description of the proposed exam candidate is:

As a candidate for this Microsoft Certification, you should understand how to recognize opportunities for AI transformation, identify the right AI tools and resources, plan for AI adoption, optimize business processes, and drive innovation by using Microsoft 365 Copilot and Azure AI services.

This Certification is designed for business decision-makers at all levels who are responsible for guiding transformation and innovation within their teams or organizations. In this role, you’re expected to demonstrate AI fluency, strategic vision, and the ability to lead AI adoption across teams and functions but are not expected to write any code.

As a candidate for this Certification, you should be able to evaluate AI opportunities, champion responsible AI practices, and align AI investments with business goals. You need experience leading adoption or change management in a business context. You must also be familiar with Microsoft 365 services, Azure AI services, and general AI capabilities.

The overall information for the exam can be found at Microsoft Certified: AI Transformation Leader, and there is an official Learning Path available for it.

As I’ve posted before around my exam experiences, it’s not permitted to share any of the exam questions. This is in the rules/acceptance for taking the exam. I’ve therefore put an overview of the sorts of questions that came up during my exam. (Note: exams are composed from question banks, so there could be many things that weren’t included in my exam, but could be included for someone else!). It’s also in beta at the moment, which means that things can obviously change for when it comes out of beta.

Overall, the exam approach was quite different to me – though I do talk with organisations frequently around general AI matters, I’ve never taken an example written in this way beforehand. However, I do feel that it’s very helpful to have this in place, to ensure that business leaders can demonstrate that they actually do know what they’re talking about 😉

I’ve tried to group things as best together as I feel (in my recollection), to make it easier to revise.

  • Azure Components & Capabilities
    • AI Vision – what it can be used for, benefits of using it, capabilities that it has
    • AI Language – what it can be used for, benefits of using it, capabilities that it has
    • AI Document Intelligence – what it can be used for, benefits of using it, capabilities that it has
    • Machine Learning – what it can be used for, benefits of using it, capabilities that it has
    • AI Foundry – what it can be used for, benefits of using it, capabilities that it has
    • AI Search – what it can be used for, benefits of using it, capabilities that it has
  • Microsoft 365 Copilot Chat
    • What license is needed
    • What data does it have access to
    • What security controls are in place
  • Microsoft 365 Copilot
    • What is it, what can it be used for
    • What can it do
    • How does it connect to data
    • What are the connectors for it (standard & custom)
    • Benefits of using it (vs 3rd party AI tooling)
    • Different agents (eg Analyst & Researcher) within it – what they do, how to access and use them
  • Microsoft Copilot Studio
    • What is it, what can be used for
    • What can it do
    • What license is needed
    • What data can it access
  • Microsoft Security Copilot
    • What is it, what can be used for
    • What can it do
    • Benefits that it provides
  • Security & Governance
    • Content filtering controls within Copilot
    • Policies
    • Handling requirements to prevent inappropriate language & responses
    • Responsible AI principles
    • Governance ownership, responsibility & requirements
  • Generative AI
    • AI model hallucinations
    • Grounding in data
    • Improving response quality
    • Prompt engineering
    • Pre trained models vs fine tuned models
    • Reasoning models vs non-reasoning models
    • Understanding usage costs (including different pricing models)
    • What is RAG, and how can it be used for business scenarios
    • Adoption throughout organisations – personas to involve in adoption team

    I hope that this is helpful for anyone who’s thinking of taking it – good luck, and please do drop a comment below to let me know how you found it! I’d also be interested in your thoughts/opinions around the direction that Microsoft has taken for this!

    Exam AB-900: Microsoft 365 Copilot and Agent Administration Fundamentals

    Following on the steps of the other AB exams I’ve been writing about my experience with (see Exam AB-730: AI Business Professional & Exam AB-100: Agentic AI Business Solutions Architect), this article will cover the AB-900 exam.

    This exam is focusing on the Microsoft 365 Copilot capabilities from a user & administration perspective, and doesn’t cover/include anything from Copilot Studio.

    Now, though it’s a Fundamentals exam, to be honest it’s the HARDEST fundamentals exam that I’ve ever taken!

    The approach is around being able to demonstrate understanding of how to use the Microsoft 365 Copilot, as well as a lot of focus on how to control & administer it.

    The official description of the proposed exam candidate is:

    As a candidate for this Microsoft Certification, you should be familiar with Microsoft 365, including core services, security, identity and access, data protection, and governance, along with Microsoft 365 Copilot and agents.

    Additionally, you should be familiar with the admin centers used to access Microsoft 365 workloads, such as Exchange Online, SharePoint in Microsoft 365, Microsoft Teams, Microsoft Entra, and Microsoft Purview. You need to have experience with AI-driven productivity tools and modern IT management practices.

    You must be able to identify the roles of the core features and objects available in Microsoft 365, such as users, groups, teams, sites, and libraries. Plus, you should understand the core security features of Microsoft 365, such as authentication methods, conditional access policies, and single sign-on (SSO).

    The overall information for the exam can be found at Microsoft 365 Certified: Copilot and Agent Administration Fundamentals, and there is an official Learning Path available for it.

    As I’ve posted before around my exam experiences, it’s not permitted to share any of the exam questions. This is in the rules/acceptance for taking the exam. I’ve therefore put an overview of the sorts of questions that came up during my exam. (Note: exams are composed from question banks, so there could be many things that weren’t included in my exam, but could be included for someone else!). It’s also in beta at the moment, which means that things can obviously change for when it comes out of beta.

    One thing to keep in mind about this exam – though I do mention Microsoft Purview in the list of items below, I haven’t gone into it extensively. However, there were a LOT of questions that touched on Purview (& other governance stuff as well) – you REALLY need to be knowing & understanding these capabilities to be able to take & pass the exam. Just guessing the answers is not going to help at all!

    Overall, the exam seemed to me to be pretty decent, though with indeed a heavy focus on security & governance (as I’ve mentioned above). I don’t see this as a bad thing though, as it can help to show that administrators really do know what they’re talking about.

    I’ve tried to group things as best together as I feel (in my recollection), to make it easier to revise.

    • Agent types
      • Native Microsoft 365 Copilot agent
      • Native Microsoft 365 Copilot advanced agents (eg Researcher & Analyst). What they are, how to access, what to use them for
      • Custom Microsoft 365 Copilot agent
      • SharePoint agent
    • Creating/using Agents
      • Using natural language to create agents
      • How to handle/perform multi-step reasoning
      • Use of notebooks
      • Custom instructions
      • Scheduling prompts
      • Querying data types
        • Structured
        • Unstructured
    • Governance & security
      • Blocking access to different types of searches & collateral
      • Blocking access to specific agents
      • Tools to use for blocking
      • How to share agents with other users
      • Assigning licenses to users
      • Data retention policies
      • Data labelling policies
      • Use of Microsoft Purview, covering capabilities, tools, auditing, how to use, etc
      • Use of DLP
      • Data source permissions
      • Conditional access policies
      • Microsoft Defender – what it is, capabilities it has, how to use it, etc.
      • Types of authentication
    • Reporting
      • Licensing & usage
      • Adoption & interactions
    • Payment options & capabilities
      • Credit usage – internal vs external users
      • Pay As You Go Billing, and scenarios you can use it for

    I hope that this is helpful for anyone who’s thinking of taking it – good luck, and please do drop a comment below to let me know how you found it! I’d also be interested in your thoughts/opinions around the direction that Microsoft has taken for this!

    Exam AB-730: AI Business Professional

    Following on from the recent launch of the new Exam AB-100: Agentic AI Business Solutions Architect exam, Microsoft has now developed & released other exams in the AB series – this post is on the AB-730 exam.

    The approach continues to be around how to use AI within technology for business purposes, rather than needing to be able to create AI or code. This exam focused on the Microsoft 365 Copilot experience, how to use it within various Microsoft Office applications, etc.

    The official description of the proposed exam candidate is:

    As a candidate for this Microsoft Certification, you should have experience using generative AI–powered productivity tools, including Microsoft 365 Copilot, Researcher, and Analyst. You take advantage of AI to improve daily work, drive business outcomes, and make informed decisions in business contexts—without building AI apps or writing code.

    You should have a basic understanding of Microsoft 365 and should be comfortable navigating core apps, such as Outlook, Word, Microsoft Teams, PowerPoint, and Excel. You should also be familiar with common business processes, including drafting emails, creating presentations, generating images, and managing documents.

    The overall information for the exam can be found at at Microsoft Certified: AI Business Professional (beta) – Certifications | Microsoft Learn, and there is an official Learning Path available for it.

    As I’ve posted before around my exam experiences, it’s not permitted to share any of the exam questions. This is in the rules/acceptance for taking the exam. I’ve therefore put an overview of the sorts of questions that came up during my exam. (Note: exams are composed from question banks, so there could be many things that weren’t included in my exam, but could be included for someone else!). It’s also in beta at the moment, which means that things can obviously change for when it comes out of beta.

    One thing that I found I didn’t like about the exam is the new question type of ‘Best Answer’. This question type gives various options, whilst telling you that more than one answer choice may achieve the goal, but asking you to select the BEST answer. I believe that questions like this are subjective, and the answers will vary based on each person’s knowledge, understanding & experience, so I’m not quite sure why Microsoft have decided that this would be good to use. It will be interesting to see what happens when the exam comes out of Beta, and if these questions are still around or not then.

    Overall, the exam seemed to me to be pretty decent – I initially thought it would be quite generic, but you really do need to know how all the Copilot offerings work including Copilot Chat and Copilot in the Office applications.

    If you’re new to Copilot, and/or not really sure as to how it actually works & the capabilities, I’d suggest not to take the exam yet. Instead, go and take a look at the learning paths, and look to find out how it actually works & operates.

    I’ve tried to group things as best together as I feel (in my recollection), to make it easier to revise.

    • Microsoft 365 Copilot vs Microsoft 365 Copilot Chat
      • What each one does/doesn’t do
      • When to use each one
    • What to include when prompting Copilot
    • Copilot security framework
      • How data is used
      • The different data controls that are in place
      • How data protection works, different data protection capabilities & using them
      • Removing data & prompts from Copilot,
    • Copilot capabilities in Microsoft Word
    • Copilot capabilities in Microsoft Excel
    • Copilot capabilities in Microsoft Outlook
    • Copilot capabilities in Microsoft PowerPoint
    • Copilot capabilities for Teams
      • Using Copilot within Teams for queries
      • Using Copilot within Teams for meetings (preparing for them, during the meeting, after the meeting)
    • Collaboration with Copilot report outputs
    • Copilot Researcher agent – getting access, capabilities & use cases, inputs & outputs
    • Copilot Analyst agent – getting access, capabilities & use cases, inputs & outputs
    • Using custom instructions within Copilot – how to do this, how it is used/applied, etc
    • Using documents with Copilot for answers & generating material. Updating new versions of documents, and how Copilot will behave
    • Microsoft 365 agents – creating, configuring, sharing, security etc
    • Creating, sharing & scheduling prompts, including limitationsent

    I hope that this is helpful for anyone who’s thinking of taking it – good luck, and please do drop a comment below to let me know how you found it! I’d also be interested in your thoughts/opinions around the direction that Microsoft has taken for this!

    Exam AB-100: Agentic AI Business Solutions Architect

    It’s always interesting when Microsoft release a new type of exam, especially when it’s not tied to specific functionality, but rather to an overall approach. The AB-100 exam (don’t pay too much attention to the ‘100’ designator, in my opinion) follows the approach that we’re seeing Microsoft taking – needing to use technology (& here, specifically AI in technology) holistically across multiple solution.

    I took the exam in Beta as soon as it launched, though due to preparing for the Power Platform Community Conference (which I’m currently writing this at), it’s taken a bit of time to get this blog post up and published.

    As an architect, AI isn’t new to us – we know of multiple different capabilities (spanning Microsoft 365, Copilot Studio & Azure AI Foundry), which we need to use appropriately to handle customer scenarios. AI isn’t new to exams either – there are multiple Azure exams with AI in them, we have multiple Business Application exams with Copilot Studio in them, etc.

    However, exams to date focus on a specific part of the technology stack. For example, the PL-600 focused on Power Platform & Dynamics 365 Customer Engagement. The MB-700 focused on Dynamics 365 Finance & Operations, and so on and so forth.

    This new exam is somewhat of a paradigm shift – needing to understand AI holistically as an architect across multiple parts of the technology stack, what & how it’s used for and where, etc. This is most definitely a new approach, and it will be interesting to see how it users react to it.

    Truthfully, having taken it, I’d personally say that it feels a bit more like an enterprise architect exam approach (which also doesn’t exist in the Microsoft stack), albeit focused around Business Applications. Given the way in which Microsoft partners have specialists in each technology part of the stack, it will be interesting to see if this approach will pivot the way in which people are trained/skilled, and deliver projects. I think that there’s likely to be a lot of feedback to Microsoft that it’s not the way that the partner landscape currently works – though perhaps Microsoft is specifically trying to influence this itself to change. Only time will tell…

    The overall information for the exam can be found at Microsoft Certified: Agentic AI Business Solutions Architect (beta) – Certifications | Microsoft Learn, though there is NO learning path that’s been created (at the time of writing). I think that this is because Microsoft may want to see the reaction to this new approach, and pivot appropriately, rather than needing to create a lot of content that may potentially need to be re-done.

    The official description of the exam can be found at the link above (it’s too long to post here), so please go take a look!

    So, as I’ve posted before around my exam experiences, it’s not permitted to share any of the exam questions. This is in the rules/acceptance for taking the exam. I’ve therefore put an overview of the sorts of questions that came up during my exam. (Note: exams are composed from question banks, so there could be many things that weren’t included in my exam, but could be included for someone else!). It’s also in beta at the moment, which means that things can obviously change for when it comes out of beta.

    I’ve tried to group things as best together as I feel (in my recollection), to make it easier to revise.

    • Business usage of AI
      • Different agents usage and results
      • How to use appropriately for business/agent analysis
      • Different types of metrics and results
      • Best practices for building Copilot Studio agents, and using Copilot Studio agents
      • Looking at the ROI for using Copilot Studio agents
      • Designing the usage of different AI and agent capabilities for business needs
    • Building agents
      • What Copilot Studio agents need to work
      • Data types that agents can use
      • Data sources that agents can use
      • Use of knowledge sources for agents
      • Usage of custom connectors
      • Handling token usage with Azure AI Foundry
      • How to handle testing for Copilot Studio agents
      • Different testing types & approaches
      • Extending Microsoft 365 Copilot
      • Using Power Automate with Copilot Studio agents
      • Speech to Text/Text to Speech
      • Handing conversation to live customer service representative using Dynamics 365 Contact Centre
      • Using RPA within an agent
    • Models
      • Different types of models that could be used within Azure AI Foundry
      • Orchestration
      • Improving performance
    • Security
      • How to handle Copilot Studio security
      • Governance & compliance tooling (eg Purview)
      • Handling/restricting connectors for Copilot Studio agents
      • Ensuring user security when using agents (ie not able to retrieve data that the user cannot access directly)
    • Reporting
      • Monitoring tools for Copilot Studio agents
      • Metrics, usage & analytics for Copilot Studio agents
      • Investigating Copilot Studio agent transcripts
      • Monitoring tools for Azure AI models
      • Evaluating Azure AI Foundry model outputs
    • Application Lifecyle Management
      • Focusing on AI Agents for Dynamics 365 CE, Finance & Operations, and Power Platform
      • How/what components to use and include
      • What tooling to use for ALM

    Overall, the exam seemed to me to be pretty decent – I was worried that it would focus just on Copilot Studio, with not much else in it, but there’s a good balance across other AI capabilities as well.

    The big change, for me at least, were the questions around Dynamics 365 Finance and Operations – this isn’t an area that I’m an expert in generally, and most definitely not for AI tooling. I think that this, as I mention above, is what may get the biggest pushback/feedback into Microsoft.

    I’m going to be quite interested in seeing how the exam is actually launched (as it’s currently in Beta of course). Having chatted with a few others who have taken the exam (whilst obviously respecting the NDA!), they also think that this is an approach pivot from Microsoft, and are wondering about the real world application of it.

    I hope that this is helpful for anyone who’s thinking of taking it – good luck, and please do drop a comment below to let me know how you found it! I’d also be interested in your thoughts/opinions around the direction that Microsoft has taken for this!

    MB-280: Microsoft Dynamics 365 Customer Experience Analyst

    It’s been a while since taking a Microsoft certification exam, but with the new MB-280 exam being launched in the last few days, I’ve obviously needed to take a look at it! It felt a little strange, as I’m now used to the certification renewal process (which is why I haven’t taken any exams in a while), but thankfully things went alright with the overall exam.

    For those who haven’t been following the news, Microsoft made an announcement a few months back that some exams would be retiring, and the new MB-280 exam would be the replacement for this. In short, this is supposed to replace the MB-210 (Sales), MB-220 (Customer Insights – Journeys) & MB-260 (Customer Insights – Data). Malin Martnes wrote a good blog post in June – I’d suggest to take a look at it at for more general information around it.

    Now I’m all up for new certifications being created & made available. However, and I know this could be considered controversial, I have ABSOLUTELY NO IDEA as to why this exam was created in THIS specific way. If an exam had been created, for example, to bring together the two sides of Customer Insights (ie to cover both Data & Journeys in a single exam), I think that would have been quite good.

    But with having taken this, my thoughts (& feedback to Microsoft directly) is that they should un-deprecate (if that’s a word/phrase?) the MB-210 exam, and continue it forward. There’s no reason that I can see having Marketing & Sales together in a single exam – it feels like two (or technically 3?) lego bricks lumped together without any rhyme or reason.

    The learning path for the exam was also launched in the last few days, and can be found at Study guide for Exam MB-280: Microsoft Dynamics 365 Customer Experience Analyst | Microsoft Learn

    The official description of the exam is:

    As a candidate for this exam, you’re a Microsoft Dynamics 365 customer experience analyst who has:

    • Participated in or plans to participate in Dynamics 365 Sales implementations.
    • An understanding of an organization’s sales process.
    • An understanding of the seller’s perspective (user experience).
    • The ability to demonstrate Dynamics 365 Customer Insights – Data and Customer Insights – Journeys capabilities.

    You’re responsible for configuring, customizing, and expanding the functionality of Dynamics 365 Sales to create business solutions that support, automate, and accelerate the company’s sales process. You use your knowledge of customer experience capabilities in Dynamics 365 Sales and Microsoft Power Platform to inform the following design and implementation tasks:

    • Configure Dynamics 365 Sales standard and premium features.
    • Implement collaboration features.
    • Configure the security model.
    • Perform Dynamics 365 Sales customizations.
    • Extend Dynamics 365 Sales with Microsoft Power Platform.
    • Deploy the Dynamics 365 App for Outlook.

    As a candidate, you need:

    • An understanding of the Dataverse security model and features, including business units, security roles, and row ownership and sharing.
    • Experience configuring model-driven apps in Microsoft Power Apps.
    • An understanding of accounts, contacts, and activities.
    • An understanding of leads and opportunities.
    • An understanding of the components of model-driven apps, including forms, views, charts, and dashboards.
    • An understanding of model-driven app personal settings.
    • Experience working with Dataverse solutions.
    • An understanding of Dataverse, including tables, columns, and relationships.
    • Familiarity with Power Automate cloud flow concepts, such as connectors, triggers, and actions.

    More can be found at the exam page itself, which is located at Exam MB-280: Microsoft Dynamics 365 Customer Experience Analyst (beta) – Certifications | Microsoft Learn

    Now during my exam, I was looking forward to seeing the ‘new’ capability around being able to use Microsoft Learn during the exam (new to me – as I haven’t taken any other exams in the last year or so since it was announced!). However there didn’t seem to be any capability to launch Microsoft Learn – I’m not sure why it wasn’t available, as this isn’t a Fundamental level exam

    Questions also used the older terms of references rather than the newer/accepted terms – ie using ‘field’ instead of ‘column’, and ‘entity’ instead of ‘table’. Again, I have no idea why this is – all other exams (including the renewals for them) are using these properly (in my summary below I have ensured I use the correct terms).

    So, as I’ve posted before around my exam experiences, it’s not permitted to share any of the exam questions. This is in the rules/acceptance for taking the exam. I’ve therefore put an overview of the sorts of questions that came up during my exam. (Note: exams are composed from question banks, so there could be many things that weren’t included in my exam, but could be included for someone else!). It’s also in beta at the moment, which means that things can obviously change.

    I’ve tried to group things as best together as I feel (in my recollection), to make it easier to revise.

    • Sales Apps
      • Configuring forms, columns & tables
      • Configuring security roles & access to records
      • Configuring relationships between records (including deletion properties)
      • Sales Mobile App – security & deployment
      • Forecasting – setting up & configuring
      • Configuring Goals
      • Configuring Opportunities
      • Handling currencies
    • Copilot for Sales
      • Setting up & deploying to users
      • Configuring access
    • Outlook App
      • Deploying & setting up
      • Configuring forms & information
    • Exchange
      • Connecting to mailboxes
      • Configuring folder permissions
      • Configuring multiple domains
    • Product Families & Catalogue
      • Creating & setting up
      • Configuring options
      • Adding items to be used
    • Price Lists
      • Creating & setting up
      • Configuring options, including discounts
      • Using time-restricted price lists
      • Handling currencies
    • Document Management
      • Different document management capabilities
      • Usage of SharePoint in different ways
    • Data Import
      • Usage of Power Query
      • Data manipulation
      • Handling duplicate records
    • SMS
      • Setting up & configuring SMS provider
    • Journeys
      • Different triggers to use based on scenarios & requirements
      • How to trigger journeys
      • How to set up emails to be used within a journey
    • Segments
      • Different types of segments
      • Creating & modifying segments
    • Searching/Filtering
      • Using Advanced Find
      • Setting up/modifying queries to include/exclude records based on conditions
    • Business Process Flows
      • Modifying business process flows
      • Handling conditions within business process flows

    As a Sales exam, it seemed alright. But as mentioned above, the Customer Insights questions just seemed strange to me – I’d expect a consultant to be very technically skilled in Customer Insights, but not in Sales (& vice versa), so I’m not understanding bringing these two sides together.

    I’m going to be quite interested in seeing how the exam is actually launched (as it’s currently in Beta of course). Having chatted with a few others who have taken the exam (whilst obviously respecting the NDA!), they also can’t really understand the landscape. Personally, I think that if it continues like this, Microsoft is going to hear quite a few complaints around it.

    I hope that this is helpful for anyone who’s thinking of taking it – good luck, and please do drop a comment below to let me know how you found it! I’d also be interested in your thoughts/opinions around the direction that Microsoft has taken for this!

    Developer Environment Routing!

    Recently I talked about the wider vision that organisations would be able to use, for helping users get access to the right environments (Default Environment – How to handle? » The CRM Ninja). As part of this, I discussed the Microsoft vision of having environment routing in place, to move users automatically to specific environments.

    At the point of writing, there wasn’t anything that I could publicly talk about. However, overnight Microsoft have released functionality around this – what I see as being the first step that this direction is taking. The documentation for this is at https://powerapps.microsoft.com/en-us/blog/default-environment-routing-public-preview/

    The functionality released is to enable new users to Power Platform to automatically have a developer environment created for them to access, rather than landing in the Default environment within their tenant. Many organisations struggle with users creating content in the Default environment, when it’s not really (at least not in my opinion) the right place to do this.

    Now, when we say ‘new users’, this doesn’t actually mean users newly created in M365 (or Entra ID/AAD). What this means is ‘users who have not accessed anything within Power Platform before’. In the back end, there’s a counter on each user record that keeps track of this, which this functionality is using to determine if users have accessed Power Platform beforehand or not.

    What is important to note on this as well is that the Default environment DOES NOT need to be set to Managed for this to work. Microsoft documentation doesn’t make this clear at the moment, but hopefully it’ll be updated soon to clarify this.

    Two settings do need to be toggled on within the Power Platform Admin Centre for this to work:

    Once these have been set & saved, let’s take a look at how things actually happen. I’ve created a new user for testing purposes:

    When signing in, it then briefly shows the general interface that we’re used to for a few seconds:

    But, then we get this exciting NEW screen!

    And then after a minute or so, we get placed nicely in the new environment:

    Looking at the Power Platform Admin Centre, we can see the new environment that’s been created:

    To be candid, during my testing things didn’t always work – I had some differing behaviour, or (on one occasion) the interface just hung. I’m going to put this down to being newly released & the product team working through potential issues (remember of course – this is in PREVIEW), and am hoping that they’re resolved very soon.

    Also, it’s important to note that the developer environments created through this are MANAGED. Users will be able to create collateral in them, but to run apps etc will need premium licensing in place.

    Moving forward, it would be great to have some information displayed to users if something hasn’t worked, as well as notifications to admins (configurable) so that they’re aware as well. Examples of this could include where an organisation has maxed out the number of (free) developer licenses available (yes, I know this sounds stange, but there’s a default limit of 9,999 developer licenses per org).

    But I think it’s a great first step forward, and hopefully there will be many different ways that this product will be developed forward. My initial thoughts would include:

    • Creating developer environments for existing Power Platform users who don’t have a personal developer environment
    • Routing existing Power Platform users who have their own Developer environment to it
    • Being able to route to other places as well, including being able to specify which users/groups of users should be routed

    It’s an exciting place to be in, and I look forward to seeing more of it!

    What are your thoughts around this? Does your organisation allow users to have personal developer enviroments, or do they lock it down?

    Default Environment – How to handle?

    As we’re all aware, the default (Power Platform) environment in any Azure tenant is a very ‘interesting’ thing to have. It’s there by default when an Azure tenant is created, all users within the Azure tenant automatically have access to it, we’re not able to restrict users from being in it, etc etc.

    Though it’s able to be backed up, it’s not able to be restored over itself, there’s no SLA/support available on it….the list goes on & on…!

    Many of us have come up against issues caused by people using the default environment whilst not knowing about challenges involving it, which usually results in pulling out our hair, banging our head against the wall, and other like-minded productive approaches.

    However, it is the first place that users, being new to Power Platform, land up, and instinctively they’ll start building applications, automations etc within it (though usually without using solutions as a container for the development of items). So to date, there’s not really anything that’s been able to be done around this, apart from monitoring users & chasing them after the fact.

    Now, we’re all about enabling our users in the right way, helping educate & support them. Telling them a big NO doesn’t help, and can even be an initial blocker to having people start playing around & building technological solutions.

    So how can we go about enabling our users, but also having the appropriate level of governance over the top? Well, there are several steps that I think we can take, which will help us with these. Now, not all of these are yet in place, though they have been talked about publicly. So let’s go take a look at them

    1. The first step, in my mind, is to start off with enabling the default environment as a managed environment (yes, this can ACTUALLY be done!). Managed environments have many different properties associated with them, but the one of most interest (for this at least) is the requirement to have a premium license in place.

    All users within an organisation should by default have an M365 license SKU against them (usually this would be an E3 or E5). Users with these can immediately use the seeded Power Platform capabilities within them to create Power Platform collateral (using standard connector capabilities). However, with the default environment being managed, they will NOT be able to access it!

    Note: For the moment, I’m leaving out users who have premium Power Platform licenses – this is deliberate

    1. Environment routing. Announced recently is the environment routing capabilities. This will enable users to be automatically routed to an appropriate environment, based on various conditions that can be set. With this, we could create appropriate business unit ‘sandboxes’, and we could route users to these. The user experience would be that when logging in, they would automatically then go to the right environment, rather than trying to work out which environment they should actually go to. This will save on confusion, and be a good user experience (in my opinion).
    1. Just-In-Time (JIT) Environment Creation. One of the items mentioned by Charles Lamanna at the European Power Platform Conference 2023 in Dublin is a new capability that’s coming in soon (I hope!). From the sound of it, this will give the ability to automatically create a new environment for users who do not already have one.

    This sounds really cool. With the recent advent of Development Environments (& the ability for all users to have multiples of these), this could work REALLY well with the environment routing capability mentioned above. When a user would log in for the first time, it could look to see if they have a developer environment – if yes, then route them to it. But if the user didn’t, then to automatically spin up & create a new developer environment, and route them to it.

    Now there are some caveats with this approach, leaving aside that some of the functionality isn’t GA yet.

    It would mean that organisations would need to be alright with changing the default environment to become a managed environment. Obviously, risk assessments would need to be carried out with this, and non-premium solutions migrated elsewhere.

    It’s also important to call out that organisations which have a CDS 1.0 implementation (ie before Power Platform became GA etc) will only have the ability to upgrade default to managed. They are not able to downgrade back to an unmanaged default environment, given limitations of the original CDS implementation (I’ve heard some truly HORRIFIC stories around this, so be careful!)

    The above, however, is just the start of things. There are many other concepts to keep in mind, such as Landing Zones, Policies, etc. I’m going to be looking to cover these in upcoming posts, so keep an eye out for them!

    New Power Platform Security Role Editor

    We’ve all been there. Security role wise, that is. It’s the point in any project where we start looking at configuring user security. To do this, we’ve used the Security Role section in the Settings area (once it’s actually loaded, of course):

    Ah, the joys of this – dating back to CRM 3.0 (to my recollection – though it possibly might be 4.0). All of these lovely little circles, which fill up more & more as we click on them, whilst trying to work out what each one actually does:

    And that’s not to mention the ‘Missing Entities’ tab (did anyone ever figure out what this was supposed to be used for), or the ‘Custom Entities’ tab which seemed like a catch all place. Plus the fact that non-table permissions (eg Export to Excel) were placed on random tabs that meant we needed to hunt through each tab to find the appropriate item.

    Now many of us spend hours in here (then further hours once we started troubleshooting user issues that were down to security role misconfiguration). The absolutely ‘JOYS‘ of the header title row not being scrollable (though it was possible to hover over each permission, and it would tell you what it was). The power of clicking on the line item, and seeing ALL of the little circles fill up – if you haven’t ever done it, you’ll not have experienced the bliss that this could bring!

    But all things come to an end(or as the Wheel of Time series says:’ The Wheel of Time turns, and Ages come and pass, leaving memories that become legend‘), and now we have a NEW security role experience.

    First of all, the UI has changed. It’s cleaner, responsive, gives more information to users upfront….and the heading SCROLL!!!

    We’re able to show just the tables that have permissions assigned to them (rather than wading through dozens or hundreds of entries that have no relevance), or show everything:

    Oh, and those random non-table privileges that we had to try to find beforehand – these are now grouped very nicely. This is SO much easier to manage!

    We can also take permissions that have been set on a specific table, and then copy them to another table (it promps us to select – and we can select MULTIPLE tables to copy to!):

    But best of all is the way that we can now set permissions for items. There are several different ways of doing this.

    Firstly, Microsoft has now provided us with the ability to select standard pre-defined options. Using these will set permissions across all categories for the item appropriately:

    This is really neat, and is likely to save quite a bit of time overall. However if we’re needing to tweak security permissions to custom settings, we can do this as well. Instead of clicking on circles, we now have lovely dropdowns to use:

    In short, I’m absolutely loving this. The interface is quick to load, intuitive, and works well without fuss.

    Given how much time I’ve spent over the years in wrestling with security roles, I think this is going to be a definite timesaver for so many people (though we’ll still need to troubleshoot interesting error messages at times that testing will throw up, and work out how/what we’re needing to tweak for security access to work).

    There are still some tweaks that I think Microsoft could make to get this experience even better. My top three suggestions would include:

    • The ability to select multiple lines, and then set a permission across all of them (sort of like bulk editing)
    • Being able to have this area solution aware. When we have various different projects going on, it would be great to have the ability to filter the permissions grid by a solution. This would be a timesaver, rather than having to wade through items that aren’t relevant
    • Export to Excel. Having a report generated to save digitally or print off is amazing for documentation purposes. There are 3rd party tools (thank you XrmToolBox!), but it would be great to be built into here

    Overall, I’m really quite happy and impressed with it (it’s definitely taken enough time for Microsoft to pay attention to this, and get it out), and hope that it’ll continue to improve!

    What have your bugbears with the legacy security editor been over the years, and how are you liking the new experience? Drop a comment below – I’d love to hear!

    Reflections on MVP Summit

    A few weeks back, I was fortunate enough to travel to Redmond, Seattle (in the USA), to partake in-person in the Microsoft MVP Summit. To say that this was an experience like no other is likely to be an understatement.

    Some people may be unfamiliar with what MVP Summit is. As a Microsoft ‘Most Valuable Professional’, we can engage with Microsoft in various different ways. These include discussions with Microsoft Engineering (aka ‘Product Group’ or ‘PG’), trying out features before they get generally released, etc. There are various methods by which these interactions occur, though in general (especially given how MVP’s are based around the globe), these are via email, virtual calls, etc.

    However, Microsoft has for many years hosted an MVP Summit at its headquarters in Redmond. This is an in-person event over multiple days, where any MVP can attend & take part. It includes up to date information on capabilities, discussions how Microsoft foresees its roadmap, etc. As an MVP, we are under NDA to Microsoft, and MVP Summit is entirely under NDA as well (well, in terms of the content discussed – selfies, however, are widely shared around social media!).

    Having been awarded during the initial pandemic lockdown in October 2020, I had never had the chance to attend an MVP Summit in person before. In 2020, Microsoft switched the format to being virtual, and this was the case for 2021 & 2022. I attended both of these virtually, but didn’t find them the most amazing experience – everyone had been telling me that it was about the in-person networking, connections & more, and although I hadn’t been to one before, I could sense that this was true. Also, given that sessions were held in PST, timezones were difficult (in 2022, I actually ‘based’ myself in PST, albeit it with being physically present in London – it was definitely quite strange!).

    With the 2023 MVP Summit on the horizon, MVP’s all around the world were hoping that the event would be in person, and indeed it was (though it was held slightly later in the year).

    I’m grateful to Kainos, the Microsoft Partner that I work at, who sponsored my ability to go to Redmond. Without this, it would have been extremely difficult for me to be able to attend.

    There were LOTS of selfies happening, with community & Microsoft people. My small ‘claim to fame’ is that I got the FIRST selfie with Scott Guthrie following the keynote!

    Overall, there were several main things that I took away from attending MVP Summit:

    • Meeting community members from across the globe. Though MVP Summit was announced at somewhat short notice, MVP’s from many different areas (including APAC) managed to make it in. Meeting them in person was just amazing – including Dian Taylor for the first time – we’ve been chatting for ages, and she was the original person to nominate me for the MVP award!Everyone was just so calm & chilled, and happy to chat about anything. It really drove home what a family the community is, and how it enables so many people
    • Meeting Microsoft people in person. I’ve been interacting with various members of different Microsoft PG’s over the last number of years (Power Platform SLT, governance, adoption, Dynamics 365 Customer Service, etc). Being able to meet them in person really deepened existing relationships, as well as being able to be introduced to people who I hadn’t met before. Something that also really hit home was being able see how our feedback actually impacts & affects the Microsoft people involved. Being able to ‘rockstar’ them, and have them see our enthusiasm in person was something that I wasn’t expecting at all, and it was truly wonderful that it happens.

    Now although the event this year was in-person, Microsoft trialled a new format for it – hyrbid. This enabled MVP’s who were not able to attend in person to join the sessions (though unfortunately not the networking between sessions). This, combined with also having sign-language capabilities for some of the sessions, really did go a long way to making the event as inclusive as possible.

    I was quite well prepared for the week, bringing gifts from the UK (aka PROPER chocolate), and some LEGO to do a giveaway competition (congratulations to Olena G & Azure M on winning!):

    Coming back from it, it has taken a little bit of time to recover (not sure if it’s been jetlag specifically, more general exhaustion, and adaption back to ‘regular’ family life’. Reflecting back, this was an experience like no other – it only developed & got more amazing throughout the entire week that I was there for. Being able to meet others, network, strengthen existing relationships…the opportunity to do this was just SO valuable.

    Now, I’m really hoping that I get the opportunity to go to MVP Summit next year!

    Power BI & Dataverse Solutions

    With the recent announcement of Power BI being able to be included in Power Platform solutions, LOTS of people were celebrating. Finally there would be the ability to not only include Power BI reports within solutions, but we could then also automate (aka ALM) it as well! Celebrations all round….well, for the most part.

    See, although the documentation (see Power Platform solutions can now include Power BI reports and datasets – Power Platform Release Plan | Microsoft Learn) states that Power BI reports & datasets can now be included in solutions, it doesn’t actually quite work like that.

    What happens is that when Power BI reports and datasets (depending on what you’re wanting to do) are included in solutions, though it does appear in the solution explorer window, it’s actually just a sort of shortcut to where they actually live. Exporting the solution then brings in the components into the exported solution file. This can be seen quite clearly when extracting the file on your computer:

    As we can see from the image above, we now have the Power BI components within it

    Note: If you were hoping to just go into it & see the Power BI report nicely, unfortunately you’re going to be disappointed. Instead, it’s exported as a ‘.pbipkg’ file, which doesn’t seem possible to open with Power BI Desktop at all!

    But it’s there, and supposed to work. So let’s go ahead & import it into the destination environment. After all, this is the whole point of solutions – being able to move components between places!

    Note: For the purpose of this blog post, I’m using manual ALM (ie manually exporting & importing the solution). However, the same will be true for automated ALM (eg using Azure DevOps).

    Now this can be easier said than actually done. See, it’s quite possible that you could experience an error when importing the solution into the target environment, such as the following:

    The error message (‘This solution contains Power BI components, so it couldn’t be imported here’) seems to be helpful – well, to a point. We know that there are Power BI components in the solution – after all, this is the point of it, but how comes we’re not able to import it!

    Usually at this point I’d go to download the log file, and try to pinpoint the exact cause of the error. When presented with this specific error though, the log file doesn’t really seem to be of much help, despite trawling through each & every line in it. All it does is confirm that there indeed has been an import error, and it seems due to the Power BI components in the solution.

    Just to double-check this, I did remove the Power BI components, export the solution, and then import it in a different environment. This worked absolutely fine without any errors! So indeed it’s got something to do with the Power BI components – but WHAT exactly is happening?

    Well, the cause of this goes back to how Power BI components in Power Platform solutions actually work. As mentioned above, the Power BI items (report, dataset etc) are actually stored within Power BI itself. Yes, they’re included in the solution when we export it, but when importing them, they don’t actually save to Dataverse.

    This is the absolutely KEY important thing to know and understand. When importing a solution with Power BI components, they come in as part of the solution, but are published to Power BI. Not only are they published to Power BI, a Power BI workspace is CREATED for them to live in (which will be specific per environment – a single Power BI workspace will not be shared with multiple Power Platform environments):

    What this means in reality is that when the solution is imported, the Power BI workspace is created. However it’s not created by the system itself – underneath everything, the creation of the Power BI workspace is being driven by the USER ITSELF that’s importing the solution. Now, if the user account does NOT have permissions to create Power BI workspaces…well then, it’s going to error out, which is EXACTLY what is happening here!

    So, it’s absolutely vital that if you are including Power BI components in a solution, you must ensure that however you’re importing it, the user account has privileges to create Power BI workspaces (as well as publish reports to an existing workspace). Without this in place, you’re going to be getting some very confusing errors happening!

    It’s also important to note that even if the solution is managed, it is still possible (with the appropriate user permissions) to edit the Power BI report & dataset. Including it in a managed solution does not lock it.

    Also, I’d like to thank Laura GB for her inspiration on this topic – with my limited Power BI knowledge, I usually turn to her for advice & help with Power BI.

    Have you been considering including Power BI components in your solutions, or already been doing so? Have you run into this error/issue before? Drop a note below – I’d love to hear how you managed to work out the issue!