Reflections on MVP Summit

A few weeks back, I was fortunate enough to travel to Redmond, Seattle (in the USA), to partake in-person in the Microsoft MVP Summit. To say that this was an experience like no other is likely to be an understatement.

Some people may be unfamiliar with what MVP Summit is. As a Microsoft ‘Most Valuable Professional’, we can engage with Microsoft in various different ways. These include discussions with Microsoft Engineering (aka ‘Product Group’ or ‘PG’), trying out features before they get generally released, etc. There are various methods by which these interactions occur, though in general (especially given how MVP’s are based around the globe), these are via email, virtual calls, etc.

However, Microsoft has for many years hosted an MVP Summit at its headquarters in Redmond. This is an in-person event over multiple days, where any MVP can attend & take part. It includes up to date information on capabilities, discussions how Microsoft foresees its roadmap, etc. As an MVP, we are under NDA to Microsoft, and MVP Summit is entirely under NDA as well (well, in terms of the content discussed – selfies, however, are widely shared around social media!).

Having been awarded during the initial pandemic lockdown in October 2020, I had never had the chance to attend an MVP Summit in person before. In 2020, Microsoft switched the format to being virtual, and this was the case for 2021 & 2022. I attended both of these virtually, but didn’t find them the most amazing experience – everyone had been telling me that it was about the in-person networking, connections & more, and although I hadn’t been to one before, I could sense that this was true. Also, given that sessions were held in PST, timezones were difficult (in 2022, I actually ‘based’ myself in PST, albeit it with being physically present in London – it was definitely quite strange!).

With the 2023 MVP Summit on the horizon, MVP’s all around the world were hoping that the event would be in person, and indeed it was (though it was held slightly later in the year).

I’m grateful to Kainos, the Microsoft Partner that I work at, who sponsored my ability to go to Redmond. Without this, it would have been extremely difficult for me to be able to attend.

There were LOTS of selfies happening, with community & Microsoft people. My small ‘claim to fame’ is that I got the FIRST selfie with Scott Guthrie following the keynote!

Overall, there were several main things that I took away from attending MVP Summit:

  • Meeting community members from across the globe. Though MVP Summit was announced at somewhat short notice, MVP’s from many different areas (including APAC) managed to make it in. Meeting them in person was just amazing – including Dian Taylor for the first time – we’ve been chatting for ages, and she was the original person to nominate me for the MVP award!Everyone was just so calm & chilled, and happy to chat about anything. It really drove home what a family the community is, and how it enables so many people
  • Meeting Microsoft people in person. I’ve been interacting with various members of different Microsoft PG’s over the last number of years (Power Platform SLT, governance, adoption, Dynamics 365 Customer Service, etc). Being able to meet them in person really deepened existing relationships, as well as being able to be introduced to people who I hadn’t met before. Something that also really hit home was being able see how our feedback actually impacts & affects the Microsoft people involved. Being able to ‘rockstar’ them, and have them see our enthusiasm in person was something that I wasn’t expecting at all, and it was truly wonderful that it happens.

Now although the event this year was in-person, Microsoft trialled a new format for it – hyrbid. This enabled MVP’s who were not able to attend in person to join the sessions (though unfortunately not the networking between sessions). This, combined with also having sign-language capabilities for some of the sessions, really did go a long way to making the event as inclusive as possible.

I was quite well prepared for the week, bringing gifts from the UK (aka PROPER chocolate), and some LEGO to do a giveaway competition (congratulations to Olena G & Azure M on winning!):

Coming back from it, it has taken a little bit of time to recover (not sure if it’s been jetlag specifically, more general exhaustion, and adaption back to ‘regular’ family life’. Reflecting back, this was an experience like no other – it only developed & got more amazing throughout the entire week that I was there for. Being able to meet others, network, strengthen existing relationships…the opportunity to do this was just SO valuable.

Now, I’m really hoping that I get the opportunity to go to MVP Summit next year!

Developer Environment Deletion!

Strong title for a blog post, right? Well, I did want to catch your attention! So what exactly are we talking about here?

For the last few years, it’s been possible for users to sign up for a ‘Developer’ plan, which gives them a full capability Power Platform environment for free (though with some limitations to them). This used to be be called the ‘Community’ plan, and is an amazing resource for everyone, whether they’re a professional or citizen developer, to have their own personal ‘sandpit’ to play in, and try things out.

Let’s wind back a few months in time now – earlier this year, Microsoft announced that users would be able to create THREE of these Developer environments, rather than having just a single one! This was mind blowing news, and something that has been extremely welcomed. If you’re wanting to see more on the announcement, Phil Topness has a great video on it at Dataverse Environments For Everyone – New Developer Plan – Power CAT Live – YouTube.

Incidentally, I’m curious as to how much storage space Microsoft has in the background to handle these. After all, each environment takes up a minimum of 1GB of space (& can grow to 2 GB). That means that each user could have 6GB of storage being used….which when multiplied, gives a VERY large number!

Microsoft has now announced that these developer environments, however, need to be utilisied. Ie if they’ve been created, but aren’t being used, Microsoft is going to delete them! Now, from a certain perspective, this is actually quite good – after all, there are all of the storage considerations for environments that have been created, but not being used. However from a different perspective, this could be a problem. What about if you’re doing something occasionally in an environment, but not too often? What about if you decide to go on a ‘Round the World’ cruise for several months?

So let’s look at the definition for this. Microsoft states that an environment is considered to be inactive when it hasn’t been used for 90 days. At that point in time, it is disabled, and the administrator or environment owner is notified. If there is no action taken within the next 30 days, then the developer environment is automatically deleted.

Now, how does Microsoft define ‘Activity’? It goes something like this:

  • User activity: Launch an app, execute a flow (whether automatic or not), chat with a Power Virtual Agents bot
  • Maker activity: Create, read, update, or delete an app, flow (desktop and cloud flows), Power Virtual Agents bot, custom connector
  • Admin activity: Environment operations such as copy, delete, back up, recover, reset

The above is all user driven – ie a user needs to interact with something within the environment. However, it’s also important to note how automation is viewed:

  • Activity includes automated behaviors such as scheduled flow runs. For example, if there’s no user, maker, or admin activity in an environment, but it contains a cloud flow that runs daily, then the environment is considered active.

It’s also important to note that at this point in time, the above only applies to Developer environments. Other types of environment (Production, Sandbox etc) don’t have any auto-deletion policies called out for them – well, at least not yet (if something does pop up around these, I’ll definitely look to talk about them too!).

So to answer our question above about what happens with a (developer) environment that is only being used infrequently – the way to stop it being auto-deleted is to put some automation in place. This doesn’t need to be lightweight – it can be something simple & easy, just to ensure that the environment registers activity happening within it.

In my view, it would be nice to have some granularity & control over this as well – allowing organisations to set their own deletion policies. We have this in place for things like audit log retention – it would be nice to have have it in here too.

Power BI & Dataverse Solutions

With the recent announcement of Power BI being able to be included in Power Platform solutions, LOTS of people were celebrating. Finally there would be the ability to not only include Power BI reports within solutions, but we could then also automate (aka ALM) it as well! Celebrations all round….well, for the most part.

See, although the documentation (see Power Platform solutions can now include Power BI reports and datasets – Power Platform Release Plan | Microsoft Learn) states that Power BI reports & datasets can now be included in solutions, it doesn’t actually quite work like that.

What happens is that when Power BI reports and datasets (depending on what you’re wanting to do) are included in solutions, though it does appear in the solution explorer window, it’s actually just a sort of shortcut to where they actually live. Exporting the solution then brings in the components into the exported solution file. This can be seen quite clearly when extracting the file on your computer:

As we can see from the image above, we now have the Power BI components within it

Note: If you were hoping to just go into it & see the Power BI report nicely, unfortunately you’re going to be disappointed. Instead, it’s exported as a ‘.pbipkg’ file, which doesn’t seem possible to open with Power BI Desktop at all!

But it’s there, and supposed to work. So let’s go ahead & import it into the destination environment. After all, this is the whole point of solutions – being able to move components between places!

Note: For the purpose of this blog post, I’m using manual ALM (ie manually exporting & importing the solution). However, the same will be true for automated ALM (eg using Azure DevOps).

Now this can be easier said than actually done. See, it’s quite possible that you could experience an error when importing the solution into the target environment, such as the following:

The error message (‘This solution contains Power BI components, so it couldn’t be imported here’) seems to be helpful – well, to a point. We know that there are Power BI components in the solution – after all, this is the point of it, but how comes we’re not able to import it!

Usually at this point I’d go to download the log file, and try to pinpoint the exact cause of the error. When presented with this specific error though, the log file doesn’t really seem to be of much help, despite trawling through each & every line in it. All it does is confirm that there indeed has been an import error, and it seems due to the Power BI components in the solution.

Just to double-check this, I did remove the Power BI components, export the solution, and then import it in a different environment. This worked absolutely fine without any errors! So indeed it’s got something to do with the Power BI components – but WHAT exactly is happening?

Well, the cause of this goes back to how Power BI components in Power Platform solutions actually work. As mentioned above, the Power BI items (report, dataset etc) are actually stored within Power BI itself. Yes, they’re included in the solution when we export it, but when importing them, they don’t actually save to Dataverse.

This is the absolutely KEY important thing to know and understand. When importing a solution with Power BI components, they come in as part of the solution, but are published to Power BI. Not only are they published to Power BI, a Power BI workspace is CREATED for them to live in (which will be specific per environment – a single Power BI workspace will not be shared with multiple Power Platform environments):

What this means in reality is that when the solution is imported, the Power BI workspace is created. However it’s not created by the system itself – underneath everything, the creation of the Power BI workspace is being driven by the USER ITSELF that’s importing the solution. Now, if the user account does NOT have permissions to create Power BI workspaces…well then, it’s going to error out, which is EXACTLY what is happening here!

So, it’s absolutely vital that if you are including Power BI components in a solution, you must ensure that however you’re importing it, the user account has privileges to create Power BI workspaces (as well as publish reports to an existing workspace). Without this in place, you’re going to be getting some very confusing errors happening!

It’s also important to note that even if the solution is managed, it is still possible (with the appropriate user permissions) to edit the Power BI report & dataset. Including it in a managed solution does not lock it.

Also, I’d like to thank Laura GB for her inspiration on this topic – with my limited Power BI knowledge, I usually turn to her for advice & help with Power BI.

Have you been considering including Power BI components in your solutions, or already been doing so? Have you run into this error/issue before? Drop a note below – I’d love to hear how you managed to work out the issue!

Justin Wilkinson on The Oops Factor

Finding out from Justin how he got into his love of watches, starting off with on-premise systems, and learning the importance of saying no rather than taking on/helping everyone out.


If you’d like to come appear on the show, please sign up at http://bit.ly/2NqP5PV – I’d love to have you on it!

Click here to take a look at the other videos that are available to watch.

Kartik Kanakasabesan on The Oops Factor

Talking to Kartik about his love of reading & science fiction (Isaac Asimov, anyone?), as well as touching on development tools, capabilities & limitations. Technology moves on in interesting ways!


If you’d like to come appear on the show, please sign up at http://bit.ly/2NqP5PV – I’d love to have you on it!

Click here to take a look at the other videos that are available to watch.

The story of MFA & the Centre of Excellence

I’ve been rolling out the Microsoft Centre of Excellence solution for several years now at customers. It’s a great place to start getting a handle on what exactly is going on within a Power Platform tenant, though there’s obviously so much more that takes place within a Centre of Excellence team.

The solution gathers telemetry around environments, Power Apps, Power Automates etc through the usage of the Power Automate Admin connectors for Power Platform (see Power Platform for Admins – Connectors | Microsoft Learn for further information on these).

Now obviously we need a user account to run these, and this usually has been through the use of a ‘pseudo service account’, as using a service principal has been tricky, to say the least. So we would get customers to set up an appropriate account with licensing & permissions in place, and use this to own & run the Power Automate flows that bring in the information to the CoE solution.

It is important to note that usage of these connectors do require a pretty high level of permissions – in fact, we usually suggest applying the Power Platform Admin security role (within the Microsoft 365 Admin Centre) to the user account. All good so far.

The tricky part has, to date, been around security. Organisations usually require (for good reasons) multi-factor authentication to be in place (aka MFA). Now this is fine for users logging in & accessing systems. However, it proves to be somewhat tricker for automations.

See, when a user logs in & authenticates through MFA, a token is stored to allow them to access systems. Automations can also use this. However the token will expire at some point (based on how each organisations has implemented MFA access/controls). When the token expires, the automations will stop running, and fail silently. There’s no prompt that the token has expired, and the only way of knowing is to take a look at the Power Automate flow history. This can be interesting though, as signing in (with the pseudo service account) will prompt for MFA authentication, and then everything will start running again!

So this has usually resulted in conversations with the client to politely point out that implementing MFA on the service account will mean that, at some point, the Power Automate flows are going to start failing. Discussions with security teams take place, mitigation using tools such as Azure Sentinel are implemented, and things move ahead (cautiously). It’s been, to date, the most annoying pain for the technical implementation (that I can think of at least, in my experience).

Now you’d think that a change in this would be shouted from the rooftops, people talking about it, social media blowing up, etc. Well, I was starting an implementation recently for a customer, and was talking to them around this, as I’d usually do. Imagine my surprise when Todd, one of the Microsoft technical people attached to the client, asked why we weren’t recommending MFA.

Taking a look at the online documentation, I noticed that something had slipped in. Finally there was the ability to use MFA!

Trawling back through the GitHub history (after all, I wanted to find out EXACTLY when this had slipped in), I discovered that it was a few months old. I was still very surprised that there hadn’t been more publicity around this (though definately a good incentive to write about it, and a great blog post to start off 2023 with!).

So moving forward, we’re now able to use MFA for the CoE user account. This is definately going to put a lot of mind at rest (especially those who are in security and/or governance). The specifics around the MFA implementation can be found at Conditional access and multi-factor authentication in Flow – Power Automate | Microsoft Learn – but it’s important to note that specific MFA policies will need to be set up & implemented for this account.

So, now the job will be to retro-fit this to all organisations that already have the CoE toolkit in place. Thankfully this shouldn’t be too difficult to do, and will most definitely enhance the security controls around it!

Have you implemented any mitigation in the past to handle non-MFA? I’m curious if you have – please drop a comment below!

Active or inactive, that is the question?!?!

Catchy title, right? Well I was wondering what exactly I should use for this blog post, and as you’ll see as we go through things, this is probably quite a good paraphrase to use.

So, where to start? Well, with a customer, of course! Now, this customer has been running live with a custom Dynamics 365 solution for a little while. Importantly for this story, there have not been ANY releases in quite a few months. This is of course good to bear in mind, given that we can all, um, occasionally find that a release could cause an issue, somewhere, sometimes…

Part of the capabilities that they’re using is bringing in Leads, and qualifying them appropriately. As part of this process, there are various custom attributes (aka columns) that have been added to the Lead table, along with corresponding columns added to the Contact table. There’s also some custom logic that, when a lead is qualified, copies the values from Lead to Contact record, updating it (essentially extending the standard capabilities of the system).

This has all been working well to date, and the customer team has been very happy with their system. Until it stopped working, last week. Which was strange, as nothing seemed to have changed at all?

When trying to qualify leads in the system, they were getting the following error message:

Cryptic, right? This seemed a little more interesting as well, given that when only inputting basic information into a Lead record (eg First Name, Last Name, Phone Number), it didn’t matter how many leads existed with the same information, it qualified without a problem.

However, using any custom columns that had been added to the table caused this error to occur.

The first thing that I did was to check that there had been no updates released to Production. This was confirmed as being the case. I then also checked that there had been no OTHER solutions released to Production (as this could have impacted on it). Thankfully there hadn’t – the system looked to be in as fine a shape as it’s been running for a while.

OK – on to the next step. What updates have been released by Microsoft? Well, with the fact that we were able to pinpoint the date that the functionality had stopped working, we went to find the corresponding Learn article about the release (Update 22102 – Release Notes | Microsoft Learn). Don’t worry about clicking through to read it – there’s essentially not much in it, and there’s nothing at all around the Lead table or its functionality!

Continuing to dig around, I really wasn’t sure of what was causing this, but obviously had to work it out & figure out a fix! It was quite a dilemna.

This is where the amazing Microsoft community came into play. I noticed a post by Jeroen Scheper on one of the channels that I’m on. It turns out that he was having the same issues, so we started to try collaborate on it. This both reassured me (that it wasn’t just me), but also increased the confusion, as we couldn’t work out what was going on underneath to cause this!

Raising with Microsoft (we both actually raised support incidents), I had an amazing support call almost immediately. Demonstrating the problem, I was told that it was due to Duplicate Detection rules.

Now I’ll admit that this confused me somewhat. See, I had already checked the Duplicate Detection rules, but nothing had been changed, and no new rules had been implemented.

Getting the support agent to walk me through things, they told me that I had to unpublish the rules, modify a setting on them, and then re-publish the rules. This was the setting (on each one) that had to be updated:

This again caused me to be confused. Why was the system having issues with inactive records? Surely qualified leads are active records, but just qualified (& then being locked down as a result)?

Well, it turns out that my perspective of how this works is actually incorrect. As we (hopefully) all know, whilst all records have a Status value (eg Active, Inactive), there are some records that also have a Status Reason value.

In fact, the ‘State Code’ choice value in Dataverse is restricted (we can’t access it), and seems to have some quite interesting functionality running behind it. Depending on which table is accessed, there are different options available within it.

For example, the Lead table shows:

Whereas the Contact table shows:

And the Task table shows:

Anyhow – it turns out that when a Lead record is qualified or disqualified, though it’s not shown in the user interface (nor behind the scenes), the record is actually being deactivated!

More information on this can be found at Qualify and convert leads to opportunity | Microsoft Learn.

So, this was the underlying reason behind the error message. Obviously Microsoft had updated something, which then caused this to fail. I don’t know how many different customers may have been (or still be?) experiencing the issue, but I think that the error message at least could be a little clearer? Perhaps including a link to the relevant Microsoft documentation page, for a start.

Well, thankfully this was put to bed, and I was quite thankful (as was the customer). And this is how I decided to come up with the title of this blog post!

Have you ever had something similar happen to you? Drop a comment below – I’d love to hear!

Adam Belak on The Oops Factor

Talking to Adam about how he decided (together with the AMAZING Kaila) to start up a new Microsoft conference, and the MAJOR twist that they’re bringing with it! Also covering a vehicle accident that he was involved in, & how it changed his life

Make sure to also check out https://www.iberiansummit.com/ and register for it!


If you’d like to come appear on the show, please sign up at http://bit.ly/2NqP5PV – I’d love to have you on it!

Click here to take a look at the other videos that are available to watch.