Power Platform ALM Changes

As a starter for 10, if you haven’t yet looked into ALM for Power Platform, you should most definitely be doing so! ALM is, of course, Application Lifecycle Management. This is how, in a nutshell, we move solutions between environments.

In the good old days, this was done manually of course (CRM 4.0, I’m looking at you!). Today, though it is of course still possible to export/import solutions manually, it’s not the Microsoft Best Practise method. Doing it manually also means that it’s unlikely that you’ll have appropriate source control for your solutions too, which let’s face it, isn’t the best.

Want to look at a previous solution version? Hmm – do you still have it saved on your machine or not?

So we should generally know why we’d want to use ALM. But which tooling do we actually use for it? Going back to the on-premise days, there was TFS (or Team Foundation Server, to give its full name). This was a full source control respository, allowing developers to check in/check out code, built solutions, deploy them, etc.

With the move to ‘cloud based systems’, the TFS replacement is Azure Dev Ops (or ADO, as it’s usually referred to as). ADO works in essentially the same way as TFS did (some differences, but they’re not really relevant here), but does so through the cloud.

When it comes to Power Platform solutions, ADO uses the ‘Power Platform Build Tools’ capabilities to hook into Dataverse & pick up solutions. The tools essentially gives ADO the ability to connect in to a Power Platform environment, build/export solutions, deploy solutions, etc.

More information on the toolset can be found at Microsoft Power Platform Build Tools for Azure DevOps – Power Platform | Microsoft Docs

Now there are some limitations to the Power Platform Build Tools. In fact, I’d be so bold as to say that currently they’re not in a fully mature state. It’s not possible to do everything that you can manually (well, not with the inbuilt capabilities – there are some ‘hacks’ around that can extend them). At the moment, it’s essentially 1.0.

Well, Microsoft is announcing that they’re now releasing 2.0 of the Power Platform Build Tools this week!

In fact, this is so new that at the time of writing, there’s no Microsoft Docs available for this! So what does version 2.0 bring, and why is Microsoft releasing a new version?

So Microsoft has actually had this in planning for a while. There’s a lot going on with GitHub, as we well know, and Microsoft wants to drive the consistency of the experience for users forwards. At the moment, they work in somewhat different ways, and the aim is to bring this to parity.

The main change that the new version has is that instead of tasks being PowerShell based (which they are currently), now the tasks will be Power Platform CLI based. So Microsoft is changing the underlying working method from PS to CLI. Some of us will, of course, already be familiar with the way that the CLI works, and it’s really nice to see that the capabilities will now be part of ADO.

Now don’t start worrying that your current ADO pipelines (v0) will suddenly stop working. Microsoft is not doing anything with v0 at this point in time (though they may potentially deprecate in the future). So all of your existing ADO pipelines using the Power Platform Build Tools will continue to work, but no new features are going to be being released for it.

In terms of switching to using v2, it’s really quite simple – you’ll need to change the task version type as so:

If you are currently using YAML (as so many wonderful developers do) to author pipelines, you’ll need to do the following in the YAML code:

It’s very important to note that it’s not possible to mix and match task versions. If you do this, the ADO pipeline will fail, so please don’t try this!

I’m really excited about this, and to see that the CLI capabilities are being brought into play for ADO capabilities. I’ll admit that I’m wondering what else will be being released (in the fullness of time), as I’m sure that this is just the start of some great new stuff!

One of the things that I’m REALLY hoping for is the ability to use ADO pipelines to be able to migrate Power App Portals (or Power Pages), as currently it’s only possible to do using the Power Platform CLI, or the Configuration Migration Tool. It would be amazing to be able to do these with ADO pipelines as well!

Solution deployments: Automated vs Manual

Over the holiday period, I’ve been playing around with solution deployments. OK – don’t judge me too much…I also took the necessary time off to relax & get time off work!

But with some spare time in the evenings, I decided to look a bit deeper into the world of DevOps (more specifically, Azure DevOps), and how it works. I’ll admit that I did have some ulterior motives around it (for a project that I’m working on), but it was good to be able to get some time to do this.

So why am I writing this post? Well, there’s a variety of great material out there already around DevOps, such as https://benediktbergmann.eu/ by Benedikt (check out his Twitter here), who’s really great at this. I chat to him from time to time around DevOps, to be able to understand it better.

However, I ran into some quite interesting behaviour (which I STILL have no idea why it’s the case, but more on this later), and thought that I would document it.

Right – let’s start off with manual deployments. As we know, manual deployments are done through the user interface. A user (with necessary permissions) would do the following:

  1. Go into the DEV environment, and export the solution (regardless of whether this is managed or unmanaged)
  2. Go into the target environment, and import the solution

Pretty simple, right?

Now, from an DevOps point of view, the process is similar, though not quite the same. Let’s see how it works:

  1. Run a Build pipeline, which will export the solution from the DEV environment, and put it into the repository
  2. Run a Release pipeline, which will get the solution from the repository, and deploy it to the necessary environment/s

All of that runs (usually) quite smoothly, which is great.

Now, let’s talk for a minute about managed solutions. I’m not going to get into the (heated) discussion around managed vs unmanaged solutions. There’s enough that’s been written, said, and debated on around the topic to date, and I’m sure it will continue. Obviously we all know that the Microsoft Best Practise approach is to use managed solutions in all non-DEV environments..

Anyway – why am I bringing this up? Well, there’s one key different in behaviour when deploying a managed solution vs an unmanaged solution (for a newer solution version), and this is to do with removing functionality from the solution in the DEV environment:

  • When deploying an unmanaged solution, it’s possible to remove items from the solution in the DEV environment, but when deploying to other environments, those items will still remain, even though they’re not present in the solution. Unmanaged solution deployments are additive only, and will not not remove any components
  • When deploying a managed solution, any items removed from the solution in the DEV environment, and then deploying the solution to other environments will cause those items to be removed from there as well. Managed deployments are both additive & subtractive (ie if a component isn’t present in the solution, it will remove it when the solution is deployed)

Now most of us know this already, which is great. It’s a very useful way to handle matters, and can assist with handling a variety of scenarios.

So, let’s go back to my first question – why am I writing this post? Well..it’s because of the different behaviour in manual vs automated deployment, which I discovered. Let’s look at this.

When deploying manually, we get the following options:

The default behaviour (outlined above) is to UPGRADE the solution. This will apply the solution with both additive & detractive behaviour. This is what we’re generally used to, and essentially the behaviour that we’d expect with a managed solution.

Now, when running a release pipeline from Azure DevOps, we’d expect this to work in the same way. After all, systems should be build to all work in the same way, right?

Well, no, that’s not actually what happens. See, when an Azure DevOps release pipeline runs, the default behaviour is NOT to import the solution (we’re talking managed solutions here) as an upgrade. Instead (by default), it imports it as an UPDATE!!!

This is what was really confusing me. I had removed functionality in DEV, ran the build pipeline, then ran the release pipeline. However the functionality (which I had removed from DEV) was still present in UAT! It took me a while to find out what was actually happening underneath…

So how can we handle this? Well, apart from suggesting to Microsoft that they should (perhaps) make everything work in the SAME way, there’s a way to handle it within the release pipeline. For this, it’s necessary to do two things:

Firstly, on the ‘Import Solution’ task, we need to set it to import as a holding solution.

Secondly, we then need to use the ‘Apply Solution Upgrade’ task in the release pipeline

What this will do is then upgrade the existing solution in the target environment with the holding solution that’s just been deployed.

Note: You will need to change the solution version to a higher solution number, in order for this to work properly. I’m going to write more about this another time, but it is important to know!

So in my view, this is a bit annoying, and perhaps Microsoft will change the default behaviour within DevOps at some point. But for the moment, it’s necessary to do.

Has this (or something similar) tripped you up in the past? How did you figure it out? Drop a comment below – I’d love to hear!