Michelle Wong on The Oops Factor

Chatting to Michelle about her long-time interest in anime/manga, the wonders of #PowerAutomate cloud flows, & what might happen as a result of an unexpected loop…


If you’d like to come appear on the show, please sign up at http://bit.ly/2NqP5PV – I’d love to have you on it!

Click here to take a look at the other videos that are available to watch.

MB-260: Microsoft Customer Data Platform Specialist

It’s been a while since I’ve taken an exam. Admittedly, this is for two reason. Firstly, the renewal process for exams now (as updated last year) is not to take it again, but rather to re-qualify through Microsoft Learn. The second reason is that I’ve been waiting for some new exams to come out (OK – there’s the DA-100, which is still on my list of things to do…).

Well, there’s a new exam on the block. In fact, it’s a different type of exam – this is a ‘Speciality’ exam, rather than focusing on a specific type of application. It’s the first of its kind, though there are likely to be more to follow in the future.

It’s the MB-260, which is all around Customer Data. That’s right – it’s not about how to do sales, or customer service, or something else. It’s about taking the (holistic) approach to ALL of the data that we can hold on customers, and do something with it.

The official page for it is at https://docs.microsoft.com/en-us/learn/certifications/exams/mb-260https://docs.microsoft.com/en-us/learn/certifications/exams/mb-260. The specification for it is:

Candidates for this exam implement solutions that provide insights into customer profiles and that track engagement activities to help improve customer experiences and increase customer retention.

Candidates should have firsthand experience with Dynamics 365 Customer Insights and one or more additional Dynamics 365 apps, Power Query, Microsoft Dataverse, Common Data Model, and Microsoft Power Platform. They should also have direct experience with practices related to privacy, compliance, consent, security, responsible AI, and data retention policy.

Candidates need experience with processes related to KPIs, data retention, validation, visualization, preparation, matching, fragmentation, segmentation, and enhancement. They should have a general understanding of Azure Machine Learning, Azure Synapse Analytics, and Azure Data Factory.

Note that there’s quite a bit of Azure in there – it’s not just about Power Platform, Dataverse, or Dynamics 365. People who handle reporting on customer data should have various Azure skills as well.

There’s also a new type of badge that will be available:

At the time of writing, there are no official Microsoft Learning paths available to use to study. I do expect this to change in the near future, and will update this article when they’re out. However the objectives/sub-objectives are available to view from the main exam page, and I’d highly recommend going ahead & taking a good look at these.

As in my previous exam posts, I’m going to stress that it’s not permitted to share any of the exam questions. This is in the rules/acceptance for taking the exam. I’ve therefore put an overview of the sorts of questions that came up during my exam. (Note: exams are composed from question banks, so there could be many things that weren’t included in my exam, but could be included for someone else! ). I’ve tried to group things together as best as possible for the different subject areas.

Overall, I had 51 questions, which was towards the higher number of questions that I’ve experienced in my exams over the last year or so. There was only a single case study though.

Some of the naming conventions weren’t updated to the latest methods, which I would have expected. I still had a few references to ‘entities’ and ‘fields’ come up, though for the most part ‘tables’ and ‘columns’ were used. I guess it’s a matter of time to get everything up to speed with it.

  • Differences between Audience Insights and Engagement Insights
    • What are the benefits of each
    • When would you use each one
    • What types of users will benefit from each type
    • How to create customer insights
  • Environments
    • Types of environments
    • How to create a new environment
    • What options are available when creating an environment
    • What is possible to copy from an existing environment
  • Relationships
    • Different types of relationships
    • What is each one used for
    • Limitations of different relationship types
  • Business level measures vs customer level measures
    • What each one is, and what they’re used for
  • Power Query
    • How to use
    • How to configure
    • How to load data
  • Data mapping
    • Different types available to use
    • Scenarios each type should be used for
    • Limitations of each type
    • How to set it up
  • Segments
    • What are segments, how are they set up, how are they used
      What are quick segments, how are they set up, how are they used
      What are segment overlaps, how are they set up, how are they used
      What are segment differentiators, how are they set up, how are they used
  • Measures
    • What are measures, how are they set up, how are they used
  • Data refresh
    • Automated vs manual options
    • Limitations of each type
    • Availability of each type
    • How to set up each type
    • How to apply each type
  • Data Unification
    • What is this
    • How it can be used
    • How to set it up
    • Limitations of it
    • Process validation
    • Changing existing models
  • AI for Audience Insights
    • What is this
    • What can it be used for
    • How to use it
    • Factors that can affect outcomes
  • Security
    • Using Azure Key Vault
    • Capabilities of this
    • How to set it up
    • How to use it
  • Dynamics 365
    • Capabilities for interacting with Dynamics 365
    • How to set it up
    • How to display data, and where it can be displayed
    • What actions users are able to carry out within Dynamics 365

Wow. It’s a lot of stuff. It’s definitely an exam that if you’re not already currently hands-on with the skills needed, I’d highly recommend you get a decent amount of experience with it before taking the exam!

I can’t tell you if I’ve passed it or not…YET!. Results aren’t going to be out for several months, and to be honest, I’m not quite sure how well I’ve actually done.

So, if you’re aiming to take it – I wish you the very best of luck, and let me know your experience!

Antti Pajunen on The Oops Factor

Speaking to Antti around whiskey collecting (something that a lot of the #community seems to enjoy?), and the importance of PROPER requirements gathering during projects (especially around ‘low code’!)


If you’d like to come appear on the show, please sign up at http://bit.ly/2NqP5PV – I’d love to have you on it!

Click here to take a look at the other videos that are available to watch.

Solution deployments: Automated vs Manual

Over the holiday period, I’ve been playing around with solution deployments. OK – don’t judge me too much…I also took the necessary time off to relax & get time off work!

But with some spare time in the evenings, I decided to look a bit deeper into the world of DevOps (more specifically, Azure DevOps), and how it works. I’ll admit that I did have some ulterior motives around it (for a project that I’m working on), but it was good to be able to get some time to do this.

So why am I writing this post? Well, there’s a variety of great material out there already around DevOps, such as https://benediktbergmann.eu/ by Benedikt (check out his Twitter here), who’s really great at this. I chat to him from time to time around DevOps, to be able to understand it better.

However, I ran into some quite interesting behaviour (which I STILL have no idea why it’s the case, but more on this later), and thought that I would document it.

Right – let’s start off with manual deployments. As we know, manual deployments are done through the user interface. A user (with necessary permissions) would do the following:

  1. Go into the DEV environment, and export the solution (regardless of whether this is managed or unmanaged)
  2. Go into the target environment, and import the solution

Pretty simple, right?

Now, from an DevOps point of view, the process is similar, though not quite the same. Let’s see how it works:

  1. Run a Build pipeline, which will export the solution from the DEV environment, and put it into the repository
  2. Run a Release pipeline, which will get the solution from the repository, and deploy it to the necessary environment/s

All of that runs (usually) quite smoothly, which is great.

Now, let’s talk for a minute about managed solutions. I’m not going to get into the (heated) discussion around managed vs unmanaged solutions. There’s enough that’s been written, said, and debated on around the topic to date, and I’m sure it will continue. Obviously we all know that the Microsoft Best Practise approach is to use managed solutions in all non-DEV environments..

Anyway – why am I bringing this up? Well, there’s one key different in behaviour when deploying a managed solution vs an unmanaged solution (for a newer solution version), and this is to do with removing functionality from the solution in the DEV environment:

  • When deploying an unmanaged solution, it’s possible to remove items from the solution in the DEV environment, but when deploying to other environments, those items will still remain, even though they’re not present in the solution. Unmanaged solution deployments are additive only, and will not not remove any components
  • When deploying a managed solution, any items removed from the solution in the DEV environment, and then deploying the solution to other environments will cause those items to be removed from there as well. Managed deployments are both additive & subtractive (ie if a component isn’t present in the solution, it will remove it when the solution is deployed)

Now most of us know this already, which is great. It’s a very useful way to handle matters, and can assist with handling a variety of scenarios.

So, let’s go back to my first question – why am I writing this post? Well..it’s because of the different behaviour in manual vs automated deployment, which I discovered. Let’s look at this.

When deploying manually, we get the following options:

The default behaviour (outlined above) is to UPGRADE the solution. This will apply the solution with both additive & detractive behaviour. This is what we’re generally used to, and essentially the behaviour that we’d expect with a managed solution.

Now, when running a release pipeline from Azure DevOps, we’d expect this to work in the same way. After all, systems should be build to all work in the same way, right?

Well, no, that’s not actually what happens. See, when an Azure DevOps release pipeline runs, the default behaviour is NOT to import the solution (we’re talking managed solutions here) as an upgrade. Instead (by default), it imports it as an UPDATE!!!

This is what was really confusing me. I had removed functionality in DEV, ran the build pipeline, then ran the release pipeline. However the functionality (which I had removed from DEV) was still present in UAT! It took me a while to find out what was actually happening underneath…

So how can we handle this? Well, apart from suggesting to Microsoft that they should (perhaps) make everything work in the SAME way, there’s a way to handle it within the release pipeline. For this, it’s necessary to do two things:

Firstly, on the ‘Import Solution’ task, we need to set it to import as a holding solution.

Secondly, we then need to use the ‘Apply Solution Upgrade’ task in the release pipeline

What this will do is then upgrade the existing solution in the target environment with the holding solution that’s just been deployed.

Note: You will need to change the solution version to a higher solution number, in order for this to work properly. I’m going to write more about this another time, but it is important to know!

So in my view, this is a bit annoying, and perhaps Microsoft will change the default behaviour within DevOps at some point. But for the moment, it’s necessary to do.

Has this (or something similar) tripped you up in the past? How did you figure it out? Drop a comment below – I’d love to hear!

Aaron Ralls on The Oops Factor

Finishing up 2021 by chatting to ‘Chief Technical Penguin’ Aaron Ralls about his love of Cajun-inspired cooking (delicious!), & the importance of setting ourselves up for financial security (not just ‘the moment’).


If you’d like to come appear on the show, please sign up at http://bit.ly/2NqP5PV – I’d love to have you on it!

Click here to take a look at the other videos that are available to watch.

Clifton Lenne on The Oops Factor

Talking with Clifton about his love of travel, the importance of community with both technical & non-technical matters, and just how vital it is to understand Digital Transformation. Without the correct mindset for this, it’s going to be difficult to succeed!


If you’d like to come appear on the show, please sign up at http://bit.ly/2NqP5PV – I’d love to have you on it!

Click here to take a look at the other videos that are available to watch.