Eswar Prakash on The Oops Factor

Talking to Eswar about his dual love of AI & music (some interesting correlations!), & covering what could happen with specific commands being used on a NON-Microsoft system…


If you’d like to come appear on the show, please sign up at http://bit.ly/2NqP5PV – I’d love to have you on it!

Click here to take a look at the other videos that are available to watch.

Damien Bird on The Oops Factor

Talking to the community legend that’s Damien about his vegetable garden, how he got into gardening in the first place, and a sudden medical condition coming out of nowhere that has an impact on life moving forward


If you’d like to come appear on the show, please sign up at http://bit.ly/2NqP5PV – I’d love to have you on it!

Click here to take a look at the other videos that are available to watch.

Ken Auguillard on The Oops Factor

Chatting to the AMAZING rockstar that is Ken around our common love of BBQ – find out if he’s a low/slow or high/fast kind of guy. Also touching on a motorbike incident some years back that shaped his approach to life moving forward – some VERY powerful lessons to hear & learn from!


If you’d like to come appear on the show, please sign up at http://bit.ly/2NqP5PV – I’d love to have you on it!

Click here to take a look at the other videos that are available to watch.

Hardit Bhatia on The Oops Factor

Talking to the AMAZING Hardit about his love of sports, finding out just WHO his ultimate sports hero is, & discovering a life-changing event that happened as a child, with how it changed his life since…


If you’d like to come appear on the show, please sign up at http://bit.ly/2NqP5PV – I’d love to have you on it!

Click here to take a look at the other videos that are available to watch.

Kristine Kolodziejski on The Oops Factor

Finding out from Kristine about her love of chocolate & drumming (not sure which one is more important to her?), some challenges when she was growing up, and how her career has evolved over time!


If you’d like to come appear on the show, please sign up at http://bit.ly/2NqP5PV – I’d love to have you on it!

Click here to take a look at the other videos that are available to watch.

Interacting with Microsoft

People sometimes wonder about what is the best way to interact with Microsoft. In fact, this post isn’t strictly aimed at interacting with Microsoft, but can also be taken as a general guide to interacting with any organisation. The reason for deciding to write about this comes from a conversation that I had last week with a good friend, who was having issues in finding a resolution to an issue.

Let’s start at the beginning. We, or our customers, have relationships with suppliers such as Microsoft. We’ll order software (licenses), need to have them supplied to us (show up in our account), and sometimes there may be issues that we need to help/support with. There are obviously general support channels available that support tickets can be raised through, but there are also other avenues to consider as well.

Apart from the ‘professional’ relationship/s that may be in place, we may also have ‘personal’ relationships with members of different teams. These can happen in various different ways, such as speaking together at events, organising communities, etc. They are very valuable to have in place, & many people that I know, as well as myself, strive to improve & increase the network & connections that we have with Microsoft & other organisations.

However, there’s something very important to keep in mind. Just as we are doing our day job (what we’re paid to do), they are as well. At the end of the day, they (as with ourselves) need to ensure that the job gets done.

So if we reach out to ask something from them, we’re essentially asking for a favour, usually without anything recriprocal being able to be offered. A really good analogy for this, shared previously with me by Mark Smith & Chris Huntingford, is the ‘Sweet Jar Concept’.

Here’s how it goes. Imagine that the person has a jar with 100 sweets in. There are a limited number (the number itself isn’t important though) available, and the person has to choose who to give the sweets to. If we ask for a favour without knowing them, it’s highly unlikely to be granted. Even if we do know them somewhat, it may still be unlikely – they’re not going to be getting any return on the sweet that they’re giving out. Potentially if we know them well, and have proven in the past that we’re of value to them, we’ll get a sweet.

But even if we do know them well, if we keep asking for sweets (aka favours), the likelihood of them being granted will diminish (rapidly). Again – there’s a limited supply of them, and we’re not going to be looked on favourably if we keep coming back & asking for more, whilst not giving anything in return.

So HOW could we go about this, to set ourselves up for success (ie getting the outcome that we’re desiring). Well, this is actually quite simple – we need to identify who will be gaining something by helping us. Let’s explain this in more detail.

Within Microsoft (& any organisation really), people have metrics that they need to meet for their role. These are usually referred to as KPI’s (Key Performance Indicators), and are used for things like salary & role progression. What we should be doing is finding the right person (or team) that has (one or more) KPI’s aligned to what we’re trying to accomplish.

Let’s use the example here of the situation with my friend last week. He had a client who had ordered licenses from Microsoft that were needed for a project to start, but hadn’t appeared in the customer account yet. If the licenses weren’t there on-time, the project would need to be delayed, which would be expensive (& very annoying) for the customer.

On hearing the situation, I suggested to him to find the person (or people) within Microsoft who’d be aligned towards ensuring the situation was remedied ASAP. Examples of these people could be:

  • Microsoft Account Manager. This person would be aligned from the Microsoft side to ensure that the customer would have everything that they needed to be successful
  • Microsoft Sales Team. If there was a sales team involved with the license purchase, they would be very aligned to ensuring that the licenses had actually been procured and showing up in the customer account!
  • Microsoft Account Technology Strategist. This is the person responsible for designing the strategy and architecture to drive digitalisation and innovation for the customer

Now the above list isn’t exhaustive, and is also applicable to the specific scenario above. Additionally, the people mentioned might not be able to actually deal with the situation themselves, but if they’re not, are more than likely to know the right person/team who can deal with it.

With this approach, we’d be lined up for success in three ways:

  1. We’d (hopefully) get the immediate situation looked at and resolved
  2. We’d be giving our connections the ability to align to their KPI’s, and show results for them
  3. We’d be showing our value to our connections, which can then help if we have a favour to ask in the future (that’s not necessarily aligned to KPI’s

So in a nutshell – when we look to try to get something dealt with/resolved, we should ask ourselves who’s best aligned professionally to help us, with it being in line with their professional goals. This way we can drive value, as well as giving goodwill all round.

Have you ever been in a situation where this may have helped? How did you handle it? I’d love to hear – please drop a comment below!

Liz Pham on The Oops Factor

Talking to the wonderful Liz around her love of cooking (in specific circumstances), her rock climbing hobby, and some very important lessons learned from a specific rock climbing incident!


If you’d like to come appear on the show, please sign up at http://bit.ly/2NqP5PV – I’d love to have you on it!

Click here to take a look at the other videos that are available to watch.

Power Platform ALM Changes

As a starter for 10, if you haven’t yet looked into ALM for Power Platform, you should most definitely be doing so! ALM is, of course, Application Lifecycle Management. This is how, in a nutshell, we move solutions between environments.

In the good old days, this was done manually of course (CRM 4.0, I’m looking at you!). Today, though it is of course still possible to export/import solutions manually, it’s not the Microsoft Best Practise method. Doing it manually also means that it’s unlikely that you’ll have appropriate source control for your solutions too, which let’s face it, isn’t the best.

Want to look at a previous solution version? Hmm – do you still have it saved on your machine or not?

So we should generally know why we’d want to use ALM. But which tooling do we actually use for it? Going back to the on-premise days, there was TFS (or Team Foundation Server, to give its full name). This was a full source control respository, allowing developers to check in/check out code, built solutions, deploy them, etc.

With the move to ‘cloud based systems’, the TFS replacement is Azure Dev Ops (or ADO, as it’s usually referred to as). ADO works in essentially the same way as TFS did (some differences, but they’re not really relevant here), but does so through the cloud.

When it comes to Power Platform solutions, ADO uses the ‘Power Platform Build Tools’ capabilities to hook into Dataverse & pick up solutions. The tools essentially gives ADO the ability to connect in to a Power Platform environment, build/export solutions, deploy solutions, etc.

More information on the toolset can be found at Microsoft Power Platform Build Tools for Azure DevOps – Power Platform | Microsoft Docs

Now there are some limitations to the Power Platform Build Tools. In fact, I’d be so bold as to say that currently they’re not in a fully mature state. It’s not possible to do everything that you can manually (well, not with the inbuilt capabilities – there are some ‘hacks’ around that can extend them). At the moment, it’s essentially 1.0.

Well, Microsoft is announcing that they’re now releasing 2.0 of the Power Platform Build Tools this week!

In fact, this is so new that at the time of writing, there’s no Microsoft Docs available for this! So what does version 2.0 bring, and why is Microsoft releasing a new version?

So Microsoft has actually had this in planning for a while. There’s a lot going on with GitHub, as we well know, and Microsoft wants to drive the consistency of the experience for users forwards. At the moment, they work in somewhat different ways, and the aim is to bring this to parity.

The main change that the new version has is that instead of tasks being PowerShell based (which they are currently), now the tasks will be Power Platform CLI based. So Microsoft is changing the underlying working method from PS to CLI. Some of us will, of course, already be familiar with the way that the CLI works, and it’s really nice to see that the capabilities will now be part of ADO.

Now don’t start worrying that your current ADO pipelines (v0) will suddenly stop working. Microsoft is not doing anything with v0 at this point in time (though they may potentially deprecate in the future). So all of your existing ADO pipelines using the Power Platform Build Tools will continue to work, but no new features are going to be being released for it.

In terms of switching to using v2, it’s really quite simple – you’ll need to change the task version type as so:

If you are currently using YAML (as so many wonderful developers do) to author pipelines, you’ll need to do the following in the YAML code:

It’s very important to note that it’s not possible to mix and match task versions. If you do this, the ADO pipeline will fail, so please don’t try this!

I’m really excited about this, and to see that the CLI capabilities are being brought into play for ADO capabilities. I’ll admit that I’m wondering what else will be being released (in the fullness of time), as I’m sure that this is just the start of some great new stuff!

One of the things that I’m REALLY hoping for is the ability to use ADO pipelines to be able to migrate Power App Portals (or Power Pages), as currently it’s only possible to do using the Power Platform CLI, or the Configuration Migration Tool. It would be amazing to be able to do these with ADO pipelines as well!

Hillel Fuld on The Oops Factor

Talking to Hillel about his love of drones – how he got into it to start off with, the joys of it, and why he thinks that they’re so amazing! Also talking about entrepreneurs, launching new products/services, and the KEY things to be aware of when bringing something new to the market!


If you’d like to come appear on the show, please sign up at http://bit.ly/2NqP5PV – I’d love to have you on it!

Click here to take a look at the other videos that are available to watch.

PL-500: Microsoft Power Automate RPA Developer

RPA (or Robotic Process Automation) is a capability that Microsoft has been developing for a while within the Power Platform space. Whilst cloud flows can be used to interact with any systems that has an API in place, many organisations have (legacy) systems that have no API, so interacting with them can be challengin. RPA capabilities allow organisations to be able to interact with any system overall, thereby enabling & empowering businesses holistically.

I’ve been aware for a while that there’s been an exam coming out for RPA, though it’s taken a bit of time to land. That’s fine though – I can’t really think of any absolute rush to have it in place. I do think that over time, just as with some of the other certifications, it will become a required for solution or specialisation status.

The official page for it is at https://docs.microsoft.com/en-us/certifications/exams/pl-500. The specification for it is:

Candidates for this exam automate time-consuming and repetitive tasks by using Microsoft Power Automate. They review solution requirements, create process documentation, and design, develop, troubleshoot, and evaluate solutions.

Candidates work with business stakeholders to improve and automate business workflows. They collaborate with administrators to deploy solutions to production environments, and they support solutions.

Additionally, candidates should have experience with JSON, cloud flows and desktop flows, integrating solutions with REST and SOAP services, analyzing data by using Microsoft Excel, VBScript, Visual Basic for Applications (VBA), HTML, JavaScript, one or more programming languages, and the Microsoft Power Platform suite of tools (AI Builder, Power Apps, Dataverse, and Power Virtual Agents).

Now here’s the thing. I occasionally work in the automation space, either on customer projects, or when training users in the technologies. I wouldn’t describe myself as an advanced automation developer (whether cloud or RPA capabilities). I’m most definitely NOWHERE near the level of legends such as Matt Collins-Jones, for example (go check him out if you don’t know about him!).

So I knew that I may be a bit challenged when taking the exam, especially in the more ‘pro dev’ space (aka JSON etc). In fact, I didn’t actually realise that the exam specification included that sort of thing. I know, I should have – it’s aimed at developers overall…shows that I need to brush up on reading things properly!

Also, there’s still quite a bit of a focus on Power Automate cloud flows – it’s not JUST about RPA capabilities.

Now, really nicely, there are already Microsoft Learn pathways available (which have been around for a while, and updated appropriately). This really is a big help, I feel, especially for people who are new’ish to RPA.

Of course, there’s a lovely shiny two star badge awarded when passing the exam, along with the title of ‘Microsoft Certified: Power Automate RPA Developer Associate’:

As with previous exams, I sat it from home (the proctored experience). Learning from previous times that I’ve taken exams, I ensured that my workspace was entirely clear from everything. As a result, the check-in process happened automatically, and I didn’t need to engage with any proctors at all (which was quite nice actually).

As in my previous exam posts, I’m going to stress that it’s not permitted to share any of the exam questions. This is in the rules/acceptance for taking the exam. I’ve therefore put an overview of the sorts of questions that came up during my exam. (Note: exams are composed from question banks, so there could be many things that weren’t included in my exam, but could be included for someone else! ). I’ve tried to group things together as best as possible for the different subject areas.

  • Cloud flows vs RPA flows
    • Capabilities of each
    • When to use each (ie how to handle different scenarios)
    • How to trigger each one
  • Cloud flows
    • Different types of triggers, & when each type should be used
    • Different types of actions, and the capabilities of them (at a high’ish level – expected to know common Microsoft actions, but not need to know all of the hundreds of different ones!)
    • Controls/operators. What they are, how they can be used to accomplish different requirements
    • JSON formatting & syntax
  • Business Process flow vs Business Rules
    • What each is
    • When to use each one
    • Capabilities
  • RPA flows
    • Common actions, how they work, capabilities of them
    • How expression syntax works within them
    • Debugging capabilities, and what to use when
    • How to interact with desktop applications
    • How to interact with websites
      • How data values can be used
      • How data tables can be used
      • How to use data that’s extracted from a website
    • Troubleshooting functionality
  • Usage of automation capabilities from Office 365 applications such as Excel & Visio
  • Loops
    • How they work for cloud & RPA flows
    • Troubleshooting
    • Implementing success/fail criteria
    • Error handling
  • Process Advisor
    • What it is
    • What it does
    • How it can help organisations
    • Limitations
    • What it cannot do
    • Process Mining vs Task Mining, & the important differences between them
  • Variables
    • How to handle variables across different environments
    • How to declare them (cloud flow vs RPA flow)
  • Runtime operations
    • How flows are triggered (async vs sync)
    • How flows are queued (cloud vs RPA)
    • How RPA flows are carried out when using machine groups
  • Artificial Intelligence (AI) capabilities
    • How AI can be used within flows
    • Different AI capability types (what each one can be used for)
    • AI within Power Platform, & AI within Azure Cognitive Services
  • Sharing flows
    • Different ways to share cloud flows
    • Different ways to share RPA flows
  • Application Lifecycle Management (ALM)
    • Solutions (managed vs unmanaged). Capabilities of each, when to use each type
    • AzureDevOps (ADO). What it is, when/how to use it, capabilities
    • Solution imports
    • Solution layers. What these are, troubleshooting functionality
    • Upgrade/Stage for Upgrade/Update. Which each is, what each does, how/when to use each one
    • Moving desktop flows between users
  • Security
    • Security roles needed to create
    • Security roles needed to share/modify
    • Security roles needed to register machine for RPA
    • Security roles needed to register machine groups for RPA
    • Security requirements to run different types of RPA flows (how it interacts with desktop/s)
    • Data Loss Prevention (DLP) – how it affects creation & runtime of flows

Overall, I had 46 questions, with a single case study. I’m used to having at least two case studies, so it was nice to have just one of them this time.

So….it’s a lot of stuff. Definitely targeted much more at the ‘pro-developer’ end of the scale that someone who might occasionally automate things. It’s absolutely necessary to understand coding conventions, ALM, etc.

It’s definitely an exam that if you’re not already currently hands-on with the skills needed, I’d highly recommend you get a decent amount of experience with it before taking the exam! I’d highly recommend ensuring that you have an environment in which you’re able to be hands on with all types of automation (cloud & desktop flows), and really understand how they can be handled with an eye on the enterprise scale!

If you’re aiming to take it – I wish you the very best of luck, and let me know your experience!