Developer environments – new capabilities to create for users

Developer environments are awesome. There – I’ve said it for the record. Formerly known as the ‘Community Plan’, developer environments are there for users to be able to play with things, get up to speed, test out new functionality, etc. They’re free to use – even with premium capabilities & connectors, users do not need premium licensing in place (caveat – if it’s enabled as a Managed Environment, it will require premium licensing).

Originally, users were only able to create a single developer environment. However, earlier on this year Microsoft lifted this restriction – users are now able to create up to THREE developer environments for their own usage (which makes it even easier now for users to get used to ALM capabilities, and try it out for themselves).

Now, the ability for users to create developer environments is controlled at the tenant level, and it’s either On or Off. It requires a global tenant admin to modify this setting, but it’s not possible to say ‘User Group A will not be able to create developer environments for themselves, but User Group B will be able to’.

Organisations have differing viewpoints on whether they should allow their users the ability to create developer environments or not. I know this well, as usually I’m part of conversations with them when they’re debating this.

One of the main challenges that comes when organisations don’t allow users to create their own developer environments has been that historically, it’s not been possible for someone else to create the environment on their behalf. If we think of ‘traditional IT’, if we’re not able to do something due to locked down permissions, we can usually ask ‘IT’ to do it for us, and grant us access. This has not been the case with developer environments though – well, not until recently.

Something that I do from time to time is chat with the Microsoft Product Engineering groups, to provide feedback to (try to!) help iterate products forward and better. One of the conversations I had in the summer was with the team responsible for developer environments. I was able to share experiences & conversations that I had been having with large scale enterprise organisations, and (very politely!) asked if they could look to open up the ability to do something around this.

Around a month ago or so, the first iteration of this dropped – in the Power Platform Admin Centre interface, it was now possible to specify the user for whom an environment was to be created!

This was an amazing start to things, and definitely would start unblocking Power Platform IT teams to enable their users, in circumstances where their organisations had decided to turn off the ability for users to create their own developer environments.

However, this still required the need to do it manually. Unless looking into an RPA process (which, let’s face it, would be clunky & undesirable), it meant that someone with appropriate privileges would need to go & actually create the environment, and associate it to the user.

However, this has now taken another MASSIVE step forward – I’m delighted to announce that this capability has been implemented in the Power Platform CLI, and is live RIGHT NOW (you’ll need to upgrade to the latest version – it’s present in 1.28.3 onwards).

So, with this in place, it’s now possible to use PowerShell commands to be able to create developer environments on behalf of users, and assign it to them. Organisations usually already have PowerShell scripts to handle new joiners, and will therefore be able to integrate this capability into these, to automatically set up developer environments for users. Alternatively, existing users could look to raise internal requests, and have them automated through the use of PowerShell (along with appropriate approval processes, of course!).

So this is really nice to see. However, I think it can still go one step further (at least!), and am trying to use my connection network to raise with the right people.

See, we have the Power Platform for Admins connector within Power Platform already. One of the functions available in this is to be able to create Power Platform environments:

However, if we look at the action (& the advanced settings within this action), there’s no ability to set this:

Interestingly enough, the API version listed by default is actually several years old. By doing some digging around, I can see that there are multiple later API versions, so I’m not sure why it’s using an older one by default:

What would be really amazing is to have these capabilities surfaced directly within Power Platform, using this connector. Then we could look to have everything handled directly within Power Platform. Given that the CoE toolkit already includes an Environment Request feature, I would see this as building on top & enabling it even further. Obviously organisations wouldn’t need the CoE toolkit itself, as they could look to build out something custom to handle this.

What are your thoughts on this – how do you see these features enabling your organisation? If your organisation HAS locked down the ability for users to provision developer environments, are you able to share some insights as to why? I’d love to hear more – drop a comment below!¬

MB-910: Microsoft Dynamics 365 Fundamentals Customer Engagement Apps

So here’s the thing. There used to be the MB-900 exam, which was the Microsoft Dynamics 365 Fundamentals exam. This was aimed at people who had a small knowledge of Dynamics 365, and it was really the base/entry-level exam into the qualifications for it.

However, Dynamics 365 is actually comprised of two ‘parts’. There’s the ‘front office’ part that’s usually referred to as Customer Engagement (well, depending on how Microsoft wish to refer to it as, which can change from time to time!), and there’s the ‘back office’ part, which is the ERP side of things. This is the finance & operations sphere, where those functions take place.

The MB-900 was a slightly strange exam, in my opinion, because it covered both. There were questions around things like Sales, Customer Service, etc, but there were also Supply Chain Management questions as well, for example. Now I’m not saying that people shouldn’t know about both ‘sides’ of the equation, but people usually (for the most part) handle one or the other. It’s generally unusual to find someone knowledgeable about both.

Furthermore, if we take a look at the more in-depth exams in the MB range, we find that there’s a definitive split there. The MB-2xx series cover Customer Engagement, whereas the MB-3xx series covers the ERP side of things. So it’s definitely not the norm to have both sides included in a single exam.

Microsoft came to the realisation around this, and have therefore decided to update the Fundamentals space. In doing this, they’ve split things out. There’s the MB-910 exam (which is what this post is about), and the MB-920 exam, which focuses specifically on the ERP space. A good move, in my opinion..

The MB-910 launched this past weekend, and I took it around a day after it went live. Let’s go take a look at it, and recap my experience with it.

The official description of the exam is:

This exam covers the features and capabilities of Microsoft Dynamics 365 customer engagement apps.

Candidates for this exam should have general knowledge of or relevant working experience in an Information Technology (IT) environment. They should also have a fundamental understanding of customer engagement principles and business operations.

Taking it leads to the qualification for ‘Microsoft Certified: Dynamics 365 Fundamentals Customer Engagement Apps (CRM)’.

The description around the qualification is:

If you’re familiar with business operations, customer relationship management (CRM), and are IT savvy—either generally or through work experience—take advantage of this certification to highlight those skills. Validate your broad exposure to the customer engagement capabilities of Dynamics 365 to enhance your career journey.

People in different roles and at various stages in their careers can benefit from this fundamentals certification. Here are some examples:

IT professionals who want to show a general understanding of the applications they work with

Business stakeholders and others who know Dynamics 365 and who want to validate their skills and experience

Developers who want to highlight their understanding of business operations and CRM

Students, recent graduates, and people changing careers who want to leverage Dynamics 365 customer engagement capabilities to move to the next level

The official page for the exam is at https://docs.microsoft.com/en-us/learn/certifications/exams/mb-910 where it gives quite a good overview of things. Go take a look at it, and also take a look at the associated learning paths.

Once again, I sat the exam through the proctored option (ie from home). This is the way that I now usually take exams (even if I could go to an exam centre, I think that I’d be unlikely to, given the travel/time needed!). Checking in for the exam went without issues (the process definitely seems to be getting smoother each time), and I was ready to go within a few minutes.

As in my previous exam posts, I’m going to stress that it’s not permitted to share any of the exam questions. This is in the rules/acceptance for taking the exam. I’ve therefore put an overview of the sorts of questions that came up during my exam. (Note: exams are composed from question banks, so there could be many things that weren’t included in my exam, but could be included for someone else! ). I’ve tried to group things together as best as possible for the different subject areas.

  • Project Operations
    • Scheduling resources
    • Entering project time/costs
    • Skills
    • Roles
    • Different types of project costings
  • Customer Service
    • SLA’s, what they are, which ones to use
    • Omnichannel, including capabilities and channel functions/availabilities
    • Power Virtual Agents
  • Sales
    • Lead processes, deactivating & reactivating
    • Opportunity processes
    • LinkedIn Sales Navigator. How it interacts, which capabilities it has within it, how it works
    • Quotes. How they work, what’s required to handle them, document generation
  • Marketing
    • Website forms
    • Automation around responses
    • A/B testing
    • Event management
  • Field Service
    • Work orders
    • Route optimization
    • Scheduling boards
  • Document options
    • Attachments that users can access within the system, as well as outside of Dynamics 365
    • File collaboration tools, and integration with them
  • Timelines & activities
  • System currencies, default options, additional currencies, and updating them
  • Understanding different types of tables, and when you’d use each one
  • Reporting capabilities
    • How data is able to be reported on
    • Report Builder Wizard
    • Reporting on data held in Dataverse
    • Reports in dashboards
    • Usage of Power BI, including data gateways

I was slightly surprised with the level of detail in some of the areas. I wasn’t, for example, expecting the emphasis on Project Operations and Field Service that came up for me. Some of the level of detail seemed more fitting for an MB-2xx exam than this Fundamentals exam.

In a similar vein, I also wasn’t expecting Power BI and Power Automate so much. Perhaps that’s just my own perspective, though obviously with the Power Platform it would be there. However there is a PL-900 exam, around Power Platform capabilities, that I’d expect those sorts of questions to be in, rather than here in this exam.

Otherwise I think that it was generally on point for what I’d expect to find at this level of exam. The questions have definitely evolved over time, and I found myself giving more consideration to answers than I would have on the previous version.

It’s a good place to start for people who are looking to get qualified around Dynamics 365! If you do decide to take it, please drop a comment below to let me know how it was for you – I’d love to hear about your experience!

Managed Solutions, & replacing a field

Well to start with, I’m sure that I’m going to get pulled up by some people for my use of the word ‘field’ in the title. After all, officially it’s now a ‘column’! But I (still) can’t let go of calling them as I’ve done so for over a decade, so field it is.

Now to the actual topic of this blog post, which is centred around Managed Solutions. Leaving aside the whole debate about whether we should be using managed or unmanaged solutions (& when/where to do each), there is one definitive benefit of using a managed solution.

See, unmanaged solutions are additive in nature. Work is done in the development environment, then deployed. Further work is done (additional items added, etc), and deployed, and they then appear in the downstream environments. However, if you delete an item in the development environment, it’s not removed when the solution is deployed downstream.

Managed solutions, on the other hand, are both additive & detractive. As with unmanaged solutions, items added in the development environment are also added downstream when deployed. However, if an item is removed from the solution in the development environment, it will also be removed when the solution is deployed downstream. It’s one of the useful ways to ensure that you don’t end up with random unused items just lying around in Production (which have a habit then of popping up in the Advanced Find window, for example). So it’s really quite handy for a lot of reasons to go down this route.

Well, I found myself going down this route recently, but with slightly unexpected results, I’ll freely admit…

The scenario was that we had deployed a managed solution to the UAT (test) environment on a client project. Then the client changed their mind (shock & horror!!) as to a specific item, and we needed to change it from a text item to a lookup item. Obviously (as per best practise, of course) this would need to be done in the development environment, and then released downstream. Given that this is a managed solution, I’d expect this to work, without any issues. Well, it didn’t…

The change in the development environment (deleted the old item, ‘re-created’ it as a lookup with the same system name) was done, we exported it as managed, and then went to import it in the UAT environment. It took the solution file, thought about it for a while (it’s somewhat of a large solution), & then errored:

Exception type: System.ServiceModel.FaultException`1[Microsoft.Xrm.Sdk.OrganizationServiceFault] Message: Attribute mdm_field is a String, but a Lookup type was specified.

Now I was somewhat confused by this message occurring. It’s not been the first time I’ve seen it over the years, but in my previous experience I’ve seen it when handling unmanaged solutions. It’s when you delete an item in the development environment, re-create it as a different item type (with the same underlying system name), and then deploy it as unmanaged. The solution import in the second environment fails due to the different in the type (as it sees the same name). This, of course, is to be expected.

But here we’ve been using managed solutions for deployment, and as mentioned above, they’re detractive as well. The expected behaviour (at least from my side of things) would be that the system would note that the item type has changed, remove the old item, & import the new item. In my mind, that’s logical, but apparently not?

See, even managed solutions have their limitations, of which this is one of them. Having checked with several other people who I reached out to around this, I’ve discovered that it can’t work in the way that I was expecting it to. Instead, a specific process has to be followed

  1. In the development environment, remove the item, & export the solution as managed
  2. In the downstream environment(s), deploy this (interim) managed solution. This will remove the item from the environments
  3. In the development environment, re-create the item with the different system type. Then export it as managed
  4. In the downstream environments, deploy this solution. This will then add the item (with the new system type) into the environment.

This means that development & deployment teams (if separate ones) need to co-ordinate around this, to ensure it’s done in the right way. It could also be developed/exported in succession, and then imported in succession as well (either manually, or through an Azure DevOps Pipeline, for example).

This worked wonderfully for us, and to be honest, I was quite relieved after several hours of frustration with things. Even better, it was a Friday, so meant that the week could end well!

Have you ever come across this, and been frustrated as well? Have you got a similar story with something else that happened to you around solutions? Drop a comment below – I’d love to hear!

AI-900: Microsoft Azure AI Fundamentals

One of my recent decisions has been to explore the Azure space. There are several reasons behind this. CDS, as we (hopefully!) know sits on top of Azure, and it’s useful to know the broader digital estate available on the platform.

I’ve also been looking into some of the Cognitive Services functions that are available within Power Platform. These all live in Azure, and are surfaced into Power Apps etc. It’s therefore good to know what can be done outside of the ‘Power Platform bubble’, and the options there.

Incidentally, a year ago I even built a canvas app that allowed you to take a picture of a motorbike tyre. Using AI Builder functionality, it then analysed if the tyre tread was legal or not! That was a really cool proof of concept.

So a good place to start, I thought, would be with the AI-900. This covers the fundamentals of the AI offerings that are in Azure. I had forgotten though that with fundamental exams, there’s only 60 minutes available! Seeing the timer ticking down from that give me a little surprise, though I managed to get through it (& pass!) in good time.

The official description of the exam is

Candidates for this exam should have foundational knowledge of machine learning (ML) and artificial intelligence (AI) concepts and related Microsoft Azure services.

This exam is an opportunity to demonstrate knowledge of common ML and AI workloads and how to implement them on Azure.

This exam is intended for candidates with both technical and non-technical backgrounds. Data science and software engineering experience are not required; however, some general programming knowledge or experience would be beneficial.

The official page for the exam is at https://docs.microsoft.com/en-us/learn/certifications/exams/ai-900, where it gives quite a good overview of things. Go take a look at it, and also take a look at the associated learning paths.

Once again, I sat the exam through the proctored option (ie from home). Honestly I think that my experience this time has probably been the best so far. I went through the usual system checks for signing in. The proctor came alone, and within 30 seconds they had released the exam!

So, as before, it’s not permitted to share any of the exam questions. This is in the rules/acceptance for taking the exam. I’ve therefore put an overview of the sorts of questions that came up during my exam. (Note: exams are composed from question banks, so there could be many things that weren’t included in my exam, but could be included for someone else!). I’ve tried to group things together as best as possible for the different subject areas.

  • Image recognition types
    • What each one is, what it’s used for
    • When to use for a specific scenario
  • Facial recognition
    • Different types available
    • What each one is, what it’s used for, when to use for a specific scenario
    • Limitations & issues that can occur when using it
  • Text:
    • Different recognition types
    • What each one is, what it’s used for, when to use for a specific scenario
    • Analytics. How this works, how to set up & use
    • Translation. Different options available, how they work, when to use for a specific scenario
    • Sentiment analysis. How it works, limitations, what’s needed to train a model
  • QnA Maker
    • What this does, how to set it up, how to train it
    • Generating material with it
    • Use with chatbots
  • Machine Learning
    • What this actually is, and what it does
    • How it works
    • Different types that are available, how they work, how to train a model
    • Classification options
  • Machine Learning Designer
    • How to use & set up
    • Different types of data/options used within it
    • Training & evaluation models. The steps needed for this, how to set it up correctly
    • Types of modules available
    • Validation sets
  • Chatbots
    • What they are
    • How/where they can be used
    • Limitations
    • Integration with other systems
  • Charts
    • Different charts that are available for use
    • Reading them correctly
    • Model types shown on them
    • Metrics!
  • Microsoft AI Principles
    • The different principles that are published
    • What each one means/refers to

Overall, it was quite good. The Microsoft AI Principles were new to me, and I had to guess at those (I went to look them up afterwards!). Other than that, some bits I breezed through, other parts I took careful stock of.

This is definitely an area that I’m going to continue exploring, and will be writing up further exams that I take in it. I’m curious what your experience of it has been – please drop a comment below to let me know!

PL-400: Microsoft Power Platform Developer Exam

I’ve been continuing with taking new exams as they come out. Having recently taken the MB-400 exam (see MB-400 Power Apps & Dynamics 365 Developer Exam), I was slightly surprised to see the announcement that it was going to be replaced!

Admittedly, I was also surprised (in a good way) that I passed the MB-400, not being a developer! It’s been quite amusing to tell people that I’m a certified Microsoft Dynamics Developer. It definitely puts a certain look on their faces, which always cracks me up.

Then again, the general approach seems to be to move all of the ‘traditional’ Dynamics 365 exams to the new Power Platform (PL) format. This includes obviously re-doing the exams to be more Power Platform centric, covering the different parts of the platform than just the ‘first party apps’. It’s going to be interesting to see how this landscape extends & matures over time.

The learning path came out in the summer, and is located at https://docs.microsoft.com/en-us/learn/certifications/exams/pl-400. It’s actually quite good. There’s quite a lot that overlaps with the MB-400 exam material, as well as the information that’s recently been covered by Julian Sharp & Joe Griffin.

The official description of the exam is:

Candidates for this exam design, develop, secure, and troubleshoot Power Platform solutions. Candidates implement components of a solution, including application enhancements, custom user experience, system integrations, data conversions, custom process automation, and custom visualizations.

Candidates must have strong applied knowledge of Power Platform services, including in-depth understanding of capabilities, boundaries, and constraints. Candidates should have a basic understanding of DevOps practices for Power Platform.

Candidates should have development experience that includes Power Platform services, JavaScript, JSON, TypeScript, C#, HTML, .NET, Microsoft Azure, Microsoft 365, RESTful web services, ASP.NET, and Microsoft Power BI.

So the PL-400 was announced on the Wednesday of Ignite this year (at least in my timezone). Waking up to hear of the announcement, I went right ahead to book it! Unfortunately, there seemed to be some issues with the Pearson Vue booking system. It took around 12 hours to be sorted out, & I then managed to get it booked Wednesday evening, to take it Thursday.

So, as before, it’s not permitted to share any of the exam questions. This is in the rules/acceptance for taking the exam. I’ve therefore put an overview of the sorts of questions that came up during my exam. (Note: exams are composed from question banks, so there could be many things that weren’t included in my exam, but could be included for someone else!). It’s also in beta at the moment, which means that things can obviously change.

There were a few glitches during the actual exam. One or two questions with answers that didn’t make sense (eg line 30 does X, but the code sample finished at line 18), and question numbers that seemed to jump back & forth (first time it’s happened to me). I guess that I’ve gotten used to at least ONE glitch happening somewhere, so this was par for the course.

I’ve tried to group things as best together as I feel (in my recollection), to make it easier to revise.

  • Model Apps.
    • Charts. How they work, what drives them, what they need in order to actually work, configuring them
    • Visualisation components for forms. What they are, examples of them, what each one does, when to use each one
    • Custom ribbon buttons. What these are, different tools able to be used to create/set them up, troubleshooting them
    • Entity alternate keys. What these are, when they should be used, how to set them up & configure them
    • Business Process Flows. What these are, how they can be used across different scenarios, limitations of them
    • Business Rules. What these are, how they can be used across different scenarios, limitations of them
  • Canvas apps
    • Different code types, expressions, how to use them & when to use them
    • Network connectivity, & how to handle this correctly within the app for data capture (this was an interesting one, which I’ve actually been looking at for a client project!)
    • Power Apps solution checker. How to run it, how to handle issues identified in it
  • Power Automates
    • Connectors – what these are, how to use them, security around them, querying/returning results in the correct way
    • Triggers. What is a trigger, how do they work, when to use/not use them
    • Actions. What these are, how they can be used, examples of them
    • Conditions. What these are, how to use them, types of conditions/expressions/data
    • Timeouts. How to use them, when to use them, how to configure
  • Power Virtual Agents. How to set them up, how to configure them, how to deploy them, how to connect them to other systems
  • Power App Portals. Different types, how to set them up, how to configure them, how they can work with underlying data & users
  • Solutions
    • Managed, unmanaged, differences between them, how to use each one.
    • Deploying solutions. Different methods that can be used to do it, best practise for each, when to use each one
    • Package Deployer & how to use it correctly
  • Security.
    • All of the different security types within Dynamics 365/Power Platform. Roles/Teams/Environment/Field level. How to set up, configure, use in the right way.
    • Hierarchy security
    • Wider platform security. How to use Azure Active Directory for authentication methods, what to know around this, how to set it up correctly to interact with CDS/Dynamics 365
    • What authentication methods are allowed, when/how they can be used, how to configure them
  • ‘Development type stuff’
    • API’s. The different API’s that can be used, methods that are valid with each one, the Organisation service
    • Discovery URL’s. What these are, which ones are able to be used, how they’d be used/queried
    • Plugins. How to set up, how to register, how to deploy. Steps needed for each
    • Plugin debugging/troubleshooting. Synchronous vs asynchronous
    • Component types. Actions/conditions/expressions/data operations. What these are, when each is used
    • Custom ribbon buttons. What these are, different tools able to be used to create/set them up, troubleshooting them
    • Javascript web resources. How to use these correctly, how to set them up on entities/forms/fields
    • Powerapps Component Framework (PCF). What these are, how to develop them, how to use them in the right way
  • System Design
    • Entity relationship types. What they are, what each one does, how they work, when to use them appropriately. Tools that can be used to display them for system design purposes
    • Storage considerations across different types, including CDS & Azure options
  • Azure items
    • Azure Consumption API. How to monitor, how to handle, how to change/update
    • Azure Event Grid. What it is, the different ways in which it can be used, when each source should be used
  • Dynamics 365 for Finance. Native functionality included in it

The biggest surprise that I had really when thinking back to things was the inclusion of Dynamics 365 for Finance in it. Generally the world is split into ‘front of house’ (being Dynamics 365/Power Platform), and ‘back of house’ (Dynamics 365 for Finance & Supply Chain Management). The two don’t really overlap, though they’re supposed to be coming more together over time. Being that this is going to happen, I guess it’s only natural that exam questions around each other will come up!

Overall it was quite a good exam. Some of the more ‘code-style’ questions were somewhat out of my comfort zone, and I’ll freely admit to guessing some of the answers around them! Time will tell, as they say, to see how I’ve done in it.

I hope that this is helpful for anyone who’s thinking of taking it – good luck, and please do drop a comment below to let me know how you found it!

Keeping belief in oneself

Although I usually post around technical matters & such, occasionally I digress into personal reflection. After all, this is my personal blog, & I feel it’s sometimes good/relevant to share certain personal things. Today’s post is along those lines, though it does relate to a technical matter.

Let’s set the scene. As many of you know (either from knowing me personally, or from reading my blog posts), I’m from the ‘model-driven app’ background. Canvas apps are really cool, but I wouldn’t say that I’m a very advanced creator of them. I’m learning the whole time about them (well, when I have a free minute here & there). There are many people in the community who are extremely more advanced than I am, and I love being able to learn from them.

I’m also considered to be in ‘Delivery’, This is the fancy word for those who run/are involved in projects, rather than selling concepts to clients. I’d run a mile if someone tried to put me in a Sales role (though I do admire the power suits that Sales have, occasionally). I’ve done a bit of Pre-Sales (where I’m helping out from a technical perspective), but haven’t been heavily involved. It’s actually something that I’m trying to work on, with being a tech evangelist. After all, if people already know/rave about the tech, how can you evangelise about it to them!!

Account Managers vs Sales People - davidmarkshaw

So last week I get a call from our Sales team. They’re really nice, and know their stuff. However they’re not ‘techies’. They had a situation – we’d been talking to a client about a potential project, and the client told us to pitch for it. Brilliant, right? Well…

The client told us that we had 4 days until the pitch deadline. Not only were we needing to pitch with the usual presentation pack (however would Sales operate without PowerPoint…?), we also had to do a live demo. Not for a completed product, but rather a Proof of Concept (PoC).

The only person available was….yes, you guess it…me. There wasn’t anyone else around with the necessary knowledge/skills to create the PoC in the time-frame needed. I’ll freely admit that I was absolutely slammed with existing projects, but wanted to be able to help out.

However, things then got ‘better’. And by ‘better’, I meant ‘interesting’. I got told who else was pitching to the client. Obviously I’m not going to mention any specific details here, but I knew who they are. More importantly, I figured that I had a very good idea of who from their side would be creating the tech, & doing the pitch.

Now as I’m not mentioning any identifiable details, I’m feeling free to say this. They’re not at my level of tech skills. They’re nowhere NEAR my level of tech skills. This is NOT because I’m better than they are. Totally the opposite – they’re SO far ahead of me with their knowledge of things, I can barely see the dust that they kick up in a race.

Knowing this, I knew that I couldn’t build a model-driven app (though it would have worked perfectly for the scenario/s we were given). I HAD to do a canvas app. But even with doing that, it wasn’t going to be anywhere near as good as what the other side would be able to put on.

The phrase ‘gibbering in fear’ does come to mind with my reaction to finding all of this out. I did feel slightly like a deer caught in the headlights. I wanted to do well, both for myself & my company, but I honestly had no idea how we could stack up.

Deer in the Headlights: By Generation Success – Generation Success
How I felt I looked like!

Thankfully, my company has an extremely open culture, and I was therefore able to talk to my manager about it. He understood where I was coming from, but encouraged me to go for it & do what I would be able to create.

My wife also encouraged me to go for it. Well, actually her words were ‘it’s not sexy when a husband says that he can’t do it, so man up and go for it!’. Ha…after that I couldn’t very well NOT do it.

So I applied myself, and with some VERY late nights (I did have other projects on, as I mentioned above), managed to get something in place. Not only did I create it, I think it looked really good. There was some really nice (canvas app) functionality, and it all came together pretty well.

Everything was in place in time (including some last minute tweaks). I even decided to spice up the demo a bit, and borrowed some dinosaurs from the kids to use for personas. We were using live camera feeds for part of the demo, and suddenly the demo was joined by ‘Rexy’, the ‘Customer Service Representative’ T-Rex! They were quite amused by it (thankfully!), and our team thought it was absolutely hilarious.

Hire A Dinosaur - Creature Events
‘Good afternoon, how may I be of assistance?’

I have no idea how the other partner pitched to the client, or what the decision will be from the client. It’s way too early for that.

What I do know is that sometimes we can lose track of ourselves. I’m not going to go into the subject of ‘Imposter Syndrome’ (check out Em D’Arcy if you want to read up about that). Rather that having others around to encourage us, even though others may be more skilled, can really make the difference.

In life, we can often face challenges. How we handle them, and how we decide to move forward, can define who we are. When dealing with technology items such as the Power Platform, where there’s constant change, it can sometimes feel very daunting, but we still need to push ahead.

Yesterday I was listening to Lisa Crosbie talking about her journey into technology (and canvas apps). As she put it – ‘there is no comfort zone here – you need to find a place to feel comfortable with this level of discomfort, and ride it to be successful’. It’s really so true. It’s not just needing to push ourselves in the traditional way, but to keep up our own confidence in our skills & abilities. With this, we can continue to drive forward, keep on learning, and continue our journey of greatness!

I’m really glad that I was able to do this, and hope that I can keep this with me. By doing so, I’ll be able to continue along my own journey.

Have you ever had a time when a challenge seemed insurmountable? How did you cope with it? Drop a comment below – I’d love to hear!

Microsoft Dynamics 365 Certifications

Having recently completed several exams, including the new MB-900 Fundamentals for Dynamics365, I thought it would be useful to set out how the new exam structure works, and what paths can be taken within it.
This post is meant to be for D365 CE, not for F&O (I’m hoping to do a separate post on that another time).

10 Marketing Certifications That Can Land You a Job at Ladder

The first question that usually comes up around certifications is ‘why should I take the exams – I know how to use/configure/deploy the system!’.
The answer to this is actually quite easy – if you know the stuff, then the exams won’t be too hard for you. They’ll also give you a better overview of things, especially due to the new curriculum (eg including cloud offerings, etc).

Not only is it rewarding for you to take (and pass!) them, it shows that you’re able to do so (and you get cool badges…thanks Microsoft for gamifying things lol).
Additionally it can also help your company to qualify for different Microsoft Partner tiers, which can be quite important in the grand scheme of things (I am NOT going to talk about the recent IUR situation…)

It can also help when applying for a job position, as recruiters will check to see if you’re current with the latest exams. Experience is great of course, but they’ll want to know why you may not have any (recent) exams to show your knowledge.

The first exams in the series that I’d recommend to take are:

The MB-900, as per the name, goes over the fundamentals of Dynamics 365, and also gets you used to the new format (it’s now 60 minutes, with approx 25 questions). There are now drag’n’drop questions, multiple choice answers, and ‘journey style’ questions (these are when the question presented depends on the answer given for the previous question)

The MB-200 exam covers the different deployment types, configurations and integrations, and click-based customisations. It expands on the base that’s set out in the MB-900. 

The next question usually asked is ‘what area/app should I specialise in’?
That’s ALSO quite simple to answer – there are (currently) 4 options available for exams (after the MB-900). These are:

So, pick which one you think would be most suitable to your role, and take them. Of course, that’s not stopping you taking some of the OTHER exams as well – why not try to get the whole set in!

Study tips:

  1. Read the syllabus! Microsoft doesn’t just draw them up randomly – they cover the material needed. They’ve also been through Beta phases where feedback has been given (which Microsoft usually take some note of). It will give you an idea of where the focus is, what’s needed to check, etc
  2. Practise – hands on experience. You really DO need this now. Fire up a trial, start playing around. Use the syllabus as a guide for this – if it says that you need to know about cases (eg case management, case routing, case rules, parent/child cases), then make sure that you DO know how to do these!
  3. Talk to others who are studying at the same time – perhaps try to make a study group. I was fortunate enough to join twice-weekly session for one of my exams, hosted by an amazing Microsoft Trainer.
  4. When taking the exam, if you come across something that you don’t know, and are guessing the answer to – DON’T CHANGE THE ANSWER LATER ON. In this sort of scenario the gut reaction is usually 85% correct, and it’s better to leave it than try to second guess yourself.

Also, don’t stress out about the exams. They’re not the Big Bad Wolf – once you do them, you’ll see that they’re not absolutely crazy. Sure, you may have to guess a question or two, but even very experienced people do that.

Useful resources: