Data Export Service Connection Issues

This is a slightly different post from the usually stuff that I talk about. It’s much more ‘techy/developer’ focused, but I thought it would be quite useful still for people to keep in mind.

The background to this comes from a project that I’ve been working on with some colleagues. Part of the project involves setting up an Azure SQL database, and replicating CDS data to it. Why, I hear you ask? Well, there are some downstream systems that may be heavy users of the data, and as we well know, CDS isn’t specifically build to handle a large number of queries against it. In fact, if you start hammering the CDS layer, Microsoft is likely to reach out to ask what exactly you’re trying to do!

Therefore (as most people would do), we’re putting in database layer/s within Azure to handle the volume of data requests that we’re expecting to occur.

Azure SQL Database | Microsoft Azure

So with setting up things like databases, we need to create the name for them, along with access credentials. All regular ‘run of the mill’ stuff – no surprises there. In order for adequate security, we usually use one of a handful of password generators that we keep to hand. These have many advantages to them, such as ensuring that it’s not something we (as humans) are dreaming up, that might be easier to be guessed at. I’ve used password generators over the years for many different professional & personal projects, and they really are quite good overall.

Sordum Random Password Generator Creates Random Passwords with Ease -  MajorGeeks
Example of a password generation tool

Once we had the credentials & everything set up, we then logged in (using SQL Server Management Studio), and all was good. Everything that we needed was in place, and it was looking superb (from the front end, at least).

OK – on to getting the data actually loaded in. To do this, we’re using the Data Export Service (see https://docs.microsoft.com/en-us/power-platform/admin/replicate-data-microsoft-azure-sql-database for further information around this). The reason for using this is that the Data Export Service intelligently synchronises the entire database initially, and thereafter synchronises on a continuous basis as changes occur (delta changes) in the system. This is really good, and means we don’t need to build anything custom to handle it. Wonderful!

Setting up the Data Export Service takes a little bit of time. I’m not going to go into the details of how to set it up – instead there’s a wonderful walkthrough by the AMAZING Scott Durow at http://develop1.net/public/post/2016/12/09/Dynamic365-Data-Export-Service. Go take a look at it if you’re needing to find out how to do it.

So we were going through the process. Part of this is needing to copy the Azure connection string into into a script that you run. When you do this, you need to re-insert the password (as Azure doesn’t include it in the string). For our purposes (as we had generated this), we copied/pasted the password, and ran things.

However all we were getting was a red star, and the error message ‘Unable to validate profile’.

As you’d expect, this was HIGHLY frustrating. We started to dig down to see what actual error log/s were available (with hopefully more information on them), but didn’t make much progress there. We logged in through the front end again – yes, no problems there, all was working fine. Back to the Export Service & scripts, but again the error. As you can imagine, we weren’t very positive about this, and were really trying to find out what could possibly be causing this. Was it a system error? Was there something that we had forgotten to do, somewhere, during the initial setup process?

It’s at these sorts of times that self-doubt can start to creep in. Did we miss something small & minor, but that was actually really important? We went over the deployment steps again & again. Each time, we couldn’t find anything that we had missed out. It was getting absolutely exasperating!

Finally, after much trial & error, we narrowed the issue down to one source. It’s something we hadn’t really expected, but had indeed caused all of this to happen!

What happened was that the password that we had auto-generated had a semi-colon (‘;’) in it. In & of itself, that’s not an issue (usually). As we had seen, we were able to log into SSMS (the ‘front-end’) successfully, with no issues at all.

However when put into code, Azure treats the semi-colon as a special character (a command separator). It was therefore not recognising the entire password, which was causing the entire thing to fail! To resolve this was simple – we regenerated the password to ensure that it didn’t include a semi-colon character within it!

Now, this is indeed something that’s quite simple, and should be at the core of programming knowledge. Most password generators will have an option to avoid this happening, but not all password generators have this. Unfortunately we had fallen subject to this, but thankfully all was resolved in the end.

The setup then carried on successfully, and we were able (after all of the effort above) to achieve what we had set out to do initially.

Have you ever had a similar issue? Either with passwords, or where something worked through a front-end system, but not in code? Drop a comment below – I’d love to hear!

Thoughts around the Connection entity

I decided to write this post due to currently looking at the Connections entity. This is for a current project with a very specific purpose. When this came up, my thoughts went back to a previous project some years back when we also looked to use the Connections entity. I therefore thought that it would be good to recap & share my experience.

What are Connections?

Now, the Connections entity truly is a wonderful piece of work. It’s one of the core features that doesn’t actually get much time or effort devoted to it! However, it underpins a lot of the way that Dynamics 365 has been built to work over time.

The best way to summarise Connections is:

Connections are a very easy way to connections records without needing to have to create a custom relationship in the system. Connections can be used between records from the same entity, or from different entities.

See, you are able to connect one record to another record within the system. This could be account to account, account to contact, or contact to a custom entity. There are practically no limits, apart from the extent of your mind! All of this is done by leveraging the functionality that Connections brings to the table.

Note that I’m not talking about lookup fields here, which are also great, but work differently, and require creating a relationship between entities (or even within the same entity).

Just a quick reminder here that custom entities need to be enabled for connections – it doesn’t happen as standard when creating them. You can either do this when creating it, or you can edit the settings for it later:

How to use Connections

In order to connect one record to another, you need to open the first record & click the Connect button on the toolbar:

You’ll then be presented with the New Connection screen, where you’ll select the record that you want. Click the ‘All Records’ item at the top & then ‘Change View’ to select the actual entity that you’re wanting to look for:

You then select the record that you’re wanting, and save. Hey presto, the two records are now connected! To see the connected records, look at the associated ‘Connections’ setting from either record:

OK, so this is really all brilliant. For the absolute majority of situations, it works, and works well. There’s nothing better for it. There are a few small issues, such as the fact that you can’t use Business Process Flows or Business Rules for custom logic, but instead need to use Javascript, but for the most part they work well.

Edge case scenarios & issues

However, there are some edge case scenarios that I’ve come up against, which is the whole purpose for writing this blog post.

What happens if you’re trying to use Connections to establish a hierarchy of records. Eg one record is a parent of another record. Well, you could use a lookup field instead, but if you wanted to define specific attributes for the actual relationship, that wouldn’t work.

Here’s the scenario. You’re needing to capture the relationship between different people, along with certain attributes (eg if they’re a legal guardian, or a trustee, or have power of attorney, etc). You’d think that Connections would work brilliantly for this. After all, you can modify the actual Connections entity to add custom fields onto it. So for example, you could have something like the following:

Note: I’m not referencing Connection Roles, as you can only have a single connection role per connection. In the scenarios I’m handling, I’m needing to have multiple attributes per connection.

So you create the connection between the two records, and you set the attributes that you require. All good. What’s also good to remember is that Connections are bi-directional. You can view them from either ‘side’ of the connection. Eg:

Record 1

Record 2

That’s actually really helpful & useful in the normal scheme of things. You can easily see connections from either side.

But there’s a catch, or even (in our case above), an issue. If we open up each of the two Connection records, we’ll see the following data.

Joe Bloggs connecting to Helen Sommers:

Helen Sommers connecting to Joe Bloggs:

Can you spot the issue? Of course you can! On BOTH of the connection records, the custom fields that we set have the same values. We originally connected Joe Bloggs to Helen Sommers as the Legal Guardian, Power of Attorney & Trustee. Well, if we open up the connection record from Helen Sommers, we’re seeing the same values set, just in the opposite direction!

This is actually due to how Connections work. When you create a connection Record A to Record B, the system automatically creates a mirror Connection record from Record B to Record A. When it does this, it copies all of the values that you’ve set over to this mirror record.

So when you look at the data, you can’t actually see how the structure should work. It’s an issue. Especially if you’re passing the data to other system/s that may need to evaluate it. They just can’t understand this properly, and you’ll get some VERY unwanted results out of this.

Now, there is actually a field within Connections that shows which record is the ‘master’ (ie the one you actually created), and which one is the ‘mirror’ that the system created:

However even with this in place, we’ve found issues when using it:

  • If you’re relying on people looking at the record to see the information, they’re going to make mistakes (ie not checking this value). With the fact that the values are also displayed on the mirror record, this is very prone to user error, and isn’t a good way to do things
  • If passing information to another system (ie the record & the values), you need to program it to only allow it to pass records with this flag set correctly. If the other system is writing back data, it also needs to be configured to write back to the same record.

Summary

With all of this in mind (& especially considering that users may create connections from the ‘wrong direction’, which is quite possible to happen), it’s important to think of the best way to architect systems for regulatory purposes. Financial, legal & other judicatory requirements need to have a system that can handle them properly & accordingly, and not leave room for error.

Therefore, if you’re looking to handle these sorts of scenarios, I’d recommend to look at implementing a custom entity for those specific connections.

Another benefit of this is to separate out these connections from the general connections entity. That way, you’ll also be able to handle security appropriately, which is usually applicable in these sorts of situations. It will allow you to easily allow only a subset of users access (read and/or write) to this data, rather than trying to apply it to Connections (which is going to be a major headache!)