Data Export Service and Versioning of Azure Key Vault URLs

The Microsoft Data Export Service (DES) is a well-documented integration used to replicate CRM data to an external Azure SQL database. Even with its easy-to-use design, there are some limitations about what can be modified after the DES profile is created. One such example is the key vault secret URL.

When the initial profile is created and the key vault string has been input, it can no longer be changed without recreating the profile (which, by relation, requires the connected SQL tables to be dropped in order to start the sync again). If the key vault URL is copied directly, this will leave a static un-editable connection string. However, one small change to the DES profile set up will allow for the secret to be updated without needing to recreate the profile. Here’s how it works:

A standard key vault secret URL is formatted as such:

Once a key vault secret is created, the secret is locked and is unable to be edited – instead, it must be versioned. Azure Key Vault allows for secret versioning with the ability to have multiple versions enabled for the same vault at one time. The primary issue with simply versioning the secret is the new version changes the GUID, which means the URL that was entered in the profile is still pointing at the old secret. In the example below, the GUID in the secret (A) directly corresponds to the GUID on the end of the secret (B). This prevents standard operational updates to the connection string, such as password renewals.

To account for this, simply remove the GUID at the end of the secret URL. This will continue to point at the newest version of the secret, no matter how many times the secret is changed. This little trick can save several hours of work in case a change to the secret is ever needed in the future. Be sure to subscribe to our blog for more tips and tricks.

Happy Dynamics 365'ing!

Integrating to Dynamics 365 and Record Ownership

At PowerObjects, we often find that clients desire the option to perform data migration on their own. This may be due to the sensitive nature of the data or because of an effort to ensure internal resources know the migration/integration packages well. Regardless of the reason – and no matter the experience and skill levels of your organization’s data integration teams, pitfalls exist for those not experienced with Dynamics 365. As such, we insist on the allocation of time and resources to provide education and training to the internal teams responsible for the ongoing integration. Otherwise, your business may face multiple trial-and-error data runs to get it right.

One common mistake we see stems from failure to set the Owner of records when creating them in Dynamics 365. By default, Dynamics will set the Owner of each record to whichever User record was used during the migration. Often, this is an admin user with the System Administrator role – we’ll call it “Admin” for the sake of this blogpost. The records will import without issue and record counts will yield expected results. Great, right? Not so fast, as this is where a full understanding of record ownership in Dynamics is crucial. When a user logs in from a different business unit or even the same unit but with lesser permissions, they may not see all the data! Not good.

See, records are owned by users; each user resides in a single Business Unit and is assigned Security Role(s). The combination of Business Unit and Security Role(s) determines which records each user can see. Of course, if all users are granted Organizational access to records, you’ll likely never have this issue. Likewise, if your organization has only a single Business Unit, this potential issue is not applicable. However, many businesses – especially larger organizations – require multiple Security Roles and Business Units.

Let’s consider a scenario in which an organization has a Business Unit structure with a top-level unit named “Contoso.” A child Business Unit of Contoso is “United States,” and children Business Units of United States are “East” and “West.” Records imported with the owner as a system administrator (Admin) who resides in the Contoso Business Unit means that all the imported records will live in top-level Contoso. When other users are added to East, West, or United States, they will not see the records in the system if they are given Business Unit (or lower) read access to the data.

Without testing visibility of data with roles other than Admin, you will not know you have this issue. And considering that your data migration will likely contain multiple jobs for multiple entities, as well as references to other entities, you could end up having to reload all of your data. With the possibility of millions of records in your migration, that data reload could set you back several days or even weeks. Also consider that not seeing the need to have all of your users loaded in Dynamics 365 could lead to a scramble to get all personnel added as users and making updates to the migration jobs.

Now, if you’re thinking No worries… I can just update the owner of the records with a subsequent job, you’re correct in that being a possibility. But… when you go to load your production environment, you will be adding time to your migration to make a second pass on all the records. Why not just avoid that problem entirely by setting the Owner appropriately from the very beginning?

Hopefully this post helps you avoid having to learn this lesson the hard way. For more Dynamics 365 tips and tricks, be sure to subscribe to our blog!

Happy D365’ing!

Join Us for our Upcoming Webinar: CRM On-the-Go with Outlook

One of the reasons Microsoft Dynamics CRM has become the go-to software for customer relationship management is its natural, streamlined integration with Outlook. If you're already using this integration, have you ever wondered if you're utilizing it to its full potential? Join one of our SMB business experts, Brady Garvin, on Friday, April 15th at 10 a.m. CST for a webinar showcasing all the features and functionalities of the Microsoft Dynamics CRM Outlook integration.

By utilizing the Dynamics CRM Outlook client, you don't have to leave Outlook to access CRM, saving you time in your day to focus on other tasks. Tracking emails from Outlook into CRM is also a breeze, but how do you know if you're only syncing relevant information from Outlook into your Dynamics CRM? Brady will show you how!

We'll include a demo of CRM for Outlook, including the out-of-the-box functionality that's included in the software. We'll also discuss use cases and scenarios that can help your organization streamline business operations, plan your day, and be more productive. Find out how easy it is to incorporate CRM into your everyday work life without having a steep learning curve to learn a new product.

Don't miss this chance to learn more about Outlook and CRM.

Happy CRM'ing!

Integrating with the Perfect Pair: webMethods and Dynamics CRM

If you’ve scoured the net lately looking for information on using Microsoft Dynamics CRM with the popular enterprise data integration platform webMethods, you already know that the results are few and far between, and any information that one may find ends up being mostly unanswered questions in forums. Ugh the frustration! In today’s blog, we aim to fix that – at least somewhat! We’ll be answering some of your burning questions about webMethods and Dynamics CRM, so let’s get started!

Integrating Dynamics CRM with an application/API using webMethods is not terribly straight-forward for older versions of webMethods, but the process becomes much simpler when using the latest and greatest version. For example, webMethods 6.5, a popular older version, has significant issues with connecting to Dynamics CRM 2015/2016. This version cannot properly consume the SOAP-based legacy web service for CRM and also cannot consume the new RESTful API (Web API Preview) because it cannot consume RESTful resources.

The solution that must be implemented here comes in the form of an intermediate console application/service proxy. This console application must supply a WSDL for the methods it contains so that webMethods can connect to the console application via the SOAP protocol. This intermediate service proxy should then be coded to contain just the methods that your business anticipates requiring for whatever resource is being integrated with Dynamics CRM via webMethods. The service proxy must then reside on a server within your environment (or in a fancy cloud VM like Azure). This server must have open connectivity with Dynamics CRM Online or your on-premises environment and must also have open connectivity to the webMethods Integration Server.

If you are able to start with or upgrade to the latest version of webMethods (version 9.9 as of this post) then you should be all set to connect to either the CRM legacy SOAP API or the new REST web API preview without any issues.

Want to learn more about the web API in Dynamics CRM 2016? Check out our blog on the subject! As always, if you need additional help, we have extensive experience integrating CRM with web services using webMethods and would love to help you out! Contact our team today!

Happy CRM’ing!

Get Faster and Simpler Update/Insert Transactions Using SSIS

When you are building an integration using SSIS, chances are you will need to update or insert data that has been created or changed in the source system on a regular basis. The SSIS job must be able to identify the newly created records versus the changes to existing records and perform the updates and inserts accordingly. The mechanism needed is fairly simple, and in today's blog, we will walk you through all the steps needed to perform this action!

The traditional mechanism needed to deal with update and insert transactions consists of four steps:

1. Identify the collection of records to be processed by using the create/modify date of the record or other criteria that business rules may dictate.

2. Compare the records to the target database and identify updates vs. inserts.

3. Route the records to the update or insert step accordingly.

4. Perform the updates and inserts.

Using the approach above coupled with a merge join transformation, your data flow may look something like this:


Though this approach will work, it does have some steps that have a design, maintenance and performance cost such as the sorting, merge join, conditional split, and, most importantly of all, requiring you to map your data twice.

There is an alternative mechanism that can achieve the same results with simpler steps and will yield better performance.

Before you start, you will need to ensure that you have a temporary table that contains all of the attributes of the target table named identically and, ideally, using the same data types to ensure smooth transactions.

You can create the temporary table quickly using this SQL statement:

into yourtable_tmp

from yourtable

where 1=2

Now that you have the temporary table, you are ready to build the SSIS job using the steps below:

1. Add a dataflow and two Execute SQL tasks to the canvas.


2. In the dataflow task, add your data source (in this case, MSCRM), and then add an ADO/OLEDB/ODBC destination step. Map your source to insert the records into the temporary table.


3. In the first Execute SQL task, you will merge the records into the permanent table by using a merge statement.

The merge statement works by matching the records in the two tables to the selected key and executing an update or an insert based on the success of the matching criteria.

INTO investments stg

USING investments_tmp tmp ON (stg.investmentid = tmp.investmentid)


stg.CODE = tmp.CODE,



stg.ASSETS = tmp.ASSETS,






















4. In the second Execute SQL task, you will truncate the temporary table by using a statement similar to the one below.

TABLE investments_tmp

Note: You can delete on a row-by-row basis with a more complex merge statement that does not require the truncate step, however, for today we will be keeping it separate for simplicity.

Now we have a complete job that will perform update and insert transactions much faster and with less components to maintain.

We hope you found today's blog helpful. Keep reading our PowerObjects blog for more tips and tricks each week! If you are looking for more information on SSIS, check out our other blogs on the topic!

Happy CRM'ing!

Utilizing Alternate Keys in Dynamics CRM Integrations

One of the challenges when integrating external data with Dynamics CRM is that in order to update a record via the API, you have to know the GUID of the record. Now with the introduction of alternate keys in the CRM 2015 Online Update 1 release, you can use a unique value in your data set as a key for updating records as part of an integration. This will work as long as the data type is an integer (whole number), a decimal number, or a string (single line of text). In today's blog, we will show you how to utilize these alternate keys in your integrations. Let's begin!

Say you have a scenario where you have an external system that you want to integrate to your Account entity, and the system uses an email address as its key. You need to define the email address on the Account entity as an alternate key to make the integration easier to write and run faster. In order to do this, follow the steps below.

1. First, navigate to your solution and add the existing Account entity (if applicable). Then navigate to the Keys attribute of the Account entity and click New.

2. Next, select the field you want to use as a key, click Add, and give it a Display Name. Click Ok.

3. The system will then create a database index on that field to ensure fast querying and enforce the uniqueness of the values. Depending on how much data you have in your database, this could take a while. While this is processing, the key's status will be either Pending or In Progress.

4. When the indexing operation is complete, the key's status will change to Active. Congratulations! Your new key is ready for use!

You can use this key for Update and Upsert operations via the SDK. For further information on alternate keys, check out the MSDN article here:

That's all for today, readers! If you're interested in learning more about Dynamics CRM, consider attending this year's PowerUp conference November 10-11, with a PowerUp Hands-On Day on November 12. This educational extravaganza is the CRM event of the year and is open to beginners and advanced users. PowerUp is a can't-miss CRM event! We hope to see you there!

Happy CRM'ing!

Manually Editing the Scribe QETXT.ini File for Microsoft Dynamics CRM

When migrating large text files with many columns in Microsoft Dynamics CRM using Scribe Insight, sometimes it's more efficient to manually edit the QETXT.ini file. A good example of this would be when migrating from text file exports from to Microsoft Dynamics CRM.

How many times have you set up a connection to a text file using a sample file only to later get the full data set and realize all of your columns are too small or of the wrong data type? You can save yourself time and energy by manually editing the QETXT.ini file. The QETXT.ini is a text file that is created automatically when using the Scribe Text Wizard to create a connection to a text file.

To see an example of the QETXT.ini, you can navigate to the TextData samples that get installed with Scribe Insight.

Scribe QETXT

In this example, we will be using the sample Company.dts that Scribe provides to illustrate how to change the QETXT.ini. When using the Scribe connection wizard to define text connections, it will define the length of any characters fields by default to the longest value in the current file. We want to change all character columns to a length of 100.

  1. First let's look at the column definitions prior to editing. Notice the column lengths vary and were likely set automatically by the column lengths at time of first connection.

  2. Now open up the QETXT.ini and edit the column lengths. Notice each new text connection is started with the connection name in brackets [ ].

  3. For each VARCHAR field, change the length to 100 in two locations.

  4. Repeat until all VARCHAR fields have been updated.



  5. That's it! Once you re-open the Scribe job, you will see that all of the column lengths have been updated.


As you can see, editing the QETXT.ini file manually can save you time while importing data into Microsoft CRM using Scribe. Try it out for editing columns lengths, data types or even data masks.

Happy CRM'ing!

Microsoft Dynamics CRM-Driven SharePoint Security

Microsoft Dynamics CRM supports out-of-the-box integration with SharePoint, described in detail two previous blogs: SharePoint Integration in Microsoft Dynamics CRM 2011 and CRM 2011 and SharePoint 2010: Configuration and Folder Structure.

One of the features of out-of-the-box integration is that the folders are created as needed—that is, when user navigates to Documents area of the record for the first time.

This is a usually a good approach, but this method can have shortcomings in some more complex requirements, such as:

Out-of-the-box integration will create a folder using the configured structure, using the name of the record:

Microsoft Dynamics CRM-Driven SharePoint Security

In SharePoint, the security is inherited from parent folder:

Microsoft Dynamics CRM-Driven SharePoint Security

Microsoft Dynamics CRM has robust security architecture, with business units, teams, roles and sharing. In the out-of-the-box SharePoint integration, users that cannot view a record in CRM would be able to view the documents directly in SharePoint.

You could maintain the security manually in SharePoint, but this can become cumbersome with large number of records. Also, SharePoint is often considered as an invisible add-on where users only use the CRM interface. Ideally, you want to programmatically mirror the CRM security in SharePoint.

Our approach to solving this is to use Dynamics CRM plug-ins and create SharePoint folders programmatically, break the inheritance, and add permissions where needed.

Here's the basic functionality of the plug-in:

At this point, all you've done is duplicate the out-of-the-box integration and force it to run at the creation of the CRM record.

The big challenge now is that the required functionalities, BreakRoleInheritance and RoleAssignments, are not available through SharePoint 2010 web services. You'll have to use SharePoint Client Object Model, which requires that the code must reference to Microsoft.Sharepoint.Client.dll and Microsoft.Sharepoint.Client.Runtime.dll. This means the assemblies must be deployed to the CRM server GAC and the plug-in cannot be isolated. Therefore, this approach is not applicable for CRM Online. For CRM Online, you may use similar code, but create your own web service that the isolated plug-in can call.

Here's a step-by-step approach to breaking inheritance and adding access to a specific SharePoint group:

1. Set Client Context and find the List:

ClientContext clientContext = new

clientContext.Credentials = new
NetworkCredential(_connectionInfo["Username"], _connectionInfo["Password"], "");

List oList = clientContext.Web.Lists.GetByTitle(listname);

2. Use CAML Query to find specific folder using URL:

CamlQuery query = new

query.ViewXml = "<View Scope='RecursiveAll'><Query><Where><Contains><FieldRef Name='FileRef'/><Value Type='Lookup'>"+FolderURL+"</Value></Contains></Where></Query></View>";

ListItemCollection collListItem = oList.GetItems(query);



3. If ListItem found, we can break the inheritance:


4. If you want to add certain access to a group, we can do that against ListItem as well. Here is how to add the Contributer role to a group:

RoleDefinitionBindingCollection collRoleDefinitionBinding = new

RoleDefinition oRoleDefinition = web.RoleDefinitions.GetByType(RoleType.Contributor);


oListItem.RoleAssignments.Add(siteGroup, collRoleDefinitionBinding);


With this plug-in, as soon as record is created, the folders are created and security is set:

Dynamics CRM-Driven SharePoint Security

Note that all the items including the subfolders underneath this folder will inherit the permissions from this folder, so you generally only have to do this at one point of each branch.

Microsoft Dynamics CRM-Driven SharePoint Security

Extending from these basic building blocks, you can build a robust, CRM-driven security architecture in SharePoint.

If you want more information regarding CRM and SharePoint Integration, make sure to check also our other blogs:

Happy CRM'ing!

CRM 2013 and SharePoint Integration New Feature

The Microsoft Dynamics CRM 2013 and Microsoft SharePoint 2010/2013 integration remains much as it was in CRM 2011 and is designated as not being a refreshed entity. However, there is a nice improvement in the document location and folder name.

The setup is exactly the same in CRM 2013 as it was in CRM 2011, which you can reference here.

Here is a summary of the setup for a new CRM 2013 deployment:

  1. Download the CRM List Component SharePoint solution and upload the appropriate version (SharePoint 2010 or 2013) in the SharePoint solution gallery.
  2. After successfully uploading and activating the CRM list component in SharePoint, navigate to Settings > Document Management in your CRM 2013 organization.Dynamics CRM 2013 and SharePoint
  3. Follow the same steps for 2013 as completed in CRM 2011 to enable the CRM 2013-SharePoint integration. Your new valid SharePoint site should be listed under your active SharePoint sites.Microsoft Dynamics CRM 2013 and SharePoint Integration
  4. Navigate to a CRM account record and locate Documents in the command bar. You'll be prompted that the folder will be created.

Now for the cool new part. The folder name is no longer simply the account record name—the CRM record's GUID is appended to the folder! You can see that the CRM document location record displays the folder name.

Navigate to SharePoint and see the corresponding folder.

This improvement in the CRM 2013 and SharePoint integration simplifies additional development. It should also should expedite further integration between the two applications, because now SharePoint knows the unique ID of the CRM record!

For other CRM 2013 information, see our full list of CRM 2013 events and educational offerings.

Happy CRM'ing!

Scribe Queue Publisher vs. Query Publisher for Dynamics CRM

There are multiple ways you can connect to Microsoft Dynamics CRM as a source from the Scribe integration tool. The ones that are frequently used are the CRM Publisher or Query Publisher. Today we'll go over the difference between these two and when to use them.

CRM Publisher

The CRM publisher uses the plug-in logic to push data from CRM to the Scribe server. Scribe then picks up the XML file and processes it. In this process, the data goes from the CRM server to the Scribe server almost instantaneously. This method is most useful if you need to process every single record separately and you need to integrate the data immediately. At the same time, you have to be careful with the data load since huge data can cause the Scribe server to not respond properly.

Here is a screenshot of the CRM publisher selection:
CRM publisher

Query Publisher

With this option, you will probably use the CRM filtered views to run your SQL queries as the source. You'll need to schedule the Scribe processor to run at defined times, which means these are time-based integrations. You can use this method when you only need to integrate the summary and can integrate the data during off-peak hours. Doing this can give you better CRM performance.

Hope this helps! For more information about using Scribe, check out some of our other blogs:

Happy CRM'ing!

Copying Scribe Target Steps Using DTSEdit

There may be a scenario when working with Scribe Insight and Microsoft Dynamics CRM when you have to integrate data to multiple locations at once—for example, if you're syncing data from an external system to a production as well as test instance of Microsoft Dynamics CRM. In order to save time and to ensure identical data in each environment, the best practice is to create one step and then make a copy of it rather than creating two each step by hand.

Here's how:

1. Once you have completed the design and build of the first step, you can make a copy of it by going into Flow Control under the Configure Steps section and using the copy step button.

2. After copying the step, rename it with a relevant name so you can identify it later. Notice the Connection says MS CRM Test Environment. There is no way to change this from within the Scribe Workbench.

3. At this point you have one step for your MS CRM Test Environment and another step ready for your MS CRM Production Environment, but they are both still pointing to the same test instance of CRM. This is where we can use DTSEdit to modify the CRM instance for the second step. First we need to make sure we have a connection defined, so let's take a look at our connections.

As you can see, we have two connections to Microsoft Dynamics CRM 2011. The MS CRM Test Environment is currently being used as the source and target. We are going to use DTSEdit to adjust step two of our Scribe job to point to MS CRM Production Environment. Go ahead and close out of the Scribe Workbench at this point.

4. DTSEdit is a tool provided by Scribe Software that allows you open and edit Scribe DTS files in a spreadsheet-based fashion. It should be found in the Scribe Folder under Program Files.

5. Double click on dtsedit.exe to open the application and open up the Scribe DTS file we have been working with.

6. As you can see, DTSEdit is laid out much like Excel—columns and tabs inside each sheet and multiple tabs across the bottom of the document.

There are four sheets that will need to be modified to adjust the connection for step two of our job.

  1. Data Providers
  2. Target Tables
  3. Keys
  4. Links

We will cover details of each step. There is a ton of data available in the DTSEdit tool. Feel free to play around and try different things, just be sure to have a backup of your DTS files before you do!

7. Now it's time to work with the data providers sheet. The data providers sheet lists the connections stored in the Scribe job. We have already defined our new connection, so at this point we just have to tell it that our connection will now be used as a target. Locate the Data Providers Sheet and the connection we defined as MS CRM Production Environment.

Put a "T" in the Connection Value Column next to Usage to denote this data provider is now being used as a target.

8. Locate the Target Tables sheet. This sheet lists all target steps contained in the job. Locate step two and the Data Providers column that denotes MS CRM Test Environment. Change that column to MS CRM Production Environment.


9. Locate the Keys sheet. Keys are synonymous with lookup links in the Scribe Workbench. Find any key that is part of step two.

Update the data provider columns to match the MS CRM Production Environment.

10. Locate the links sheet. Update the Data Provider column to reflect the new MS CRM Production Environment provider where necessary. There will typically be many more rows to update on this sheet.

Keep updating until you've completed all the links for step two.

11. At this point that's it! You can now re-open the Scribe job in the Scribe Workbench. You will know very quickly if you missed anything as you will see errors upon opening. If you open your Connections or Configure steps, you will notice that they have been updated to reflect the MS CRM Production Environment.

As you can see, DTSEdit can be a very powerful tool when creating integration and migrations with Scribe Insight to save time and reduce errors!

Happy CRM'ing!

Minimizing Data Storage in Microsoft Dynamics CRM

Dynamics CRM is a great tool for many things; tracking sales opportunities, managing cases, and tracking activities. One of the few things that should not be done in CRM: document management and storage.

Many Dynamics CRM customers use SharePoint to store and manage documents, as it integrates smoothly within CRM. In some cases, CRM and SharePoint are hosted in two separate domains. For example: CRM may be Online and SharePoint may be on-premise. Or CRM may be partner-hosted while SharePoint is Online. In these situations, every time a user is in CRM and wants to view the SharePoint documents, they have to enter in their SharePoint username and password. For a company that relies heavily on various documents to operate daily, this is tedious. PowerObjects to the rescue!

We recently implemented a custom SharePoint plug-in that functions as such:

  1. A user logs into CRM and opens an Account or Contact record. (This plug-in can be implemented on any entity within CRM, including custom entities, as long as they have Notes enabled.)
  2. In the ribbon, the user will click on the Add tab and then click Attach File. Then he or she will select the document from the PC.
  3. After the attachment is added, the plug-in kicks off behind the scenes, moving the document to the corresponding SharePoint folder by logging in as an administrator user.
  4. A link to the SharePoint folder is added to the Notes section of the Account/Contact record.

    SharePoint & Data storage in Microsoft Dynamics CRM

  5. The document can also be accessed from the Documents folder in the left navigation of the Account/Contact record.

Not only does this solution prevent the user from having to enter in additional credentials each time they need to add or access a document, it also allows them to stay within CRM and use out-of-the-box CRM functionality (the Add Attachment button) which takes less clicking than adding a document through the SharePoint integration.

If you're interested in a demo, or would like to see if we can provide a solution to meet your needs, contact us.

Happy CRM'ing!