How to Identify D365 Processes that Reference a Subject using XrmToolBox

At times, you will find it useful to know which Dynamics 365 Processes (Workflows, Actions, Business Rules) currently reference a specified Subject(s) in the Subject Tree. This is especially true if you intend to rename the Subject(s) and want to keep the Process references accurate and up to date. There is a way to do this using the FetchXML Builder Plugin in the XrmToolBox utility. This tool, built by developer Jonas Rapp, provides an easy, user-friendly way to generate queries to use for views, reports, and code. It’s provided free by Jonas, and it’s a great tool to get to know.

Assumptions

To illustrate how it's done, we'll walk through an example that is based on three important assumptions:

1. You have the XrmToolBox tool with the FetchXML Builder Plugin already installed on your computer and have created a connection within XrmToolBox to your Microsoft Dynamics 365 Org. For information/documentation on downloading and installing XrmToolBox, connecting it to your Microsoft Dynamics 365 deployment, and adding XrmToolBox plugins, go to Download XrmToolBox.

2. The Subject Tree (navigate to Settings > Business Management > Subjects) is defined as shown below:

XrmToolBox

3. The example Subjects are referenced in these Processes (Entity = Case) that have been defined and created in a D365 Org with conditions as follows:

XrmToolBox

XrmToolBox

XrmToolBox

XrmToolBox

XrmToolBox

Now you want to identify all Processes that reference the highlighted Subjects from the Subject Tree.

Steps

Assuming a Subject Tree and Processes as shown above for this example, follow the steps below to determine which Processes reference "Test Subject Parent" and which reference each of its two Child Subjects:

1. First, in your D365 Org, open the Advanced Find tool.

XrmToolBox

2. Create an Advanced Find query (look for Cases entity) where the Subject field is equal to Test Subject Parent and click the Download Fetch XML icon to generate a FetchXML file of type 'XML Document' in your Downloads folder.

XrmToolBox

3. Find and open the downloaded FetchXML file with Notepad or another preferred Text Editor tool. In this file, take note of the Test Subject Parent's globally unique identifier (GUID) that is embedded in the FetchXML. Be ready to copy the GUID from this file for a later step.

XrmToolBox

4. Repeat Steps 2 and 3 for Test Subject Child #1 and Test Subject Child #2.

Test Subject Child #1

XrmToolBox

XrmToolBox

Test Subject Child #2

XrmToolBox

XrmToolBox

5. Now open the XrmToolBox utility and Connect to your D365 Org.

XrmToolBox

XrmToolBox

6. Open the FetchXML Builder Plugin by clicking on Plugins, then on the FetchXML Builder icon, and then on the Continue button.

XrmToolBox

7. Once FetchXML Builder is open, in the top left Query Builder pane, right click on the fetch top:50 query statement. In the bottom left Quick Actions pane, you can change the value for Top to indicate the maximum number of rows you want returned from the fetch, but 50 will work for our purposes, as the number of Processes we are trying to find is far less than 50.

XrmToolBox  

 

XrmToolBox

8. Next, in the Query Builder, click on entity. In Quick Actions, select workflow in the Entity name dropdown.

XrmToolBox

XrmToolBox

9. In Query Builder click on entity workflow. In Quick Actions click on attribute.

XrmToolBox
XrmToolBox

10. Now go back to Query Builder and click attribute. In Quick Actions select name in the Attribute name dropdown.

XrmToolBox

XrmToolBox

11. Click again on entity workflow in Query Builder. In Quick Actions click on filter.

XrmToolBox

XrmToolBox

12. With filter still selected in Query Builder, click on condition in Quick Actions.

XrmToolBox    

XrmToolBox

13. With condition still selected in Query Builder, return to the Quick Actions pane and select xaml in the Attribute dropdown and Like in the Operator dropdown. Now copy the GUID in the FetchXML file for Test Subject Parent (noted in Step 3). Place it into the Value field between two "%" characters.

XrmToolBox    

XrmToolBox

14. Select filter once again in the Query Builder query structure and again click on condition in Quick Actions (this is done just as shown in Step 12).

15. With a second condition now placed into and still selected in the Query Builder structure, go to the Quick Actions pane and select type in the Attribute dropdown, Equal in the Operator dropdown, and Definition (1) in the Value dropdown.

XrmToolBox    

XrmToolBox

16. The query is now ready to run! You can use the Save button to save this query for future use – the Open button will allow you to browse to it and open it again when you reenter FetchXML Builder at a later time. At this time click the Execute button or F5 key to run the query and see results in the Result View pane to the right of Query Builder. Per our Process definition (assumption #3 above), the three Processes listed (one Action, two Workflows) all referenced the Test Subject Parent Subject.

XrmToolBox

17. To also identify Processes with references to other Subjects (e.g., Test Subject Child #1 or Test Subject Child #2) you can build queries for each Subject by clicking New to start a new query, repeating Steps 7 – 16, and taking care to use the correct GUID (e.g., as noted in Step 4) for the Subject reference you want to find. However, at this point you can now just use this same query that has already been constructed, and, for each of the two Subjects, simply replace the Test Subject Parent GUID with the GUID for the Subject you want to find. Then Execute each of the new queries. Here are the results when querying the other two Subjects:

Test Subject Child #1 (referenced in one Action and two Workflows)

XrmToolBox

Test Subject Child #2 (referenced in one Business Rule, one Action, and one Workflow)

XrmToolBox

Note that you can easily repeat these steps for any Subject for which you want to find the Processes that contain a reference to it.

That's it for today, readers. For more information on Dynamics 365 and other useful tips, feel free to contact us! Also, make sure you check out our blog weekly for the latest news and educational materials. Happy D365'ing!

*Update: PowerObjects published an earlier version of this blog post that heavily references an open-source, community-driven set of tools for Dynamics 365. We inadvertently did not give credit to the tool’s author and we sincerely apologize. We have updated this post so it properly cites the creator, Jonas Rapp. It is not the policy of PowerObjects or our employees who contribute to our blog content to not give credit where credit is due. This was purely a publishing error and oversight by the team. The Dynamics community and its diverse members are sincerely a treasured part of the day-to-day life of everyone at PowerObjects. We are a corporate entity, but we’re made up of human beings that deeply care about technology and the Dynamics framework. We live and breath it everyday and it is what drives us to be the best at what we do. We sincerely apologize for the error and would like to thank the community for keeping us honest and calling us on our mistake. We will be reviewing our publishing procedures and policies in order to avoid this error in the future. - Joe D365 and The PowerObjects Team

Dynamics 365 Customer Engagement Application Portability

The Customer Engagement Application, formerly Microsoft Portals (and before that, ADXStudio Portals) is a Software-As-Service (SAS) web application run by Microsoft in the Cloud. It utilizes Microsoft CRM as the persistence layer for business data and application metadata. The latter defines the web application's look, feel, behavior, and restrictions. Deployment of such Portals in version-controlled scenarios may present substantial challenges that prevent efficient testing of the target environments and make it problematic to roll back any deployed changes. In today's blog, we will review the issues, available tools, and options to solve the task, as well as detail the methodology of achieving positive results.

Let's start by discussing the challenges. Typical Portal deployment consists of three steps:

  1. Deployment of Portal solutions into the target CRM
  2. Deployment of business customizations related to the Portal (entities, forms, views, option sets)
  3. Transfer of the data stored in Portal-related entities

In the typical scenario, Microsoft provides step #1 by installing a new Portal into the target CRM instance. Once this is done, Microsoft also populates Portal configuration entities with sample data and makes a Portal available for modification. This approach assumes a single production environment and does not imply any development anywhere other than the target CRM. Such development may involve business customizations to the CRM objects, as well as modification and expansion of the Portal configuration data.

In the development workflow, where any business customizations are done in DEV environment and then ported to the target PROD or QA CRM, Step #2 above is managed consistently and presents no challenge. However, in case when both source and target CRM environments are bound with appropriate SAS Portal applications, any development related to the source Portal represents data that requires transfer to the target CRM, presenting the following challenges:

Now, let's discuss our options. Presently, there are only two tools available to transfer Portal data:

Next, let's examine our solution. The happy path for Portal data deployment must follow this sequence of events:

  1. In the target environment, the Portal record is renamed for archival purposes and to avoid name collision with the new Portal.
  2. The source data is extracted and stored into a file.
  3. In the source data file, all GUIDs identifying Portal records are replaced with the new ones to avoid collision with any of the target entities.
  4. The modified source data is imported to the target CRM – the new Portal record and dependent data set is created.
  5. In the target environment, the Portal SAS application is switched from the archived Portal record to the newly imported one.
  6. If testing of the new Portal is failing, the SAS application may be switched back to the archived Portal record.

Step #3 is a simple regular expression replacement of an existing found GUID. For example:

[a-fA-F0-9]{8}-([a-fA-F0-9]{4}-){3}[a-fA-F0-9]{12}

However, we must not forget that many unique record identifiers may be referenced multiple times within the data source file (linked entities), and such repeating GUIDs must be replaced with the same new GUID as the first one met. A further complication is related to the fact that we want to replace only GUIDs defined as record IDs in the data file, not the GUIDs that are referenced only: this is because many records refer to the entities outside the scope of contents of the Portal schema.

The solution is a simple command-line tool developed internally as a universal Reg-ex replacer with default patterns searching for record IDs, building a dictionary for those mapping them to the new GUIDs, then replacing all GUIDs in the file according to the dictionary. Those GUIDs not in the dictionary are not replaced.

This tool had produced a data file that, when imported, creates all brand-new target Portal entities and all brand-new dependent entities, while any of the imported entities linked to existing data outside of the Portal schema stay properly linked.

Since the new record set is created in the target environment, there is no issue with deleting obsolete configuration records: those stay linked to the archived versions of configurations and will be deleted automatically when the old Portal records are purged.

The only insubstantial complication in the suggested deployment process is related to the contacts assigned with the web roles. We are not porting or updating the contacts during deployment. We cannot assume that the contacts in the target environment will match those from the source. So, the web role assignments will be lost if we attempt to transfer them. For the purpose of clean data transition, it is recommended to drop all web role assignments in the source environment prior to exporting the Portal data. Those assignments may be backed-up before and restored in the source later to maintain the ability to unit-test in DEV.

In summary, with a simple command-line tool developed internally, we can use the standard CRM Configuration Migration application to transfer the Portal data from the source environment to the target in such a way that the new Portal record and dependent child records are created, making it available for SAS application to switch to. At the same time, the previous versions of the Portal records in the target environment are not destroyed or overwritten, thus making it possible to roll back the SAS assignation to any previous state of release.

A special note: with the slight modification of the default search-replace regular expressions, it is possible to use our tool to condition the data file extracted by XRMToolbox plugin for the Portals. While the data transfer experiment was successful, it is still not recommended to use because of the internal issues found within this plugin, as well as its inability to migrate M:M relationships properly.

Don't forget to subscribe to our blog for more!

Happy D365'ing!

How to Bulk Update CRM Records Using the Bulk Workflow Execution Tool

That's right – no more being limited to mass updating only 250 records at once in Dynamics 365! The Bulk Workflow Execution is a great tool which allows users to run an On-Demand Workflow against a set of records pulled from a System or Personal View, all in bulk! You can find the "Bulk Workflow Execution" tool in the Plug-in Store using the XRMToolBox application.

Let's look at a quick example of how the tool works. Say that you need to change the Owner of 10,000+ Case records. Follow the steps below:

1. Create an On-demand Workflow.

bulk workflow execution

2. Create a Personal View or use an existing System View to pull your data set.

bulk workflow execution

3. Open XRMToolBox.

4. Scroll and click the Bulk Workflow Execution.

5. Click Yes to connect to your Dynamics 365 organization.

bulk workflow execution

6. Type your Organization URL.

bulk workflow execution

7. Type your Username/Password and click Connect.

bulk workflow execution

8. Type a Name for your organization after it successfully connects. NOTE: once the organization connection is established, XRMToolBox will open a new tab with the tool name and organization name you provided.

9. Select the On-Demand Workflow you created from the dropdown and wait for the tool to retrieve the applicable Views.

bulk workflow execution

10. From the View list, select the System View or Personal View you are running the On-Demand Workflow against. Note: you should see the FetchXML Query populated in the area to the right after you select the View from the list.

11. Click Validate Query to verify the amount of records being pulled in the data set.

bulk workflow execution

12. Determine your Batch Size and Interval Delay.
Note: if unsure, keep the default Batch Size at "200" and Interval Delay at "0" seconds

13. When ready, click Start Workflows.

bulk workflow execution

14. The tool will run and will inform you of progress.

bulk workflow execution

15. When finished, the tool will display a small window indicating the time it completed, how many records the workflow ran against, and number of errors as a result of the execution.

bulk workflow execution

A good rule of thumb is allowing about 2 seconds per record for the tool to update a record. That would be 30 records per minute and 1,800 records per hour. Remember that this tool is completely automated! So, while the job runs you can leave your computer unattended or have it running in the background if multitasking.

That's it! For more Tips & Tricks and other educational materials on Dynamics 365, check out our blog.

Happy Dynamics 365'ing!