Cloud Migration Made Easy with Online Accelerator

Deciding on Microsoft Dynamics 365 as your business solution is a no brainer, but how to implement it? That’s another story. Not only must you decide what ERP and CRM applications are right for your business, you have to decide what type of deployment works best for you, as well. Dynamics 365 can be hosted online, on-prem, or using a hybrid model. Microsoft and PowerObjects take a cloud-first strategy, as the cloud offers several distinct advantages:

Cloud deployment offers a ton of advantages, but how about getting there?

Migrating to the cloud is a significant initiative that should not be taken lightly… or alone. Migration has traditionally been done manually, which as you can imagine is extremely challenging, subject to manual errors, expensive, and time-consuming.

But based on years of experience and hundreds of cloud migrations for our clients, PowerObjects knows exactly how to take your organization through the process step by step. We leverage a Microsoft tool called the Online Accelerator, and the engine behind it was designed specifically to deliver an ultra-fast migration of all your Dynamics 365 data.

The Online Accelerator offers huge advantages over manual migration, including…

If you’ve been contemplating a move to the cloud, we can help! Learn more about the Online Accelerator by reading this fact sheet or reach out to one of our support engineers. Cloud migration is a huge undertaking – don’t go it alone!

Move to the Cloud with the Online Accelerator by PowerObjects

Organizations hosting Microsoft Dynamics 365 have options. It can be hosted online, on-prem or using a hybrid model. Microsoft and PowerObjects strongly recommend online. Based on years of experience and hundreds of cloud migrations for our clients, PowerObjects created an offering that takes your organization through the process step by step by leveraging the power of both Microsoft Azure and the Microsoft Online Accelerator, a tool designed specifically to deliver an ultra fast migration of all your Dynamics 365 data. Watch this video to learn more!

[php slug=amp-video-embed]

Migrating Databases with HashByte

Hey, what should we do tonight? The same thing we do every night… try to migrate all the data in the world!

Once, we were creating a migration process for a database that had no fields to verify or control changes. Thus, we had to consider a different option, and someone mentioned using HashBytes. A-ha!

HashBytes is a function that allows you to generate an encryption code for any information that is transferred in the string parameter, and where you can make the decision about which algorithm you want to use in the process.

How is the syntax of this function?

This function receives two parameters:

When I send these parameters, the function returns a varbinary (maximum 8000 bytes).

Example:

OK, so we know what this function can do, but how can we use it in our migration process? The solution was to separate the main issues into four items and attack each of them separately.

  1. How can we identify when a record was created or modified?
  2. In the Migration/Integration process, we need to validate all the fields of data source because it is not always necessary to move all the information to the target system.
  3. How can we build the string to generate the hash with Hashbytes and make it an automatic process?
  4. How can we use the encrypted information in our process?

To solve the first question, we created two new fields to validate and control the changes in our cross-reference table. We did this so we could have control if we needed to create a new record or update the existing record in the target system.

The new fields are PO_HashBytes_ORI, where you save the encrypted code when the migration process is executed or when the record is new/modified. Great! We can now identify our records, but this is just a small part of our bigger issue.

For the second item, the solution was to validate all the fields we need to move in the migration/integration process that had been modified in the records. Once this was done, we could create a new table with the name of the fields of each entity that we needed to move from the source system to the target system.

C:\Users\JOSE~1.ROD\AppData\Local\Temp\SNAGHTMLadb07e2.PNG

Next, we populate this table with all the fields that we need to use in the process. To do this, we create a SQL Dynamic to build all the insert sentences necessary to insert all the rows required.

C:\Users\JOSE~1.ROD\AppData\Local\Temp\SNAGHTMLae72524.PNG

We now have the validation/control fields needed in our process. Next, we need to build strings to execute the HashBytes function.

We need to build this string with excellent performance, so this can occur automatically because the next time someone needs to modify the process, they can use this one instead of the more difficult process of building these strings manually. In order for the above process to go smoothly, we created a function in our database. Below is an example of the function we created (PO_get_hash_fields):

This function has a parameter (@p_table_name) where it receives the name of the table that is necessary to build the string that we use in the other function (HashBytes). With the name of the table (@p_table_name), we get all the necessary fields to build the string (PO_structure_to_update) and make the conversion when the field type is different from char, varchar, nvarchar.

The result of this function is the new string with all fields concatenated and ready to do the conversion in encrypted code.

We’re getting close. Now we need to use all the processes to identify new or updated records. But how can we use all the previous steps to complete our process to identify whether a record is new or updated? The solution was to create a stored procedure that uses the functions to generate the hash and update the field to validate if the record is new or updated.

This stored procedure has 4 input parameters that are necessary to do all the Dynamic SQL that identify if the record is new or modified.

These parameters are:

  1. @p_table_name_ori: this parameter is used to place the name of the table that we need to get all the fields to generate the hash.
  2. @p_table_name_PO_Des: this parameter is used to find the name of the cross-referenced table where we have the information of all the records stored, moved from the source system to the target system. Now we need to generate the hash for each record and update in the cross-reference table.
  3. @p_table_type: this parameter identifies which table we need to process (e.g., Account, Contact, and/or other entities).
  4. @p_process_type: this parameter is used to identify the process of validating whether the record is new or updated. If the process is updated, we need to place the information in the PO_HashBytes_DES field of the generated hash. When the record is new, we need to place the information in the PO_HashBytes_ORI field of the generated hash.

The stored procedure generates a Dynamic SQL to update a table with the encrypted code generated using the HashBytes and all the fields required in the migration/integration process for each record.

Excellent! We have our new verified/control field for our migration/integration process. Now we can use the HashByte function in the migration/integration packages by generating the hash for each record using the new stored procedure. We can also identify if a record is new or modified by comparing the original field to the new field using the encrypted code through the HashBytes. Now we can migrate databases that do not have any control fields. Wow!

Happy migrating!

CRM 2016 Preview Feature: Data Migration with the Data Loader Service

In today's blog, we'll be giving you a brief overview of the Data Loader feature, which is a Microsoft Dynamics CRM 2016 Preview Feature available only to North American-based Dynamics CRM Online organizations. So without further ado, let's begin!

Let's walk through the steps needed for a data migration using the Data Loader. Since it is a Preview Feature, Microsoft does encourage use and feedback for future feature development. The objective of the Data Loader is to move high volumes of data into CRM Online quickly, securely, and cost effectively without needing custom code.

Once a user logs in to the Lifecycle Services Data Loader Dashboard, there are five data import options. Below is the dashboard view with numbered Data Loader options:

1. Deploy Runtimes – Connects CRM instances to the Data Loader for data migrations.

2. Configure File Format – Create the format of the file to be uploaded.

3. New Import – Set data migration characteristics.

4. Data Jobs – View all data jobs for a given CRM instance.

5. Data Projects – View all data projects for a given CRM instance.

Data Migration

The next step is to briefly breakdown the five data import options:

Deploy Runtime

1. First, enter the CRM Username and CRM Password.

2. Then, select Fetch CRM instances.

3. All the CRM instances will load, and from the drop down box, select the instance for the data migration.

Data Migration

Configure File Format

1. Select a CRM Instance for the data migration.

2. Create & Save a file format.

3. Select the + sign to add a file.

4. Enter a File format name.

5. Set the Delimiter and Regional settings.

Data Migration

New Import

1. First, create a new Data Migration Project.

Data Migration

2. Upload data tables.

Data Migration

3. Next, map the fields (applies an auto mapping and fields can also be manually mapped).

Data Migration

4. Finally, click Start Job.

Data Migration

Data Jobs: A view of all the data jobs for a given CRM instance

Data Migration

Data Projects: A view of all the data projects for a given CRM instance

Data Migration

That's all for the blog today, but make sure you check back with us daily for blogsevents, trainings, and webinars, all focused on Dynamics CRM 2016. It's going to be a busy year. You won't want to miss a thing! Want to learn more about CRM 2016 Online? Check out our blog Top New Features in Microsoft Dynamics CRM 2016 Online.

Until next time, happy CRM'ing!

Bust a Move with Data Migration in CRM 2016

The January update for Microsoft Dynamics CRM 2016 has so much to offer! The update introduced cool new features to make a developer's life a lit bit easier. Features include Data Loader, one click document generator, the ability to hide and show sub grids, and many more. In an earlier blog, we showed you how to use the Data Loader feature, but in today's blog, we'll be talking specifically about to access the Data Loader service. So let's dive in!

The Data Loader service is an answer to your data migration scenario hiccups where a user has to rely on third-party solutions or has to write custom code to import/export large volumes of data. Essentially, it's a painless way to handle bulk data loading in to a CRM system.

Bulk data migrations can be imported and exported into CRM systems by using the Data Loader service. The feature enables bulk data to be uploaded to a staging area where you can perform light, data quality functions and then push the data in to your CRM system.

The Data Loader service feature is now supported in Dynamics CRM 2016 and Dynamic CRM 2015 Update 1. Here are a list of some key features:

The Data Loader preview is enabled by default for all CRM administrators. Use the following steps to access the service:

1. First, navigate to this link: https://lcs.dynamics.com/DataLoader/Index

2. Click on Sign In and enter your CRM administrator credentials.

3. Upon login, if you see the screen below, you don't have permissions to access the resource.

Bust a Move with Data Migration in CRM 2016

Please note here that only CRM administrators, who are Global or Service administrators in Azure Active Directory, are allowed to access the Data Loader service.

Once you get all of the required permissions, you will be all set to use the Data Loader service and will be able to see the Data Loader's multiple options displayed as shown in the image below.

Bust a Move with Data Migration in CRM 2016

That's all for the blog today! Check out our video on new features in Microsoft Dynamics CRM 2016 to see our five favorite features!

Until next time, readers, happy CRM'ing!

Maximizing Data Integration and Migration Performance in Dynamics 365

Data Integration and Data Migration has always been one of the most challenging tasks in any Dynamics 365 implementation. The most common and popular method used within the community is through SQL Server Integration Services (SSIS) along with a third party connector/toolkit from KingswaySoft.

While this is the most popular method, you can sometimes encounter performance issues during the implementation. In this blog, we will show you how to tackle performance issues by optimizing the configuration.

Optimizing Multiple Connection Settings

It's no secret that utilizing ExecuteMultipleRequest (which was introduced when Dynamics CRM 2011 UR12 was released) can drastically improve the performance for bulk data load. Instead of using a single connection to create or update a single record, ExecuteMultipleRequest allows you to create or update more than one record per connection.

By default, the platform only accepts up to two concurrent connections and each connection only accepts up to 1000 records. In some cases, dependent on the type of contract or license you have with Microsoft, you may be able to request for additional connections. If you exceed the number of concurrent connection available, the platform will reject the call and return an error message indicating that the server is busy. Fortunately, ExecuteMultipleRequest is built-in to KingwaysSoft and we just have to set the optimal settings. So what is the best settings?

In KingswaySoft, ExecuteMultipleRequest is indicated by following settings:

1. Number of concurrent connection: Use up to X thread in total

2. Number of records per connection (batch): Batch Size

data integration

In the example above, you will use five concurrent ExecuteMultipleRequest connections with a batch size of 30 records. While in theory the maximum batch size is 1000, in real world scenario, you will want to keep the number between 50-100 to maintain performance and stability and avoid timeout. Ideally, you will want the number of connections to be lower than the maximum connections your environment has, since other integrations or migration jobs may be running at the same time and sharing the number of connections with your SSIS job.

Optimizing .NET Client Settings

Although, the SSIS Job has its own connection settings, it is running on top of .NET framework and therefore will be limited by the .NET maximum number of connections settings. This particular setting determines the number of open connections the SSIS Job is going to establish and it may throttle the overall SSIS job performance. You can remove this limitation by adding the following configuration section:

<configuration>

  <system.net>

    <connectionManagement>

      <add address = "*"
maxconnection = "100" />

    </connectionManagement>

  </system.net>

</configuration> 

Add this to your machine.config file located in: C:\Windows\Microsoft.NET\Framework64\v4.0.30319\Config and C:\Windows\Microsoft.NET\Framework\v4.0.30319\Config

data integration

This method should help optimize your configuration. For more Dynamics 365 tips – check out our blog!

Happy Dynamics 365'ing!

How to Prepare for a Dynamics 365 Global Data Migration

You have decided to implement Microsoft Dynamics 365 across your global enterprise. You know the value this will bring to your organization, but you also know there are going to be challenges, especially in a global implementation.

For data stewards, architects, and product owners, data migration can be a daunting task and just one pillar of a large global implementation. The migration will affect every part of the project, from end users and developers, to testers and trainers. There are lots of questions to ask and decisions to make. How much data are we going to bring from our legacy systems? Does Germany have the same data as the UK or Singapore? What languages are represented in my master data? Do all countries sell the same products? Are all the product names and numbers the same in each country?

Undertaking a large scale, global data migration can take months or even years. In order to execute this successfully, we are going to talk about some initial thoughts and questions to prepare for in a Global Data Migration.

Develop a Strategy

You know your destination: a single, consolidated, global system. Sounds awesome! It's quite a path to travel in order to get there. Let's put together a roadmap.

1. Survey Systems and People
It's important to understand what systems, countries, and user groups are going to be part of the migration. If you envision a single global system, you need to start sharing that vision, and understanding all the pieces that will make up that system. Do we have three countries, ten, fifty? How many systems per country are we dealing with? Does everyone have their own customized system or shared systems? What users are going to be impacted? Just the sales team? Or are the call center, marketing, and IT groups impacted too?

2. Put Up Some Boundaries and Define Scope
You likely are not going to migrate every object and every field for every system part of your organization. If we do that, we will never be done. What are the critical data points that drive your business? What master data needs to be saved, what data is coming from other systems, and what data is garbage? If you ask these questions, you can identify the core data elements to focus on during the migration, saving time and money.

3. Define a Governance System
Already have a data governance group? Great! Don't have one setup yet? Now is a good time to think about it. You are going to have many hard questions being raised by the business. "Why can't we keep all this data?" "Why does France get to determine the values in the drop-down list?" "I absolutely 'NEED' this field to do my job." All of these questions and more will be asked, and you need someone who is both informed and empowered to make these decisions quickly and definitively. Without governance, you can wind up bogged down in endless meetings, changes, and revisions with no end in sight to the migration.

4. Make a Plan for Old Data
What is your data warehousing strategy? Do you want to archive all historical data? Are you going to keep your old systems around? As previously mentioned, not everything will be migrated and you must determine whether to preserve historical data for compliance/legal reasons. Archiving can be expensive if not planned properly, but by putting some thought into this, you can potentially save historical data without breaking the bank and keep the new system lean and clean.

There are a thousand other questions that are going to be racing through your mind. This is just a framework to get started in thinking about your migration project. As we can see with just these basic points, you are going to need some help in order to get this project done.

Assemble Your Team

You have your roadmap in your pocket, or at least the outline of one. Who is going to own this migration? Who is going to answer all the questions from the users and leadership?

The migration team is going to be as diverse as your organization. You will need representation from every country and every system. Here are some thoughts on the kind of people you might need:

global data migration

One of the key aspects to having an effective team is keeping them engaged in the planning process, and sharing the vision of the future system. Keep the team invested by keeping them informed and involved, and you will be able to drive towards a successful global data migration.

Be sure to subscribe to our blog for more Dynamics 365 content!

Happy Dynamics 365'ing!

Some Quick Data Migration Tips

We all know how much work and effort goes into a Data Migration, so in this blog we share a few tips that should help reduce some common issues when performing this work.

The worst thing that can happen during a data migration is migrating bad or incorrect data into your new system. By taking a few of them items below into consideration, you could prevent this from happening. Of course as with all advice, these may not apply to every situation, but in general, these tips should help mitigate some of the common issues.

1. Combine into as few files as possible

If you have complex and extensive data to migrate, consider making one large file with Tabs as opposed to multiple separate files. By doing this you typically:

Reduce your risk - Every time you upload a file, there is a risk of that upload failing to complete properly. Therefore, the more files you have, the more risk you have. Reducing the file count ultimately reduces your risk.

Make Validation Easier - If you have one larger file, usually validation is faster because you can validate more data without having to change or open other files.

Sequencing - With one file you can organize it in the order of uploads instead of trying to organize many separate files and ensure they are done correctly.

2. Excel is not always your Friend

Check your cell format -When migrating data using excel, sometimes Excel "helps" you by changing that data. An example is if you had a size of 5/6, Excel may change this to a date or a numeric value. Both changes corrupt your data, so ensure you validate your cell format and data in the file before you upload.

3. Validate the File before upload

Often the extract file is not validated prior to upload.

The assumption is that the data loaded into Excel properly and, therefore, it is ready to be uploaded. As noted above, Excel can corrupt your data so don't assume it is the same as in the old system.

Ensure a different person validates than the person that extracted and updated the data. Often if the same person is doing both, they miss errors due to looking at the same data the same way. Get "another set of eyes" to ensure it is correct and ready for upload.

Validating the data is correct in the file is usually easier than validating it after upload. If you find issues with your data file, you can fix it before it is uploaded and eliminate the time of performing the upload again. This eliminates the time to fix the data or perform another upload and validation.

4. If you fail, don't just keep trying.

First, always test your data uploads prior to your final or Production uploads. This may seem obvious, but we have all seen people upload without testing and have issues. The last thing you want is for your final upload to hang or crash and then you must try to figure out how to ensure all the data is there correctly.

Remember, sometimes performing additional data loads only further corrupts your data and introduces more issues by creating supplicate records or affecting other data sets.

Hopefully these tips help you with your next data migration!

Be sure to subscribe to our blog for more Dynamics 365 tips and tricks.

Happy Dynamics 365'ing!

Data Migration: Do and Don’ts

Data migration is when data is moved from one place to another. In many projects, the data is moved from a legacy system to a current and updated system.

There are three main parts of a data migration process, and they are:

  1. Understanding the data
  2. Understanding the data source
  3. Understanding the data destination

data migration

Figure 1: Source Link

For Dynamics 365, it is very important that we understand the source and destination very well, especially if the source is coming from different countries and different source systems. Every business user wants their data to be migrated in the same way as it is in the legacy system, for ease of use, training, and understanding of the new technology. In this blog, we will share the "Do and Don'ts" for an enterprise level data migration project.

First, we must identify the correct technology to migrate the data. Most use technologies such as Kingsway Soft and SSIS instead of free tools such as Talend, Mulesoft – for which there is less support.

For such open source technologies:

data migration

Figure 2: Source Link

data migration

Figure 3: Source Dilbert Cartoon, Google

With this, if we can introduce a concept for requirement inspection, we might encounter several issues during the initial phase of the project. We might be able to identify the missing pieces in the data migration during the requirement phase and save cost for the project in later stages.

For more guides to Dynamics 365, check out our blog!

Happy Dynamics 365'ing!

Engaging Businesses Effectively for Data Migration Mapping

Data migration may turn out to be a key component for any CRM implementation especially for Enterprise organizations that are switching to a new CRM application. Any CRM system without appropriate data may end up becoming a prominent reason for program/project failure, as users may not adapt to an application when they can't perform their job functions.

Many would consider data migration as a technical activity with mapping data elements from source system to target or moving them from a legacy system to a new one. However, there are many critical business/stakeholder aspects that need to be considered during the Data Analysis and Mapping phase.

The Data Analysis and Mapping phase cannot be successfully completed without engaging the right stakeholders and business users at different levels. Here are some of the aspects to consider for engaging business stakeholders to lay a strong foundation for successful data migration.

Understand the Difference Between Business and Data Owners

1. Business owners are users who are using a legacy system and will use a CRM application to conduct day-to-day business for the organization. Data Owners are the technical resources who are managing and maintaining the legacy application.

2. Business stakeholders are key to understanding the legacy data and to identifying how it was used to carry out business and its' critical components.

3. Data owners should be engaged only to the extent of getting source data files and understanding the technical dependencies of the source dataset. They are the ones who extract data into the required format from a legacy system.

Effective Communication

1. Introduce various teams involved in data migration with the business and publish a formal communication plan that includes SPOC (Single Point of Contact) for each group or team.

2. Publish RACI (Responsible, Accountable, Consult, or Inform) matrix for Data Migration clearly outlining where and in what capacity business engagement is required.

3. Prepare a business org chart with contact details outlining whom to reach out to for help or to escalate, if you encounter any blocks.

4. Escalation hierarchy and a matrix for all teams.

Define the Degree of Business Engagement during the Data Analysis and Mapping Phase

1. Data Analysis should be an inclusive exercise where a Data Migration Analyst/Lead needs to lay out a detailed plan with project/program manager to have details like the following:

2. Set up all meetings with a clear agenda and outcome from discussions, in advance.

3. Explain all templates that the business may have to validate or sign-off on (for example: Query Tracker, Lookup Mapping, Data Mapping Specification).

4. Commitment from business leadership for a quick turnaround to queries, review, and sign-off for data mappings.

Being Agile with Data Mapping (Creation, Review, and Sign-off)

1. The Data Analyst needs to make sure all analysis and mappings created are discussed and reviewed in real-time with business stakeholders.

2. All business feedback and inputs are incorporated to have accurate data mappings.

3. The Data Analyst is to connect with any downstream and upstream integration teams for impacts on data migration and keep business informed about the same.

4. Discuss any risk items and showstoppers with all key stakeholders in a joint meeting to address them faster rather than logging it in isolated system and following up every week over emails. This will help reduce multiple back and forth between the project and business teams.

5. This would also lead to faster resolution of issues between multiple teams engaged including business.

6. The business will be more confident to sign-off on the project, as they would have been part of data analysis and mapping journey.

7. This all would ultimately lead to successful data migration execution with minimum issues.

The Data Analysis and Mapping phase of a migration project cannot be successfully completed without engaging the right stakeholders and business users at different levels. By taking these points into consideration, you'll be on your way to success!

For more helpful Dynamics 365 tips and tricks – check out our blog!

Happy Dynamics 365'ing!

Dynamics 365 Data Migration Testing Best Practices

If you are moving from Dynamics 365 On premises or any platform to Dynamics 365 Online or earlier versions of CRM, you will need to migrate your data. In this blog, we are sharing a list to check when creating a Test Plan and Test Execution strategy for Data Migration Testing.

Test Plan for Data Migration Testing

Scope of Migration

Analyze the Scope of Migration for your project or instance. Migrating the entire data in Dev/Test/UAT environments and getting a signoff at each phase is time consuming. Hence, you need to check with stakeholders/Business owners and be in an agreement for full or partial data migration (Subset of data per entity) in lower environments like Dev/Test and full data migration in UAT/Production.

Identify Entities

Before jumping into migration testing get a checklist of all entities that need to be migrated. Get it reviewed with business Owners and add any entities that were missed.

Determine the order of validating entities

It is crucial that you validate the entities in order so that the data referenced in lookup fields are present when the record is validated. For example, you need to have your accounts in the system before you validate opportunities.

Determine the Cut-off Date

Check with business owners on the Cutoff Date for Old History Records. Consider whether history records over a certain number of years from old system should be migrated into a new Dynamics 365 system.  For example, do history records over five years old need to be in the new Dynamics 365 system?

Field to Field Mapping sheet

Get the field-to-field mapping sheet from source to destination. Ask the following questions:

Test Execution Strategy

Your test cases should cover the below scenarios for each entity:

Service Account for Master Data
Ensure that owner of Master data is not a specific user in the team and it should be Service Account.

Lookups and References
Ensure all the look up on the Dynamics 365 CRM forms has the correct data

Perform a Limited Test Migration

Start validating with limited set of sample data for each entity and ensure data is migrated for all the fields from source to destination.

Unique identifier in CRM

To identify a record uniquely in Dynamics 365, use the GUID of the record from the source instance to the destination Dynamics 365 instance.

Created On date

Created On date of the CRM record should be the actual Created on date in the source and not the date the record was migrated into Dynamics 365.

Calculated fields

Ensure calculated fields in Dynamics 365 are correctly populated.

Opportunities

Ensure while validating Opportunities, the actualclosedate will be set to actualclosedate the opportunity is closed in the source system.

Activities

Individual activity entities like email, appointment, phone call, or task should be migrated individually with Owner correctly migrated.

Case/Activity Status

Ensure status of Cases and Activities are correctly migrated.

Record Validation

Ensure you open at least a record for each entity and check if any errors are displayed. Chances are that you might miss importing a mandatory field and then you get an error that could be easily caught.

Console App to validate count of records

Some entities like Accounts/Opportunities/Contacts might have lots of records. From an advanced find, we can get only 5000 records at a stretch and its time consuming to review data in this small set. Hence, check with the dev. team for a console application that can run against each entity so you can find the record count easily and compare with the source and destination in Dynamics 365.

Record Status

Ensure to validate status field for CRM records. In source application, there might be different status like the Active/Pending Closure/Closed. In this scenario validate that all types of status should be available in destination, especially if you have created unique status reasons.

Owner

Validate that the Owner is not the System user who is used to migrate the records but that it's the user who created the record, or owns it in the source environment.

Field-to-Field Mapping

Validate field-to-field mapping with the source application to Dynamics 365 against the data-mapping sheet.

Involve the Business Users Early in the Cycle

In addition to checking the data and matching counts of the source and target data, you can have business users review and test the data. The users know the data. It's important to have testing scripts identified for data testing rather than "eyeballing" the data and assuming it looks right.

Lastly, we recommend executing smoke test cases and business scenarios against the system and then providing a Sign off on the Dynamics 365 data migration testing.

For more Dynamics 365 tips and tricks – subscribe to our blog!

Happy Dynamics 365'ing!