From Disaster to Diligence: Mastering Salesforce Integration and Migration

Salesforce integration and migration are high-stakes procedures, and a single oversight can lead to data breaches, sync issues, or project delays. In this blog, we go over how to avoid the seven real-world pitfalls that even well seasoned developers find themselves in.

This blog is geared towards Developers, Analysts, QAs and other Salesforce Consultants, where we'll be going over some reasonable steps that can protect them from committing offenses in Salesforce integration and migration projects. And no, you didn't see wrong, we really did say offenses as any form of ignorance/carelessness when dealing with data could lead to catastrophic outcomes and never end well.
Besides covering the pitfalls of implementing stories containing SFDC limitations, this blog will also cover basic principles on data migration and integration which can help mitigate costly future issues.
We've also provided links to relevant Salesforce documentation wherever possible but some of the basic principles we highlight here may not be found in the Salesforce documentation. These best practices are based on real world experiences, specifically with integration and migration efforts.
Due Diligence: ‘Due’ is defined as meaning “proper” or “required.” In legal contexts, ‘diligence’ means “the degree of care required in a given situation.” In this way, due diligence is the level of care or caution that a specific situation calls for. Due Diligence has lots of definitions across multiple books. But in this article, we will be covering due diligence from an SFDC developer standpoint and not from a business or investor standpoint. In simple software terms, while developing software a developer should meet legal standards besides meeting customer requirements. To be precise, we will be concentrating on the due diligence that a developer should maintain while working in migration and integration modules.
It is hard to make a common framework for due diligence because every project is somewhat unique and has different requirements on integration and data migration. But these are some common yet costly pitfalls based on the rich experience we've gathered over the years and here we are sharing it with the world to help our peers avoid such scenarios.
In previous projects I have seen free software used to implement migration projects or to test migration without client notice. Some software is free for 1 month and only one user is allowed to use it in a domain. But multiple developers in the same domain end up downloading it and using it without reading the terms and conditions. After the free trial is up a bill is sent to the unknowing organization that leads to legal issues or a heavy fee.
Even if the software is completely free, there could be limitations that create problems at the time of project delivery and there may be no fallback option. This will lead to searching for a workaround that could reduce the quality of the project and increase the delivery time.
There are also free browser plugins/add-ons that provide quick utilities such as advanced code searcher or data export/import utilities which, otherwise done from SFDC standard features, would take time. But some of these free add-ons require credentials to log in which may lead to a data breach. As such, always use Salesforce-authorized plugins and read reviews before using any. Also, check the Salesforce developer community for recommended add-ons/plugins.
We all know Salesforce is hosted on a multitenant architecture. It has governor limits to make sure no single system can exploit the resources, so it has SOQL, DML, Query Rows, Execution and CPU time limits. In integration, we also have Callout limits where you cannot do more than 100 callouts in a single transaction, so we need to design in such a way that the endpoint can receive bulk records in a single HTTP callout thus not doing a callout for every record. One more problem statement would be when you parse a request into multiple objects in Salesforce, please make sure all the SOQL, DML and Query rows combined from these multiple objects are well under the governor limits. We can also get APEX CPU timeout when we are implementing a bulk load through standard or custom APIs, this comes when the transaction takes more than 10 minutes. We can reduce it by reducing the batch size. All these limits are well explained in Salesforce Docs that you can refer HERE
Not knowing transactional limits across integration systems can blow up in your face during integrations. Besides knowing Salesforce SOQL and DML limits, one should also know the limitations of other systems. There are some integration points such as consent validation websites which actually cost you money for every request, even in a sandbox. Another example is when you integrate AWS S3 with Apex you can only upload files per API request through Apex because some systems still do not support RESTful services. Bearing all that in mind, going through all the system limitations will definitely make you a smarter developer.
I could have covered this in the transactional limits section, but I feel it is so important that it needed to be covered separately. We already know about finding transactional limitations but while writing code we also need to make sure we write safe/quality code to avoid loopholes, especially a transactional loop. In cases when the integration systems are tightly coupled from inbound as well as outbound with SFDC to keep data in sync any update in other systems triggers updates in SFDC and vice versa leading to the dreaded loop. So best practice is to make sure to check for a loop like when you get a request from integrated systems, you don't call out for the same update back. We can easily implement it by having a Custom Label and adding integration users which you need to bypass while making callouts to other systems.
In the Agile methodology we often connect with other systems so don’t hesitate to check with stakeholders on how they are handling loops because even though SFDC has checks, other systems may not. This may lead to SFDC consuming an API when we get the same request back which initially got triggered from SFDC itself and SFDC does not expect the same update back.
To integrate with systems we need to configure inbound and outbound connections to and from Salesforce. To allow inbound requests you need to share the credentials with third parties. I have seen people sharing credentials in chats or emails in plain text which is a very insecure way and can lead to unauthorized access. Companies generally have operations teams from IT who handle this sharing of credentials to third parties using encryptions. Another options is to handle the encryption yourself before sharing. Moreover, external applications should not store Salesforce user credentials (usernames, passwords, or session IDs) in external databases. In order to integrate an external application with Salesforce user accounts, an OAuth flow should be used.
Furthermore still, you need to be careful using system debug logs or setting the finest level in debug logs when debugging connections. Debug logs in Apex code should not contain any sensitive data (usernames, passwords, names, contact information, opportunity information, etc.). The debug logs include standard Salesforce logs using system.debug() methods or custom debug logs created by the application. Sensitive information should also not be sent to third parties by email or other means as part of reporting possible errors. You can refer to this Salesforce documentation HERE for more details on storing sensitive data:
When doing data migration, we are given database access and we may need to stage data in middleware for transformation before loading into SFDC. This middleware is already equipped with security features to protect the data. But there are also scenarios when we get client data in Excel sheets. During the migration phase, you are the owner of that data. Any data leaks or unwanted transformation during this time is not acceptable. You should not share the data in emails or chats, even though I have seen a lot of developers make this mistake. Please make sure to use client domains to share and receive data; it can be shared in a drive with restricted access.
There is one last scenario where developers test production data in a stage environment for testing. This is a very common mistake I have seen everywhere. This leads to sending emails and calls to actual customers from sandboxes. Make sure all relevant contact information is masked before testing because this error is not at all acceptable to the client; customer data is of utmost priority to the business. You can also go through this Salesforce Doc HERE on data protection to stay up-to-date on the standard rules.
How often have you come across storage issues across all environments? I believe everyone must have had the chance to see this issue once, mostly while importing test records in a sandbox. How would you feel if you got the same error in production when someone is trying to create an opportunity? This would be a disaster, right?! We need to consider data storage in SFDC before migration and also keep monitoring the storage after migration. Each record on average holds 2KB of storage. At a high level, we need to examine the number of records that can be migrated under a certain allowed storage limit. Starting in late March 2019, Contact Manager, Group, Essentials, Professional, Enterprise, Performance, and Unlimited Editions are allocated 10 GB for data storage, plus incrementally added user storage. For example, a Professional Edition org with 10 users receives 10 GB of data storage, plus 200 MB, for 10.2 GB of total data storage. There are separate limits for file storage.
Fixing the storage issue after migration is not part of data migration or integration activities. It is solely an SFDC issue but as a developer, I would like to discuss quick fixes we could do. Storage mainly gets consumed when we do custom logging for exceptions and also for successful transactions. It is good to have a batch job which runs daily to delete older logs. We usually create Notes and Tasks under Account or Opportunity records every time we interact with customers. We can have a plan to delete it for inactive accounts. So as a quick fix, we can delete everything related to inactive accounts. Preferably we should archive such data in the form of Big Objects or in some secure external data storage solution before deletion, if possible. You can check HERE for a more detailed read on the limits and maintenance.
At Cloud Peritus, we’ve had the privilege of leading numerous large-scale Salesforce integration and migration initiatives across industries and the one truth that has stood the test of every project was that due diligence is never optional, but mission-critical. The pitfalls outlined in this blog aren’t hypothetical, they’re real-world scenarios we’ve encountered, mitigated and in many cases, been brought in to fix after damage was done.
We’ve witnessed first-hand how something as seemingly minor as an unchecked plugin, a forgotten API limit, or an insecure credential-sharing method can snowball into data loss, system downtime, or even reputational risk. That’s why we embed these best practices into our DNA, ensuring that every engagement we take on is designed for long-term scalability, data integrity, and a seamless user experience, or for a complex, multi-system integration, the principles of due diligence we’ve outlined here should form the backbone of your approach. It’s not just about avoiding risk—it’s about building with foresight, precision, and accountability.
To warp it up
- Always try to use Salesforce out-of-the-box functionality to its full potential before using other utility tools.
- Use Data Loader to insert, update or delete operations.
- Use Workbench to test APIs and do quick queries.
- Use Developer Console to debug logs.
If your organization is navigating the challenges of Salesforce data migration or system integration and wants a partner who’s seen the trenches and knows how to guide you safely through them, then we're right here. Let’s build it right, the first time.
Reach out to us at info@cloudperitus.com
Cloud Peritus has been at the forefront of innovation since its inception and has since played an integral role in delivering game changing solutions to its clients time and again, resulting in 5 star reviews from across the board.
Feel free to check us out at Salesforce AppExchange.