ownbackup logo
Back to the blog

Imagine you’re seeking out the best Salesforce data protection solution for your company. If your manager initiated this project, you might wonder why you need to worry about backing up your company’s data in the first place. Isn’t that Salesforce’s responsibility? 

Salesforce has outstanding availability and uptime. They do an excellent job of keeping the house, or the org where you store your company’s data, secure. But like with most SaaS vendors, you have a responsibility too. You’re responsible for protecting your items inside the house, or your data, metadata, and attachments, from accidents that could occur, or data loss admins and users cause in the org.

The Impact of Data Loss and Corruption

Data loss isn’t something to take lightly because, contrary to popular belief, it does happen in both Salesforce production and sandbox environments. The leading causes are human error, integration errors, bad code, and migration errors.

eliminate data downtime with ownbackupHuman errors can occur in production environments if you have too many “Marie Kondos,” whether they’re admins or users, cleaning up a little data here or merging a few accounts there. One wrong move can cause a cascade delete. All of a sudden, ten accounts that were accidentally merged by a less experienced salesperson also removed 32 contacts and 69 opportunities. Data loss or corruption caused by just one user in production could disrupt your business in many ways. First, the time wasted diagnosing the data loss by both the salesperson and the Salesforce admin. Then, the time spent restoring the data, which varies depending on the data protection solution you have in place. While all of that’s going on, other departments are noticing the impact. Orders and deliveries are coming to a halt because the accounts no longer exist. Customer support is unable to find contact information to help resolve complaints. What a mess!

In both sandbox and production environments, data and metadata loss or corruption could quickly happen when admin and development teams work on transformational projects, such as transitions to Lightning, org merges, org splits, migrations, and application implementations. For example, merging two orgs is a complex process that can take months to complete. These orgs often have hundreds of thousands to millions of records, thousands of users, differing configurations, automations, workflows, custom objects, APEX code, Visualforce pages, third-party apps, integrations, differing functionality, custom code, and more. With all of this going on, it’s easy to delete or corrupt data in sandboxes or production during an org merge or other transformational projects. Losing or corrupting sandbox data during these complex projects could cause bugs and errors to enter production, creating technical debt, elongating timelines, and increasing costs. Fortunately, all of this is avoidable with the right data protection solution.

Now that you understand the importance of protecting your Salesforce data, it’s essential to realize that not all solutions are equally effective. In this article, we’ll explore building a backup tool yourself by exporting data through the Salesforce API, vs. purchasing a Salesforce-recommended AppExchange partner solution, like OwnBackup.

Why Building Your Own Tool May Be Risky

Salesforce provides Application Programming Interfaces (APIs) for running programmatic operations on data and metadata. If you’re not familiar with this term, think of it as a library of commands for interacting with Salesforce’s data. You can also use that API to Extract, Update, Upsert, and Insert data to or from Salesforce. Those commands can essentially build your own backup tool; however, this is not a recommended approach.

At first, it may seem like a brilliant idea to build your own Salesforce backup tool. But in reality, creating your own tool consists of many moving parts and unexpected considerations that could lead to a high opportunity cost. You’re better off focusing your company’s valuable resources on your core business rather than creating an in-house backup tool.

Social Image -In-House Backup Solution Blog - Black - updated 2

The most reasonable way to figure out whether or not you should build vs. buy is to ask the right questions. We call this the “Backup and Recovery Test.” The questions should be the same for every company, but not everyone has the same answers.

Backup and Recovery Test Questions: Building Your Own Tool

Whether you’ve already developed an in-house tool or are investigating the feasibility of doing so, here are the six questions to think about as you consider the best option for your company.

1. Does the tool meet your recovery point objective?

As a measure of the amount of data a company can afford to lose before it begins to impact business operations, recovery point objective (RPO) is an indicator of how often a company should back up its data. RPO won’t be the same for every company. Typically, companies aim to recover to a point, not more than a day ago. Many companies back up data daily, or better yet, multiple times per day for critical objects.

When building your own backup tool, you’ll often export data from Salesforce’s API as Weekly Export .CSV files. Doing this will give you an RPO of up to seven days.

Another option is to export data via the Salesforce API into a SQL database. Depending on your ETL tool, this may or may not pull daily. It could even run hourly or every 15 minutes. Be cautious with tools that overwrite previous backups rather than saving each backup individually. Overwriting previous backups will drastically increase your RPO.

2. Does the tool meet your recovery time objective?

Recovery time objective (RTO) is the timeframe by which you must restore after data loss or corruption has occurred. The goal here is for companies to calculate how fast they need to recover by preparing in advance. Keep in mind that recovery time starts when you first become aware of the data loss. Recovery time is variable. It depends on multiple factors, including your approach to the following recovery tasks.

  • Backing up your data: The type of backup you’re performing can significantly impact your RPO and RTO. If you’re running incremental backups instead of full backups, point-in-time recovery will require extra data processing. That’s because some of the lost records may be saved in separate locations and will require piecing together. The same goes for partial backup methods. Suppose you’re not conducting a comprehensive backup of all of your data, metadata, and attachments. In that case, you’ll end up piecing together data as much as you can, but you could end up losing the data forever.
  • Identifying data loss or corruption: How do you find out about data loss? Does someone notify you? Do you have an alert? Most in-house tools don’t have this critical functionality. So instead of learning about an incident yourself and immediately resolving it, you’ll find out from your users, or you’ll never find out at all. If you’re alerted by users, you’ll have to drop everything and figure out what happened. Never finding out about a data loss or corruption could be even worse in the long run, depending on the type of missing data. Maybe your accounting department will miss out on billing a few customers. Or perhaps some opportunities will go missing from your CRO’s dashboards, making this quarter appear short on pipeline.
  • Finding and preparing all the records that were affected: Precisely locating the related child objects, metadata, or attachments can be a nightmare with in-house backup tools that don’t run a comprehensive backup, as mentioned earlier. You’ll either 1) leave out data because you can’t find it or 2) spend hours to weeks piecing together related objects involved in a cascade delete.
  • Restoring the lost or corrupted data to Salesforce: Once you’ve done the initial investigation, identified the last set of valid data from the backups, you’ll have to update/insert the lost or corrupted data back into Salesforce. Depending on the level of in-house backup tool you’ve built, this will most likely be a manual process involving Salesforce’s data loader or a similar tool. For more advanced in-house tools, it may be possible to restore through Salesforce’s API.


The best way to ensure a data protection solution will match your defined RTO is to test various data loss and corruption scenarios in your sandbox.

3. Does the tool allow you to recover data from every point in time precisely like before?

One of the primary ways to ensure data integrity remains intact is to back up all data, metadata, and attachments daily or even more frequently. That way, you’ll preserve as much of your information as possible. Most in-house backup methods are not comprehensive. Meaning you’ll have to pick and choose objects and fields to back up rather than merely backing up everything. Can you predict the exact data you’ll need to restore next week, month, or year? No one can, so it’s best practice to back everything up.

You’ll also need the ability to compare today’s data to historical data quickly. Lack of compare functionality will make it challenging to identify incorrect vs. correct changes to your data. Without this level of granular data comparison capabilities, your custom-built tool will likely overwrite unaffected data.

Furthermore, maintaining parent-child relationships during restoration can be challenging for a custom-built tool because these aren’t typically a holistic backup. A cascade delete often occurs after deletion or corruption in Salesforce. That means the omission of a single parent object (or account) could also corrupt, orphan, or remove all the child records related to that parent object. Given that you cannot set a record’s ID via the Salesforce API, and data relationships are based on the record ID, restoring relationships can be very complicated.

4. Is the tool secure enough?

Unless you’re a cybersecurity firm, you may not be able to count on an in-house backup tool’s security. Security of data backups is not only crucial for your confidential data, but your customers’ personally identifiable information (PII). Wrongful access to this information could incur regulatory fines due to non-compliance with GDPR, CCPA, and other regulations that require data privacy measures to be in place. Furthermore, most industry regulations, like HIPAA, SEC 17a-4, and CFR Part 11, require backups to be in place.

If you attempt to build your own tool, make sure that you’re securing your backups with:

  • Encryption in transit and at rest to protect data at all times,
  • Role-based Access Controls (RBAC) for restriction over who has access to backups,
  • IP whitelisting for commanding domain access,
  • Two-factor authentication for ensuring only authorized users have access, and 
  • Single sign-on (SSO) for reducing the number of threat surfaces hackers could access.

5. Can you count on the tool when you need it most?

You deserve automated and dynamic backups, proactive monitoring, and world-class support to ensure backup and recovery are as stress-free as possible. It’s almost impossible to achieve all of this when you build your own backup tool. It would require you to:

  • Implement auto-discover to support custom objects,
  • Keep up with new API versions while supporting metadata, attachments, table-data, Chatter, and custom objects,
  • Implement a notification system for when a backup fails or finishes with errors,
  • Troubleshoot backup failures and errors,
  • Maintain backup runtimes and API consumption, and
  • Keep all the related code up-to-date.

6. Does the tool enable you to provide the necessary data accessibility?

What happens when your Salesforce data is temporarily unavailable? Your company must have access to information through a user-friendly, controlled interface outside of Salesforce. Backing the data up to an external database, alone, is not the answer. Yes, it’s accessible outside of Salesforce. Still, the number of people who will have access will be minimal and providing access to those who might need the data poses a significant security risk.

Data availability is also an essential requirement of GDPR and CCPA. In support of these regulatory requirements, you need to give Data Subjects full transparency into the data you have about them, including backed-up data. To do this, you need to include search functionality as part of your in-house tool that will allow you to find where a Data Subject’s information is located, including within attachments. Under GDPR and CCPA, you’ll also need to incorporate retention controls that only store data for as long as legally necessary.

How did your in-house backup tool fare against the above test? Are you ready to try a better solution?

Eliminate Data Downtime with OwnBackup Data Protection

OwnBackup has been on the Salesforce AppExchange since 2012 and has over 300 five-star reviews.

With 10+ years of development work, we continuously make additional enhancements to keep up with constant Salesforce advancements, making our solution enterprise-ready for your specific data protection needs. 

Request a Demo