5 practical tips for putting backup data in the public cloud
Here's how you can adjust your data management strategies for the hybrid cloud future.

 In |

Reading Time: 5 minutes

Overwhelmingly, organizations of all shapes and sizes are moving at least some of their workloads to the cloud. As cloud adoption continues, I predict more and more organizations will take a hybrid approach to the public cloud, choosing to keep some key services on-premises, migrate some to the public cloud, and spread others across on-premises and the public cloud.

When it comes to backup data, the public cloud – with its nearly infinite capacity, ability for multiple accounts, and geographical distribution – is a natural choice. However, as organizations adopt this approach, they often fail to consider and adjust their data management and data protection strategies accordingly.

In this post, I’ll break down five ways to better manage and protect your backup data in the public cloud.

1. Use different accounts for the backup data

You don’t want other cloud work to interfere with your backup infrastructure. For instance, if developers are testing a configuration of cloud services, they might delete all of the components when they’re finished – including the S3 or Blob resources associated with backup data.

Avoid this kind of disaster by keeping backups in their own dedicated accounts. An organization using multiple accounts could prevent a developer from acquiring administrative access to the storage accounts of a backup administrator.

2. Get policies and retention right

Too many times in the on-premises world, policies for retention are set around the capacity of the underlying backup target (NAS, tape, appliances, SAN, etc.). This isn’t necessarily wrong, but as environments change and data grows, organizations are either over-purchasing up front or under-serving the needs they have due to fixed allocated capacity.

Imagine an on-premises data center with retention set based on what the storage can do. If you have 10 TB of secondary storage for backups, and you can only hold one month of the data you care about, you have found your limit.

In the public cloud, however, the storage is effectively infinite. From a compliance perspective, the cloud can allow you to hold a year or more — whatever you want — due to its elastic scale-out nature. Organizations have to take a different approach – one consistent with cloud characteristics, namely elasticity — to policies to meet governance and compliance conditions of HIPAA, PCI DSS, GDPR, and more.

Ask the application teams and stakeholders about their expectations for retention. By comparing them to the on-premises capabilities, you might identify a gap.

3. Do your research on ingress and egress

It’s important to know how data is sent and retrieved from the public cloud. Is it moved, is it copied?

For example, if you need to restore 1 KB of data from two years ago that was part of an on-premises storage system within a 5 TB backup file, that’s not a big issue. However, if you have to retrieve 5 TB of backup data from the cloud, you’ll incur egress charges from the cloud provider.

Research how specific backup software products write to the cloud, and more importantly, how they retrieve from the cloud. Here you can discover if any products charge extra to use the public cloud or use it in an inefficient manner. You don’t want to be surprised later.

4. Price matters, choose wisely

Different regions charge differently for the same service. For object storage, it may make sense to go to a different region – even if it’s far away – for cost savings. Besides, it’s meant to be a tier for capacity outside of operational restores.

Consider the governance and compliance of the data in question as well. It may not be worth sending your data out of the country to save 2% if that means introducing a compliance issue with how the data is being handled.

One thing to remember if you’re simply trying to find the best price for a cloud service is that countries and regions are being added to the global offering list at an astounding rate. You never know when a new, closer one might pop up. (AWS US East (Ohio), also known as us-east-2, was added right in my backyard in 2016!)

When new cloud regions are added, they may incur a different cost profile as well as less latency for better performance. Check with your cloud provider for their latest map before making a decision; Azure and AWS, for instance, keep a running list on their sites.

5. Encrypt

The backup data in the cloud is yours. Make sure it stays that way. Use backup software encryption to ensure the confidentiality of your data in the cloud.

Bonus tip: Integrate with the public cloud

At one point in time, cloud gateway technologies were attractive for quickly adding capacity or off-site elements to existing practices. And while they might have made sense 10 years ago, today organizations should consider a different approach.

Many services are now cloud-native, and their mechanism for protection and storage should have a native experience with other cloud services. Native object storage support, down to metadata and seamless backup software integration, are now table stakes. Having a solution that is integrated and more efficient when putting backup data in – and out of – the cloud is going to be the right solution for today and tomorrow.

Before you leverage the public cloud for backups

The public cloud is the perfect place for storing your backup data. But don’t just jump into it. Take certain precautions to ensure a smooth transition.

These five tips are a good place to start.

About the author

Rick Vanover (Cisco Champion, vExpert) is Senior Director of Product Strategy for Veeam Software. Rick’s IT experience includes system administration and IT management, with virtualization being the central theme of his career recently. Follow Rick on Twitter @RickVanover or @Veeam.