In this webinar, we’ll dive deep into the specifics of each component of a backup policy, providing you with a clear roadmap for creating a resilient backup strategy.
Size: 29.77 MB
Language: en
Added: Jul 11, 2024
Slides: 23 pages
Slide Content
Setting Up Backup Policies for Success Kurt Abrahams CMO, MSP360 Dmitry Evdokimov Manager, Technical Sales and Services Andrey Anushin Senior Onboarding Engineer
MSP360 At-A-Glance Backup RMM Connect
Identify Critical Data Define Backup Objectives Choose the Right Backup Solutions Implement and Enforce Policies Monitor and Test Agenda
The main goal is to configure the backups that have a: Cost-effective storage footprint An Optimal number of restore points A fast restore in case of disaster Goals
Identify Critical Data When coming up with a plan to secure machines with backups, you should answer the following questions: What is the endpoint’s role? What kind of data needs to be protected? How is the restore going to be performed in case of a disaster?
Identify Critical Data Types of data to protect: Files: File backup SQL / Exchange database backup System: Image-Based backup VM backup Cloud data (Google workspace / Microsoft 365)
Create the backup strategy Workstation File server Domain controller NAS devices File backup %userprofiles% User folders Network shares stored on this computer No NAS network shared folder Imaged based System partitions Exclude user data %USERPROFILE%\Desktop etc. system required / boot partitions Exclude shared folders All partitions or system required No
Create the backup strategy Image-Based backups are used to restore on a bare-metal, or as a VM with the Operating System and all the applications. File-backup is a faster way to restore the individual files. They become available as the restore progresses. Key points for choosing the right strategy: If the PC has user data - consider setting up File-Level backups If the entire system should be protected - configure an Image-Based backup If the PC has small amount of data - using only Image-Based backups might make more sense If the PC has a large amount(1+ TB) of data to back up - using a combination of Image-Based and File-Level backups is recommended
RTO and RPO RTO: This defines the maximum amount of downtime your organization can tolerate before experiencing severe consequences. RTO depends mostly on connection speed, the size of the data being restored, and the target disk speed. RPO: This specifies the maximum acceptable amount of data loss before encountering significant disruptions. The RPO is defined by the frequency of your backup schedule. However, more frequent backups = more storage consumption. MSP360 has several ways of making the backup footprint smaller Built-in compression (LZ4) Efficiency depends on the type of files optimal compression and performance ratio Built-in deduplication Advanced filters to exclude files and folders from backups
Schedule and retention planning Depending on the role of the source PC, the data that needs to be backed up will have the following parameters: Initial size Data change pattern (and size) Access method (e.g. DBs need to be dumped before backup) To set up a valid schedule and retention policy for the data to backup, the following approaches can be used: Create several backup plans with different scheduling and retention policies Setting the right frequency for running these backups and chaining them to run in a sequence
Different backup modes for success Forever forward incremental Best for cost efficiency (saves up to 50% storage space) For short-term retention (up to 3 months) Recurring incrementals with periodical full Suitable for all cases Fast backups and restores Grandfather-Father-Son Designed for the long-term backups Supports native object-lock (immutability)
Backup Retention Storage cost depends on the backup retention Different cases require different retention policies Accidental deletion - 2 weeks Ransomware - 3 months Compliancy - up to 7 years Use object lock (immutability)
Retention policy settings Object Lock (Immutability) is a feature that locks backup datasets for a period specified by GFS retention policy. Within this period, backup data cannot be removed or modified. Object Lock (Immutability) is supported for the following storage providers: Amazon S3 Wasabi Backblaze B2
Week 1 2 3 4 5 6 7 8 9 10 11 12 13 1 month 2 months How does it work? Regular schedule with GFS: Backup schedule: full every month, weekly incremental backups Retention period: keep backup for 1 month GFS: keep 6 monthly full backups Immutability: Enabled 3 months
Mo Tu We Th Fr Sa Su Mo Tu We Th Fr Sa Su How does it work? Forever Forward Incremental (FFI): Storage: AWS S3 Standard/Azure Hot/B2 Backup schedule: every day Retention period: 8 days
W eek 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Number of restore points varies between 30 required and 60 ( retention policy + minimum storage retention period ) 1 month 2 months How does it work? FFI with Intelligent Retention (IR): Storage: MSP360 Wasabi (min retention period - 30 days) Backup schedule: every day Retention period: 30 days
What’s next We defined: what to back up applied filters how often to back up how long to store the backups Enjoy Configure reports Run periodical test restores
Reports and notifications Triggered notifications Sent upon backup plan completion. Scheduled Group report for all your computers and backup status Storage usage Live\on-demand Backup history Capacity Report Storage Usage License usage
Summary Define the type of machine Define what should be backed up Separate Image-Based and File backups Define RTO and RPO Exclude from the backups that you don’t need Configure the backup schedule accordingly Select the type of backup Define what you are protecting against Use Object Lock Configure reports that you need to receive
Always test your backups Develop your documentation Enable email or push notifications Use lifecycle rules where applicable Periodically review settings of the plans Always use the latest version of the Agent Tips For Successful Backup Management