Monday, December 5, 2022

04 - Cloud Backup - Duplicati - Backup Jobs - Setup & Configuration - Part 4 of X


Intro:

Ok so after all our hard work to setup the Wasabi backend and email account(s) we are finally ready to create our backup jobs. One for the users files on the computer and the other to backup the Duplicati database.

In the event that the computer you are restoring has a failed hard drive and the Duplicati database file is missing then the restore process will take significantly longer since Duplicati needs to rebuild the database based on the files it backed up to Wasabi. This applies to backups over 100gb.

If we have the database handy we can spin up a virtual machine, load Duplicati quickly from the PowerShell script, insert the existing database, download the encrypted backup files from Wasabi and do a restore. I'd only suggest going this way if the backup is overly large. But better to have it than not.

You cannot include the database as part of a single backup job. The reason for this is simple. Let say you run a backup of all files in "C:\Users\". Duplicati as it's doing the backup is writing to its database. If you include the database file along with the files in "C:\Users" you'll have backed up an incomplete database as Duplicati won't finish writing to the database until the backup job is done.

We will be using the example computer name we used in the previous 3 articles, "duplicati-desktop-488gd38".

We're also going to be setting up an email just for the reporting side of Duplicati. This email will receive job reports whenever the backup(s) run. I may combine this into the original email we created for use with Wasabi in the future and just use one email address per cx location. In the end, using Thunderbird, you can create a filter and automatically move XYZ email into XYZ folder. So that option exists.

For now just create a new email with the following format: "s3-backups.cxSiteLocation". Using our previous example, John's Seafood Shack located at 1500 Green Road we can create an email as follows:
  • s3-backups.jss1500@xyz.com
  • Generate a 20 character password and save it.



Initial Setup of Duplicati:

1. On the machine you just did all this configuration on pull up Chrome, Firefox, or Edge๐Ÿ™„ and navigate to http://127.0.0.1:8200/. (There's no place like ๐Ÿก). If this is the first time opening this link you'll see the "First Run Setup" box shown below. Click, "Yes" to setup a password for the interface.


2. Enable, set, and or verify the following settings:

Access to User Interface:
  • Enable and set a password
  • Enable Allow Remote Access (restart service to apply setting)
    • Hostnames:
      • Leave blank if accessing pc directly
      • "*" if you want to access Duplicati's web gui from another pc
      • "pcname.vertigoisabitch.com" is accessing from web
  • Enable Prevent tray icon automatic log-in
Pause After Startup or Hibernation:
  • Zero Seconds (Use this to delay start on systems that are slow to boot)
User Interface Settings:
  • Display and color theme = The dark theme (by Michal)
Donation Messages:
  • Show or hide
Update Channel:
  • Default (beta)
Usage Statistics:
  • Anonymous usage reports = System default (Information) or Usage statistics, warnings, error, and crashes
Default Options (edit as text):

--accept-any-ssl-certificate=true
--snapshot-policy=Required

When done press, "Ok" at the bottom and login again.




Backup Job 1 of 2:

Add Backup:

1. Once logged back in the interface will be blank as seen below:


2. Click, "Add Backup".


3. Select, "Configure a new backup" then click, "Next".


General Backup Settings (Part 1 of 5):


1. Input the following information into the fields as seen below:
  • Name = duplicati-desktop-488gd38 - Users Folder
  • Description = Backup of Users folder to Wasabi s3
  • Description = Job started on (month.day.year). ie 10.11.22
  • Encryption = AES-256 encryption, built in
  • Passphrase = Press generate button a few times then copy and save password. This password is for the job only and is needed for restoring the files in this job. Don't lose it or the backup is worthless!
Click, "Next" when done.



Backup Destination (Part 2 of 5):

1. The backup destination is picked first. Enter the information exactly as seen below:

Backup Destination:
  • Storage Type = s3 Compatible
  • Enable Use SSL
  • Server = Wasabi Hot Storage (s3.wasabisys.com)
  • Bucket Name = duplicati-desktop-488gd38
  • Bucket Create Region = (default) ()
  • Storage Class = (default) ()
  • Folder Path = duplicati
  • AWS Access ID = This is your Key that was created in Wasabi during user creation
  • AWS Access Key = This is your Access Key that was created in Wasabi during user creation
  • Client Library to Use = Amazon AWS SDK
Just to be clear, the "Access ID" and "Access Key" referenced above are the Wasabi sub-account username keys. In this case the keys are for the user named, "duplicati-desktop-488gd38" that were generated at the end of the user creation in the sub-account Wasabi console, not the WACM console.

The AWS Access ID key is the users Key in Wasabi and is 20 characters long.
The AWS Access Key is the users Access Key in Wasabi and is 40 characters long.

Also the "Server" mentioned above must match the region you created the bucket on in the Wasabi sub-account console. In our case we created it in the "Wasabi US East 1 (N. Virginia)" region. Use, "Custom Server URL ()" in the drop down to key in another region.

"Folder" is the actual folder inside of the bucket. In our case there are two, "duplicati" and "duplicati-database". We'll create the latter in the next backup job.


The Wasabi Management Console is always reached at console.wasabisys.com (regardless of which region you are using)


2. Scroll down and at the bottom of the page is, "Advanced Options". Click on the word, "Advanced Options" in green. It should drop down like below.


3. Click on the Hamburger icon on the right in light blue and click, "Edit as text".


4. Paste in the following into the, "Advanced Options" box.

--accept-any-ssl-certificate=true
--s3-ext-forcepathstyle=true
--s3-ext-signatureversion=2

9. When done click, "Test Connection". It will ask you to prepend the bucket name with your username. Click, "No".

If everything went right it will say, "Connection Worked!" in less than 20 seconds or so.


*If the page hangs for more than 30-40 seconds and doesn't say anything, something is wrong. Wait until the page fails and comes back with an error. Don't refresh it to try and get the prompts back. Go back and make sure everything is selected and entered correctly.


Source Data (Part 3 of 5):

1. We're going to backup the Windows Users folder in the first backup. In Windows 10 the location is as follows: "C:\Users". In some instances this folder will be larger than 100gb. You can divide it up into multiple backups which will present faster restores.


2. Use the folder tree to either navigate to "C:\Users\" or use the Hamburger icon on the right in light blue to manually add, "C:\Users\".


or


Below is what the Advanced Editor looks like:


3. Next we need to exclude Temporary Files and System Files. Click on the drop down for, "Exclude" in green near the bottom of the page.


4. Select, "Hidden Files" and "System Files" then click, "Next".



Schedule (Part 4 of 5):

The schedule is going to run our backup automatically.

1. Verify that, "Automatically Run Backups" is enabled.

2. Change the runtime to a time when the client will not be in the office. I generally like to stagger them between 12:00am and 6:00am. For locations bars or clubs I'll bump it up to between 3:00am and 9:00am.

3. Verify that every day is selected. If the client shows up on a weekend do catch up on work you don't want to miss that backup. If the location is clearly closed then you can omit days.

4. You can also set the time frame to minutes, hours, days, weeks, months, and years.

5. Click, "Next" when done.



Options (Part 5 of 5):

1. Under General Options set the, "Remote Volume Size" to a higher number such as 100-200mb if the client has a fast upload on their end, else leave it at 50mb. Keep in mind a smaller remote volume size will ultimately result in a larger number of files on the Wasabi backend.

2. I am currently using, "Keep All Backups" for sites with less than 100gb per computer. This can also be used as an audit tool to see what employee XYZ did on whatever day when recalled. Pretty much it's versioning at that point but longer than most cloud provide.

At the end of the year I dump the encrypted files to an external hard drive and hand it to the customer. It's up to them to safeguard it from here on out. They receive the decryption key and a copy of Duplicati is installed to recall the files. Then we start fresh on Wasabi for the new year.

3. Ok so we're going to try something different here for this massive 100gb+ backup for a client. We're going to run a custom backup retention consisting of saving once a day for a month (30 copies), then once a week for a month (4 copies), and once a month for 3 years (36).

Select, "Custom Backup Retention" and enter:

1M:1D,1M:1W,3Y:1M


4. Click on, "Advanced Options" below in green to expand it. Click on the Hamburger icon in light blue to the right, then click on, "Edit As Text".



5. Modify the email address and credentials then copy the following into the text block and click, "Save" when done:

--accept-any-ssl-certificate=true
--snapshot-policy=Required
--send-mail-url=smtp://smtp.dynu.com:587/?starttls=when-available
--send-mail-any-operation=true
--send-mail-subject=%OPERATIONNAME% - %backup-name% - %PARSEDRESULT%
--send-mail-to=
s3-backups.jss1500@xyz.com
--send-mail-username=s3-backups.jss1500@xyz.com
--send-mail-password=Random 20 character password
--send-mail-from=s3-backups.jss1500@xyz.com
The above settings will periodically send emails every time a backup runs, completes, fails, has problems, etc. The email will be sent from, "s3-backups.jss1500@xyz.com" and be delivered to itself.

Add this email address into Thunderbird. This way we can create a filter later to automatically move the emails into its own folder named after the computer name. If we have multiple computers being backed up at this location, all emails relating to backups will come into this mailbox.


6. Make sure you copy the Duplicati Job AES Password from step 1. This is what the following warning is referring to:


7. Once the setup fo the backup job is complete you'll be dumped back at the main screen. If the backup is on a schedule, you're done at this point with the exception of backing up your jobs which is covered in the next step. Since we installed Duplicati as a Windows service, no one needs to be logged into the machine to run the backup. You can come back to http://127.0.0.1:8200/ periodically to see the results or current job running on the main screen.



Export Your Backup Job(s) to JSON File:

1. From the main screen click on the name of the backup in blue and a drop down menu with more options will appear:



2. Under "Configuration" click on "Export".


3. Make sure "To File" and "Export Passwords" is selected and "Encrypt File" is disabled then click on "Export".


4. Click "Yes, I understand the risk" and your download will start.


5. Above is what the file name will look like for the backup. It's a plain text file and not encrypted so protect it as it may contain sensitive passwords and or other credentials.



Backup Job 2 of 2:
  • Create another backup job. Only this time you're going to backup the Duplicati database. Its location is as follows:
    • "C:\ProgramData\Duplicati\"
  • In step 1 give the backup the following name (be sure to update desktop-488gd38 with the computer's real name):
    • "duplicati-desktop-488gd38 - Duplicati Database"
  • Keep all the settings the same as the first backup with the exception of the two changes above.
  • Make sure to schedule the database backup for a few hours later and run it every day. This gives the first backup time to complete. The initial backup of the Windows User folder will take forever but subsequent backups are much faster since they only backup parts of files that have changed. Duplicati cannot be writing to the database or the backup will be corrupt.
  • Export the backup job and save the JSON file.



TL:DR:

I'll come back to this later.



Conclusion:
  • In part 5 I'll explain how to do a restore of the data if need be.


๐Ÿ‘ฝ

No comments: