Monday, December 19, 2022

Windows - Reset TCP/IP Stack

 

Intro:

This is just a quick list of commands you can use to reset the TCP/IP stack in Windows.



Commands (CMD run as Admin):

ipconfig /flushdns

netsh winsock reset
netsh int ip reset
netsh interface ipv4 reset
netsh interface ipv6 reset
netsh interface tcp reset
netsh int reset all 

nbtstat -R
nbtstat -RR

netsh advfirewall reset



PowerShell (run as admin):

Get-NetAdapter | Restart-NetAdapter



Conclusion:

I'll add to this list as I either remember forgotten commands or I come across new ones.



πŸ‘½

Monday, December 5, 2022

04 - Cloud Backup - Duplicati - Backup Jobs - Setup & Configuration - Part 4 of X


Intro:

Ok so after all our hard work to setup the Wasabi backend and email account(s) we are finally ready to create our backup jobs. One for the users files on the computer and the other to backup the Duplicati database.

In the event that the computer you are restoring has a failed hard drive and the Duplicati database file is missing then the restore process will take significantly longer since Duplicati needs to rebuild the database based on the files it backed up to Wasabi. This applies to backups over 100gb.

If we have the database handy we can spin up a virtual machine, load Duplicati quickly from the PowerShell script, insert the existing database, download the encrypted backup files from Wasabi and do a restore. I'd only suggest going this way if the backup is overly large. But better to have it than not.

You cannot include the database as part of a single backup job. The reason for this is simple. Let say you run a backup of all files in "C:\Users\". Duplicati as it's doing the backup is writing to its database. If you include the database file along with the files in "C:\Users" you'll have backed up an incomplete database as Duplicati won't finish writing to the database until the backup job is done.

We will be using the example computer name we used in the previous 3 articles, "duplicati-desktop-488gd38".

We're also going to be setting up an email just for the reporting side of Duplicati. This email will receive job reports whenever the backup(s) run. I may combine this into the original email we created for use with Wasabi in the future and just use one email address per cx location. In the end, using Thunderbird, you can create a filter and automatically move XYZ email into XYZ folder. So that option exists.

For now just create a new email with the following format: "s3-backups.cxSiteLocation". Using our previous example, John's Seafood Shack located at 1500 Green Road we can create an email as follows:
  • s3-backups.jss1500@xyz.com
  • Generate a 20 character password and save it.



Initial Setup of Duplicati:

1. On the machine you just did all this configuration on pull up Chrome, Firefox, or EdgeπŸ™„ and navigate to http://127.0.0.1:8200/. (There's no place like 🏑). If this is the first time opening this link you'll see the "First Run Setup" box shown below. Click, "Yes" to setup a password for the interface.


2. Enable, set, and or verify the following settings:

Access to User Interface:
  • Enable and set a password
  • Enable Allow Remote Access (restart service to apply setting)
    • Hostnames:
      • Leave blank if accessing pc directly
      • "*" if you want to access Duplicati's web gui from another pc
      • "pcname.vertigoisabitch.com" is accessing from web
  • Enable Prevent tray icon automatic log-in
Pause After Startup or Hibernation:
  • Zero Seconds (Use this to delay start on systems that are slow to boot)
User Interface Settings:
  • Display and color theme = The dark theme (by Michal)
Donation Messages:
  • Show or hide
Update Channel:
  • Default (beta)
Usage Statistics:
  • Anonymous usage reports = System default (Information) or Usage statistics, warnings, error, and crashes
Default Options (edit as text):

--accept-any-ssl-certificate=true
--snapshot-policy=Required

When done press, "Ok" at the bottom and login again.




Backup Job 1 of 2:

Add Backup:

1. Once logged back in the interface will be blank as seen below:


2. Click, "Add Backup".


3. Select, "Configure a new backup" then click, "Next".


General Backup Settings (Part 1 of 5):


1. Input the following information into the fields as seen below:
  • Name = duplicati-desktop-488gd38 - Users Folder
  • Description = Backup of Users folder to Wasabi s3
  • Description = Job started on (month.day.year). ie 10.11.22
  • Encryption = AES-256 encryption, built in
  • Passphrase = Press generate button a few times then copy and save password. This password is for the job only and is needed for restoring the files in this job. Don't lose it or the backup is worthless!
Click, "Next" when done.



Backup Destination (Part 2 of 5):

1. The backup destination is picked first. Enter the information exactly as seen below:

Backup Destination:
  • Storage Type = s3 Compatible
  • Enable Use SSL
  • Server = Wasabi Hot Storage (s3.wasabisys.com)
  • Bucket Name = duplicati-desktop-488gd38
  • Bucket Create Region = (default) ()
  • Storage Class = (default) ()
  • Folder Path = duplicati
  • AWS Access ID = This is your Key that was created in Wasabi during user creation
  • AWS Access Key = This is your Access Key that was created in Wasabi during user creation
  • Client Library to Use = Amazon AWS SDK
Just to be clear, the "Access ID" and "Access Key" referenced above are the Wasabi sub-account username keys. In this case the keys are for the user named, "duplicati-desktop-488gd38" that were generated at the end of the user creation in the sub-account Wasabi console, not the WACM console.

The AWS Access ID key is the users Key in Wasabi and is 20 characters long.
The AWS Access Key is the users Access Key in Wasabi and is 40 characters long.

Also the "Server" mentioned above must match the region you created the bucket on in the Wasabi sub-account console. In our case we created it in the "Wasabi US East 1 (N. Virginia)" region. Use, "Custom Server URL ()" in the drop down to key in another region.

"Folder" is the actual folder inside of the bucket. In our case there are two, "duplicati" and "duplicati-database". We'll create the latter in the next backup job.


The Wasabi Management Console is always reached at console.wasabisys.com (regardless of which region you are using)


2. Scroll down and at the bottom of the page is, "Advanced Options". Click on the word, "Advanced Options" in green. It should drop down like below.


3. Click on the Hamburger icon on the right in light blue and click, "Edit as text".


4. Paste in the following into the, "Advanced Options" box.

--accept-any-ssl-certificate=true
--s3-ext-forcepathstyle=true
--s3-ext-signatureversion=2

9. When done click, "Test Connection". It will ask you to prepend the bucket name with your username. Click, "No".

If everything went right it will say, "Connection Worked!" in less than 20 seconds or so.


*If the page hangs for more than 30-40 seconds and doesn't say anything, something is wrong. Wait until the page fails and comes back with an error. Don't refresh it to try and get the prompts back. Go back and make sure everything is selected and entered correctly.


Source Data (Part 3 of 5):

1. We're going to backup the Windows Users folder in the first backup. In Windows 10 the location is as follows: "C:\Users". In some instances this folder will be larger than 100gb. You can divide it up into multiple backups which will present faster restores.


2. Use the folder tree to either navigate to "C:\Users\" or use the Hamburger icon on the right in light blue to manually add, "C:\Users\".


or


Below is what the Advanced Editor looks like:


3. Next we need to exclude Temporary Files and System Files. Click on the drop down for, "Exclude" in green near the bottom of the page.


4. Select, "Hidden Files" and "System Files" then click, "Next".



Schedule (Part 4 of 5):

The schedule is going to run our backup automatically.

1. Verify that, "Automatically Run Backups" is enabled.

2. Change the runtime to a time when the client will not be in the office. I generally like to stagger them between 12:00am and 6:00am. For locations bars or clubs I'll bump it up to between 3:00am and 9:00am.

3. Verify that every day is selected. If the client shows up on a weekend do catch up on work you don't want to miss that backup. If the location is clearly closed then you can omit days.

4. You can also set the time frame to minutes, hours, days, weeks, months, and years.

5. Click, "Next" when done.



Options (Part 5 of 5):

1. Under General Options set the, "Remote Volume Size" to a higher number such as 100-200mb if the client has a fast upload on their end, else leave it at 50mb. Keep in mind a smaller remote volume size will ultimately result in a larger number of files on the Wasabi backend.

2. I am currently using, "Keep All Backups" for sites with less than 100gb per computer. This can also be used as an audit tool to see what employee XYZ did on whatever day when recalled. Pretty much it's versioning at that point but longer than most cloud provide.

At the end of the year I dump the encrypted files to an external hard drive and hand it to the customer. It's up to them to safeguard it from here on out. They receive the decryption key and a copy of Duplicati is installed to recall the files. Then we start fresh on Wasabi for the new year.

3. Ok so we're going to try something different here for this massive 100gb+ backup for a client. We're going to run a custom backup retention consisting of saving once a day for a month (30 copies), then once a week for a month (4 copies), and once a month for 3 years (36).

Select, "Custom Backup Retention" and enter:

1M:1D,1M:1W,3Y:1M


4. Click on, "Advanced Options" below in green to expand it. Click on the Hamburger icon in light blue to the right, then click on, "Edit As Text".



5. Modify the email address and credentials then copy the following into the text block and click, "Save" when done:

--accept-any-ssl-certificate=true
--snapshot-policy=Required
--send-mail-url=smtp://smtp.dynu.com:587/?starttls=when-available
--send-mail-any-operation=true
--send-mail-subject=%OPERATIONNAME% - %backup-name% - %PARSEDRESULT%
--send-mail-to=
s3-backups.jss1500@xyz.com
--send-mail-username=s3-backups.jss1500@xyz.com
--send-mail-password=Random 20 character password
--send-mail-from=s3-backups.jss1500@xyz.com
The above settings will periodically send emails every time a backup runs, completes, fails, has problems, etc. The email will be sent from, "s3-backups.jss1500@xyz.com" and be delivered to itself.

Add this email address into Thunderbird. This way we can create a filter later to automatically move the emails into its own folder named after the computer name. If we have multiple computers being backed up at this location, all emails relating to backups will come into this mailbox.


6. Make sure you copy the Duplicati Job AES Password from step 1. This is what the following warning is referring to:


7. Once the setup fo the backup job is complete you'll be dumped back at the main screen. If the backup is on a schedule, you're done at this point with the exception of backing up your jobs which is covered in the next step. Since we installed Duplicati as a Windows service, no one needs to be logged into the machine to run the backup. You can come back to http://127.0.0.1:8200/ periodically to see the results or current job running on the main screen.



Export Your Backup Job(s) to JSON File:

1. From the main screen click on the name of the backup in blue and a drop down menu with more options will appear:



2. Under "Configuration" click on "Export".


3. Make sure "To File" and "Export Passwords" is selected and "Encrypt File" is disabled then click on "Export".


4. Click "Yes, I understand the risk" and your download will start.


5. Above is what the file name will look like for the backup. It's a plain text file and not encrypted so protect it as it may contain sensitive passwords and or other credentials.



Backup Job 2 of 2:
  • Create another backup job. Only this time you're going to backup the Duplicati database. Its location is as follows:
    • "C:\ProgramData\Duplicati\"
  • In step 1 give the backup the following name (be sure to update desktop-488gd38 with the computer's real name):
    • "duplicati-desktop-488gd38 - Duplicati Database"
  • Keep all the settings the same as the first backup with the exception of the two changes above.
  • Make sure to schedule the database backup for a few hours later and run it every day. This gives the first backup time to complete. The initial backup of the Windows User folder will take forever but subsequent backups are much faster since they only backup parts of files that have changed. Duplicati cannot be writing to the database or the backup will be corrupt.
  • Export the backup job and save the JSON file.



TL:DR:

I'll come back to this later.



Conclusion:
  • In part 5 I'll explain how to do a restore of the data if need be.


πŸ‘½

03 - Cloud Backup - Wasabi - Setup MFA, Bucket, Sub Folders, Users, and Policy Setup - Part 3 of 5


Intro:

This is Part 3 of 5 on how to setup and use Duplicati with Wasabi's Hot Cloud Storage.



This Article Explains How To:
  • Enable Multi-Factor Authentication on the Sub-Account
  • Create a Bucket(s) (each pc being backed up goes in its own bucket)
  • Create two folders inside the bucket. One for the Duplicati files and one for a backup of the Duplicati database.
  • Create a user (each pc being backed up should have its own user)
  • Create and apply a policy to the user (not the bucket)
  • Test the Wasabi policy


Enable Multi-Factor Authentication on the Sub-Account:

1. Login to the Wasabi Console at: https://console.wasabisys.com/




2. The most important thing to do immediately after setting up a sub-account is to turn on Multi-factor authentication. It can be found in settings. Click on the person icon in the top right then click Settings.


3. Click on MFA Settings:


4. Leave, "MFA Recovery Codes" disabled since you can reset the root user's password from the WACM if you forget it. Use something like, Authy, to save this token. Enter one code then wait 30 seconds for another code to be generated and enter that code. Click on, "Activate Virtual MFA" when finished.


5. If the multi-factor authentication setup was successful you should see the following under MFA Settings:





Pay attention going forward.



Things to Accomplish:

1. Create a bucket with the following naming convention: "duplicati-pcname" where "pcname" is the actual name of the computer.

2. Create two folders inside the bucket. One is for the Duplicati backups of files on the computer and the other folder is for a backup of the database. This will come into play later when we do the restore for very large backups (100gb and up). Instead of pointing Duplicati to the files and saying recover them, which will be slow as it builds a new database, we can copy the existing database over and it'll accomplish this process much faster. 

3. Create a Wasabi user and save the Access and Secret key (one user and one set of keys per computer to be backed up)

4. Create a policy and apply it to the Wasabi user.


If you remember from the previous article our sample customer was, "John's Seafood Shack". This customer is located at 1500 Green Road and the email we setup for the Wasabi account is, "cx-jss1500@xyz.com".

Since we're logged into the Wasabi console as the root user of John's Seafood Shack, we need to create a simple bucket naming convention when adding new computers to backup (replace "pcname" with the actual name of the computer):

"duplicati-pcname" 

Wasabi buckets need to be globally unique in their system. If another Wasabi user is already using the bucket name you picked then the system will prevent you from creating that bucket. The odds of someone using the same naming convention and having the same customer as you are slim.



1 - Create a Bucket (one per pc to be backed up):

Before creating a bucket make sure you are editing the correct account! After logging in to the sub-account you can view this by clicking, "Settings" on the left then looking at the 3rd line down. It will say, "Account Name". I sent Wasabi a feature request for this, to identify somewhere on the main screen telling you which account you are logged into so you don't screw up an account you've accidentally signed into.

*The bucket name is globally unique to Wasabi.com and uses all lowercase characters.

1. On the main screen after logging in click, "Create" in the top right corner. Create a bucket with the following naming convention, "duplicati-pcname" where "pcname" is the actual name of the pc. Pick the region that you want to store the data in. In this case we will select, "N. Virginia us-east-1 (s3.wasabisys.com)".

When done click, "Create Bucket".



2. You should see the following after bucket creation:




Create Duplicati Folders Inside the Bucket:

1. Click on the new bucket name then on the next page click, "Create Folder" in the top right. In the screenshot below we are already in this bucket. Proceed to step 2.


2. Create the following folders in the root of the bucket:
  1. duplicati
  2. duplicati-database


3. It should look like this when you're done:




Create a User:

1. From the menu on the left, click, "Users".


2. Create a new user with the same name as the bucket name, "duplicati-desktop-488gd38". Select "Programmatic (create API key)". DO NOT SELECT, "CONSOLE"! Click, "next" when done.


3. Skip group creation for now:


4. Skip attaching a policy for now:


5. Verify the username is correct, API is selected, and console access is set to NO. Click, "Create User" when done.


6. After you click, "Create User" the prompt will change and it will have a Secret Key and Access Key. These are the keys that will be used to perform the backups. Make sure you download these. If you lose them you need to destroy the existing keys and regenerate a new set for the user.


7. Once you've downloaded them you can close the above box and proceed.



Create a Policy:

The policy we're about to create will give the user the ability to write to two folders, "duplicati" and "duplicati-database". For every new folder you create inside a bucket, you need to add a new policy for it or review the code below and duplicate the blocks of code that are responsible for the write access.

Here's the 3 blocks that need to be duplicated if you add another folder (only one policy at a time can be enabled on Wasabi's end). Pay attention to the curly brackets and make sure you properly close each block of code:
  • AllowRootAndHomeListingOfCompanyBucket
  • AllowListingOfUserFolder
  • AllowAllS3ActionsInUserFolder

1. On the menu to the left click on, "Policies" then click, "Create Policy" on the top right.


2. Create a new policy with the same name as the user and the bucket, "duplicati-desktop-488gd38". Copy the following into the description, "Allows Duplicati user to read and write in its own bucket and nothing else".


3. Before you copy the below code into the, "Policy Document" section as seen in the screenshot above, we need to change some variables.

Using our bucket name of, "duplicati-desktop-488gd38" we're going to replace every instance of, "duplicati-pcname" below with our bucket name, "duplicati-desktop-488gd38". There are 8 instances that need to be replaced.

You don't have to mess with the policy itself as I've already written that out to work with the two sub folders, "duplicati" and "duplicati-database".

START OF TEMPLATE:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Sid": "AllowUserToSeeBucketListInTheConsole",
      "Effect": "Allow",
      "Action": [
        "s3:ListAllMyBuckets",
        "s3:GetBucketLocation",
        "s3:GetBucketCompliance"
      ],
      "Resource": "arn:aws:s3:::*"
    },
    {
      "Sid": "AllowRootAndHomeListingOfCompanyBucket",
      "Effect": "Allow",
      "Action": "s3:ListBucket",
      "Resource": "arn:aws:s3:::duplicati-pcname",
      "Condition": {
        "StringEquals": {
          "s3:delimiter": "/",
          "s3:prefix": [
            "",
            "duplicati"
          ]
        }
      }
    },
    {
      "Sid": "AllowRootAndHomeListingOfCompanyBucket",
      "Effect": "Allow",
      "Action": "s3:ListBucket",
      "Resource": "arn:aws:s3:::duplicati-pcname",
      "Condition": {
        "StringEquals": {
          "s3:delimiter": "/",
          "s3:prefix": [
            "",
            "duplicati-database"
          ]
        }
      }
    },
    {
      "Sid": "AllowListingOfUserFolder",
      "Effect": "Allow",
      "Action": "s3:ListBucket",
      "Resource": "arn:aws:s3:::duplicati-pcname",
      "Condition": {
        "StringLike": {
          "s3:prefix": "duplicati/*"
        }
      }
    },
    {
      "Sid": "AllowListingOfUserFolder",
      "Effect": "Allow",
      "Action": "s3:ListBucket",
      "Resource": "arn:aws:s3:::duplicati-pcname",
      "Condition": {
        "StringLike": {
          "s3:prefix": "duplicati-database/*"
        }
      }
    },
    {
      "Sid": "AllowAllS3ActionsInUserFolder",
      "Effect": "Allow",
      "Action": [
        "s3:GetObject",
        "s3:PutObject",
        "s3:DeleteObject"
      ],
      "Resource": "arn:aws:s3:::duplicati-pcname/duplicati*"
    },
    {
      "Sid": "AllowAllS3ActionsInUserFolder",
      "Effect": "Allow",
      "Action": [
        "s3:GetObject",
        "s3:PutObject",
        "s3:DeleteObject"
      ],
      "Resource": "arn:aws:s3:::duplicati-pcname/duplicati-database*"
    },
    {
      "Effect": "Deny",
      "Action": "s3:DeleteBucket",
      "Resource": [
        "arn:aws:s3:::duplicati-pcname",
        "arn:aws:s3:::duplicati-pcname/*"
      ]
    }
  ]
}

END OF TEMPLATE

Using our example it should now look like:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Sid": "AllowUserToSeeBucketListInTheConsole",
      "Effect": "Allow",
      "Action": [
        "s3:ListAllMyBuckets",
        "s3:GetBucketLocation",
        "s3:GetBucketCompliance"
      ],
      "Resource": "arn:aws:s3:::*"
    },
    {
      "Sid": "AllowRootAndHomeListingOfCompanyBucket",
      "Effect": "Allow",
      "Action": "s3:ListBucket",
      "Resource": "arn:aws:s3:::duplicati-desktop-488gd38",
      "Condition": {
        "StringEquals": {
          "s3:delimiter": "/",
          "s3:prefix": [
            "",
            "duplicati"
          ]
        }
      }
    },
    {
      "Sid": "AllowRootAndHomeListingOfCompanyBucket",
      "Effect": "Allow",
      "Action": "s3:ListBucket",
      "Resource": "arn:aws:s3:::duplicati-desktop-488gd38",
      "Condition": {
        "StringEquals": {
          "s3:delimiter": "/",
          "s3:prefix": [
            "",
            "duplicati-database"
          ]
        }
      }
    },
    {
      "Sid": "AllowListingOfUserFolder",
      "Effect": "Allow",
      "Action": "s3:ListBucket",
      "Resource": "arn:aws:s3:::duplicati-desktop-488gd38",
      "Condition": {
        "StringLike": {
          "s3:prefix": "duplicati/*"
        }
      }
    },
    {
      "Sid": "AllowListingOfUserFolder",
      "Effect": "Allow",
      "Action": "s3:ListBucket",
      "Resource": "arn:aws:s3:::duplicati-desktop-488gd38",
      "Condition": {
        "StringLike": {
          "s3:prefix": "duplicati-database/*"
        }
      }
    },
    {
      "Sid": "AllowAllS3ActionsInUserFolder",
      "Effect": "Allow",
      "Action": [
        "s3:GetObject",
        "s3:PutObject",
        "s3:DeleteObject"
      ],
      "Resource": "arn:aws:s3:::duplicati-desktop-488gd38/duplicati*"
    },
    {
      "Sid": "AllowAllS3ActionsInUserFolder",
      "Effect": "Allow",
      "Action": [
        "s3:GetObject",
        "s3:PutObject",
        "s3:DeleteObject"
      ],
      "Resource": "arn:aws:s3:::duplicati-desktop-488gd38/duplicati-database*"
    },
    {
      "Effect": "Deny",
      "Action": "s3:DeleteBucket",
      "Resource": [
        "arn:aws:s3:::duplicati-desktop-488gd38",
        "arn:aws:s3:::duplicati-desktop-488gd38/*"
      ]
    }
  ]
}


*The policy pretty much says what it's doing and not doing. With the above policy the, "duplicati-desktop-488gd38" user cannot write to the root folder, "duplicati-desktop-488gd38" but you can write inside of the two folders sitting in the root bucket, "duplicati" and "duplicati-database".

*The user cannot delete the bucket itself but they can remove the files inside of the folders as well as delete the folders themselves. Looking for a way to stop this but I don't think it's possible because folders don't really exist as per the Amazon s3 specifications.



4. Once you are sure you've changed all 8 variables marked in light red from the template, copy the updated template and paste it into the, "Policy Document" section as seen below. If the policy is valid and has no syntax errors it will say, "Policy is valid". Once that happens click, "Create Policy" at the bottom. If not go back and make sure you didn't accidentally remove a character.


5. You'll see the following message after a successful creation and will be dropped back off at the policies page. Look closely and you'll see there are 11 policies but the page is only showing you policies 1-10. To see the one you just created you need to change, "Rows per page" to 25 near the bottom. *After you attach the policy to the user you can click on the icon below to filter out only policies applied to users:





Apply the Policy to the User We Created Earlier:

* Still on the Policies page notice there's a zero to the right of the policy name we just created in the column for, "Number Attached". This simply means the number of users that policy is attached to at the time of the page refresh.

1. Menu on the left, click on, "Users".

2. Click on our username, "duplicati-desktop-488gd38".


3. Select, "Policies".


4. Notice by default there are no policies attached to this user.


5. To add one, click to the right of the magnifying glassπŸ”and select the policy name, "duplicati-desktop-488gd38" we created earlier.


6. Note there is no save button on this page. After you select the policy it will show up as a bubble near the bottom of the policy tab.

At this point you're done. You can close the page.




Testing the Wasabi Policy:

1. Download Wasabi's tailored version of CloudBerry Explorer. It's free for Wasabi users and can only connect to Wasabi accounts. It's basically a stripped down version of CloudBerry Explorer. Scroll down the page and the download for the app is under, "Installation Instructions".

2. Install the application then open it.

3. We're going to use the Secret and Access Keys for the Wasabi user, "duplicati-desktop-488gd38". Do not use the root credentials for this test!

*Remember root can do anything including deleting the folders we created earlier. Our policy stops the sub users from doing this.

4. Open Wasabi Explorer.


5. File --> Wasabi


6. In the box that pops up enter the following (enter the credentials that were generated when you created a user earlier) :
  • Display Name = duplicati-desktop-488gd38
  • Access Key = Your access key
  • Secret Key = Your secret key
  • Enable - Use SSL

Click, "Test Connection". If everything works you'll get a green check mark like below. If you don't double check your credentials.



When done, press, "ok" on the above box, then, "close" on the box behind it labeled, "Registered Accounts".


7. Click on the, "Source:" dropdown and select, "duplicati-desktop-488gd38".


8. Double click on the bucket labeled, "duplicati-desktop-488gd38" below.


9. You should now see the two folders you created earlier, "duplicati" and "duplicati-database".


10. If you want to check to make sure the policy is correct, try to drag a small file from your Desktop into one of those folders. If you can upload and delete that file from the Wasabi Explorer then you're ready to go.

*Be careful since you can still remove the folders themselves.



TL;DR:


2. Settings --> MFA Settings --> Enable

2. Create a new bucket named, "duplicati-pcname" on "s3.wasabisys.com" --> Create Bucket

3. Create two folders inside this bucket:
  • duplicati
  • duplicati-database
4. Create a user named, "duplicati-pcname" --> Programmatic (Create API Key).
  • WAIT FOR THE KEYS TO BE GENERATED AND COPY THEM!!!
5. Create a policy using the above template and name it, "duplicati-pcname". Edit the 8 variables in notepad.
  • Use, "Allows Duplicati user to read and write in its own bucket and nothing else" as the description.
6. Apply the policy to the user. Users --> Pick Username --> Policies --> Start typing to find new policy --> Select (There's no save on this page, the policy is applied instantly).

7. Test using Wasabi Explorer

8. Update documents.




Next step is creating the backup job in Duplicati itself.



πŸ‘½
I'm using http://hilite.me/ to reproduce the beautiful policy code above