Script to automate the creation of an "off-site backup" to AWS S3 in the 3-2-1 backup strategy.
Find a file
2026-01-01 13:35:49 +01:00
src style: log S3 destination path when archive is uploaded instead of local path (#4) 2026-01-01 13:35:49 +01:00
.example.config.json refactor: streamlined config file structure and config value referencing (#1) 2026-01-01 00:46:25 +01:00
.gitignore fix: resolved runtime issues caused by subroutines expecting old config format (#3) 2026-01-01 13:35:08 +01:00
LICENSE initial commit 2025-12-25 15:21:06 +01:00
README.md docs: update README wording of config file structure (#2) 2026-01-01 13:35:26 +01:00
run.py fix: resolved runtime issues caused by subroutines expecting old config format (#3) 2026-01-01 13:35:08 +01:00

3rd

A script to automate the 3rd "off-site copy" step in the 3-2-1 Backup strategy. Each directory have independent configuration of compression level, encryption password, AWS S3 destination, and temporary storage location while being uploaded to S3.

This script is a wrapper for the AWS CLI aws and the 7zip CLI 7z and is meant to be run on Linux. Other operating systems are untested.

Installation

Make sure you have the following prerequisites before starting:

  1. Clone this repository
git clone https://codeberg.org/vlw/3rd
  1. Copy .example.config.json to .config.json
cp -p .example.config.json .config.json
  1. Set environment variables in .config.json

See the config file documentation for more information.

  1. Run run.py with your config file
python3 run.py -i .config.json

See the CLI section for a list of all available arguments.

Optional cron

Schedule this backup script to run with a crontab entry, for example:

30 2 * * 3 cd /opt/3rd && /usr/bin/python3 run.py -i .config.json

Which will run at 2:30 each Wednesday.

Config

The config file (.config.json by default) is used to define which directories to archive with parameters.

Directories are specified as an array of objects, each object has key value properties with the following format:

[
	{
		"password": "mypassword", // AES-256 encryption password. Set to false to disable encryption
		"compression": 10, // Compression level between 0-10, where 0 is STORE and 10 is max compression. Set to 0 or false/null to disable compression
		"abspath_temp": "/tmp", // Directory to store the created archive while it's being uploaded to S3. Set to false/null to use the system temp-directory
		"abspath_target": "<replace with ABSOLUTE path to a target directory>", // An ABSOLUTE path to the directory or file to archive
		"abspath_destination": "s3://<replace with bucket>/<replace with destination>" // A fully qualified AWS S3 URL
	},
    // etc..
]

Common parent directories

One of the key features of this script is that it can perform different archiving procedures for a subdirectories and their parent directories.

If you have the directory /my/archive with the following config:

{
	"password": "mypassword",
	"compression": 10,
	"abspath_temp": null,
	"abspath_target": "/my/archive",
	"abspath_destination": "s3://my-bucket/archive.7z"
}

And a subdirectory /my/archive/subdirectory with the following config:

{
	"password": "mypassword",
	"compression": 10,
	"abspath_temp": null,
	"abspath_target": "/my/archive/subdirectory",
	"abspath_destination": "s3://my-bucket/subdirectory.7z"
}

The /my/archive/subdirectory will be excluded from the /my/archive archive since it has an overriding archive configuration.

CLI

Available command line argument with run.py:

arg Name Default Description
-s --sleep 2 Set a global sleep duration between commands
-i --input None Path to a config file to load
-d --dryrun False Perform a dry run. Archives will not be uploaded to S3.
-l --log-level StdoutLevel.STANDARD Set a custom log level when printing to the console. See /src/Enums.py#StdoutLevel