3rd/README.md
2026-01-01 00:45:12 +01:00

94 lines
No EOL
3.2 KiB
Markdown

# 3rd
A script to automate the 3rd "off-site copy" step in the 3-2-1 Backup strategy. Each directory have independent configuration of compression level, encryption password, AWS S3 destination, and temporary storage location while being uploaded to S3.
This script is a wrapper for the AWS CLI `aws` and the 7zip CLI `7z` and is meant to be run on Linux. Other operating systems are untested.
# Installation
Make sure you have the following prerequisites before starting:
- Python 3
- The [7zip CLI](https://www.7-zip.org/download.html)
- The [AWS CLI](https://aws.amazon.com/cli/)
- Write permission to an AWS S3 bucket
1. **Clone this repository**
```
git clone https://codeberg.org/vlw/3rd
```
2. **Copy `.example.config.json` to `.config.json`**
```
cp -p .example.config.json .config.json
```
3. **Set environment variables in `.config.json`**
[See the config file documentation for more information](#config).
4. **Run `run.py` with your config file**
```
python3 run.py -i .config.json
```
[See the CLI section for a list of all available arguments](#cli).
## Optional cron
Schedule this backup script to run with a crontab entry, for example:
```
30 2 * * 3 cd /opt/3rd && /usr/bin/python3 run.py -i .config.json
```
Which will run at 2:30 each Wednesday.
# Config
The config file `.config.json` is used to define parameters and which directories to archive (in autorun mode).
```json
[
{
"password": "mypassword", // AES-256 encryption password. Set to false to disable encryption
"compression": 10, // Compression level between 0-10, where 0 is STORE and 10 is max compression. Set to 0 or false/null to disable compression
"abspath_temp": "/tmp", // Directory to store the created archive while it's being uploaded to S3. Set to false/null to use the system temp-directory
"abspath_target": "<replace with ABSOLUTE path to a target directory>", // An ABSOLUTE path to the directory or file to archive
"abspath_destination": "s3://<replace with bucket>/<replace with destination>" // A fully qualified AWS S3 URL
},
// etc..
]
```
## Common parent directories
One of the key features of this script is that it can perform different archiving procedures for a subdirectories and their parent directories.
If you have the directory `/my/archive` with the following config:
```json
{
"password": "mypassword",
"compression": 10,
"abspath_temp": null,
"abspath_target": "/my/archive",
"abspath_destination": "s3://my-bucket/archive.7z"
}
```
And a subdirectory `/my/archive/subdirectory` with the following config:
```json
{
"password": "mypassword",
"compression": 10,
"abspath_temp": null,
"abspath_target": "/my/archive/subdirectory",
"abspath_destination": "s3://my-bucket/subdirectory.7z"
}
```
The `/my/archive/subdirectory` will be **excluded** from the `/my/archive` archive since it has an overriding archive configuration.
# CLI
Available command line argument with `run.py`:
arg|Name|Default|Description
--|--|--|--
`-s`|`--sleep`|2|Set a global sleep duration between commands
`-i`|`--input`|*None*|Path to a config file to load
`-d`|`--dryrun`|False|Perform a dry run. Archives will not be uploaded to S3.
`-l`|`--log-level`|`StdoutLevel.STANDARD`|Set a custom log level when printing to the console. See `/src/Enums.py#StdoutLevel`