Automatic, managed and scalable file backups to Google Cloud Storage, AWS S3 and Azure Blob Storage with fewer network roundtrips.
Find a file
2022-03-09 10:45:37 -05:00
src wip(22w10a): add logger 2022-03-09 10:45:37 -05:00
.env.example wip(22w10a): add logger 2022-03-09 10:45:37 -05:00
.gitignore pre-v1.0 (#1) 2022-03-02 04:06:13 +01:00
backup.py wip(22w9c): add single item backup 2022-03-05 18:38:55 +01:00
install.sh wip(22w9b): add azure 2022-03-03 02:32:52 +01:00
LICENSE Initial commit 2022-02-24 23:58:57 +01:00
README.md pre-v1.0 (#1) 2022-03-02 04:06:13 +01:00
requirements.txt wip(22w9b): add azure 2022-03-03 02:32:52 +01:00
resolve.py wip(22w9c): add single item backup 2022-03-05 18:38:55 +01:00

Cloud Backup

Backup and archive ordinary files and folders to Google Cloud, AWS or Azure.

Get started

This program requires Python 3.6 or newer with PIP.

  1. Clone this repo
git clone https://github.com/VictorWesterlund/cloud-backup
  1. Install dependencies
python3 -m pip install -r requirements.txt
  1. Copy environment variables file
cp .env.example .env
  1. Edit environment variables Open .env with your text editor of choice and fill out these required variables
# Path to the local folder to back up
SOURCE_FOLDER=
# Name of the remote bucket (destination)
TARGET_BUCKET=

# Cloud provider (gcs, s3, azure)
SERVICE_NAME=
# Path to service account key file
SERVICE_KEY=
  1. Run backup script
python3 backup.py

Second-level files and folders should now start uploading to your destination bucket as zip archives. Subsequent runs of the backup.py script will only upload changed files and folders. In-fact; modified state is cached locally and doesn't request anything from your cloud provider.