Automatic, managed and scalable file backups to Google Cloud Storage, AWS S3 and Azure Blob Storage with fewer network roundtrips.
Find a file
Victor Westerlund 36cc3eb0b2
Create README.md
2022-03-02 04:04:39 +01:00
src wip(22w9b): add gcs upload 2022-03-02 03:43:06 +01:00
.env.example wip(22w9b): add gcs upload 2022-03-02 03:43:06 +01:00
.gitignore wip(22w8d): add sqlite abstraction 2022-02-25 15:50:42 +00:00
backup.py wip(22w8a): finish db writer 2022-02-28 14:41:33 +00:00
LICENSE Initial commit 2022-02-24 23:58:57 +01:00
README.md Create README.md 2022-03-02 04:04:39 +01:00
requirements.txt wip(22w8a): add wip db and gcs client 2022-02-25 01:04:00 +00:00

Cloud Backup

Backup and archive ordinary files and folders to Google Cloud, AWS or Azure.

Get started

This program requires Python 3.6 or newer with PIP.

  1. Clone this repo
git clone https://github.com/VictorWesterlund/cloud-backup
  1. Install dependencies
python3 -m pip install -r requirements.txt
  1. Copy environment variables file
cp .env.example .env
  1. Edit environment variables Open .env with your text editor of choice and fill out these required variables
# Path to the local folder to back up
SOURCE_FOLDER=
# Name of the remote bucket (destination)
TARGET_BUCKET=

# Cloud provider (gcs, s3, azure)
SERVICE_NAME=
# Path to service account key file
SERVICE_KEY=
  1. Run backup script
python3 backup.py

Second-level files and folders should now start uploading to your destination bucket as zip archives. Subsequent runs of the backup.py script will only upload changed files and folders. In-fact; modified state is cached locally and doesn't request anything from your cloud provider.