Automatic, managed and scalable file backups to Google Cloud Storage, AWS S3 and Azure Blob Storage with fewer network roundtrips.
Find a file
Victor Westerlund 247e6732bf
pre-v1.0 (#1)
* wip(22w8a): add wip db and gcs client

* wip(22w8b): bootstrapping fix

* wip(22w8c): add first-run sql config

* wip(22w8d): add sqlite abstraction

* wip(22w8e): add filesystem handler

* wip(22w8f): add folder walker

* wip(22w8a): finish db writer

* wip(22w9a): add item zipper

* wip(22w9b): add gcs upload

* Create README.md

Co-authored-by: Cloud Shell <cloud-shell@victor-westerlund.iam.gserviceaccount.com>
2022-03-02 04:06:13 +01:00
src pre-v1.0 (#1) 2022-03-02 04:06:13 +01:00
.env.example pre-v1.0 (#1) 2022-03-02 04:06:13 +01:00
.gitignore pre-v1.0 (#1) 2022-03-02 04:06:13 +01:00
backup.py pre-v1.0 (#1) 2022-03-02 04:06:13 +01:00
LICENSE Initial commit 2022-02-24 23:58:57 +01:00
README.md pre-v1.0 (#1) 2022-03-02 04:06:13 +01:00
requirements.txt pre-v1.0 (#1) 2022-03-02 04:06:13 +01:00

Cloud Backup

Backup and archive ordinary files and folders to Google Cloud, AWS or Azure.

Get started

This program requires Python 3.6 or newer with PIP.

  1. Clone this repo
git clone https://github.com/VictorWesterlund/cloud-backup
  1. Install dependencies
python3 -m pip install -r requirements.txt
  1. Copy environment variables file
cp .env.example .env
  1. Edit environment variables Open .env with your text editor of choice and fill out these required variables
# Path to the local folder to back up
SOURCE_FOLDER=
# Name of the remote bucket (destination)
TARGET_BUCKET=

# Cloud provider (gcs, s3, azure)
SERVICE_NAME=
# Path to service account key file
SERVICE_KEY=
  1. Run backup script
python3 backup.py

Second-level files and folders should now start uploading to your destination bucket as zip archives. Subsequent runs of the backup.py script will only upload changed files and folders. In-fact; modified state is cached locally and doesn't request anything from your cloud provider.