mirror of
https://codeberg.org/vlw/cloud-backup.git
synced 2025-09-13 17:43:42 +02:00
Automatic, managed and scalable file backups to Google Cloud Storage, AWS S3 and Azure Blob Storage with fewer network roundtrips.
![]() * wip(22w8a): add wip db and gcs client * wip(22w8b): bootstrapping fix * wip(22w8c): add first-run sql config * wip(22w8d): add sqlite abstraction * wip(22w8e): add filesystem handler * wip(22w8f): add folder walker * wip(22w8a): finish db writer * wip(22w9a): add item zipper * wip(22w9b): add gcs upload * Create README.md Co-authored-by: Cloud Shell <cloud-shell@victor-westerlund.iam.gserviceaccount.com> |
||
---|---|---|
src | ||
.env.example | ||
.gitignore | ||
backup.py | ||
LICENSE | ||
README.md | ||
requirements.txt |
Cloud Backup
Backup and archive ordinary files and folders to Google Cloud, AWS or Azure.
Get started
This program requires Python 3.6 or newer with PIP.
- Clone this repo
git clone https://github.com/VictorWesterlund/cloud-backup
- Install dependencies
python3 -m pip install -r requirements.txt
- Copy environment variables file
cp .env.example .env
- Edit environment variables
Open
.env
with your text editor of choice and fill out these required variables
# Path to the local folder to back up
SOURCE_FOLDER=
# Name of the remote bucket (destination)
TARGET_BUCKET=
# Cloud provider (gcs, s3, azure)
SERVICE_NAME=
# Path to service account key file
SERVICE_KEY=
- Run backup script
python3 backup.py
Second-level files and folders should now start uploading to your destination bucket as zip archives.
Subsequent runs of the backup.py
script will only upload changed files and folders.
In-fact; modified state is cached locally and doesn't request anything from your cloud provider.