Update README.md

This commit is contained in:
Victor Westerlund 2022-03-15 07:06:40 -08:00 committed by GitHub
parent cdeeb47278
commit 43cee9e285
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23

View file

@ -1,43 +1,87 @@
# Cloud Backup # Cloud Backup
Backup and archive ordinary files and folders to Google Cloud, AWS or Azure. Backup and archive ordinary files and folders to Google Cloud, AWS or Azure.
## Why this exists in a world with Rclone (and similar)
This script was created to solve the specific task of archiving a folder to the cloud without read access to the bucket/container and its objects.
Cloud Backup keeps an internal database of second-level files and folders which were sucessfully uploaded to the cloud.
If a file or folder changes on disk, that specific file or folder is compressed, uploaded to the cloud, and the database gets updated. What happens to the object in the bucket is invisible to the program.
## Get started ## Get started
This program requires Python 3.6 or newer with PIP. This program requires Python 3.6 or newer with PIP.
Cloud Backups supports uploading to Google Cloud Storage, Azure Blob Storage and AWS S3.
1. **Clone this repo** 1. **Clone this repo**
``` ```bash
git clone https://github.com/VictorWesterlund/cloud-backup git clone https://github.com/VictorWesterlund/cloud-backup
``` ```
2. **Install dependencies** 2. **Install dependencies**
``` ```bash
python3 -m pip install -r requirements.txt python3 -m pip install -r requirements.txt
``` ```
3. **Copy environment variables file** 3. **Copy environment variables file**
``` ```bash
cp .env.example .env cp .env.example .env
``` ```
4. **Edit environment variables** 4. **Edit environment variables in `.env`**
Open `.env` with your text editor of choice and fill out these required variables
```bash ```bash
# Path to the local folder to back up # Remember to double-slash escape paths on Windows 'E:\\path\\to\\something'
SOURCE_FOLDER=
# Name of the remote bucket (destination)
TARGET_BUCKET=
# Cloud provider (gcs, s3, azure) # Absolute path to folder whose contents should be backed up
SERVICE_NAME= SOURCE_FOLDER="/home/me/backup/"
# Path to service account key file # Name of bucket (or "container" in Azure)
SERVICE_KEY= TARGET_BUCKET="home_backup"
# Cloud provider. "gloud", "aws" or "azure"
SERVICE_NAME="aws"
# IAM authentication
# GCS: Path to keyfile or string (GOOGLE_APPLICATION_CREDENTIALS)
# Azure: "Connection string" from the Access Key to the container
# AWS: Access key ID and secret seperated by a ";"
SERVICE_KEY="SDJSBADYUAD;asid7sad123ebasdhasnk3dnsai"
``` ```
5. **Run backup script** 5. **Run backup script**
``` ```bash
python3 backup.py python3 backup.py
``` ```
Second-level files and folders should now start uploading to your destination bucket as zip archives. Second-level files and folders should now start uploading to your destination bucket as zip archives.
Subsequent runs of the `backup.py` script will only upload changed files and folders. Subsequent runs of the `backup.py` script will only upload changed files and folders.
In-fact; modified state is cached locally and doesn't request anything from your cloud provider. In-fact; modified state is cached locally and doesn't request anything from your cloud provider.
----
You can also run `backup.py` on a schedule with CRON or equivalent for your system. No requests will be sent to the cloud unless a file or folder has actually changed
## More stuff
Here are some additional settings and commands you can try
### Back up a second-level file
```bash
python3 backup.py file 'relative/path/from/.env'
```
### Resolve CRC32 to path or vice versa
```bash
python3 resolve.py '587374759'
# output: 'hello_world.txt'
python3 resolve.py 'hello_world.txt'
# output: '587374759'
```
### Optional flags in `.env`
```bash
# The following intbool flags can be added to .env to override default behavior
# Their value in this demo is the "default" state
# Archive files and folders before uploading
COMPRESS="1"
```