Skip to content

angadsingh/s3sync

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

s3sync

PyPi Version Supported versions

Overview

s3sync.py is a utility created to sync files to/from S3 as a continuously running process, without having to manually take care of managing the sync. It internally uses the aws s3 sync command to do the sync, and uses the python module watchdog to listen to filesystem events on the monitored path and push changes to S3. For pull, there is no listener implemented, and it does simple interval based pull. Therefore for pull it is recommended to use s3fs instead - just mount the s3 bucket on your filesystem.

Features

  • Rate limiting using the python tocken-bucket module. You can set max_syncs_per_minute in the config yaml and the file system watcher triggered pushes will be throttled as per that limit.
  • Optional Reporting of runtime stats for the sync operation using the pyformance module
  • Ability to filter by include_patternsand exclude_patterns or to exclude_directories completely, or make the filter case_sensitive
  • Automated setup of AWS CLI config by creating a separate named profile for the utility with ability to tune performance by setting max_concurrent_requests, max_queue_size, etc.
  • Setuptools integration, python click based command line interface

Requirements

Requires AWS CLI version 2 to be installed and available in the path

Installation

pip install pys3sync

Usage

s3sync --help

Usage: s3sync.py [OPTIONS] COMMAND [ARGS]...

  A utility created to sync files to/from S3 as a continuously running
  process, without having to manually take care of managing the sync.  It
  internally uses the aws s3 sync command to do the sync and uses python's
  watchdog listener to get notified of any changes to the watched folder.

Options:
  --config PATH        Path to the config.yaml file containing configuration
                       params for this utility  [required]

  -v, --verbosity LVL  Either CRITICAL, ERROR, WARNING, INFO or DEBUG
  --help               Show this message and exit.

Commands:
  init  Initial setup.
  pull  One-way continuous sync from s3 path to local path (based on
        polling...

  push  One-way continuous sync from localpath to s3 path (uses a file...

s3sync --config config.yaml push --help

Usage: s3sync.py push [OPTIONS]

  One-way continuous sync from localpath to s3 path (uses a file watcher
  called watchdog)

Options:
  --s3path PATH     Full s3 path to sync to/from  [required]
  --localpath PATH  Local directory path which you want to sync  [required]
  --help            Show this message and exit.

s3sync --config config.yaml push --help

Usage: s3sync.py pull [OPTIONS]

  One-way continuous sync from s3 path to local path (based on polling on an
  interval)

Options:
  --s3path PATH       Full s3 path to sync to/from  [required]
  --localpath PATH    Local directory path which you want to sync  [required]
  --interval INTEGER  S3 polling interval in seconds  [required]
  --help              Show this message and exit.
First run/setup

s3sync --config-yaml config.yaml init

This utility creates a named profile for your AWS CLI so that the parameters required for the S3 cli for the utility are isolated from your regular AWS CLI profile. The first time you nee to run the init command, which will create the named profile s3sync in your local aws config (~/.aws/config), with the parameters configured in config.yaml and credentials copied from your default AWS credentials file.

Push

You run one instance of this utility per localpath<>s3path combination that you want to continuously sync

s3sync --config config.yaml -v DEBUG push --s3path s3://<bucket>/<path> --localpath ./

push

Pull

s3sync --config config.yaml -v DEBUG pull --s3path s3://<bucket>/<path> --localpath ./sync --interval 2

pull

Rate limiting in action

ratelimit

Configuration

global:
  max_syncs_per_minute: 10
  report_stats: False
watcher:
  include_patterns: 
  exclude_patterns: ["*.git/*"]
  exclude_directories: False
  case_sensitive: False
s3:
  max_concurrent_requests: 20
  max_queue_size: 2000
  multipart_threshold: 8MB
  multipart_chunksize: 8MB
  max_bandwidth: 
  use_accelerate_endpoint: "false"
  region: ap-south-1
Include/excluse patterns

Include/excluse patterns are implemented using pathtools.match_any_path, which ultimately supports unix glob pattern syntax. You can test your patterns using the provided script patternhelper.py. These patterns are passed to the watchdog as well as aws cli, which also uses the same syntax. Both properties accept a list of patterns.

Advanced Configuration

Please change these values carefully. They depend on your machine and your internet connection. Read more about improving s3 sync transfer speeds here

max_concurrent_requests

Passed through to your ~/.aws/config via aws configure set default.s3.max_concurrent_requests command. Read about the parameter here

max_queue_size

Passed through to your ~/.aws/config via aws configure set default.s3.max_queue_size command. Read about the parameter here

multipart_threshold

Passed through to your ~/.aws/config via aws configure set default.s3.multipart_threshold command. Read about the parameter here

multipart_chunksize

Passed through to your ~/.aws/config via aws configure set default.s3.multipart_chunksize command. Read about the parameter here

max_bandwidth

Passed through to your ~/.aws/config via aws configure set default.s3.max_bandwidth command. Read about the parameter here

use_accelerate_endpoint

Passed through to your ~/.aws/config via aws configure set default.s3.use_accelerate_endpoint command. Read about the parameter here


Performance Tests for aws sync command

Environment

Network: home/BLR (Airtel 1 Gbps Xtreme Fiber)
WiFi: 5 GHZ, RSS: -38 dbM, Tx rate: 1300 Mbps (802.11 ac)
Upload speed to s3: 18 MB/s
Download speed from s3: 15 MB/s
Number of threads for s3 sync command: 10 (default)

Sync from local to S3 (upload)

Size: 224M
Number of files: 3571

Test 1 (full sync):
time aws s3 sync --storage-class REDUCED_REDUNDANCY ./ s3://psm-poc-dmp-temp/codesync
real	0m45.543s
user	0m14.755s
sys	0m3.685s
Test 2 (added 39 files, 168k):
cp -rf ../intSDK .
time aws s3 sync --storage-class REDUCED_REDUNDANCY ./  s3://psm-poc-dmp-temp/codesync

real	0m3.141s
user	0m1.887s
sys	0m0.405s
Test 3 (removed 398 files, 2.1M):
rm -rf examples/
time aws s3 sync --storage-class REDUCED_REDUNDANCY --delete ./  s3://psm-poc-dmp-temp/codesync

real	0m3.436s
user	0m2.276s
sys	0m0.406s
Test 4 (change timestamp of single file):
touch README.markdown 
time aws s3 sync --storage-class REDUCED_REDUNDANCY --delete --exact-timestamps ./  s3://psm-poc-dmp-temp/codesync

real	0m2.602s
user	0m1.492s
sys	0m0.296s
Test 5 (no change):
time aws s3 sync --storage-class REDUCED_REDUNDANCY --delete --exact-timestamps ./  s3://psm-poc-dmp-temp/codesync

real	0m2.442s
user	0m1.469s
sys	0m0.294s
Test artifact 2: Bunch of PNGs

Size: 400M
Number of files: 577

Test 1 (full sync)
time aws s3 sync --storage-class REDUCED_REDUNDANCY --delete --exact-timestamps ./  s3://psm-poc-dmp-temp/codesync

real	0m22.015s
user	0m5.972s
sys	0m2.516s

Sync from S3 to local (download)

Size: 224M
Number of files: 3571

Test 1 (full sync)
time aws s3 sync --storage-class REDUCED_REDUNDANCY --delete --exact-timestamps s3://psm-poc-dmp-temp/codesync ./

real	0m26.448s
user	0m14.544s
sys	0m3.794s
Test artifact 2: Bunch of PNGs

Size: 400M
Number of files: 577

Test 1 (full sync)
time aws s3 sync --storage-class REDUCED_REDUNDANCY --delete --exact-timestamps s3://psm-poc-dmp-temp/codesync ./

real	0m29.268s
user	0m6.131s
sys	0m2.855s