Skip to content

Getting S4cmd working for Wasabi or Other object storage services #107

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
varunbtr opened this issue Sep 19, 2017 · 5 comments
Closed

Getting S4cmd working for Wasabi or Other object storage services #107

varunbtr opened this issue Sep 19, 2017 · 5 comments

Comments

@varunbtr
Copy link

I noticed that the s4cmd was not working with other object store except amazon. The reason being that the endpoint url was not being passed when creating a boto client . So i made the change to the python file to grab the config when specified in the s3cmd config file and use it .
Just wanted to place it here to see if any contributor wants to add this to the tool if not somebody can use it if needed.

So the basic diff is as followsindex 3c0d683..f39b5a5 100755



--- a/s4cmd.py
+++ b/s4cmd.py
@@ -61,7 +61,6 @@ S3_ACCESS_KEY_NAME = "S3_ACCESS_KEY"
 S3_SECRET_KEY_NAME = "S3_SECRET_KEY"
 S4CMD_ENV_KEY = "S4CMD_OPTS"

@@ -377,12 +375,16 @@ class BotoClient(object):
        "If the bucket is configured as a website, redirects requests for this object to another object in the same bucket or to an external URL. Amazon S3 stores the value of this header in the object metadata."),
   ]
 
-  def __init__(self, opt, aws_access_key_id=None, aws_secret_access_key=None):
+  def __init__(self, opt, aws_access_key_id=None, aws_secret_access_key=None, aws_access_endpoint=None):
     '''Initialize boto3 API bridge class. Calculate and cache all legal parameters
        for each method we are going to call.
     '''
     self.opt = opt
-    if (aws_access_key_id is not None) and (aws_secret_access_key is not None):
+    if(aws_access_key_id is not None) and (aws_secret_access_key is not None) and (aws_access_endpoint is not None):
+       self.client = self.boto3.client('s3',
+                                      aws_access_key_id=aws_access_key_id,
+                                      aws_secret_access_key=aws_secret_access_key,endpoint_url=aws_access_endpoint)
+    elif (aws_access_key_id is not None) and (aws_secret_access_key is not None):
       self.client = self.boto3.client('s3',
                                       aws_access_key_id=aws_access_key_id,
                                       aws_secret_access_key=aws_secret_access_key)
@@ -624,6 +626,7 @@ class S3Handler(object):
   '''
 
   S3_KEYS = None
+  S3_ENDPOINT = None
 
   @staticmethod
   def s3_keys_from_env():
@@ -664,12 +667,33 @@ class S3Handler(object):
     except Exception as e:
       info("could not read S3 keys from %s file; skipping (%s)", s3cfg_path, e)
       return None
+  
+  @staticmethod
+  def s3_endpoint_from_s3cfg(opt):
+    '''Retrieve S3 endpoint settings from s3cmd's config file, if present; otherwise return None.'''
+    try:
+      if opt.s3cfg != None:
+        s3cfg_path = "%s" % opt.s3cfg
+      else:
+        s3cfg_path = "%s/.s3cfg" % os.environ["HOME"]
+      if not os.path.exists(s3cfg_path):
+        return None
+      config = ConfigParser.ConfigParser()
+      config.read(s3cfg_path)
+      endpoint = "https://" + config.get("default", "host_base")
+      debug("read S3 endpoint from $HOME/.s3cfg file")
+      debug(endpoint)
+      return endpoint
+    except Exception as e:
+      info("could not read S3 keys from %s file; skipping (%s)", s3cfg_path, e)
+      return None
 
   @staticmethod
   def init_s3_keys(opt):
     '''Initialize s3 access keys from environment variable or s3cfg config file.'''
     S3Handler.S3_KEYS = S3Handler.s3_keys_from_cmdline(opt) or S3Handler.s3_keys_from_env() \
                         or S3Handler.s3_keys_from_s3cfg(opt)
+    S3Handler.S3_ENDPOINT = S3Handler.s3_endpoint_from_s3cfg(opt)
 
   def __init__(self, opt):
     '''Constructor, connect to S3 store'''
@@ -684,7 +708,9 @@ class S3Handler(object):
   def connect(self):
     '''Connect to S3 storage'''
     try:
-      if S3Handler.S3_KEYS:
+      if S3Handler.S3_KEYS and S3Handler.S3_ENDPOINT:
+       self.s3 = BotoClient(self.opt, S3Handler.S3_KEYS[0], S3Handler.S3_KEYS[1], S3Handler.S3_ENDPOINT) 
+      elif S3Handler.S3_KEYS:
         self.s3 = BotoClient(self.opt, S3Handler.S3_KEYS[0], S3Handler.S3_KEYS[1])
       else:
         self.s3 = BotoClient(self.opt)
@takaidohigasi
Copy link

👍

@jhagberg
Copy link

jhagberg commented Nov 8, 2017

Can you create this as a pullrequest? Just tested it and it works with my S3 endpoint !=Amazon

@navinpai
Copy link
Contributor

Hey,
A pullrequest doing just this was merged into master a couple days back (#82 ). Should be fine to work with any endpoint using --endpoint-url. We're in the process of updating our test-suite to use minio to ensure this works E2E, but you can play around with it in the meantime! (You'll have to build from the latest code until we update pip et al)

Closing this issue for now, but let me know if you face any issues and I'll reopen it! :D

@jamshid
Copy link

jamshid commented Jun 27, 2018

Thanks for this feature! FYI to others trying to use s4cmd with a non-AWS S3 endpoint now...

The current s4cmd 2.0.1 release in pip does not support --endpoint-url yet.
You'll need to run from source:

RUN git clone https://github.com/bloomreach/s4cmd.git && cd s4cmd && pip install pytz boto3 && python setup.py install
$ s4cmd --endpoint-url ${HTTPS_PROTOCL}://${DOMAIN}:${S3_PORT} ls

s4cmd doesn't seem to use host_base / host_bucket from ~/.s3cfg, it requires that command line arg.

host_bucket = ${DOMAIN}:${S3_PORT}
host_base = ${DOMAIN}:${S3_PORT}

@gadelkareem
Copy link

it seems this is still a problem. I installed it with apt-get though. Any update on this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants