Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Stuck in 'Uploading LFS objects' #1

Closed
lordcoppetti opened this issue Jun 7, 2023 · 4 comments
Closed

Stuck in 'Uploading LFS objects' #1

lordcoppetti opened this issue Jun 7, 2023 · 4 comments

Comments

@lordcoppetti
Copy link

I've been trying to use the tool recently, so my env:

  • macOS ventura 13.4
  • git version 2.39.2 (Apple Git-143)
  • git-lfs/3.3.0 (GitHub; darwin arm64; go 1.19.3)
  • go1.20.4 darwin/arm64

I have an S3 bucket configured with programmatic access which I tested through boto and python3 and works fine, can upload to the bucket so no networking issues either.
My environment variables are set as described.

However when I run test.sh it gets stuck in:
Uploading LFS objects: 0% (0/2), 1.0 MB | 0 B/s

So my way of reproducing is only running test.sh with a env file with the variables set there.

Do you know why this could be or where to dig into the code? Thanks!

@nicolas-graves
Copy link
Owner

It's normal that it gets stuck for a couple seconds, IIRC it needs some feedback from the bucket to be able to write B/s. I assume this is not what you're experiencing.

If you can run test.sh, can you provide the contents of input.log, output.log and error.log (filter any sensitive data) which should be generated when running it? It would help understand what's going on.

Also, which S3 provider do you use? Maybe there running more up-to-date API.

@nicolas-graves
Copy link
Owner

nicolas-graves commented Jun 29, 2023

It's possible that you error was related to the usePathStyle option of the s3 client, especially if you use a different hosting provider. @jheiselman allowed to change this option. In addition to my previous message, you can also try changing it.

@phoenixlzx
Copy link

phoenixlzx commented Jun 30, 2023

I just run into a similar issue and the error log may be help:

Received upload request for f3ae196696594ede29453d12b42796e19a38912f65ab32b4723e1823325c7e58
Sent message {"event":"progress","oid":"f3ae196696594ede29453d12b42796e19a38912f65ab32b4723e1823325c7e58","bytesSoFar":1048576,"bytesSinceLast":1048576}
Error uploading file: operation error S3: PutObject, serialization failed: serialization failed: input member Bucket must not be empty

My envrc looks like below

AWS_REGION="ap-northeast-2"
AWS_ACCESS_KEY_ID="[redacted]"
AWS_SECRET_ACCESS_KEY="[redacted]"
AWS_S3_ENDPOINT="s3.wasabisys.com"
S3_BUCKET="[redacted]"
S3_USEPATHSTYLE=1

Update:

From further debugging I found the environment variables are not correctly set, hard-code the values to replace os.Getenv's resulted in a correct upload. However I'm still getting error downloading object:

Error downloading object: blob1.bin (9283ca9): Smudge error: Error downloading blob1.bin (9283ca917229220b973de47968c54c36086bbf34a6a7d3237cc3f1e8e71e5bda): error transferring "9283ca917229220b973de47968c54c36086bbf34a6a7d3237cc3f1e8e71e5bda": [0] remote missing object 9283ca917229220b973de47968c54c36086bbf34a6a7d3237cc3f1e8e71e5bda

Update 2:

Turns out the error on clone is expected and should be manually fixed afterwards with corrected LFS settings. I just filed a PR to address the ENV issue.

@nicolas-graves
Copy link
Owner

Thanks for debugging this issue @phoenixlzx.

I'll assume the error indeed came from an ill-defined environment. I've added error handling in the last version of the package for this case.

If that was not your issue @lordcoppetti, fell free to reopen an issue with more detail.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants