You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am using this uploader on our local cluster, and I initially tested it out on one file interactively and it worked perfectly fine.
Since I have multiple files I created a bash script to run this in a for loop and now I am getting this error (even when I try to run this interactively now). What did I do wrong, and is there a way to remedy this?
***Starting to Process Upload Requests:***
Error response when checking for existing item at B_pa_in_I2A_2L_R1.fastq.gz : Unauthorized
PROCESSING(F): B_pa_in_I2A_2L_R1.fastq.gz
Does not yet exist on server.
Error response when processing /n/data1/cores/bcbio/PIs/stuart_orkin/h3k9me3_hyeji/B_pa_in_I2A_2L_R1.fastq.gz : Forbidden
{"status":"ERROR","message":"The API key is invalid."}
Not uploaded due to error during processing: B_pa_in_I2A_2L_R1.fastq.gz
***Execution Complete.***
Thanks so much for any help!
The text was updated successfully, but these errors were encountered:
Have you checked that the that the API key is still valid in Dataverse (no-one has hit the Recreate or Revoke buttons?)
Another guess would be to ask if the API key for a user who has permissions on the dataset(s) you're accessing? (It's possible that the 'API key is invalid' message is being sent for this case instead of a clearer message - I'd have to check in the source).
If that's not it, I'd suggest checking in the DVUploader log files - they have more info than the console in some cases. I'm not aware of any bug like this or know why repeated use would cause a problem, but if its repeatable, let me know and send any log info you have.
Lastly - one suggestion to avoid having to loop - DVUploader can upload all files in a directory, so you can just call it for a dir 'data' and it would upload 'data/*'. If you have subdirs and add the -recurse flag, it will also go into subdirs ('data/subdir1/file2' will be given the directoryPath 'subdir1' in Dataverse so the relative paths are preserved. If you later add files to 'data', you can just run again to pick up the new ones (same thing if you have an error or stop uploading at some point - just restart to upload any files that hadn't yet been processed).
Hi,
I am using this uploader on our local cluster, and I initially tested it out on one file interactively and it worked perfectly fine.
Since I have multiple files I created a bash script to run this in a for loop and now I am getting this error (even when I try to run this interactively now). What did I do wrong, and is there a way to remedy this?
Thanks so much for any help!
The text was updated successfully, but these errors were encountered: