Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
102 changes: 96 additions & 6 deletions hadoop-ozone/dist/src/main/smoketest/s3/MultipartUpload.robot
Original file line number Diff line number Diff line change
Expand Up @@ -48,17 +48,13 @@ Wait Til Date Past
${sleepSeconds} = Subtract Date From Date ${date} ${latestDate}
Run Keyword If ${sleepSeconds} > 0 Sleep ${sleepSeconds}

*** Variables ***
${ENDPOINT_URL} http://s3g:9878
${BUCKET} generated

*** Test Cases ***

Test Multipart Upload With Adjusted Length
[Arguments] ${BUCKET}
Perform Multipart Upload ${BUCKET} multipart/adjusted_length_${PREFIX} /tmp/part1 /tmp/part2
Verify Multipart Upload ${BUCKET} multipart/adjusted_length_${PREFIX} /tmp/part1 /tmp/part2

Test Multipart Upload
[Arguments] ${BUCKET}
${result} = Execute AWSS3APICli create-multipart-upload --bucket ${BUCKET} --key ${PREFIX}/multipartKey
${uploadID} = Execute and checkrc echo '${result}' | jq -r '.UploadId' 0
Should contain ${result} ${BUCKET}
Expand Down Expand Up @@ -87,6 +83,7 @@ Test Multipart Upload


Test Multipart Upload Complete
[Arguments] ${BUCKET}
${result} = Execute AWSS3APICli create-multipart-upload --bucket ${BUCKET} --key ${PREFIX}/multipartKey1 --metadata="custom-key1=custom-value1,custom-key2=custom-value2,gdprEnabled=true"
${uploadID} = Execute and checkrc echo '${result}' | jq -r '.UploadId' 0
Should contain ${result} ${BUCKET}
Expand Down Expand Up @@ -139,12 +136,14 @@ Test Multipart Upload Complete
Compare files /tmp/part2 /tmp/${PREFIX}-multipartKey1-part2.result

Test Multipart Upload with user defined metadata size larger than 2 KB
[Arguments] ${BUCKET}
${custom_metadata_value} = Execute printf 'v%.0s' {1..3000}
${result} = Execute AWSS3APICli and checkrc create-multipart-upload --bucket ${BUCKET} --key ${PREFIX}/mpuWithLargeMetadata --metadata="custom-key1=${custom_metadata_value}" 255
Should contain ${result} MetadataTooLarge
Should not contain ${result} custom-key1: ${custom_metadata_value}

Test Multipart Upload Complete Entity too small
[Arguments] ${BUCKET}
${result} = Execute AWSS3APICli create-multipart-upload --bucket ${BUCKET} --key ${PREFIX}/multipartKey2
${uploadID} = Execute and checkrc echo '${result}' | jq -r '.UploadId' 0
Should contain ${result} ${BUCKET}
Expand All @@ -168,6 +167,7 @@ Test Multipart Upload Complete Entity too small


Test Multipart Upload Complete Invalid part errors and complete mpu with few parts
[Arguments] ${BUCKET}
${result} = Execute AWSS3APICli create-multipart-upload --bucket ${BUCKET} --key ${PREFIX}/multipartKey3
${uploadID} = Execute and checkrc echo '${result}' | jq -r '.UploadId' 0
Should contain ${result} ${BUCKET}
Expand Down Expand Up @@ -219,6 +219,7 @@ Test Multipart Upload Complete Invalid part errors and complete mpu with few par
Compare files /tmp/part3 /tmp/${PREFIX}-multipartKey3-part3.result

Test abort Multipart upload
[Arguments] ${BUCKET}
${result} = Execute AWSS3APICli create-multipart-upload --bucket ${BUCKET} --key ${PREFIX}/multipartKey4 --storage-class REDUCED_REDUNDANCY
${uploadID} = Execute and checkrc echo '${result}' | jq -r '.UploadId' 0
Should contain ${result} ${BUCKET}
Expand All @@ -228,15 +229,18 @@ Test abort Multipart upload
${result} = Execute AWSS3APICli and checkrc abort-multipart-upload --bucket ${BUCKET} --key ${PREFIX}/multipartKey4 --upload-id ${uploadID} 0

Test abort Multipart upload with invalid uploadId
[Arguments] ${BUCKET}
${result} = Execute AWSS3APICli and checkrc abort-multipart-upload --bucket ${BUCKET} --key ${PREFIX}/multipartKey5 --upload-id "random" 255

Upload part with Incorrect uploadID
[Arguments] ${BUCKET}
${result} = Execute AWSS3APICli create-multipart-upload --bucket ${BUCKET} --key ${PREFIX}/multipartKey
Execute echo "Multipart upload" > /tmp/testfile
${result} = Execute AWSS3APICli and checkrc upload-part --bucket ${BUCKET} --key ${PREFIX}/multipartKey --part-number 1 --body /tmp/testfile --upload-id "random" 255
Should contain ${result} NoSuchUpload

Test list parts
[Arguments] ${BUCKET}
#initiate multipart upload
${result} = Execute AWSS3APICli create-multipart-upload --bucket ${BUCKET} --key ${PREFIX}/multipartKey5
${uploadID} = Execute and checkrc echo '${result}' | jq -r '.UploadId' 0
Expand Down Expand Up @@ -279,13 +283,15 @@ Test list parts
${result} = Execute AWSS3APICli and checkrc abort-multipart-upload --bucket ${BUCKET} --key ${PREFIX}/multipartKey5 --upload-id ${uploadID} 0

Test Multipart Upload with the simplified aws s3 cp API
[Arguments] ${BUCKET}
Create Random file 22
Execute AWSS3Cli cp /tmp/part1 s3://${BUCKET}/mpyawscli
Execute AWSS3Cli cp s3://${BUCKET}/mpyawscli /tmp/part1.result
Execute AWSS3Cli rm s3://${BUCKET}/mpyawscli
Compare files /tmp/part1 /tmp/part1.result

Test Multipart Upload Put With Copy
[Arguments] ${BUCKET}
Run Keyword Create Random file 5
${result} = Execute AWSS3APICli put-object --bucket ${BUCKET} --key ${PREFIX}/copytest/source --body /tmp/part1

Expand All @@ -308,6 +314,7 @@ Test Multipart Upload Put With Copy
Compare files /tmp/part1 /tmp/part-result

Test Multipart Upload Put With Copy and range
[Arguments] ${BUCKET}
Run Keyword Create Random file 10
${result} = Execute AWSS3APICli put-object --bucket ${BUCKET} --key ${PREFIX}/copyrange/source --body /tmp/part1

Expand Down Expand Up @@ -335,6 +342,7 @@ Test Multipart Upload Put With Copy and range
Compare files /tmp/part1 /tmp/part-result

Test Multipart Upload Put With Copy and range with IfModifiedSince
[Arguments] ${BUCKET}
Run Keyword Create Random file 10
${curDate} = Get Current Date
${beforeCreate} = Subtract Time From Date ${curDate} 1 day
Expand Down Expand Up @@ -388,6 +396,7 @@ Test Multipart Upload Put With Copy and range with IfModifiedSince
Compare files /tmp/part1 /tmp/part-result

Test Multipart Upload list
[Arguments] ${BUCKET}
${result} = Execute AWSS3APICli create-multipart-upload --bucket ${BUCKET} --key ${PREFIX}/listtest/key1
${uploadID1} = Execute and checkrc echo '${result}' | jq -r '.UploadId' 0
Should contain ${result} ${BUCKET}
Expand All @@ -406,3 +415,84 @@ Test Multipart Upload list

${count} = Execute and checkrc echo '${result}' | jq -r '.Uploads | length' 0
Should Be Equal ${count} 2

*** Variables ***
${ENDPOINT_URL} http://s3g:9878
${BUCKET} generated
${BUCKET1} generated
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could we improve the naming of variables here? BUCKET1 is very ambiguous, it can be changed to BUCKET_FSO or anything similar.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1


*** Test Cases ***
Test Multipart Upload With Adjusted Length with OBS
Test Multipart Upload With Adjusted Length ${BUCKET}
Test Multipart Upload With Adjusted Length with FSO
Test Multipart Upload With Adjusted Length ${BUCKET1}

Test Multipart Upload with OBS
Test Multipart Upload ${BUCKET}
Test Multipart Upload with FSO
Test Multipart Upload ${BUCKET1}

Test Multipart Upload Complete with OBS
Test Multipart Upload Complete ${BUCKET}
Test Multipart Upload Complete with FSO
Test Multipart Upload Complete ${BUCKET1}

Test Multipart Upload with user defined metadata size larger than 2 KB with OBS
Test Multipart Upload with user defined metadata size larger than 2 KB ${BUCKET}
Test Multipart Upload with user defined metadata size larger than 2 KB with FSO
Test Multipart Upload with user defined metadata size larger than 2 KB ${BUCKET1}

Test Multipart Upload Complete Entity too small with OBS
Test Multipart Upload Complete Entity too small ${BUCKET}
Test Multipart Upload Complete Entity too small with FSO
Test Multipart Upload Complete Entity too small ${BUCKET1}

Test Multipart Upload Complete Invalid part errors and complete mpu with few parts with OBS
Test Multipart Upload Complete Invalid part errors and complete mpu with few parts ${BUCKET}
Test Multipart Upload Complete Invalid part errors and complete mpu with few parts with FSO
Test Multipart Upload Complete Invalid part errors and complete mpu with few parts ${BUCKET1}

Test abort Multipart upload with OBS
Test abort Multipart upload ${BUCKET}
Test abort Multipart upload with FSO
Test abort Multipart upload ${BUCKET1}

Test abort Multipart upload with invalid uploadId with OBS
Test abort Multipart upload with invalid uploadId ${BUCKET}
Test abort Multipart upload with invalid uploadId with FSO
Test abort Multipart upload with invalid uploadId ${BUCKET1}

Upload part with Incorrect uploadID with OBS
Upload part with Incorrect uploadID ${BUCKET}
Upload part with Incorrect uploadID with FSO
Upload part with Incorrect uploadID ${BUCKET1}

Test list parts with OBS
Test list parts ${BUCKET}
Test list parts with FSO
Test list parts ${BUCKET1}

Test Multipart Upload with the simplified aws s3 cp API with OBS
Test Multipart Upload with the simplified aws s3 cp API ${BUCKET}
Test Multipart Upload with the simplified aws s3 cp API with FSO
Test Multipart Upload with the simplified aws s3 cp API ${BUCKET1}

Test Multipart Upload Put With Copy with OBS
Test Multipart Upload Put With Copy ${BUCKET}
Test Multipart Upload Put With Copy with FSO
Test Multipart Upload Put With Copy ${BUCKET1}

Test Multipart Upload Put With Copy and range with OBS
Test Multipart Upload Put With Copy and range ${BUCKET}
Test Multipart Upload Put With Copy and range with FSO
Test Multipart Upload Put With Copy and range ${BUCKET1}

Test Multipart Upload Put With Copy and range with IfModifiedSince with OBS
Test Multipart Upload Put With Copy and range with IfModifiedSince ${BUCKET}
Test Multipart Upload Put With Copy and range with IfModifiedSince with FSO
Test Multipart Upload Put With Copy and range with IfModifiedSince ${BUCKET1}

Test Multipart Upload list with OBS
Test Multipart Upload list ${BUCKET}
Test Multipart Upload list with FSO
Test Multipart Upload list ${BUCKET1}
17 changes: 15 additions & 2 deletions hadoop-ozone/dist/src/main/smoketest/s3/awss3.robot
Original file line number Diff line number Diff line change
Expand Up @@ -25,10 +25,11 @@ Suite Setup Setup s3 tests
*** Variables ***
${ENDPOINT_URL} http://s3g:9878
${BUCKET} generated
${BUCKET1} generated
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same for this variable as well, And all other files having BUCKET1 can be changed to something more sensible


*** Test Cases ***

*** Keywords ***
File upload and directory list
[Arguments] ${BUCKET}
Execute date > /tmp/testfile
${result} = Execute AWSS3Cli cp /tmp/testfile s3://${BUCKET}
Should contain ${result} upload
Expand All @@ -48,9 +49,21 @@ File upload and directory list
Should contain ${result} file

File upload with special chars
[Arguments] ${BUCKET}
Execute date > /tmp/testfile
${result} = Execute AWSS3Cli cp /tmp/testfile s3://${BUCKET}/specialchars/a+b
Should contain ${result} upload
${result} = Execute AWSS3Cli ls s3://${BUCKET}/specialchars/
Should not contain ${result} 'a b'
Should contain ${result} a+b

*** Test Cases ***
File upload and directory list with OBS
File upload and directory list ${BUCKET}
File upload and directory list with FSO
File upload and directory list ${BUCKET1}

File upload with special chars with OBS
File upload with special chars ${BUCKET}
File upload with special chars with FSO
File upload with special chars ${BUCKET1}
11 changes: 10 additions & 1 deletion hadoop-ozone/dist/src/main/smoketest/s3/boto3.robot
Original file line number Diff line number Diff line change
Expand Up @@ -27,9 +27,18 @@ Suite Setup Setup s3 tests
*** Variables ***
${ENDPOINT_URL} http://s3g:9878
${BUCKET} generated
${BUCKET1} generated
${S3_SMOKETEST_DIR} /opt/hadoop/smoketest/s3

*** Test Cases ***
*** Keywords ***

Boto3 Client Test
[Arguments] ${BUCKET}
${result} = Execute python3 ${S3_SMOKETEST_DIR}/boto_client.py ${ENDPOINT_URL} ${BUCKET}

*** Test Cases ***

Boto3 Client Test with OBS
Boto3 Client Test ${BUCKET}
Boto3 Client Test with FSO
Boto3 Client Test ${BUCKET1}
11 changes: 9 additions & 2 deletions hadoop-ozone/dist/src/main/smoketest/s3/buckethead.robot
Original file line number Diff line number Diff line change
Expand Up @@ -25,16 +25,23 @@ Suite Setup Setup s3 tests
*** Variables ***
${ENDPOINT_URL} http://s3g:9878
${BUCKET} generated
${BUCKET1} generated

*** Test Cases ***
*** Keywords ***

Head Bucket
[Arguments] ${BUCKET}
${result} = Execute AWSS3APICli head-bucket --bucket ${BUCKET}

*** Test Cases ***
Head Bucket with OBS
Head Bucket ${BUCKET}
Head Bucket with FSO
Head Bucket ${BUCKET}

Head Bucket not existent
[tags] no-bucket-type
${randStr} = Generate Ozone String
${result} = Execute AWSS3APICli and checkrc head-bucket --bucket ozonenosuchbucketqqweqwe-${randStr} 255
Should contain ${result} 404
Should contain ${result} Not Found

17 changes: 16 additions & 1 deletion hadoop-ozone/dist/src/main/smoketest/s3/bucketlist.robot
Original file line number Diff line number Diff line change
Expand Up @@ -25,21 +25,36 @@ Suite Setup Setup s3 tests
*** Variables ***
${ENDPOINT_URL} http://s3g:9878
${BUCKET} generated
${BUCKET1} generated

*** Test Cases ***
*** Keywords ***

List buckets
[Arguments] ${BUCKET}
${result} = Execute AWSS3APICli list-buckets | jq -r '.Buckets[].Name'
Should contain ${result} ${BUCKET}

Get bucket info with Ozone Shell to check the owner field
[Arguments] ${BUCKET}
Pass Execution If '${SECURITY_ENABLED}' == 'false' Skipping this check as security is not enabled
${result} = Execute ozone sh bucket info /s3v/${BUCKET} | jq -r '.owner'
Should Be Equal ${result} testuser
# In ozonesecure(-ha) docker-config, hadoop.security.auth_to_local is set
# in the way that getShortUserName() converts the accessId to "testuser".
# Also see "Setup dummy credentials for S3" in commonawslib.robot

*** Test Cases ***

List buckets with OBS
List buckets ${BUCKET}
List buckets with FSO
List buckets ${BUCKET1}

Get bucket info with Ozone Shell to check the owner field with OBS
Get bucket info with Ozone Shell to check the owner field ${BUCKET}
Get bucket info with Ozone Shell to check the owner field with FSO
Get bucket info with Ozone Shell to check the owner field ${BUCKET1}

List buckets with empty access id
[setup] Save AWS access key
Execute aws configure set aws_access_key_id ''
Expand Down
12 changes: 10 additions & 2 deletions hadoop-ozone/dist/src/main/smoketest/s3/commonawslib.robot
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,9 @@ ${ENDPOINT_URL} http://s3g:9878
${OZONE_S3_HEADER_VERSION} v4
${OZONE_S3_SET_CREDENTIALS} true
${BUCKET} generated
${BUCKET1} generated
${BUCKET_LAYOUT} OBJECT_STORE
${BUCKET_LAYOUT1} FILE_SYSTEM_OPTIMIZED
Comment on lines +26 to +28
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The naming of ${BUCKET_LAYOUT} and ${BUCKET_LAYOUT1} can also be changed to ${BUCKET_LAYOUT_OBS} and ${BUCKET_LAYOUT_FSO} respectively.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1

${KEY_NAME} key1
${OZONE_S3_TESTS_SET_UP} ${FALSE}
${OZONE_AWS_ACCESS_KEY_ID} ${EMPTY}
Expand Down Expand Up @@ -141,7 +143,8 @@ Setup s3 tests
Run Keyword Generate random prefix
Run Keyword Install aws cli
Run Keyword if '${OZONE_S3_SET_CREDENTIALS}' == 'true' Setup v4 headers
Run Keyword if '${BUCKET}' == 'generated' Create generated bucket ${BUCKET_LAYOUT}
Run Keyword if '${BUCKET}' == 'generated' Create generated bucket with OBS ${BUCKET_LAYOUT}
Run Keyword if '${BUCKET1}' == 'generated' Create generated bucket with FSO ${BUCKET_LAYOUT1}
Comment on lines +146 to +147
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Referring to the below comment, the BUCKET_LAYOUT variables wouldn't need to be passed here.

Run Keyword if '${BUCKET}' == 'link' Setup links for S3 tests
Run Keyword if '${BUCKET}' == 'encrypted' Create encrypted bucket
Run Keyword if '${BUCKET}' == 'erasure' Create EC bucket
Expand All @@ -154,11 +157,16 @@ Setup links for S3 tests
Execute ozone sh bucket create --layout ${BUCKET_LAYOUT} o3://${OM_SERVICE_ID}/legacy/source-bucket
Create link link

Create generated bucket
Create generated bucket with OBS
[Arguments] ${layout}=OBJECT_STORE
${BUCKET} = Create bucket with layout ${layout}
Set Global Variable ${BUCKET}

Create generated bucket with FSO
[Arguments] ${layout}=FILE_SYSTEM_OPTIMIZED
${BUCKET1} = Create bucket with layout ${layout}
Set Global Variable ${BUCKET1}

Comment on lines +160 to +169
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As we have 2 different key words for creating FSO and OBS buckets, there is no need to pass an argument to these keywords. [Arguments] ${layout}=FILE_SYSTEM_OPTIMIZED can be ommitted.
Instead of passing ${layout} to Create bucket with layout, we can directly mention the layouts ${BUCKET_LAYOUT_OBS} and ${BUCKET_LAYOUT_FSO} respectively.

Create encrypted bucket
Return From Keyword if '${SECURITY_ENABLED}' == 'false'
${exists} = Bucket Exists o3://${OM_SERVICE_ID}/s3v/encrypted
Expand Down
7 changes: 6 additions & 1 deletion hadoop-ozone/dist/src/main/smoketest/s3/freon.robot
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,7 @@ Default Tags no-bucket-type
*** Variables ***
${ENDPOINT_URL} http://s3g:9878
${BUCKET} generated
${BUCKET1} generated

*** Keywords ***
# Export access key and secret to the environment
Expand All @@ -40,6 +41,10 @@ Freon S3BG
Should contain ${result} Successful executions: ${n}

*** Test Cases ***
Run Freon S3BG
Run Freon S3BG with OBS
[Setup] Setup aws credentials
Freon S3BG s3bg-${BUCKET}

Run Freon S3BG with FSO
[Setup] Setup aws credentials
Freon S3BG s3bg-${BUCKET1}
Loading