Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[feature][python] Support aliasing of API keys #6469

Conversation

jirikuncar
Copy link
Contributor

@jirikuncar jirikuncar commented May 28, 2020

Adds support for API key aliasing as it's done in Go and Java:

PR checklist

  • Read the contribution guidelines.
  • If contributing template-only or documentation-only changes which will change sample output, build the project before.
  • Run the shell script(s) under ./bin/ (or Windows batch scripts under.\bin\windows) to update Petstore samples related to your fix. This is important, as CI jobs will verify all generator outputs of your HEAD commit, and these must match the expectations made by your contribution. You only need to run ./bin/{LANG}-petstore.sh, ./bin/openapi3/{LANG}-petstore.sh if updating the code or mustache templates for a language ({LANG}) (e.g. php, ruby, python, etc).
  • File the PR against the correct branch: master, 4.3.x, 5.0.x. Default: master.
  • Copy the technical committee to review the pull request if your PR is targeting a particular programming language.

@taxpon (2017/07) @frol (2017/07) @mbohlool (2017/07) @cbornet (2017/09) @kenjones-cisco (2017/11) @tomplus (2018/10) @Jyhess (2019/01) @arun-nalla (2019/11) @spacether (2019/11)

@spacether
Copy link
Contributor

Can you please add a test that sets configuration values using a shared vendor extension key?
When we add features without any verification tests we have no way of verifying that it is working now, and we have no automatic way of telling if and when it breaks in the future.

@jirikuncar
Copy link
Contributor Author

@spacether can you point me to an example test?

@jirikuncar jirikuncar requested a review from spacether June 1, 2020 18:41
@spacether
Copy link
Contributor

spacether commented Jun 2, 2020

@spacether can you point me to an example test?

Sure, this file instantiates the Configuration class

If you want to call api endpoints then you can add those test to a file like:
https://github.com/OpenAPITools/openapi-generator/blob/master/samples/openapi3/client/petstore/python-experimental/test/test_pet_api.py
Looking in the docs here you can see a code sample in get_pet_by_id that uses api_key auth.
How about adding a new endpoint that has two parameters with this variable name collision that you are trying to prevent?
Do you want to add code demonstrating that we can assign a value once using this #vendorExtensions.x-auth-id-alias value, and assign a key value to multiple keys? I get that you are changing the key name here, but I'm not sure how you want to use it.

With your examples:

components:
  securitySchemes:
    apiKeyAuth:
      type: apiKey
      name: api_key
      x-auth-id-alias: secret
      in: query
      description: API Key
    apiKeyAuthHeader:
      type: apiKey
      name: SECURITY-API-KEY
      x-auth-id-alias: secret
      in: header
      description: API Key

Where do the values {{name}} and {{keyParamName}} come from in your examples above?

@jirikuncar
Copy link
Contributor Author

@spacether I have added tests. It should work together with prefixes too. I had to modify the lookup method.

@jirikuncar jirikuncar requested a review from spacether June 3, 2020 10:03
@jirikuncar jirikuncar changed the title [python] Support aliasing of API keys [feature][python] Support aliasing of API keys Jun 3, 2020
@jirikuncar
Copy link
Contributor Author

@spacether I have updated the spec to show different values for api key ID and name. I hope the test is clear now.

@@ -1159,11 +1162,13 @@ components:
'read:pets': read your pets
api_key:
type: apiKey
name: api_key
name: X-Api-Key
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It may not be obvious but some generators samples have "live" tests which run pre-written tests against compiled code (see jersey2-java8). The tests aren't overwritten on generation (as a rule within DefaultGenerator).

This is documented in the description at the top of the file:

This spec is mainly for testing Petstore server and contains fake endpoints, models. Please do not use this for any other purpose.

If you update this api key, it would require changing those other "live tests".

In this case, I'd recommend creating a separate smaller spec or adding another auth type outside of api_key which allows you to test the aliasing. That way you wouldn't have the conflict in the test.

Copy link
Contributor

@spacether spacether Jun 4, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Jim does this point apply to this cordoned off spec file which is used for python-experimental only?
Is there another tag where we could ad this security scheme on to an use it there?
Should we use a new api tag?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@spacether I don't know why a spec for python-only would be used by Java. My comment above is about changing a spec which is shared elsewhere and would require changes to hand-written Java tests.

I don't really follow your other questions. My recommendation was to add to the spec to avoid breaking those Java tests, or to create a standalone spec which demonstrates the aliasing behavior. If it was me, I'd do a separate minimal spec which demonstrates the behavior in unit tests then add the new alias parts to any existing spec to verify samples generation.

An issue with modifying existing specs that target swagger petstore (as does the one modified here) is that swagger petstore defines the header as api_key and someone wouldn't be able to use this spec to generate a valid client against that live API any longer. If we instead have another spec which doesn't target a real system's design (search for pony, fruits, and minimal "ping" specs), we don't risk breaking existing examples.

Eventually, I think we need to better organize and document the test specs so we know each are "fakes", which are intended to be exemplar full APIs, and which ones are only meant to evaluate full feature support.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I will create a separate spec to test API key aliasing feature and generate new Python client with tests for it.

Copy link
Contributor

@spacether spacether Jun 4, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So the modules/openapi-generator/src/test/resources/3_0/python-experimental/petstore-with-fake-endpoints-models-for-testing-with-http-signature.yaml spec is used in 2 locations to generate samples:

  • bin/openapi3/java-petstore-jersey2-java8.sh
  • bin/openapi3/python-experimental-petstore.sh

So I agree with Jim's assessment that we need a separate spec for python-experimental for this case because I do not see a PR adding support for api key aliasing in java-jersey2.
If this spec were only used by python-experimental then we could just:

  • add new security schemes
  • use them in a non-live server api like the fake tag api

How about this fix?

  • saving off the existing modules/openapi-generator/src/test/resources/3_0/python-experimental/petstore-with-fake-endpoints-models-for-testing-with-http-signature.yaml as a resource that java-petstore-jersey2-java8.sh uses, maybe in a java-jersey2-java8 folder
  • adding your new changes to the existing python-experimental test file in the fake api, add new security schemes so they are only used in that one endpoint on the fake api

That fix meets these needs:

  • it keeps feature isolation between these 2 generators
  • it has no impact on live server testing based on specs because the fake api is not live
  • it keeps is clear which sample that our users should look at when they look for python-experimental samples for the v3.0 openapi sample

Copy link
Contributor

@spacether spacether Jun 4, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jimschubert does this proposed fix meet your needs or do you still prefer the small feature only spec?

@jirikuncar
Copy link
Contributor Author

@spacether @jimschubert the whole idea behind having a separate spec for each feature is to have a support matrix (with test status):

language \ extensions | x-auth-id-alias | ...
----------------------------------------------
python-experimental   | x               | ...
java/jersey2          | ?               | ...
go-experimental       | ?               | ...

x - tested
? - supported (not tested)

@jimschubert
Copy link
Member

@jirikuncar I completely agree. It seems like there might be some misunderstanding in this PR so I'll reframe.

We have two needs for specs:

  • testing
  • samples

What you've done with the custom spec to be testable is correct. We also follow the same practice elsewhere to generate feature specific or option specific samples.

It sounds like what Justin is aiming for is the "golden" example, which we really don't currently define or practice. By "golden", I mean a spec that represents all features and can be used across the mature generators. I feel like we're a little way from being able to do that consistently. My project (number 5) aims to get us there.

I don't see any issue with the approaches in this PR.

@spacether
Copy link
Contributor

@jirikuncar can you run bin/utils/ensure-up-to-date?
The PR looks good, it is only blocked by these CI errors:

Changes not staged for commit:
  (use "git add <file>..." to update what will be committed)
  (use "git checkout -- <file>..." to discard changes in working directory)

	modified:   samples/client/petstore/python-asyncio/petstore_api/configuration.py
	modified:   samples/client/petstore/python-tornado/petstore_api/configuration.py
	modified:   samples/client/petstore/python/petstore_api/configuration.py
	modified:   samples/openapi3/client/extensions/x-auth-id-alias/python-experimental/.openapi-generator/FILES
	modified:   samples/openapi3/client/petstore/python-experimental/.openapi-generator/FILES
	modified:   samples/openapi3/client/petstore/python-experimental/README.md
	modified:   samples/openapi3/client/petstore/python-experimental/petstore_api/models/__init__.py
	modified:   samples/openapi3/client/petstore/python/petstore_api/configuration.py

Running ensure-up-to-date and committing the changes will allow CI to pass.

@jirikuncar
Copy link
Contributor Author

@spacether I have followed:

Run the shell script(s) under ./bin/ (or Windows batch scripts under.\bin\windows) to update Petstore samples related to your fix. This is important, as CI jobs will verify all generator outputs of your HEAD commit, and these must match the expectations made by your contribution. You only need to run ./bin/{LANG}-petstore.sh, ./bin/openapi3/{LANG}-petstore.sh if updating the code or mustache templates for a language ({LANG}) (e.g. php, ruby, python, etc).

@jirikuncar
Copy link
Contributor Author

jirikuncar commented Jun 8, 2020

@jimschubert @spacether I have merged master to my feature branch and re-run the ./bin/*/python*.sh scripts.

@jirikuncar
Copy link
Contributor Author

@spacether can you please re-review?

Copy link
Contributor

@spacether spacether left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good. Thank you for the PR!

@spacether spacether merged commit 046ba7d into OpenAPITools:master Jun 11, 2020
@jirikuncar jirikuncar deleted the jirikuncar/python-experimental/key-aliasing branch June 12, 2020 09:17
@wing328 wing328 added this to the 5.0.0 milestone Jul 3, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants