Skip to content

[data-tables] Manual port of data-tables package to corev2#12548

Merged
xirzec merged 11 commits intoAzure:masterfrom
xirzec:tablesCoreV2
Feb 1, 2021
Merged

[data-tables] Manual port of data-tables package to corev2#12548
xirzec merged 11 commits intoAzure:masterfrom
xirzec:tablesCoreV2

Conversation

@xirzec
Copy link
Copy Markdown
Member

@xirzec xirzec commented Nov 13, 2020

This PR is a quick and dirty modification of the generated code inside data-tables to work with core-https, core-client, and core-xml.

All tests pass. The changes to core-client have been pulled out into #12547.

@xirzec xirzec added Client This issue points to a problem in the data-plane of the library. Tables labels Nov 13, 2020
@xirzec xirzec self-assigned this Nov 13, 2020
@check-enforcer
Copy link
Copy Markdown

This pull request is protected by Check Enforcer.

What is Check Enforcer?

Check Enforcer helps ensure all pull requests are covered by at least one check-run (typically an Azure Pipeline). When all check-runs associated with this pull request pass then Check Enforcer itself will pass.

Why am I getting this message?

You are getting this message because Check Enforcer did not detect any check-runs being associated with this pull request within five minutes. This may indicate that your pull request is not covered by any pipelines and so Check Enforcer is correctly blocking the pull request being merged.

What should I do now?

If the check-enforcer check-run is not passing and all other check-runs associated with this PR are passing (excluding license-cla) then you could try telling Check Enforcer to evaluate your pull request again. You can do this by adding a comment to this pull request as follows:
/check-enforcer evaluate
Typically evaulation only takes a few seconds. If you know that your pull request is not covered by a pipeline and this is expected you can override Check Enforcer using the following command:
/check-enforcer override
Note that using the override command triggers alerts so that follow-up investigations can occur (PRs still need to be approved as normal).

What if I am onboarding a new service?

Often, new services do not have validation pipelines associated with them, in order to bootstrap pipelines for a new service, you can issue the following command as a pull request comment:
/azp run prepare-pipelines
This will run a pipeline that analyzes the source tree and creates the pipelines necessary to build and validate your pull request. Once the pipeline has been created you can trigger the pipeline using the following comment:
/azp run js - [service] - ci

Comment on lines 12 to 13
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hmm, why?

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This can be left in the codegen, I think I just took it out because the convenience clients always provide this info so it's technically not used:

https://github.com/Azure/azure-sdk-for-js/blob/ac4f023ffc601366c81bc4f3d3a057d210f94065/sdk/tables/data-tables/src/TableClient.ts#L139

Tactically, I think I omitted it because I don't have getDefaultUserAgentValue() exposed right now, though perhaps we should add that back.

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like the changes here!

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wonder, are these changes by hand or do we have support for core-https in the code generator?

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@joheredi is going to help with updating the generator. 😄

@xirzec
Copy link
Copy Markdown
Member Author

xirzec commented Jan 26, 2021

Marking this PR as ready for review, the only remaining piece of work is the recorder is currently choking on the URL parts no longer being aggressively encoded, e.g. before it would look like

 .delete('/Tables(%27testTablenode%27)')

And now it ends up generating:

.delete('/Tables('testTablenode')')

which isn't valid JS. I think we should consider switching out single quotes for `

@xirzec xirzec changed the title [data-tables][DRAFT] Manual port of data-tables package to corev2 [data-tables] Manual port of data-tables package to corev2 Jan 26, 2021
@xirzec xirzec marked this pull request as ready for review January 26, 2021 23:40
@xirzec xirzec requested a review from chradek as a code owner January 26, 2021 23:40
ghost pushed a commit that referenced this pull request Jan 29, 2021
…ck (#13474)

## Problem
As part of moving `data-tables` from `core-http` to `core-https` at #12548, @xirzec has observed that the generated recordings for the node tests was invalid if the url-path has single quotes.
Upon investigating, I could see that the request url at core-http and core-https levels is the same.
The problem is seen because of the switch from `node-fetch`(core-http) to native `https`(core-https).

`node-fetch` encodes the request before sending which encodes the single quotes to `%27` and nock was able to save the fixtures with the encoded request nicely.
When migrated to `https` in core-https, there is a difference in behavior with the URL path, there is no default encoding, leading to the presence of single quotes.

## Bug(in `nock`)
Given the above problem, "nock" does capture the request with single quotes in the URL path, but fails to save the fixture properly leading to invalid recordings on our end.
Reference -  https://github.com/nock/nock/blob/dc04694dcd3186a396e48567bdfabbdad2761f6a/lib/recorder.js#L134

This line 
> lines.push(\`  .${methodName}('${path}', ${JSON.stringify(requestBody)})\`)

Single quotes can be present in the url path though unusual. When the url path has single quotes, the fixture generated by "nock" is incorrect since it doesn't consider the case. 
#### Examples below
- `.delete('/Tables('node')')` 
- `.get('/Tables('node')')` 
- `.post('/Tables('node')', {"TableName":"testTablenode"})`

## Fix
To avoid this problem, we replace the single quotes surrounding the url-path in the recording with backticks(`) if there are any single quotes present in the url-path. This would fix the invalid recordings.
This is backward compatible as well.
This would work for both core-http and core-https.

## To Do
- [x] Add workaround
- [x] Tests for the workaround
- [x] Create an issue in the "nock" repo 
    - [x] nock/nock#2136
- [x] Changelog
ghost pushed a commit that referenced this pull request Feb 1, 2021
…rted "workaround" (#13494)

#13474 added a workaround in the recorder but was reverted soon at #13488 because the presence of single quotes in the request body had corrupted the updated fixture due to the workaround.

For the case with request bodies, since the request body may contain literally anything, it is not trivial to come up with a regex to split the arguments. Any regex would have assumptions on the line saved by nock, i.e., the number of spaces in between the arguments and the characters with which the payload would start and what the url-path may contain.

Given the problems, it might be much better if we get the fix from nock before the string/fixture is saved.

And since we have only seen the single quotes with the delete requests and that too for the requests without the request body, I'm inclined to have the workaround only for that case(without the request bodies).

I have left a TODO for the case with request bodies for now. 

Have tested this fix with the data-tables SDK, and I was able to record and playback all the `data-tables` tests. This would unblock #12548.
@xirzec
Copy link
Copy Markdown
Member Author

xirzec commented Feb 1, 2021

The recorded tests should all be fixed now. Please review @deyaaeldeen

Copy link
Copy Markdown
Member

@deyaaeldeen deyaaeldeen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great!

sendRequest(request: WebResourceLike): Promise<HttpOperationResponse>;
signRequest(request: WebResourceLike): WebResource;
}
export function tablesSharedKeyCredentialPolicy(credential: TablesSharedKeyCredentialLike): PipelinePolicy;
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should this have a create prefix? the current name is a bit vague.

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hum, we don't use create for any of the built-in policies today. I'm not opposed to changing this everywhere if we wanted to, but I think my hesitation is that makes it feel like it's returning a class instance instead of just an object that satisfies PipelinePolicy.

@bterlson do you have any naming feelings about policy functions?

{ table, options },
getAccessPolicyOperationSpec
) as Promise<TableGetAccessPolicyResponse>;
) as unknown) as Promise<TableGetAccessPolicyResponse>;
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wonder why casting to unknown first? the source and target types are not compatible?

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oh this isn't needed anymore! I'll remove it. Thanks for pointing it out! 👍

@xirzec xirzec merged commit 959a8c9 into Azure:master Feb 1, 2021
@xirzec xirzec deleted the tablesCoreV2 branch February 1, 2021 23:06
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Client This issue points to a problem in the data-plane of the library. Tables

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants