Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create ChargingModuleTokenCachePlugin to cache Charging Module JWT token #91

Merged
merged 14 commits into from
Jan 20, 2023

Conversation

StuAA78
Copy link
Contributor

@StuAA78 StuAA78 commented Jan 18, 2023

https://eaflood.atlassian.net/browse/WATER-3867

We previously built our first link to the SROC Charging Module API. That included authenticating with the Module's AWS Cognito service to get a JWT which needs to be sent with all subsequent requests.

These tokens expire after an hour by default, at which point another one needs to be requested. Because we'll need to send thousands of requests to the charging module during a bill run it's important we can reuse a token as much as possible and avoid the overhead of requesting new ones.

We do this by creating a new plugin ChargingModuleTokenCachePlugin which registers a server method to get the token. This takes advantage of the fact that hapi has built-in caching for these server methods.

Our new server.methods.getChargingModuleToken() will return the cached token object if one exists, or retrieve one if it doesn't (either because one hasn't been fetched yet, or because the cached token has expired).

Although we don't expect the expiry time of the tokens to change from the current value of an hour, we still use the returned expiresIn value to determine when the cache should expire. However we set it to be a minute less than the returned value, just to avoid the (admittedly unlikely!) case that a cached token is retrieved which then expires prior to its use.

Note that we haven't yet updated CreateBillRunService to use this new server method; this will be done in another PR.

We may also want to consider changing the cache options to use Redis instead of in-memory caching but again, this can be done in a future PR.

https://eaflood.atlassian.net/browse/WATER-3867

[We previously built our first link to the SROC Charging Module API](https://eaflood.atlassian.net/browse/WATER-3833). That included authenticating with the Module's AWS Cognito service to then get a JWT which needs to be sent with all subsequent requests.

These tokens expire after an hour by default, at which point another one needs to be requested. Because we'll need to send thousands of requests to the charging module during a bill run it's important we can reuse a token as much as possible and avoid the overhead of requesting new ones.

This means we need to add to`water-abstraction-system` the ability to cache tokens and reuse them as part of communicating with the Charging Module API.
@StuAA78 StuAA78 added the enhancement New feature or request label Jan 18, 2023
@StuAA78 StuAA78 self-assigned this Jan 18, 2023
We'll be using Hapi's inbuilt caching to store our JWT token. To make the cache easily accessible to controllers and services, we create a plugin that adds the cache to the request
Hapi comes bundled with catbox for caching which seems ideal, but at present we aren't fully au fait with what it can do and how to use it. So we can start poking around with it, we create `/token/set` and `/token/get` routes and endpoints which we can manually hit in order to check that we're on the right lines
After further reading we come to the conclusion that the simplest way to cache a token is to create a server method. By providing cache config options with we define this method, it will automatically cache the result when called and serve this cached result until the expiry time has been reached.

We therefore refactor our code so that our `CachePlugin` is now the `ChargingModuleTokenCachePlugin`, which registers `server.methods.getChargingModuleToken()`. Currently this simply returns the current time, cached for 10 seconds. But it's enough to be sure that this approach to caching works!
Currently our requested tokens expire in an hour. We don't expect this to change, but we don't want to make the assumption just in case! We therefore change the cache expiry time to default to 1 day, but add an override in the main method to manually set it to 10 seconds
Now that we understand how the caching works, we swap out the previous dummy delay for an actual call to `ChargingModuleTokenService`
We check the response from `ChargingModuleTokenService` and either set the cache expiry time based on the `expiresIn` value, or set the expiry time to 0 to avoid caching an unsuccessful response.

Note that we set the expiry time to be 1 minute less than the returned `expiresIn` value to avoid cases where the token is retrieved from the cache but expires before the subsequent request is made
We added the `ttl` value to the plugin's response to help us figure out the caching. We remove it now it's no longer needed
@StuAA78 StuAA78 changed the title Cache and reuse Charging Module JWT token Create ChargingModuleTokenCachePlugin to cache Charging Module JWT token Jan 19, 2023
@StuAA78 StuAA78 force-pushed the cache-charging-module-jwt-token branch from e0b9f6c to 539046e Compare January 19, 2023 15:53
@StuAA78 StuAA78 marked this pull request as ready for review January 19, 2023 16:04
Copy link
Member

@Cruikshanks Cruikshanks left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor

@Jozzey Jozzey left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@StuAA78 StuAA78 merged commit 28e5959 into main Jan 20, 2023
@StuAA78 StuAA78 deleted the cache-charging-module-jwt-token branch January 20, 2023 10:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants