Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

configurable serializer #11

Closed
whiskeyriver opened this issue Feb 21, 2020 · 5 comments · Fixed by #63
Closed

configurable serializer #11

whiskeyriver opened this issue Feb 21, 2020 · 5 comments · Fixed by #63
Assignees
Labels
feature request Request a new feature

Comments

@whiskeyriver
Copy link

I would like to be able to choose my serialization strategy when configuring caching. This would have benefits both in performance and in cache sharing.

@northernSage northernSage added the feature request Request a new feature label May 24, 2021
@davidism davidism changed the title Request: Configurable serializers configurable serializer Jun 25, 2021
@northernSage
Copy link
Member

Thanks for the suggestion 👍🏻

I'll get the ball rolling here by suggesting a decorator, something like cachelib.serializer that could be used to mark a serialization function implemented by the user, defaulting to our current strategy when the decorator is not provided. At first this seems like a flexible way of addressing that.

@whiskeyriver Could you perhaps give an example of an alternate serialization strategy not currently supported that you would like to use? Just so the feature request is better contextualized.

@whiskeyriver
Copy link
Author

I think that's a good solution. A registration decorator for serde would be great for exotic strategies like encryption at rest, and equally useful for anything requiring upfront configuration.

The caveat, if it's global, is that it couldn't support multiple strategies. Say, for example, you wanted to store memoized results from a function using pickle, but you wanted to serve a request as JSON directly from cache without having to unpickle it just to dump it to json for transmission. In that case a user would probably want multiple cachelib instances, each configured for the serde they choose.

@northernSage
Copy link
Member

Thanks for elaborating. That's definitely an interesting consideration.

So far I have two approaches in mind:

  1. Making the decorator cachelib.BaseCache.serializer so that all cache types (base, file, memcached, redis, simple and uwsgi) will have a class-scoped serializer registered by the user since BaseCache is the common interface for all types. We would have to watch for different RedisCache instances altering values under the same key simultaneously, tough (probably avoided by using different RedisCache.key_prefix for each instance.

  2. Allowing the user to bind a name to each registered serializer using the decorator @cachelib.serializer(<my_serializer_name>) and passing that name into cachelib methods where serialization/deserialization happens.
    For example cache.set(key="bacon", "spam", timeout=5, strategy=<my_serializer_name>)

I'm open to any other ideas people might have. Also, if anyone feels like giving this a go just let me know. If nobody picks it up I'll implement it in the near future :bowtie:

@ThiefMaster
Copy link

I think 2 is greatly overcomplicating things. I'd do something similar to flask sessions: Set the serializer as a class attribute, and let people subclass if they want to change it.

@northernSage northernSage self-assigned this Aug 19, 2021
@northernSage
Copy link
Member

I like the idea, it sounds much simpler indeed. I'll go ahead and give this a go. Thanks for the tip 😉 @ThiefMaster

@northernSage northernSage linked a pull request Nov 4, 2021 that will close this issue
9 tasks
@github-actions github-actions bot locked as resolved and limited conversation to collaborators Nov 23, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
feature request Request a new feature
Development

Successfully merging a pull request may close this issue.

3 participants