-
-
Notifications
You must be signed in to change notification settings - Fork 44
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
configurable serializer #11
Comments
Thanks for the suggestion 👍🏻 I'll get the ball rolling here by suggesting a decorator, something like @whiskeyriver Could you perhaps give an example of an alternate serialization strategy not currently supported that you would like to use? Just so the feature request is better contextualized. |
I think that's a good solution. A registration decorator for serde would be great for exotic strategies like encryption at rest, and equally useful for anything requiring upfront configuration. The caveat, if it's global, is that it couldn't support multiple strategies. Say, for example, you wanted to store memoized results from a function using pickle, but you wanted to serve a request as JSON directly from cache without having to unpickle it just to dump it to json for transmission. In that case a user would probably want multiple cachelib instances, each configured for the serde they choose. |
Thanks for elaborating. That's definitely an interesting consideration. So far I have two approaches in mind:
I'm open to any other ideas people might have. Also, if anyone feels like giving this a go just let me know. If nobody picks it up I'll implement it in the near future |
I think 2 is greatly overcomplicating things. I'd do something similar to flask sessions: Set the serializer as a class attribute, and let people subclass if they want to change it. |
I like the idea, it sounds much simpler indeed. I'll go ahead and give this a go. Thanks for the tip 😉 @ThiefMaster |
I would like to be able to choose my serialization strategy when configuring caching. This would have benefits both in performance and in cache sharing.
The text was updated successfully, but these errors were encountered: