-
Notifications
You must be signed in to change notification settings - Fork 38.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Reactive types support for @Cacheable
methods [SPR-14235]
#17920
Comments
Hi, what is the status of this ? If I understand well, there was discussion about a |
Nothing has been evolving in that area I am afraid and given the lack of proprietary implementations, I think cache vendors themselves aren't keen to explore that route either. This leaves us with the blocking model which isn't a great fit with whatever There is a cache operator in reactor though and I remember so initial work trying to harmonize this here. Perhaps @simonbasle can refresh my memory? |
I made an attempt at an opinionated API towards caching in |
Are there any updates on this? |
No, the latest update on is here. Most cache libraries (including JCache) are still on a blocking model so my comment above still stands. |
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
Redis has reactive driver. So it can be implemented for redis |
I made new project for reactive cache with proper annotation usage, Spring Boot auto-configuration and tests, looks like working well, it will be soon deployed to our production. |
Folks, is there a technical reason We can do something similar with a custom operator for the flow, but I really like the AOP interceptor approach - very clean and gives me access to the method's attributes in bulk. |
Developers, myself included, have been waiting for Spring to support the reactive types in the Cache Annotation implementation, also known as Cache Abstraction. What is interesting in this story, Micronaut, a relative new framework compared to Spring, added support for reactives types in their Cache Annotation implementation since their 1.0.0 GA Release back in October 2018. You can check it by yourself: Not related to this topic, but another area where Micronaut has been more successful than Spring is supporting reactive types through their declarative HTTP Client. Spring Cloud Open Feign doesn't support reactive types and in the documentation they explicitly say they won't until OpenFeign supports Project Reactor. Interesting, Micronaut's declarative HTTP Client supports any type that implements the BTW, I'm a huge Spring Framework fan, I've been coding Spring based applications since 2005 and for the last 5 years doing Spring Cloud microservices, but my dear Spring folks, in my honest opinion it is time to catch up! |
@howardem yeah, Micronaut does have an One thing to consider is that Micronaut's Cacheable interceptor needs to work with the common denominator of cache providers it claim to support. As a result, caching the result of an reactive-returning method boils down to these steps:
One glaring limitation is that it is not "atomic". It is comparable to calling a The thing is, caching atomically is more important in the case of an asynchronous method:
Micronaut's Note that Micronaut's own Cacheable interceptor doesn't even use that So yeah, it has an async cache abstraction. It is best effort, and might hide subtle issues like cache stampeding, so I would take it with a grain of salt. |
fwiw, I believe Quarkus implements its caches by requiring that they are asynchronous. Their interceptor checks if the return type is reactive, and if not simply block indefinitely (unless a timeout is specified). Internally they use a Caffeine |
yeah, that's interesting. they initially (very early) thought about supporting multiple caching providers but the reduction in scope (annotations only at first) and the focus on the 80% case led them to only use Caffeine as the underlying implementation 👍 they don't seem to use your but yeah, they focused on an async API returning |
I think that I suggested the putIfAbsent approach so that they could reuse the caller thread. There were many PRs as they iterated and I advised from a point of ignorance (as I use Guice at work, no annotated caching, and it’s all a weekend hobby). I think going async was an evolution of their internal apis, but my knowledge is very fuzzy. I hope that other caching providers become async friendly to enable similar integrations, but I don’t know the state of that support. |
hy @Bryksin how has it been working for you? we also currently use Redis-backed cache (via redisson), but we have to implement it in the code level, along the lines of
which I don't like too much, as caching should be treated a cross-cutting concern, easily disable-able if needed I like the API of your library (https://github.com/Bryksin/redis-reactive-cache) but my coworkers are hesitant given apparent lack of community adoption :( also would be interesting to learn of @simonbasle take on this approach 🙏 |
Hi @62mkv, unfortunately usage of the lib is very low, no time to focus on it properly
Though time is going but such important aspect as Cache continue to be ignored by Spring for reactive stuff by some reason... |
Hi, @jaesuk-kim0808 I'm integrating your code but the AOP is not invoked...
Can I put the method to cached in a service implementation as a private method? For example:
The parameters aren't used in the code because I need to cache all elements. I need to change this code, but the pointcut is not launched Thanks for your great work |
Hi @kidhack83 @Component
public class ServiceA {
...
@AsyncCacheable(name = "getCurrencies")
public Mono<List<RateDto>> getCurrencies(String source, String target) {
return currencyRatesApi.getCurrencyRates()
.collectList();
}
...
}
@Component
@RequiredArgsConstructor
public class ServiceB {
private final ServiceA serviceA;
...
private Mono<List<RateDto>> someMethod(String source, String target) {
return serviceA.getCurrencies(source, target);
}
...
} I hope this helps. |
I took the solutions in this thread and made a more fleshed out version that includes the annotation, the imports, and the dependencies: https://github.com/shaikezr/async-cacheable |
While we still do not see On the provider side, we got Caffeine's CompletableFuture-based There are a few caveats: In order to address the risk of cache stampede for expensive operations, Thanks everyone for your comments in this thread! This served as very valuable inspiration. |
One somewhat related area where the Flux reactive type is very useful is to support coalescing bulk loads. This allows for collecting multiple independent loads using Reactor's bufferTimeout(maxSize, maxTime) to perform a batch request. I don't believe the annotations support bulk operations yet, but it is a very nice mash-up. Here is a short (~30 LOC) example demonstrating this using Caffeine directly. CoalescingBulkLoaderpublic final class CoalescingBulkLoader<K, V> implements AsyncCacheLoader<K, V> {
private final Function<Set<K>, Map<K, V>> mappingFunction;
private final Sinks.Many<Request<K, V>> sink;
public CoalescingBulkLoader(int maxSize, Duration maxTime,
int parallelism, Function<Set<K>, Map<K, V>> mappingFunction) {
this.mappingFunction = requireNonNull(mappingFunction);
sink = Sinks.many().unicast().onBackpressureBuffer();
sink.asFlux()
.bufferTimeout(maxSize, maxTime)
.parallel(parallelism)
.runOn(Schedulers.boundedElastic())
.subscribe(this::handle);
}
@Override public CompletableFuture<V> asyncLoad(K key, Executor executor) {
var result = new CompletableFuture<V>();
sink.tryEmitNext(new Request<>(key, result)).orThrow();
return result;
}
private void handle(List<Request<K, V>> requests) {
try {
var results = mappingFunction.apply(requests.stream().map(Request::key).collect(toSet()));
requests.forEach(request -> request.result.complete(results.get(request.key())));
} catch (Throwable t) {
requests.forEach(request -> request.result.completeExceptionally(t));
}
}
private record Request<K, V>(K key, CompletableFuture<V> result) {}
} Sample test@Test
public void coalesce() {
AsyncLoadingCache<Integer, Integer> cache = Caffeine.newBuilder()
.buildAsync(new CoalescingBulkLoader<>(
/* maxSize */ 5, /* maxTime */ Duration.ofMillis(50), /* parallelism */ 5,
keys -> keys.stream().collect(toMap(key -> key, key -> -key))));
var results = new HashMap<Integer, CompletableFuture<Integer>>();
for (int i = 0; i < 82; i++) {
results.put(i, cache.get(i));
}
for (var entry : results.entrySet()) {
assertThat(entry.getValue().join()).isEqualTo(-entry.getKey());
}
} |
just to throw things in the mix...there are a couple of additional challenges that we sorta need and would still keep the annotation approach:
we are going to share code snippets when we're a bit more confident that it actually works 🙃 oh and one more thing, i see the implementations here casually using |
The initial drop is in |
@Cacheable
methods [SPR-14235]
I may be running into thread issues on This is the function header and annotations: |
@mschaeferjelli, you have to compile your code with the This is documented in the 6.1 upgrade notes: https://github.com/spring-projects/spring-framework/wiki/Upgrading-to-Spring-Framework-6.x#parameter-name-retention |
That worked, thanks!. As a note, I had to upgrade the maven-compiler-plugin from 3.8.1 (to 3.13.1), because it was not honoring the xml |
Pablo Díaz-López opened SPR-14235 and commented
Currently when using cache annotations on beans it caches the Observables like other types, So it will not cache its value.
I tried to use the following pattern to handle it:
In the happy path, as we cache the
Observable
values works pretty well, but ifgetById
throws an exception the observable is cached with the exception which isn't how it should work.It would be also very nice to have support to
Single
.If you give me some advice I can try to do a PR to solve this.
Affects: 4.2.5
Sub-tasks:
Referenced from: pull request #1066
1 votes, 8 watchers
The text was updated successfully, but these errors were encountered: