-
Notifications
You must be signed in to change notification settings - Fork 61
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Parallel data queueing for OLIApi #1303
Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #1303 +/- ##
==========================================
- Coverage 94.41% 93.85% -0.56%
==========================================
Files 370 370
Lines 37753 37993 +240
==========================================
+ Hits 35645 35660 +15
- Misses 2108 2333 +225 ☔ View full report in Codecov by Sentry. |
@adam-a-a @avdudchenko we need to discuss this and whether this is still a priority, and if it is still a priority then we should identify when we want to finish it |
Closing this for now - should revisit once OLI API issues are fixed with data management. |
Fixes/Resolves:
This adds capability to run parallel request to OLIApi via aiohttp.
(replace this with the issue # fixed or resolved, if no issue exists then a brief statement of what this PR does)
Summary/Motivation:
The OLI API is adding burst capability to enable larger and faster data through put. This enables user to leverages this capability through aiohttp implementation that asynchronously queues OLI Api for data.
To cleanly support this, the OLI api client is updated to process a list of requests in addition to single calls. Going forward any survey calls should generate a list of requests dicitonaries and send it to OLI API client, this has been implemented in flash.
Changes proposed in this PR:
-add aiohttp support
-add support for processing list of requests by the client it self.
Legal Acknowledgement
By contributing to this software project, I agree to the following terms and conditions for my contribution: