Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Parallel Processing Limits #3887

Closed
1 task
ninosamson opened this issue Nov 5, 2024 · 3 comments
Closed
1 task

Parallel Processing Limits #3887

ninosamson opened this issue Nov 5, 2024 · 3 comments

Comments

@ninosamson
Copy link
Collaborator

ninosamson commented Nov 5, 2024

Code is starting 32 parallel processes where each one will create its own transaction. Once the typeorm pool reaches its maximum (configured to 10 right now) it will be waiting forever.

The next PROD release changed the default configuration and should already resolve this issue for the e-Cert.

  • Ensure the parallel process no longer uses the os.cpus() as a limit when no other limit was provided.
@Joshua-Lakusta
Copy link
Collaborator

@astridSABC Testing note. This has the low potential to impact bulk offering upload. Ensure we test bulk upload to ensure no impact.

@andrewsignori-aot andrewsignori-aot changed the title eCert Processing Parallel Processing Limits Nov 14, 2024
github-merge-queue bot pushed a commit that referenced this issue Nov 18, 2024
- Refactoring methods that rely on OS CPUs as a parallel limit to use
the existing `processInParallel` method.
- New options added to the `processInParallel` to keep the existing
features from the methods refactored.
- `currentRecord` allows progress report (added to satisfy SFAS
integration and also used for Fed restrictions now).
- `partialResults` allow access to lasted awaited promises batch (added
to satisfy offering bulk upload).
- While doing the review, please try to identify existing code that was
just moved vs new code.

### Federal Restriction Benchmark

The amount of parallel promises allowed was reduced from 4 to 2. Even
though the bulk size of 750 was faster in local tests it was kept as 500
to avoid overloading the DB.

- 2 parallel promises, 500 bulk inserted records, inserting records
time: 2:24.584 (m:ss.mmm)
- 4 parallel promises, 500 bulk inserted records, inserting records
time: 2:25.388 (m:ss.mmm)
- 2 parallel promises, 250 bulk inserted records, inserting records
time: 4:24.722 (m:ss.mmm)
- 2 parallel promises, 750 bulk inserted records, inserting records:
1:51.329 (m:ss.mmm)

### Bulk Offering Upload
- Tested importing 1000 new offerings and it was imported almost
instantaneously.
@andrewsignori-aot
Copy link
Collaborator

@Joshua-Lakusta @astridSABC @CarlyCotton, below are the files impacted by this refactoring.

  • Offering Bulk Upload
  • Federal Restrictions File Import
  • SFAS to SIMS Bridge File Import

@andrewsignori-aot
Copy link
Collaborator

andrewsignori-aot commented Nov 18, 2024

Demo

Federal Restrictions

Logs now show the import progress, as shown below. The restrictions are inserted in bulks of 500 records in parallel, usually leading to 1000 records inserted per progress line below.

image

image

image

image

SFAS Integration

image

Offerings Bulk Upload

Institution: SIMS_COLLE
Program Name: Bulk Offering Upload Parallel Refactor Demo
Program SABC: ZYZ9
Institution location code: SLSL

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants