Skip to content

ci: Update op-e2e#5334

Merged
OptimismBot merged 2 commits intodevelopfrom
jg/ci
Apr 5, 2023
Merged

ci: Update op-e2e#5334
OptimismBot merged 2 commits intodevelopfrom
jg/ci

Conversation

@trianglesphere
Copy link
Contributor

@trianglesphere trianglesphere commented Apr 3, 2023

Description

This fixes go based CI in several ways

  1. Set test.parallel flag so it matches the actual amount of CPUs request
  2. Add more calls to t.Parallel in op-e2e
  3. Reduce the amount of blocks in TestBigL2Txs to reduce the test time.

CI Time for op-e2e goes from 16 min -> 4:30 min.

Metadata

@changeset-bot
Copy link

changeset-bot bot commented Apr 3, 2023

⚠️ No Changeset found

Latest commit: fcd6c7f

Merging this PR will not cause a version bump for any packages. If these changes should not result in a new version, you're good to go. If these changes should result in a version bump, you need to add a changeset.

This PR includes no changesets

When changesets are added to this PR, you'll see the packages that this PR includes changesets for and the associated semver types

Click here to learn what changesets are, and how to add one.

Click here if you're a maintainer who wants to add a changeset to this PR

@netlify
Copy link

netlify bot commented Apr 3, 2023

Deploy Preview for opstack-docs canceled.

Name Link
🔨 Latest commit fcd6c7f
🔍 Latest deploy log https://app.netlify.com/sites/opstack-docs/deploys/642de5efdfbe3f0008ee49d4

@trianglesphere trianglesphere force-pushed the jg/ci branch 4 times, most recently from 0ade21e to dad8c5a Compare April 4, 2023 17:28
This fixes go based CI in several ways
1. Set test.parallel flag so it matches the actual amount of CPUs request
2. Add more calls to t.Parallel in op-e2e
3. Reduce the amount of blocks in TestBigL2Txs to reduce the test time.
@codecov
Copy link

codecov bot commented Apr 4, 2023

Codecov Report

Merging #5334 (fcd6c7f) into develop (18747f1) will increase coverage by 0.37%.
The diff coverage is n/a.

Additional details and impacted files

Impacted file tree graph

@@             Coverage Diff             @@
##           develop    #5334      +/-   ##
===========================================
+ Coverage    38.67%   39.04%   +0.37%     
===========================================
  Files          339      393      +54     
  Lines        25376    26206     +830     
  Branches       659      838     +179     
===========================================
+ Hits          9814    10233     +419     
- Misses       14818    15192     +374     
- Partials       744      781      +37     
Flag Coverage Δ
bedrock-go-tests 35.45% <ø> (-0.03%) ⬇️
common-ts-tests 26.82% <ø> (ø)
contracts-bedrock-tests 51.20% <ø> (?)
contracts-tests 98.86% <ø> (ø)
core-utils-tests 60.41% <ø> (ø)
dtl-tests 47.15% <ø> (ø)
fault-detector-tests 33.88% <ø> (ø)
sdk-tests 38.86% <ø> (ø)

Flags with carried forward coverage won't be shown. Click here to find out more.

see 57 files with indirect coverage changes

@trianglesphere trianglesphere marked this pull request as ready for review April 4, 2023 17:59
@trianglesphere trianglesphere requested review from a team as code owners April 4, 2023 17:59
@trianglesphere trianglesphere requested review from ajsutton and zhwrd April 4, 2023 17:59
Copy link
Contributor

@protolambda protolambda left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

amazing

Copy link
Contributor

@ajsutton ajsutton left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Nice work. I've been running all the e2e tests locally a fair bit recently and IntelliJ runs them in parallel which has been quite reliable. I'd be surprised if there isn't some intermittency that turns up so we'll need to keep an eye out and fix it when it does, but it shouldn't be frequent from what I've seen running locally.

@trianglesphere
Copy link
Contributor Author

LGTM. Nice work. I've been running all the e2e tests locally a fair bit recently and IntelliJ runs them in parallel which has been quite reliable. I'd be surprised if there isn't some intermittency that turns up so we'll need to keep an eye out and fix it when it does, but it shouldn't be frequent from what I've seen running locally.

So far with the changes I've been making it's been more reliable this time (though I will watch our for flakes). I think the big improvement is specifying the parallelism. We may have been hitting this CircleCI issue where it doesn't appear to be setting the proc/cpu_count value correctly: https://circleci.com/docs/configuration-reference/#resourceclass

@trianglesphere trianglesphere requested a review from mslipper April 5, 2023 15:42
@mergify
Copy link
Contributor

mergify bot commented Apr 5, 2023

This PR has been added to the merge queue, and will be merged soon.

@mergify
Copy link
Contributor

mergify bot commented Apr 5, 2023

This PR is next in line to be merged, and will be merged as soon as checks pass.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants