Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

rpcclient: make sure batch requests are GCed #2105

Merged
merged 2 commits into from
Jan 23, 2024

Conversation

yyforyongyu
Copy link
Contributor

@yyforyongyu yyforyongyu commented Jan 19, 2024

This commit makes sure the batch requests are always GCed before sending back the responses for them. In particular,

  • removeRequest didn't remove the item from batchList, which is now fixed.
  • Send didn't remove the request from requestMap, which is now fixed by using removeRequest.

In addition, a Copy method is added to MsgBlock to fix another heap escape found in lnd's GetBlock.

This commit makes sure the batch requests are always GCed before sending
back the responses for them. In particular,
- `removeRequest` didn't remove the item from `batchList`, which is now
  fixed.
- `Send` didn't remove the request from `requestMap`, which is now fixed
  by using `removeRequest`.
@coveralls
Copy link

coveralls commented Jan 19, 2024

Pull Request Test Coverage Report for Build 7600694273

  • 0 of 0 changed or added relevant lines in 0 files are covered.
  • No unchanged relevant lines lost coverage.
  • Overall coverage decreased (-0.03%) to 56.755%

Totals Coverage Status
Change from base Build 7546934558: -0.03%
Covered Lines: 29206
Relevant Lines: 51460

💛 - Coveralls

Copy link
Contributor

@ellemouton ellemouton left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Really really great find!!! Basically LGTM - just one question

rpcclient/infrastructure.go Show resolved Hide resolved
Comment on lines +1765 to +1767
log.Errorf("Unable to marshal result: %v for req=%v",
err, request.id)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what's the rationale for changing the error handling here? do we expect to get an unmarshal error here frequently?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since it's a list of requests batched together, if one or more requests failed, it should not affect sending responses to other requests. And nope, I think we should almost never get an unmarshal error.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, an unmarshall request here would mean that the full node sent over garbled json.

rpcclient/infrastructure.go Show resolved Hide resolved
Comment on lines +1747 to +1749
// TODO(yy): need to double check to make sure there's no
// concurrent access to this batch list, otherwise we may miss
// some batched requests.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah - and I guess we also miss all the ones that were in the list that didnt necessarily cause the failure...

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah, need more love in this lib, more tests, etc

rpcclient/infrastructure.go Show resolved Hide resolved
Copy link
Contributor

@ellemouton ellemouton left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm!

Copy link
Member

@Roasbeef Roasbeef left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM ☘️

@Roasbeef Roasbeef merged commit 62e6af0 into btcsuite:master Jan 23, 2024
3 checks passed
@yyforyongyu yyforyongyu deleted the fix-batch-mem-leak branch January 23, 2024 09:07
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants