core, miner, trie: add metrics tracking state trie depth#32388
core, miner, trie: add metrics tracking state trie depth#32388rjl493456442 merged 14 commits intoethereum:masterfrom
Conversation
|
Any particular reason for making this PR? We always construct a nodeset containing all the dirty nodes after // NodeSet contains a set of nodes collected during the commit operation.
// Each node is keyed by path. It's not thread-safe to use.
type NodeSet struct {
Owner common.Hash
Leaves []*leaf
Nodes map[string]*Node
updates int // the count of updated and inserted nodes
deletes int // the count of deleted nodes
}If you need the statistics of accessed nodes, you can use |
|
The reason for making this PR is for Guillaume's BloatNet project to bloat state and see growth of state trie. Needed to provide new metrics in prometheus so that Pari could parse the data and give it to a researcher. |
|
|
||
| // Witness returns a set containing all trie nodes that have been accessed. | ||
| func (t *StateTrie) Witness() map[string]struct{} { | ||
| func (t *StateTrie) Witness() map[string][]byte { |
There was a problem hiding this comment.
I assume that Gary okay'd this. I wonder what the ram usage impact will be though. We'll have to check that... with metrics 😁
There was a problem hiding this comment.
I believe Gary changed this so that everything is tracked in stats now, but would be curious to see how much it impacts memory.
…uced memory for witness by only keeping statistics.
Co-authored-by: Guillaume Ballet <3272758+gballet@users.noreply.github.com>
584d1f0 to
94a0320
Compare
Filtering for leaf nodes was missing from #32388, which means that even the root done was reported, which made little sense for the bloatnet data processing we want to do.
Filtering for leaf nodes was missing from ethereum#32388, which means that even the root done was reported, which made little sense for the bloatnet data processing we want to do.
Filtering for leaf nodes was missing from ethereum#32388, which means that even the root done was reported, which made little sense for the bloatnet data processing we want to do.
No description provided.