fix adaptive p sampler rewinding too far back#1359
Conversation
|
|
||
| ctx->record_samplers = false; | ||
| ctx->rewind_samplers = false; | ||
| // add stateful samplers here |
There was a problem hiding this comment.
Why did you remove the check for ctx->adapt_p_ctx being null?
There was a problem hiding this comment.
It is now in llama_review_adaptive_p_impl(). I think of common_sampler_review() as a place to list stateful samplers that needs reviewing, so the check felt a bit too preemptive. I can put the check back in here.
| auto & weighted_sum = adapt_p_ctx->weighted_sum; | ||
| auto & total_weight = adapt_p_ctx->total_weight; | ||
| if (n_rewind < 0) { | ||
| // clear history except most recent |
There was a problem hiding this comment.
What if weighted_sum or total_weight is empty (i.e., size() returns 0)?
My concept is that erase is kind of slow. I would write something like this (assuming weighted_sum and total_weight have the same size):
if (weighted_sum.size() > 1) {
weighted_sum.front() = weighted_sum.back();
total_weight.front() = total_weight.beck();
weighted_sum.resize(1);
total_weight.resize(1);
}There was a problem hiding this comment.
What if
weighted_sumortotal_weightis empty (i.e.,size()returns 0)?
That really should not happen, but I added a check.
My concept is that
eraseis kind of slow. I would write something like this (assumingweighted_sumandtotal_weighthave the same size):if (weighted_sum.size() > 1) { weighted_sum.front() = weighted_sum.back(); total_weight.front() = total_weight.beck(); weighted_sum.resize(1); total_weight.resize(1); }
Neat. Fixed.
…ry check in llama_review_adaptive_p_impl()
|
|
This reverts commit a903409.
This reverts commit a903409.
This reverts commit a903409.
This reverts commit a903409.
This reverts commit a903409.
This reverts commit a903409.
This reverts commit a903409.
#1287 does not account for partial rewinds. Fixing it with this PR.