Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Design suggestions of _IterationBuilder._best_loss from adanet.core.iteration #165

Open
linjing-lab opened this issue Dec 3, 2023 · 0 comments

Comments

@linjing-lab
Copy link

I have designed some operators for model loss monitoring, like the following continuous fragments:

https://github.com/linjing-lab/easy-pytorch/blob/9651774dcc4581104f914980baf2ebc05f96fd85/released_box/perming/_utils.py#L269-L281

and I achieved it for this approach taking a small proportion in CPU runtime, I don't want to burden the CPU runtime without the need to detect the optimal model after training runtime.

best_candidate_index = self._best_candidate_index(
candidates, best_ensemble_index_override)
best_predictions = self._best_predictions(candidates,
best_candidate_index)
best_loss = self._best_loss(candidates, best_candidate_index, mode)

I don't think the underlined function is a quick response code because it selects the optimal model at least O(n) runtime from all trained candidate ensembles only by loss evaluation. It's an implementation based on the tf API, but it doesn't take into account how users can get the ideal combination based on the strategy they are using in as few trials as possible.

def _best_loss(self, candidates, best_candidate_index, mode):
"""Returns the best loss from a set of candidates.
Args:
candidates: List of `_Candidate` instances to compare.
best_candidate_index: `Tensor` index of the best candidate in the list.
mode: Defines whether this is training, evaluation or inference. Loss is
always None during inference. See `ModeKeys`.
Returns:
Float `Tensor` of the best candidate's loss.
"""
if mode == tf.estimator.ModeKeys.PREDICT:
return None
if len(candidates) == 1:
return candidates[0].ensemble_spec.loss
with tf_compat.v1.variable_scope("best_loss"):
losses = [c.ensemble_spec.loss for c in candidates]
loss = tf.slice(tf.stack(losses), [best_candidate_index], [1])
return tf.reshape(loss, [])

@linjing-lab linjing-lab changed the title Design suggestions of _IterationBuilder._best_loss from adanet.core.iteration.py Design suggestions of _IterationBuilder._best_loss from adanet.core.iteration Dec 3, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant