[Model] Use sigmoid for single-label classification#18313
[Model] Use sigmoid for single-label classification#18313vllm-bot merged 1 commit intovllm-project:mainfrom
Conversation
Signed-off-by: 22quinn <33176974+22quinn@users.noreply.github.com>
|
👋 Hi! Thank you for contributing to the vLLM project. 💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels. Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging. To run CI, PR reviewers can either: Add 🚀 |
|
Could you explain why this is needed? I thought softmax is just a generalization of sigmoid so they should be equivalent for the binary case. |
@DarkLight1337 When the output has only a single class, applying softmax will always give you 1: from vllm import LLM
MODEL = "22quinn/Llama-3.2-1B-1Label-dummy"
PROMPTS = ["Hello my name is Robert", "ok I got it"]
model = LLM(MODEL, task="classify", enforce_eager=True, enable_prefix_caching=True)
outputs = model.classify(PROMPTS)
for output in outputs:
print(output.outputs.probs) |
DarkLight1337
left a comment
There was a problem hiding this comment.
Oh, I get it now. Thanks for the explanation!
Signed-off-by: 22quinn <33176974+22quinn@users.noreply.github.com> Signed-off-by: Yuqi Zhang <yuqizhang@google.com>
In classification problems, two labels are often reduced to one label because they are equivalent, both representing binary classification. For a single label, we should apply
sigmoidinstead ofsoftmaxRelated to #18052
cc @maxdebayser @DarkLight1337 @WoosukKwon @houseroad