-
Notifications
You must be signed in to change notification settings - Fork 225
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Inconsistent ONNX Export with Differentiable Head #418
Comments
The output is the raw logits per class for each sample, and all you need to do then is a softmax/argmax to get the class labels. Is this intended? |
Hello! I'm aware that there's some issues with exporting ONNX sadly. Hopefully I will have some time in the future to refactor the exporting to a more consistent approach instead. Thank you for raising this issue & for providing a Google Colab! It'll be helpful for certain.
|
Hey @tomaarsen, can I work on this? |
After a long debugging, I found the issue. transformers library assumes that the third argument represents I will open a PR to fix this |
https://colab.research.google.com/drive/19xE4WdxqGLLZOSanycYfUzbcAxFgpzuR?usp=sharing Here is my colab that I used to debug |
Hello,
I am trying to fine-tune SetFit for a multi-class classification problem.
Everything is smooth until exporting to Onnx. The head is not exported correctly, so when loading the model with Onnx, the predictions are the outputs of some previous layer.
For example, I showcased this on a dummy dataset with 3 classes.
See https://colab.research.google.com/drive/19EESqbIDwD5FOI2Ufx22txFZ8qADq2xj?authuser=1#scrollTo=vl6AvQKqtU-9
This differs from the behaviour in the example notebook, where the LogisticRegression head is used.
Any directions would be appreciated; Otherwise, I would happily contribute with an MR if anyone can spot the issue.
The text was updated successfully, but these errors were encountered: