BEATs model produces NaN when using mixed precision with pytorch lightning #1569
Open
Description
I do not know if this is an expected behavior, but I was not able to train BEATs using pytorch lightning with mixed-precision as it was producing NaN values when using the extract_features method.
Is there a fix for that ? Or am I contrained to full float32 training ?
Metadata
Assignees
Labels
No labels
Activity