easynlp.utils.losses¶
CrossEntropy Loss¶
Cross Entropy loss
| param input: | input tensor | 
|---|---|
| param target: | prediction tensor | 
| param weight: | weighted cross-entropy loss (sample level weights) | 
| param size_average: | |
| size average | |
| param ignore_index: | |
| ignore index | |
| param reduction: | |
| default 'mean' reduction | |
Vanilla KD Loss¶
Vanilla KD loss
| param s_logits: | student logits | 
|---|---|
| param t_logits: | target logits | 
| param alpha: | kd loss weight | 
| param temperature: | |
| temperature | |
MultiLabel Sigmoid CrossEntropy Loss¶
MultiLabel Sigmoid Cross Entropy loss
| param input: | input tensor | 
|---|---|
| param target: | prediction tensor | 
| param weight: | weighted cross-entropy loss (sample level weights) | 
| param size_average: | |
| size average | |
| param ignore_index: | |
| ignore index | |
| param reduction: | |
| default 'mean' reduction | |
Soft Input CrossEntropy Loss¶
Soft Input Cross Entropy loss
| param input: | input tensor | 
|---|---|
| param target: | prediction tensor | 
| param weight: | weighted cross-entropy loss (sample level weights) | 
| param size_average: | |
| size average | |
| param ignore_index: | |
| ignore index | |
| param reduction: | |
| default 'mean' reduction | |
Hinge Loss for Embeddings¶
Hinge loss for embeddings
| param emb1: | embedding tensor | 
|---|---|
| param emb2: | embedding tensor | 
| param margin: | margin (default 0.3) |