Cross Entropy Loss

class liqfit.losses.CrossEntropyLoss

(multi_target=False, weight=None, ignore_index=-100, reduction='mean',
label_smoothing=0.0)

Parameters:

  • multi_target (bool, optional): Whether the labels are multi-target or not.

  • weight (torch.Tensor, optional): Manual rescaling weight given to the loss of each batch element.

  • reduction (str, optional): Reduction method to apply on the loss. (Defaults to "mean").

  • ignore_index (int): Index that will be ignored while calculating the loss.

  • label_smoothing (float, optional): A float in [0.0, 1.0]. Specifies the amount of smoothing when computing the loss, where 0.0 means no smoothing. The targets become a mixture of the original ground truth and a uniform distribution as described in Rethinking the Inception Architecture for Computer Vision. Default: 0.0.

Using CrossEntropyLoss

Simple wrapper over PyTorch's cross_entropy loss function to support multi-target inputs

from liqfit.losses import CrossEntropyLoss
import torch

x = torch.randn((1, 10, 20))
y = torch.randint(0, 2, (1, 10))
ce_loss = CrossEntropyLoss(multi_target=True)
loss = ce_loss(x, y) # No need for reshaping.

Last updated