Knowledgator Docs
GitHubDiscord
  • 🛎️Welcome
  • ⚙️Models
    • 🧮Comprehend-it
      • Comprehend_it-base
      • Comprehend_it-multilingual-t5-base
    • 🦎UTC
  • 👷Frameworks
    • 💧LiqFit
      • Quick Start
      • Benchmarks
      • API Reference
        • Collators
          • NLICollator
          • Creating custom collator
        • Datasets
          • NLIDataset
        • Losses
          • Focal Loss
          • Binary Cross Entropy
          • Cross Entropy Loss
        • Modeling
          • LiqFitBackbone
          • LiqFitModel
        • Downstream Heads
          • LiqFitHead
          • LabelClassificationHead
          • ClassClassificationHead
          • ClassificationHead
        • Pooling
          • GlobalMaxPooling1D
          • GlobalAbsAvgPooling1D
          • GlobalAbsMaxPooling1D
          • GlobalRMSPooling1D
          • GlobalSumPooling1D
          • GlobalAvgPooling1D
          • FirstTokenPooling1D
        • Models
          • Deberta
          • T5
        • Pipelines
          • ZeroShotClassificationPipeline
  • 📚Datasets
    • Biotech news dataset
  • 👩‍🔧Support
  • API Reference
    • Comprehend-it API
    • Entity extraction
      • /fast
      • /deterministic
      • /advanced
    • Token searcher
    • Web2Meaning
    • Web2Meaning2
    • Relation extraction
    • Text2Table
      • /web2text
      • /text_preprocessing
      • /text2table
      • /merge_tables
Powered by GitBook
On this page
  1. Frameworks
  2. LiqFit
  3. API Reference
  4. Losses

Binary Cross Entropy

class liqfit.losses.BinaryCrossEntropyLoss

(multi_target=False, weight=None, reduction='mean')

Parameters:

  • multi_target (bool, optional): Whether the labels are multi-target or not.

  • weight (torch.Tensor, optional): Manual rescaling weight given to the loss of each batch element.

  • reduction (str, optional): Reduction method to apply on the loss. (Defaults to "mean").

Using CrossEntropyLoss

Simple wrapper over PyTorch's binary_cross_entropy_with_logits loss function to support multi-target inputs

from liqfit.losses import BinaryCrossEntropyLoss
import torch

x = torch.randn((1, 10, 20))
y = torch.randint(0, 2, (1, 10))
binary_loss = BinaryCrossEntropyLoss(multi_target=True)
loss = binary_loss(x, y) # No need for reshaping.
PreviousFocal LossNextCross Entropy Loss

Last updated 1 year ago

👷
💧