๐ŸงฎComprehend-it

Overview

The Comprehend-it is a powerful NLP model based on DeBERTaV3-base, trained extensively on natural language inference (NLI) and various text classification datasets. It stands out for its performance in zero-shot and few-shot text classification, surpassing Bart-large-mnli in quality while maintaining a significantly smaller size.

Supported IE tasks

  • Text classification

  • Reranking of search results

  • Named-entity recognition

  • Relation extraction

  • Entity linking

  • Question-answering

Models

Model
Base model
Size
Input capacity
Languages
Access

Benchmarking

Below, you can see the F1 score on several text classification datasets. All tested models were not fine-tuned on those datasets and were tested in a zero-shot setting.

Model
IMDB
AG_NEWS
Emotions

0.89

0.6887

0.3765

0.85

0.6455

0.5095

0.90

0.7982

0.5660

0.86

0.5636

0.5754

Usage instructions

For installation guidelines please refer to the model's detailed page.

Examples

Besides text classification, the model can be used for many other information extraction tasks.

Question-answering

The model can be used to solve open question-answering as well as reading comprehension tasks if it's possible to transform a task into a multi-choice Q&A.

Named-entity classification and disambiguation

The model can be used to classify named entities or disambiguate similar ones. It can be put as one of the components in an entity-linking system as a reranker.

Relation classification

With the same principle, the model can be utilized to classify relations from a text.

Fine-tuning

We recommend fine-tuning models using our LiqFit framework, which allows you to efficiently fine-tune models using about 8 training examples per label.

Last updated