Squeezebert
Rank:Average Model Cost: $0.0000
Number of Runs: 22,032
Models by this creator
squeezebert-mnli
squeezebert-mnli
The SqueezeBERT-MNLI model is a language model trained on the MNLI (Multi-Genre Natural Language Inference) task. It employs a compressed version of the BERT model architecture called SqueezeBERT. This model can perform text classification tasks, particularly in determining the relationship between two given sentences.
$-/run
11.1K
Huggingface
squeezebert-uncased
squeezebert-uncased
squeezebert-uncased is a language model that has been pretrained on a large corpus of text data. It is based on the SqueezeBERT architecture and is "uncased", meaning that it does not differentiate between uppercase and lowercase letters. This model can be fine-tuned on specific tasks such as text classification, question answering, or language generation. It can generate meaningful text based on input prompts and can be used to perform a variety of natural language processing tasks.
$-/run
10.5K
Huggingface
squeezebert-mnli-headless
$-/run
364
Huggingface