Optimum
Rank:Average Model Cost: $0.0000
Number of Runs: 140,477
Models by this creator
t5-small
t5-small
T5-small is an encoder-decoder model that has been pre-trained on a variety of unsupervised and supervised tasks. It is designed to convert various natural language processing tasks into a text-to-text format. The model has been trained with transfer learning and can be used for a variety of translation tasks.
$-/run
70.1K
Huggingface
sbert-all-MiniLM-L6-with-pooler
$-/run
25.7K
Huggingface
distilbert-base-uncased-finetuned-sst-2-english
distilbert-base-uncased-finetuned-sst-2-english
The distilbert-base-uncased-finetuned-sst-2-english model is a fine-tuned version of the DistilBERT-base-uncased model. It has been trained on the sentiment analysis task SST-2 and achieves an accuracy of 91.3% on the dev set. The model uses a learning rate of 1e-5, a batch size of 32, a warmup of 600 steps, a maximum sequence length of 128, and is trained for 3 epochs. It is important to note that this model may produce biased predictions, particularly for underrepresented populations. It is recommended to thoroughly evaluate the model's performance and biases in specific use cases.
$-/run
22.2K
Huggingface
gpt2
gpt2
GPT-2 is a pretrained language model that was trained on a large corpus of English text data using a self-supervised learning approach. It was trained to predict the next word in sentences and can generate text given a prompt. The model can be used for text generation or can be fine-tuned for specific downstream tasks. It has been released as an ONNX model and can be used with the transformers library.
$-/run
21.0K
Huggingface
tiny_random_bert_neuron
$-/run
530
Huggingface
roberta-base-squad2
$-/run
431
Huggingface
bert-base-NER
$-/run
265
Huggingface
all-MiniLM-L6-v2
$-/run
132
Huggingface
distilbert-base-uncased-mnli
$-/run
93
Huggingface
segformer-b0-finetuned-ade-512-512
segformer-b0-finetuned-ade-512-512
Platform did not provide a description for this model.
$-/run
90
Huggingface