Average Model Cost: $0.0000
Number of Runs: 12,166,769
Models by this creator
electra-base-discriminator
electra-base-discriminator
The Electra-Base-Discriminator model is a language model that has been trained using unsupervised learning techniques and the ELECTRA architecture. Its purpose is to discriminate between real and generated tokens in a given sequence of text. This model can be used as part of a larger system for tasks such as text generation, text classification, and natural language understanding.
$-/run
3.3M
Huggingface
vit-base-patch16-224
$-/run
2.7M
Huggingface
flan-t5-large
flan-t5-large
FLAN-T5-Large is a language model that has been fine-tuned on more than 1000 additional tasks and covers multiple languages. It is an improved version of the T5 model, offering better performance with the same number of parameters. The model can be used for various NLP tasks and is available for use in transformers. There are no specific out-of-scope uses mentioned, and the model's training details indicate that it was trained on TPU v3 or TPU v4 pods using the T5X codebase with JAX. The model has been evaluated on various tasks and languages, and the results can be found in the research paper. The environmental impact of the model has not been specified.
$-/run
1.1M
Huggingface
bert_uncased_L-2_H-128_A-2
$-/run
1.1M
Huggingface
flan-t5-xxl
flan-t5-xxl
The flan-t5-xxl model is a text-to-text generation model. It takes in a specific prompt or input text and generates a corresponding output text based on the given prompt. The model is designed to handle large amounts of text and generate high-quality responses.
$-/run
804.0K
Huggingface
flan-t5-base
flan-t5-base
The flan-t5-base model is a language model trained on a large number of tasks and languages. It is an improved version of the T5 model, with fine-tuning on over 1000 additional tasks and support for multiple languages. The model can be used for various NLP tasks and has been evaluated on a wide range of tasks and languages. It is licensed under Apache 2.0 and is available for use in the transformers library. The model's training details, evaluation results, and environmental impact are also provided in the model card.
$-/run
785.5K
Huggingface
vit-base-patch16-224-in21k
$-/run
548.5K
Huggingface
electra-small-discriminator
$-/run
482.5K
Huggingface
vit-base-patch16-384
$-/run
346.9K
Huggingface