Google

Rank:

Average Model Cost: $0.0000

Number of Runs: 12,166,769

Models by this creator

electra-base-discriminator

electra-base-discriminator

google

The Electra-Base-Discriminator model is a language model that has been trained using unsupervised learning techniques and the ELECTRA architecture. Its purpose is to discriminate between real and generated tokens in a given sequence of text. This model can be used as part of a larger system for tasks such as text generation, text classification, and natural language understanding.

Read more

$-/run

3.3M

Huggingface

flan-t5-large

flan-t5-large

FLAN-T5-Large is a language model that has been fine-tuned on more than 1000 additional tasks and covers multiple languages. It is an improved version of the T5 model, offering better performance with the same number of parameters. The model can be used for various NLP tasks and is available for use in transformers. There are no specific out-of-scope uses mentioned, and the model's training details indicate that it was trained on TPU v3 or TPU v4 pods using the T5X codebase with JAX. The model has been evaluated on various tasks and languages, and the results can be found in the research paper. The environmental impact of the model has not been specified.

Read more

$-/run

1.1M

Huggingface

flan-t5-base

flan-t5-base

The flan-t5-base model is a language model trained on a large number of tasks and languages. It is an improved version of the T5 model, with fine-tuning on over 1000 additional tasks and support for multiple languages. The model can be used for various NLP tasks and has been evaluated on a wide range of tasks and languages. It is licensed under Apache 2.0 and is available for use in the transformers library. The model's training details, evaluation results, and environmental impact are also provided in the model card.

Read more

$-/run

785.5K

Huggingface

Similar creators