It5
Rank:Average Model Cost: $0.0000
Number of Runs: 27,057
Models by this creator
it5-base-news-summarization
it5-base-news-summarization
The it5-base-news-summarization model is a text summarization model trained by OpenAI. It is designed to take in a news article as input and generate a concise summary of the article. The model is based on the It5 transformer architecture and has been trained on a large dataset of news articles. It can be used to automate the process of summarizing news articles, saving time and effort for users who need to quickly get an overview of a news story.
$-/run
25.7K
Huggingface
it5-large-news-summarization
$-/run
377
Huggingface
it5-efficient-small-el32-news-summarization
it5-efficient-small-el32-news-summarization
Platform did not provide a description for this model.
$-/run
274
Huggingface
it5-base-question-answering
it5-base-question-answering
IT5 Base for Question Answering ⁉️ 🇮🇹 This repository contains the checkpoint for the IT5 Base model fine-tuned on extractive question answering on the SQuAD-IT corpus as part of the experiments of the paper IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation by Gabriele Sarti and Malvina Nissim. A comprehensive overview of other released materials is provided in the gsarti/it5 repository. Refer to the paper for additional details concerning the reported scores and the evaluation approach. Using the model Model checkpoints are available for usage in Tensorflow, Pytorch and JAX. They can be used directly with pipelines as: or loaded using autoclasses: If you use this model in your research, please cite our work as:
$-/run
227
Huggingface
it5-large-question-generation
$-/run
104
Huggingface
it5-large-question-answering
it5-large-question-answering
IT5 Large for Question Answering ⁉️ 🇮🇹 This repository contains the checkpoint for the IT5 Large model fine-tuned on extractive question answering on the SQuAD-IT corpus as part of the experiments of the paper IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation by Gabriele Sarti and Malvina Nissim. A comprehensive overview of other released materials is provided in the gsarti/it5 repository. Refer to the paper for additional details concerning the reported scores and the evaluation approach. Using the model Model checkpoints are available for usage in Tensorflow, Pytorch and JAX. They can be used directly with pipelines as: or loaded using autoclasses: If you use this model in your research, please cite our work as:
$-/run
91
Huggingface
it5-small-wiki-summarization
$-/run
72
Huggingface
mt5-small-news-summarization
$-/run
67
Huggingface
it5-large-wiki-summarization
$-/run
65
Huggingface
it5-base-wiki-summarization
$-/run
56
Huggingface