Adasnew
Rank:Average Model Cost: $0.0000
Number of Runs: 8,560
Models by this creator
t5-small-xsum
t5-small-xsum
t5-small-xsum is a fine-tuned version of the t5-small model on the xsum dataset. It achieves a loss of 2.3953 on the evaluation set. More information is needed about the model's description, intended uses and limitations, training and evaluation data, as well as the training procedure. The hyperparameters used during training include a learning rate of 5e-05, a train batch size of 1, an eval batch size of 1, a seed of 42, gradient accumulation steps of 16, a total train batch size of 16, an optimizer of Adam with betas=(0.9,0.999) and epsilon=1e-08, a linear learning rate scheduler with warmup steps of 500, and one epoch of training. The model was trained using Transformers 4.18.0, Pytorch 1.10.0+cu111, Datasets 2.0.0, and Tokenizers 0.11.6.
$-/run
8.6K
Huggingface
t5-small-finetuned-xsum
$-/run
0
Huggingface
pegasus-xsum-finetuned-xsum
$-/run
0
Huggingface
pegasus-xsum-abhijit
$-/run
0
Huggingface
pegasus-fine-tune
$-/run
0
Huggingface
t5-small-samsum
$-/run
0
Huggingface