Wvangils
Rank:Average Model Cost: $0.0000
Number of Runs: 4,671
Models by this creator
BLOOM-350m-Beatles-Lyrics-finetuned-newlyrics
BLOOM-350m-Beatles-Lyrics-finetuned-newlyrics
Platform did not provide a description for this model.
$-/run
4.6K
Huggingface
GPT-Medium-Beatles-Lyrics-finetuned-newlyrics
GPT-Medium-Beatles-Lyrics-finetuned-newlyrics
This model is a fine-tuned version of gpt2-medium on the Cmotions - Beatles lyrics dataset. It will complete an input prompt with Beatles-like text. More information needed More information needed More information needed The following hyperparameters were used during training:
$-/run
31
Huggingface
NL_BERT_michelin_finetuned
$-/run
13
Huggingface
GPT2-Beatles-Lyrics-finetuned-newlyrics
GPT2-Beatles-Lyrics-finetuned-newlyrics
Platform did not provide a description for this model.
$-/run
10
Huggingface
CTRL-Beatles-Lyrics-finetuned-newlyrics
CTRL-Beatles-Lyrics-finetuned-newlyrics
This model is a fine-tuned version of sshleifer/tiny-ctrl on the Cmotions - Beatles lyrics dataset. It will complete an input prompt with Beatles-like text. More information needed More information needed More information needed The following hyperparameters were used during training:
$-/run
10
Huggingface
GPT-Neo-125m-Beatles-Lyrics-finetuned-newlyrics
GPT-Neo-125m-Beatles-Lyrics-finetuned-newlyrics
This model is a fine-tuned version of EleutherAI/gpt-neo-125M on the Cmotions - Beatles lyrics dataset. It will complete an input prompt with Beatles-like text. More information needed More information needed More information needed The following hyperparameters were used during training:
$-/run
8
Huggingface
BLOOM-560m-Beatles-Lyrics-finetuned
BLOOM-560m-Beatles-Lyrics-finetuned
BLOOM-560m-Beatles-Lyrics-finetuned This model is a fine-tuned version of bigscience/bloom-560m on the None dataset. It achieves the following results on the evaluation set: Loss: 3.3387 Model description More information needed Intended uses & limitations More information needed Training and evaluation data More information needed Training procedure Training hyperparameters The following hyperparameters were used during training: learning_rate: 5e-05 train_batch_size: 1 eval_batch_size: 1 seed: 42 optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 lr_scheduler_type: linear lr_scheduler_warmup_steps: 100 num_epochs: 5 Training results Framework versions Transformers 4.21.1 Pytorch 1.12.1+cu113 Datasets 2.4.0 Tokenizers 0.12.1
$-/run
4
Huggingface
DistilGPT2-Beatles-Lyrics-finetuned-newlyrics
DistilGPT2-Beatles-Lyrics-finetuned-newlyrics
Platform did not provide a description for this model.
$-/run
2
Huggingface
DistilGPT2-Beatles-Lyrics-finetuned
$-/run
2
Huggingface