Valurank
Rank:Average Model Cost: $0.0000
Number of Runs: 2,298
Models by this creator
MiniLM-L6-Keyword-Extraction
$-/run
305
Huggingface
distilroberta-hatespeech
$-/run
242
Huggingface
distilroberta-offensive
distilroberta-offensive
distilroberta-offensive This model is a fine-tuned version of distilroberta-base on an unknown dataset. It achieves the following results on the evaluation set: Loss: 0.4526 Acc: 0.8975 Model description More information needed Intended uses & limitations More information needed Training and evaluation data More information needed Training procedure Training hyperparameters The following hyperparameters were used during training: learning_rate: 5e-05 train_batch_size: 32 eval_batch_size: 32 seed: 12345 optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 lr_scheduler_type: linear lr_scheduler_warmup_steps: 16 num_epochs: 20 mixed_precision_training: Native AMP Training results Framework versions Transformers 4.11.3 Pytorch 1.10.1 Datasets 1.17.0 Tokenizers 0.10.3
$-/run
240
Huggingface
finetuned-distilbert-adult-content-detection
finetuned-distilbert-adult-content-detection
Platform did not provide a description for this model.
$-/run
212
Huggingface
distilroberta-clickbait
distilroberta-clickbait
distilroberta-clickbait This model is a fine-tuned version of distilroberta-base on a dataset of headlines. It achieves the following results on the evaluation set: Loss: 0.0268 Acc: 0.9963 Training and evaluation data The following data sources were used: 32k headlines classified as clickbait/not-clickbait from kaggle A dataset of headlines from https://github.com/MotiBaadror/Clickbait-Detection Training procedure Training hyperparameters The following hyperparameters were used during training: learning_rate: 2e-05 train_batch_size: 32 eval_batch_size: 32 seed: 12345 optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 lr_scheduler_type: linear lr_scheduler_warmup_steps: 16 num_epochs: 20 mixed_precision_training: Native AMP Training results Framework versions Transformers 4.11.3 Pytorch 1.10.1 Datasets 1.17.0 Tokenizers 0.10.3
$-/run
206
Huggingface
finetuned-distilbert-news-article-categorization
finetuned-distilbert-news-article-categorization
Platform did not provide a description for this model.
$-/run
196
Huggingface
distilroberta-news-small
$-/run
184
Huggingface
distilroberta-proppy
$-/run
169
Huggingface
distilroberta-propaganda-2class
$-/run
169
Huggingface