Has-abi

Rank:

Average Model Cost: $0.0000

Number of Runs: 3,974

Models by this creator

distilBERT-finetuned-resumes-sections

distilBERT-finetuned-resumes-sections

has-abi

distilBERT-finetuned-resumes-sections This model is a fine-tuned version of Geotrend/distilbert-base-en-fr-cased on an unknown dataset. It achieves the following results on the evaluation set: Loss: 0.0369 F1: 0.9652 Roc Auc: 0.9808 Accuracy: 0.9621 Model description More information needed Intended uses & limitations More information needed Training and evaluation data More information needed Training procedure Training hyperparameters The following hyperparameters were used during training: learning_rate: 2e-05 train_batch_size: 8 eval_batch_size: 8 seed: 42 optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 lr_scheduler_type: linear num_epochs: 20 Training results Framework versions Transformers 4.21.1 Pytorch 1.12.1+cu113 Datasets 2.4.0 Tokenizers 0.12.1

Read more

$-/run

3.6K

Huggingface

extended_distilBERT-finetuned-resumes-sections

extended_distilBERT-finetuned-resumes-sections

extended_distilBERT-finetuned-resumes-sections This model is a fine-tuned version of Geotrend/distilbert-base-en-fr-cased on an unknown dataset. It achieves the following results on the evaluation set: Loss: 0.0321 F1: 0.9735 Roc Auc: 0.9850 Accuracy: 0.9715 Model description More information needed Intended uses & limitations More information needed Training and evaluation data More information needed Training procedure Training hyperparameters The following hyperparameters were used during training: learning_rate: 2e-05 train_batch_size: 8 eval_batch_size: 8 seed: 42 optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 lr_scheduler_type: linear num_epochs: 20 Training results Framework versions Transformers 4.21.3 Pytorch 1.12.1+cu113 Datasets 2.4.0 Tokenizers 0.12.1

Read more

$-/run

359

Huggingface

Similar creators