Gagan3012

Rank:

Average Model Cost: $0.0000

Number of Runs: 4,913

Models by this creator

k2t-base

k2t-base

gagan3012

Platform did not provide a description for this model.

Read more

$-/run

2.9K

Huggingface

ArOCR

ArOCR

ArOCR This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set: Loss: 0.0407 Cer: 0.0200 Model description More information needed Intended uses & limitations More information needed Training and evaluation data More information needed Training procedure Training hyperparameters The following hyperparameters were used during training: learning_rate: 5e-05 train_batch_size: 8 eval_batch_size: 8 seed: 42 optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 lr_scheduler_type: linear num_epochs: 5 mixed_precision_training: Native AMP Training results Framework versions Transformers 4.18.0 Pytorch 1.9.1 Datasets 2.1.0 Tokenizers 0.11.6

Read more

$-/run

61

Huggingface

Similar creators