Polejowska

Rank:

Average Model Cost: $0.0000

Number of Runs: 17,320

Models by this creator

cdetr-r50-cd45rb-all-4ah

cdetr-r50-cd45rb-all-4ah

polejowska

Platform did not provide a description for this model.

Read more

$-/run

4.3K

Huggingface

detr-r101-cd45rb-all-4ah

detr-r101-cd45rb-all-4ah

detr-r101-cd45rb-all-4ah This model is a fine-tuned version of facebook/detr-resnet-101 on the cd45rb dataset. It achieves the following results on the evaluation set: Loss: 1.5962 Model description More information needed Intended uses & limitations More information needed Training and evaluation data More information needed Training procedure Training hyperparameters The following hyperparameters were used during training: learning_rate: 1e-05 train_batch_size: 8 eval_batch_size: 8 seed: 42 optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 lr_scheduler_type: linear num_epochs: 20 mixed_precision_training: Native AMP Training results Framework versions Transformers 4.28.0 Pytorch 2.0.1 Datasets 2.12.0 Tokenizers 0.13.3

Read more

$-/run

4.3K

Huggingface

detr-r50-cd45rb-all-4ah-learned

detr-r50-cd45rb-all-4ah-learned

detr-r50-cd45rb-all-4ah-learned This model is a fine-tuned version of facebook/detr-resnet-50 on the cd45rb dataset. It achieves the following results on the evaluation set: Loss: 1.6174 Model description More information needed Intended uses & limitations More information needed Training and evaluation data More information needed Training procedure Training hyperparameters The following hyperparameters were used during training: learning_rate: 1e-05 train_batch_size: 8 eval_batch_size: 8 seed: 42 optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 lr_scheduler_type: linear num_epochs: 20 mixed_precision_training: Native AMP Training results Framework versions Transformers 4.28.0 Pytorch 2.0.1 Datasets 2.12.0 Tokenizers 0.13.3

Read more

$-/run

2.2K

Huggingface

detr-r50-cd45rb-all-16ah

detr-r50-cd45rb-all-16ah

detr-r50-cd45rb-all-16ah This model is a fine-tuned version of facebook/detr-resnet-50 on the cd45rb dataset. It achieves the following results on the evaluation set: Loss: 1.6073 Model description More information needed Intended uses & limitations More information needed Training and evaluation data More information needed Training procedure Training hyperparameters The following hyperparameters were used during training: learning_rate: 1e-05 train_batch_size: 8 eval_batch_size: 8 seed: 42 optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 lr_scheduler_type: linear num_epochs: 20 mixed_precision_training: Native AMP Training results Framework versions Transformers 4.28.0 Pytorch 2.0.1 Datasets 2.12.0 Tokenizers 0.13.3

Read more

$-/run

2.1K

Huggingface

detr-r50-cd45rb-all-1ah

detr-r50-cd45rb-all-1ah

detr-r50-cd45rb-all-1ah This model is a fine-tuned version of facebook/detr-resnet-50 on the cd45rb dataset. It achieves the following results on the evaluation set: Loss: 1.6472 Model description More information needed Intended uses & limitations More information needed Training and evaluation data More information needed Training procedure Training hyperparameters The following hyperparameters were used during training: learning_rate: 1e-05 train_batch_size: 8 eval_batch_size: 8 seed: 42 optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 lr_scheduler_type: linear num_epochs: 10 mixed_precision_training: Native AMP Training results Framework versions Transformers 4.28.0 Pytorch 2.0.1 Datasets 2.12.0 Tokenizers 0.13.3

Read more

$-/run

2.1K

Huggingface

detr-r50-cd45rb-2ah-6l

detr-r50-cd45rb-2ah-6l

detr-r50-cd45rb-2ah-6l This model is a fine-tuned version of facebook/detr-resnet-50 on the cd45rb dataset. It achieves the following results on the evaluation set: Loss: 2.0706 Model description More information needed Intended uses & limitations More information needed Training and evaluation data More information needed Training procedure Training hyperparameters The following hyperparameters were used during training: learning_rate: 1e-05 train_batch_size: 4 eval_batch_size: 8 seed: 42 optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 lr_scheduler_type: linear num_epochs: 5 mixed_precision_training: Native AMP Training results Framework versions Transformers 4.28.0 Pytorch 2.0.1 Datasets 2.12.0 Tokenizers 0.13.3

Read more

$-/run

16

Huggingface

vit-base-xray-pneumonia-lcbsi

vit-base-xray-pneumonia-lcbsi

vit-base-xray-pneumonia-lcbsi This model is a fine-tuned version of nickmuchi/vit-base-xray-pneumonia on the None dataset. It achieves the following results on the evaluation set: Loss: 0.3775 Accuracy: 0.9773 Model description More information needed Intended uses & limitations More information needed Training and evaluation data More information needed Training procedure Training hyperparameters The following hyperparameters were used during training: learning_rate: 1e-05 train_batch_size: 32 eval_batch_size: 32 seed: 42 gradient_accumulation_steps: 4 total_train_batch_size: 128 optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 lr_scheduler_type: linear lr_scheduler_warmup_ratio: 0.1 num_epochs: 10 Training results Framework versions Transformers 4.25.1 Pytorch 1.13.0+cu116 Datasets 2.7.1 Tokenizers 0.13.2

Read more

$-/run

16

Huggingface

detr-r50-cd45rb-1ah-6l

detr-r50-cd45rb-1ah-6l

detr-r50-cd45rb-1ah-6l This model is a fine-tuned version of facebook/detr-resnet-50 on the cd45rb dataset. It achieves the following results on the evaluation set: Loss: 2.4488 Model description More information needed Intended uses & limitations More information needed Training and evaluation data More information needed Training procedure Training hyperparameters The following hyperparameters were used during training: learning_rate: 1e-05 train_batch_size: 4 eval_batch_size: 8 seed: 42 optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 lr_scheduler_type: linear num_epochs: 5 mixed_precision_training: Native AMP Training results Framework versions Transformers 4.28.0 Pytorch 2.0.1 Datasets 2.12.0 Tokenizers 0.13.3

Read more

$-/run

13

Huggingface

Similar creators