Bigwiz83
Rank:Average Model Cost: $0.0000
Number of Runs: 15,898
Models by this creator
sapbert-from-pubmedbert-squad2
sapbert-from-pubmedbert-squad2
The sapbert-from-pubmedbert-squad2 model is a fine-tuned version of cambridgeltl/SapBERT-from-PubMedBERT-fulltext on the squad_v2 dataset. Unfortunately, there is not enough information provided about the model's description, intended uses and limitations, training and evaluation data, and training results. The model was trained using the Transformers 4.7.0 framework with PyTorch 1.8.0, Datasets 1.4.1, and Tokenizers 0.10.2. The hyperparameters used during training included a learning rate of 2e-05, a training batch size of 16, an evaluation batch size of 16, a seed of 42, and Adam optimizer with betas=(0.9,0.999) and epsilon=1e-08. The training was performed for 5 epochs.
$-/run
15.9K
Huggingface