Average Model Cost: $0.0000
Number of Runs: 9,033
Models by this creator
ProtGPT2 is a language model based on GPT-2 that has been fine-tuned on protein sequences and can generate protein sequences. It takes a starting sequence as input and generates new sequences based on the patterns it has learned from the training data. This model can help researchers in the field of protein structure prediction and protein engineering by providing them with new protein sequences that may have desirable properties.
output This model is a fine-tuned version of /home/woody/b114cb/b114cb10/zymCTRL/train/output/ on the None dataset. It achieves the following results on the evaluation set: Loss: 0.1872 Model description More information needed Intended uses & limitations More information needed Training and evaluation data More information needed Training procedure Training hyperparameters The following hyperparameters were used during training: learning_rate: 8e-05 train_batch_size: 1 eval_batch_size: 4 seed: 42 optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 lr_scheduler_type: linear num_epochs: 5.0 Training results Framework versions Transformers 4.26.1 Pytorch 1.12.1+cu116 Datasets 2.10.0 Tokenizers 0.12.1