Average Model Cost: $0.0000

Number of Runs: 8,677

Models by this creator




The bert-malayalam-pos-tagger model is a fine-tuned version of bert-base-multilingual-uncased on a dataset of tagged Malayalam sentences. It achieves good results on the evaluation set, with a loss of 0.4383, precision of 0.7380, recall of 0.7767, F1 score of 0.7569, and accuracy of 0.8552. More information is needed regarding the model's description, intended uses and limitations, as well as the training and evaluation data. The training hyperparameters include a learning rate of 2e-05, a training batch size of 8, an evaluation batch size of 8, and a seed of 42. The optimizer used is Adam with betas=(0.9,0.999) and epsilon=1e-08, and the learning rate scheduler type is linear. The model was trained for 3 epochs. The framework versions used are Transformers 4.18.0, Pytorch 1.11.0, Datasets 2.1.0, and Tokenizers 0.12.1.

Read more




Similar creators