Get a weekly rundown of the latest AI models and research... subscribe! https://aimodels.substack.com/

Akhisreelibra

Rank:

Average Model Cost: $0.0000

Number of Runs: 8,677

Models by this creator

๐Ÿ

bert-malayalam-pos-tagger

The bert-malayalam-pos-tagger model is a fine-tuned version of bert-base-multilingual-uncased on a dataset of tagged Malayalam sentences. It achieves good results on the evaluation set, with a loss of 0.4383, precision of 0.7380, recall of 0.7767, F1 score of 0.7569, and accuracy of 0.8552. More information is needed regarding the model's description, intended uses and limitations, as well as the training and evaluation data. The training hyperparameters include a learning rate of 2e-05, a training batch size of 8, an evaluation batch size of 8, and a seed of 42. The optimizer used is Adam with betas=(0.9,0.999) and epsilon=1e-08, and the learning rate scheduler type is linear. The model was trained for 3 epochs. The framework versions used are Transformers 4.18.0, Pytorch 1.11.0, Datasets 2.1.0, and Tokenizers 0.12.1.

Read more

$-/run

8.7K

Huggingface

๐Ÿง 

xlmR-finetuned-pos

Platform did not provide a description for this model.

Read more

$-/run

3

Huggingface

๐Ÿงช

mt5-small-finetuned-one-india

Platform did not provide a description for this model.

Read more

$-/run

0

Huggingface

๐Ÿš€

malayalam-summariser

Platform did not provide a description for this model.

Read more

$-/run

0

Huggingface

๐Ÿค–

bert-finetuned-ner

Platform did not provide a description for this model.

Read more

$-/run

0

Huggingface

๐Ÿ’ฌ

mt5-small-finetuned-amazon-en-es

Platform did not provide a description for this model.

Read more

$-/run

0

Huggingface

๐ŸŒ€

t5-small-finetuned-xsum

Platform did not provide a description for this model.

Read more

$-/run

0

Huggingface

Similar creators