Get a weekly rundown of the latest AI models and research... subscribe! https://aimodels.substack.com/

Wangchanberta Base Att Spm Uncased

airesearch

๐Ÿค–

wangchanberta-base-att-spm-uncased is a pretrained RoBERTa BASE model trained on assorted Thai texts. It can be used for various natural language processing tasks such as masked language modeling, multiclass/multilabel text classification, and token classification. The model was trained on a large dataset of Thai sentences, and the vocabulary was created using the SentencePiece unigram model. The model has a maximum sequence length of 416 subword tokens. It was trained for 500,000 steps with the batch size of 4,096 and optimized using Adam with a learning rate of 3e-4.

Use cases

This AI model, wangchanberta-base-att-spm-uncased, has various potential use cases for natural language processing tasks. One use case is masked language modeling, where the model predicts a mask token in the input text. This can be useful for tasks such as filling in missing words or completing sentences. Additionally, the model can be used for multiclass or multilabel text classification, enabling the categorization of texts into different classes or assigning multiple labels to texts. Token classification is another possible use case, which involves labeling specific tokens or parts of a text. With this model, it is possible to develop products or applications that require language understanding and analysis. For example, a chatbot could utilize the masked language modeling capability to generate responses for missing words or phrases. A news organization could use the multiclass text classification feature to automatically categorize news articles into different topics or genres. An e-commerce platform could employ the token classification capability to extract important information from product descriptions, such as identifying brand names or product attributes. These are just a few examples of how this AI model could be applied in practical scenarios, making it a valuable tool for various language-related tasks.

fill-mask

Pricing

Cost per run
$-
USD
Avg run time
-
Seconds
Hardware
-
Prediction

Creator Models

ModelCostRuns
Bert Base Multilingual Cased Finetune Qa$?39
Wangchanberta Base Wiki Newmm$?105
Wangchanberta Base Wiki Spm$?97
Bert Base Multilingual Cased Finetuned$?17
Wangchanberta Base Wiki 20210520 Spm Finetune Qa$?113

Similar Models

Try it!

You can use this area to play around with demo applications that incorporate the Wangchanberta Base Att Spm Uncased model. These demos are maintained and hosted externally by third-party creators. If you see an error, message me on Twitter.

Overview

Summary of this model and related resources.

PropertyValue
Creatorairesearch
Model NameWangchanberta Base Att Spm Uncased
Description

Pretrained RoBERTa BASE model on assorted Thai texts (78.5 GB). The script ...

Read more ยป
Tagsfill-mask
Model LinkView on HuggingFace
API SpecView on HuggingFace
Github LinkNo Github link provided
Paper LinkNo paper link provided

Popularity

How popular is this model, by number of runs? How popular is the creator, by the sum of all their runs?

PropertyValue
Runs20,824
Model Rank
Creator Rank

Cost

How much does it cost to run this model? How long, on average, does it take to complete a run?

PropertyValue
Cost per Run$-
Prediction Hardware-
Average Completion Time-