Dccuchile

Rank:

Average Model Cost: $0.0000

Number of Runs: 107,568

Models by this creator

bert-base-spanish-wwm-cased

bert-base-spanish-wwm-cased

dccuchile

The bert-base-spanish-wwm-cased model, also known as BETO, is a Spanish BERT model trained on a large Spanish corpus. It uses the Whole Word Masking technique and is similar in size to BERT-Base. BETO has been evaluated on various Spanish benchmarks and compared with Multilingual BERT and other non-BERT models. It is available as TensorFlow and PyTorch checkpoints. The model can be accessed using the Huggingface Transformers library with the model name 'dccuchile/bert-base-spanish-wwm-cased'. Note that the licenses of the original text resources used to train BETO may not be compatible with commercial use.

Read more

$-/run

56.4K

Huggingface

bert-base-spanish-wwm-uncased

bert-base-spanish-wwm-uncased

The bert-base-spanish-wwm-uncased model is a version of the BERT (Bidirectional Encoder Representations from Transformers) model that has been pre-trained on a large corpus of Spanish text. It is a masked language model, which means it can predict missing words in a given sentence by considering the surrounding context. This model can be used for various natural language processing tasks such as text classification, named entity recognition, and sentiment analysis in the Spanish language.

Read more

$-/run

47.0K

Huggingface

Similar creators