Dccuchile
Rank:Average Model Cost: $0.0000
Number of Runs: 107,568
Models by this creator
bert-base-spanish-wwm-cased
bert-base-spanish-wwm-cased
The bert-base-spanish-wwm-cased model, also known as BETO, is a Spanish BERT model trained on a large Spanish corpus. It uses the Whole Word Masking technique and is similar in size to BERT-Base. BETO has been evaluated on various Spanish benchmarks and compared with Multilingual BERT and other non-BERT models. It is available as TensorFlow and PyTorch checkpoints. The model can be accessed using the Huggingface Transformers library with the model name 'dccuchile/bert-base-spanish-wwm-cased'. Note that the licenses of the original text resources used to train BETO may not be compatible with commercial use.
$-/run
56.4K
Huggingface
bert-base-spanish-wwm-uncased
bert-base-spanish-wwm-uncased
The bert-base-spanish-wwm-uncased model is a version of the BERT (Bidirectional Encoder Representations from Transformers) model that has been pre-trained on a large corpus of Spanish text. It is a masked language model, which means it can predict missing words in a given sentence by considering the surrounding context. This model can be used for various natural language processing tasks such as text classification, named entity recognition, and sentiment analysis in the Spanish language.
$-/run
47.0K
Huggingface
albert-tiny-spanish-finetuned-ner
albert-tiny-spanish-finetuned-ner
Platform did not provide a description for this model.
$-/run
1.5K
Huggingface
albert-base-10-spanish-finetuned-mldoc
albert-base-10-spanish-finetuned-mldoc
Platform did not provide a description for this model.
$-/run
1.1K
Huggingface
distilbert-base-spanish-uncased
distilbert-base-spanish-uncased
Platform did not provide a description for this model.
$-/run
768
Huggingface
albert-tiny-spanish
albert-tiny-spanish
ALBERT Tiny Spanish This is an ALBERT model trained on a big spanish corpora. The model was trained on a single TPU v3-8 with the following hyperparameters and steps/time: LR: 0.00125 Batch Size: 2048 Warmup ratio: 0.0125 Warmup steps: 125000 Goal steps: 10000000 Total steps: 8300000 Total training time (aprox): 58.2 days Training loss
$-/run
412
Huggingface
albert-base-spanish
albert-base-spanish
ALBERT Base Spanish This is an ALBERT model trained on a big spanish corpora. The model was trained on a single TPU v3-8 with the following hyperparameters and steps/time: LR: 0.0008838834765 Batch Size: 960 Warmup ratio: 0.00625 Warmup steps: 53333.33333 Goal steps: 8533333.333 Total steps: 3650000 Total training time (aprox): 70.4 days. Training loss
$-/run
211
Huggingface
distilbert-base-spanish-uncased-finetuned-ner
distilbert-base-spanish-uncased-finetuned-ner
Platform did not provide a description for this model.
$-/run
55
Huggingface
bert-base-spanish-wwm-uncased-finetuned-mldoc
bert-base-spanish-wwm-uncased-finetuned-mldoc
Platform did not provide a description for this model.
$-/run
37
Huggingface
bert-base-spanish-wwm-cased-finetuned-xnli
bert-base-spanish-wwm-cased-finetuned-xnli
Platform did not provide a description for this model.
$-/run
35
Huggingface