Huawei-noah

Rank:

Average Model Cost: $0.0000

Number of Runs: 77,330

Models by this creator

TinyBERT_4L_zh

TinyBERT_4L_zh

huawei-noah

TinyBERT_4L_zh is a Chinese language model that is specifically designed for understanding and processing Chinese text. It has been trained using the BERT architecture, but with a smaller size of 4 layers instead of the usual 12 or 24 layers. Despite its smaller size, it is able to perform well on a variety of natural language processing tasks such as text classification, entity recognition, and question answering. The model can be fine-tuned or used as a pre-trained model to achieve high performance on Chinese language tasks.

Read more

$-/run

41.6K

Huggingface

TinyBERT_General_4L_312D

TinyBERT_General_4L_312D

TinyBERT_General_4L_312D is a language representation model that has been pre-trained on a large corpus of text data. It is a variant of the BERT model, specifically designed to be smaller in size and have fewer parameters. Despite its reduced size, it still retains good language understanding capabilities. This model can be fine-tuned for various downstream natural language processing tasks, such as text classification, sentiment analysis, and question answering.

Read more

$-/run

20.0K

Huggingface

TinyBERT_General_6L_768D

TinyBERT_General_6L_768D

TinyBERT_General_6L_768D is a pre-trained model that is designed to perform various natural language processing tasks including text classification, sentiment analysis, and question answering. It has been trained using a smaller version of the BERT (Bidirectional Encoder Representations from Transformers) model, making it more lightweight and efficient for deployment in resource-constrained environments. The model architecture consists of 6 transformer layers and produces contextualized word representations of size 768. This model is particularly useful for tasks that require understanding and processing of natural language text.

Read more

$-/run

15.6K

Huggingface

Similar creators