Ckiplab

Rank:

Average Model Cost: $0.0000

Number of Runs: 284,808

Models by this creator

bert-base-chinese-ws

bert-base-chinese-ws

ckiplab

The bert-base-chinese-ws model is a token classification model trained on Chinese text. It is based on the BERT architecture, which is a transformer-based model. The purpose of this model is to perform word segmentation, which involves dividing a sentence into individual words or tokens. It can be used to extract important information from Chinese text by identifying the boundaries between words.

Read more

$-/run

102.8K

Huggingface

bert-base-chinese-ner

bert-base-chinese-ner

The model bert-base-chinese-ner is a token classification model trained on Chinese text. It is based on the BERT architecture and is specifically designed to perform named entity recognition (NER) tasks in Chinese. NER involves identifying and classifying named entities such as names, organizations, locations, and other entities within text. This model has been trained to identify these entities in Chinese text and can be used to extract important information from such text.

Read more

$-/run

47.8K

Huggingface

albert-tiny-chinese-ner

albert-tiny-chinese-ner

The albert-tiny-chinese-ner model is a Chinese Named Entity Recognition (NER) model based on the ALBERT architecture. It is trained to identify and classify named entities in Chinese text, such as names of people, organizations, locations, and other proper nouns. This model can be used to extract named entities from Chinese text and is particularly useful for tasks like entity recognition and information extraction.

Read more

$-/run

24.1K

Huggingface

bert-base-chinese

bert-base-chinese

The bert-base-chinese model is a Chinese version of the BERT (Bidirectional Encoder Representations from Transformers) model. It is a pre-trained language model that has been trained on a large corpus of Chinese text. The model is capable of various natural language processing tasks, including filling in missing words or "masked" tokens in a sentence. It can be used to generate contextually accurate predictions for masked tokens in Chinese text.

Read more

$-/run

5.4K

Huggingface

bert-base-chinese-qa

bert-base-chinese-qa

CKIP BERT Base Chinese This project provides traditional Chinese transformers models (including ALBERT, BERT, GPT2) and NLP tools (including word segmentation, part-of-speech tagging, named entity recognition). 這個專案提供了繁體中文的 transformers 模型(包含 ALBERT、BERT、GPT2)及自然語言處理工具(包含斷詞、詞性標記、實體辨識)。 Homepage https://github.com/ckiplab/ckip-transformers Contributers Mu Yang at CKIP (Author & Maintainer) Usage Please use BertTokenizerFast as tokenizer instead of AutoTokenizer. 請使用 BertTokenizerFast 而非 AutoTokenizer。 For full usage and more information, please refer to https://github.com/ckiplab/ckip-transformers. 有關完整使用方法及其他資訊,請參見 https://github.com/ckiplab/ckip-transformers 。

Read more

$-/run

1.5K

Huggingface

Similar creators