Get a weekly rundown of the latest AI models and research... subscribe! https://aimodels.substack.com/

Line-corporation

Rank:

Average Model Cost: $0.0000

Number of Runs: 3,452

Models by this creator

🛸

line-distilbert-base-japanese

LINE DistilBERT Japanese This is a DistilBERT model pre-trained on 131 GB of Japanese web text. The teacher model is BERT-base that built in-house at LINE. The model was trained by LINE Corporation. For Japanese https://github.com/line/LINE-DistilBERT-Japanese/blob/main/README_ja.md is written in Japanese. How to use Requirements Model architecture The model architecture is the DitilBERT base model; 6 layers, 768 dimensions of hidden states, 12 attention heads, 66M parameters. Evaluation The evaluation by JGLUE is as follows: Tokenization The texts are first tokenized by MeCab with the Unidic dictionary and then split into subwords by the SentencePiece algorithm. The vocabulary size is 32768. Licenses The pretrained models are distributed under the terms of the Apache License, Version 2.0. To cite this work We haven't published any paper on this work. Please cite this GitHub repository:

Read more

$-/run

3.5K

Huggingface

Similar creators