Get a weekly rundown of the latest AI models and research... subscribe!


Models by this creator




Total Score


bert-base-japanese-whole-word-masking is a BERT model pretrained on Japanese text. It uses word-level tokenization based on the IPA dictionary, followed by WordPiece subword tokenization. The model is trained with whole word masking, where all subwords corresponding to a single word are masked at once during the masked language modeling (MLM) objective. Similar models include BERT large model (uncased) whole word masking finetuned on SQuAD, Chinese BERT models with whole word masking, and the multilingual BERT base model. These models leverage whole word masking and multilingual training to improve performance on language understanding tasks. Model inputs and outputs Inputs Japanese text as a sequence of tokens Outputs Contextualized token representations that can be used for downstream natural language processing tasks Capabilities The bert-base-japanese-whole-word-masking model can be used for a variety of Japanese language understanding tasks, such as text classification, named entity recognition, and question answering. Its use of whole word masking during pretraining allows the model to better capture word-level semantics in the Japanese language. What can I use it for? You can use this model as a starting point for fine-tuning on your own Japanese language task. For example, you could fine-tune it on a Japanese text classification dataset to build a product categorization system, or on a Japanese question answering dataset to create a customer support chatbot. Things to try One interesting thing to try with this model is to compare its performance on Japanese tasks to models that use character-level or subword-level tokenization, to see if the whole word masking provides a significant boost in accuracy. You could also try using the model's contextualized token representations as input features for other Japanese NLP models, to see if it helps improve their performance.

Read more

Updated 5/15/2024