Bert Base Uncased

huggingface

bert-base-uncased

BERT-base-uncased is a pretrained language model that has been trained on a large corpus of English text. It uses a masked language modeling objective to learn an inner representation of the English language. This pretrained model can be used for tasks such as sequence classification, token classification, and question answering. It has been trained on the BookCorpus dataset and English Wikipedia using a vocabulary size of 30,000. The model has been trained on 4 cloud TPUs with a batch size of 256 and uses the Adam optimizer. When fine-tuned on downstream tasks, BERT-base-uncased achieves good performance on tasks like sentiment analysis and text classification. However, it is important to note that the model may have biased predictions, and this bias can also affect all fine-tuned versions of the model.

Use cases

BERT-base-uncased can be used for a variety of natural language processing tasks. It is particularly useful for tasks that require the understanding of the overall meaning and context of a sentence. Some possible use cases include sentiment analysis, where the model can be fine-tuned to classify the sentiment of a given text, text classification tasks where the model can be used to classify documents into different categories, and question answering tasks, where the model can be trained to answer questions based on a given context. Additionally, BERT-base-uncased can be used for token classification tasks, such as named entity recognition, and for masked language modeling tasks, where the model can predict the missing word in a sentence. Overall, BERT-base-uncased provides a powerful tool for natural language processing tasks and can be used to improve the accuracy and performance of various applications and products in this domain.

fill-mask

Pricing

Cost per run
$-
USD
Avg run time
-
Seconds
Hardware
-
Prediction

Creator Models

ModelCostRuns
Time Series Transformer Tourism Monthly$?1,123
Bert Large Uncased Whole Word Masking Finetuned Squad$?294,128
Bert Large Cased Whole Word Masking$?4,430
Gpt2 Large$?263,455
Xlm Roberta Large Finetuned Conll02 Dutch$?378

Similar Models

Try it!

You can use this area to play around with demo applications that incorporate the Bert Base Uncased model. These demos are maintained and hosted externally by third-party creators. If you see an error, message me on Twitter.

Overview

Summary of this model and related resources.

PropertyValue
Creatorhuggingface
Model NameBert Base Uncased
Description

Pretrained model on English language using a masked language modeling (MLM)...

Read more »
Tagsfill-mask
Model LinkView on HuggingFace
API SpecView on HuggingFace
Github LinkNo Github link provided
Paper LinkNo paper link provided

Popularity

How popular is this model, by number of runs? How popular is the creator, by the sum of all their runs?

PropertyValue
Runs49,596,540
Model Rank
Creator Rank

Cost

How much does it cost to run this model? How long, on average, does it take to complete a run?

PropertyValue
Cost per Run$-
Prediction Hardware-
Average Completion Time-