Bionlp

Rank:

Average Model Cost: $0.0000

Number of Runs: 15,243

Models by this creator

bluebert_pubmed_mimic_uncased_L-12_H-768_A-12

bluebert_pubmed_mimic_uncased_L-12_H-768_A-12

bionlp

The bluebert_pubmed_mimic_uncased_L-12_H-768_A-12 model is a language model trained on PubMed and MIMIC datasets. It uses the BERT (Bidirectional Encoder Representations from Transformers) architecture and has 12 layers, with a hidden size of 768 and 12 attention heads. The model can be used for various natural language processing tasks such as text classification, named entity recognition, and sentence similarity scoring.

Read more

$-/run

8.3K

Huggingface

bluebert_pubmed_mimic_uncased_L-24_H-1024_A-16

bluebert_pubmed_mimic_uncased_L-24_H-1024_A-16

BlueBert-Base, Uncased, PubMed and MIMIC-III Model description A BERT model pre-trained on PubMed abstracts and clinical notes (MIMIC-III). Intended uses & limitations Please see https://github.com/ncbi-nlp/bluebert Training data We provide preprocessed PubMed texts that were used to pre-train the BlueBERT models. The corpus contains ~4000M words extracted from the PubMed ASCII code version. Pre-trained model: https://huggingface.co/bert-large-uncased Training procedure lowercasing the text removing speical chars \x00-\x7F tokenizing the text using the NLTK Treebank tokenizer Below is a code snippet for more details. BibTeX entry and citation info Acknowledgments This work was supported by the Intramural Research Programs of the National Institutes of Health, National Library of Medicine and Clinical Center. This work was supported by the National Library of Medicine of the National Institutes of Health under award number 4R00LM013001-01. We are also grateful to the authors of BERT and ELMo to make the data and codes publicly available. We would like to thank Dr Sun Kim for processing the PubMed texts. Disclaimer This tool shows the results of research conducted in the Computational Biology Branch, NCBI. The information produced on this website is not intended for direct diagnostic use or medical decision-making without review and oversight by a clinical professional. Individuals should not change their health behavior solely on the basis of information produced on this website. NIH does not independently verify the validity or utility of the information produced by this tool. If you have questions about the information produced on this website, please see a health care professional. More information about NCBI's disclaimer policy is available.

Read more

$-/run

4.8K

Huggingface

bluebert_pubmed_uncased_L-12_H-768_A-12

bluebert_pubmed_uncased_L-12_H-768_A-12

BlueBert-Base, Uncased, PubMed Model description A BERT model pre-trained on PubMed abstracts Intended uses & limitations Please see https://github.com/ncbi-nlp/bluebert Training data We provide preprocessed PubMed texts that were used to pre-train the BlueBERT models. The corpus contains ~4000M words extracted from the PubMed ASCII code version. Pre-trained model: https://huggingface.co/bert-base-uncased Training procedure lowercasing the text removing speical chars \x00-\x7F tokenizing the text using the NLTK Treebank tokenizer Below is a code snippet for more details. BibTeX entry and citation info

Read more

$-/run

968

Huggingface

Similar creators