microsoft-biomednlp-pubmedbert-base-uncased-abstract-fulltext
microsoft-biomednlp-pubmedbert-base-uncased-abstract-fulltext
Version: 12
Hugging FaceLast updated December 2025

MSR BiomedBERT (abstracts + full text)

  • This model was previously named "PubMedBERT (abstracts + full text)".
  • You can either adopt the new model name "microsoft/BiomedNLP-BiomedBERT-base-uncased-abstract-fulltext" or update your transformers library to version 4.22+ if you need to refer to the old name.
Pretraining large neural language models, such as BERT, has led to impressive gains on many natural language processing (NLP) tasks. However, most pretraining efforts focus on general domain corpora, such as newswire and Web. A prevailing assumption is that even domain-specific pretraining can benefit by starting from general-domain language models. Recent work shows that for domains with abundant unlabeled text, such as biomedicine, pretraining language models from scratch results in substantial gains over continual pretraining of general-domain language models. BiomedBERT is pretrained from scratch using abstracts from PubMed and full-text articles from PubMedCentral . This model achieves state-of-the-art performance on many biomedical NLP tasks, and currently holds the top score on the Biomedical Language Understanding and Reasoning Benchmark .

Citation

If you find BiomedBERT useful in your research, please cite the following paper:
@misc{pubmedbert,
  author = {Yu Gu and Robert Tinn and Hao Cheng and Michael Lucas and Naoto Usuyama and Xiaodong Liu and Tristan Naumann and Jianfeng Gao and Hoifung Poon},
  title = {Domain-Specific Language Model Pretraining for Biomedical Natural Language Processing},
  year = {2020},
  eprint = {arXiv:2007.15779},
}

microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract-fulltext is a pre-trained language model available on the Hugging Face Hub. It's specifically designed for the fill-mask task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find this information on the model's dedicated Model Card on the Hugging Face Hub . Here's an example API request payload that you can use to obtain predictions from the model:
{
  "inputs": "[MASK] is a tumor suppressor gene."
}
Model Specifications
LicenseMit
Last UpdatedDecember 2025
ProviderHugging Face