pritamdeka-biobert-pubmed200krct
pritamdeka-biobert-pubmed200krct
Version: 3
HuggingFaceLast updated May 2023
pritamdeka/BioBert-PubMed200kRCT is a pre-trained language model available on the Hugging Face Hub. It's specifically designed for the text-classification task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find this information on the model's dedicated Model Card on the Hugging Face Hub . Here's an example API request payload that you can use to obtain predictions from the model:
{
  "inputs": "SAMPLE 32,441 archived appendix samples fixed in formalin and embedded in paraffin and tested for the presence of abnormal prion protein (PrP)."
}
Model Specifications
Last UpdatedMay 2023
PublisherHuggingFace