distilbert-base-uncased
distilbert-base-uncased
Version: 21
HuggingFaceLast updated July 2023
distilbert-base-uncased is a pre-trained language model available on the Hugging Face Hub. It's specifically designed for the fill-mask task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find this information on the model's dedicated Model Card on the Hugging Face Hub . Here's an example API request payload that you can use to obtain predictions from the model:
{
  "inputs": "Paris is the [MASK] of France."
}
Model Specifications
LicenseApache-2.0
Last UpdatedJuly 2023
PublisherHuggingFace