tscholak-1wnr382e
Version: 5
tscholak/1wnr382e
is a pre-trained language model available on the Hugging Face Hub. It's specifically designed for the text2text-generation
task in the transformers
library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find this information on the model's dedicated Model Card on the Hugging Face Hub .
Here's an example API request payload that you can use to obtain predictions from the model:
{
"inputs": "How many singers do we have? | concert_singer | stadium : stadium_id, location, name, capacity, highest, lowest, average | singer : singer_id, name, country, song_name, song_release_year, age, is_male | concert : concert_id, concert_name, theme, stadium_id, year | singer_in_concert : concert_id, singer_id"
}
Model Specifications
LicenseApache-2.0
Last UpdatedMay 2023
PublisherHuggingFace