nbroad-esg-bert
Version: 15
nbroad/ESG-BERT
is a pre-trained language model available on the Hugging Face Hub. It's specifically designed for the text-classification
task in the transformers
library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find this information on the model's dedicated Model Card on the Hugging Face Hub .
Here's an example API request payload that you can use to obtain predictions from the model:
{
"inputs": "In fiscal year 2019, we reduced our comprehensive carbon footprint for the fourth consecutive year\u2014down 35 percent compared to 2015, when Apple\u2019s carbon emissions peaked, even as net revenue increased by 11 percent over that same period. In the past year, we avoided over 10 million metric tons from our emissions reduction initiatives\u2014like our Supplier Clean Energy Program, which lowered our footprint by 4.4 million metric tons. "
}
Model Specifications
Last UpdatedMay 2023
PublisherHuggingFace