Template model for huggingfaceinferencetoolkitgpusmallbitnet pipelines
Template model for huggingfaceinferencetoolkitcpusmallbitnet pipelines
jinaai/ReaderLMv2 powered by Text Generation Inference (TGI) Example Notebook Original Model Card Send Request You can use cURL or any REST Client to send a request to
BSCLT/ALIA40b powered by Text Generation Inference (TGI) Example Notebook Original Model Card Send Request You can use cURL or any REST Client to send a request to the Az
BSCLT/salamandra7bbasefp8 powered by Text Generation Inference (TGI) Example Notebook Original Model Card Send Request You can use cURL or any REST Client
speakleash/Bielik11Bv2.3Instruct powered by Text Generation Inference (TGI) Example Notebook Original Model Card Send Request You can use cURL or any
VAGOsolutions/Llama3.1SauerkrautLM70bInstruct powered by Text Generation Inference. Example Notebook Original Model Card Send Request Y
FreedomIntelligence/AceGPTv1.513B powered by Text Generation Inference. Example Notebook Original Model Card Send Request You can use cURL or any REST
Bin12345/AutoCoder powered by Text Generation Inference. Example Notebook Original Model Card Send Request You can use cURL or any REST Client to sent request. Just add y
AlibabaNLP/gtelargeenv1.5 powered by Text Embeddings Inference. Text Embeddings Inference (TEI) is a toolkit for deploying and serving open source text embeddings and sequence classification models. TEI enables highperformance extraction for the most popular models, including FlagEmbedding,
hyunwoongko/asianbartecjk is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the text2textgeneration task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can f
ToddGoldfarb/CadetTiny is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the conversational task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find this i
persiannlp/mt5largeparsinluopustranslationfaen is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the text2textgeneration task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitatio
Sonnenblume/bertbaseuncasedancientgreekv3 is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the fillmask task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, y
oliverguhr/fullstoppunctuationmultilingualbase is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the tokenclassification task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations,
monologg/kobigbirdbertbase is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the fillmask task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find this i
activebus/BERTReview is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the fillmask task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find this informat
cointegrated/rut5base is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the text2textgeneration task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find t
facebook/blenderbot1Bdistill is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the conversational task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find
hfinternaltesting/tinyrandomAlbertForQuestionAnswering is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the questionanswering task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limit
AidenH20/DNABERT500down is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the textclassification task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find
allenai/tkinstructbasedefpos is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the text2textgeneration task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you
microsoft/tapexbase is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the tablequestionanswering task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find
hfinternaltesting/tinyrandomRemBertForMaskedLM is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the fillmask task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biase
yoshitomomatsubara/bertbaseuncasedmnli is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the textclassification task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and bia
andreaskoepf/pythia1.4bgpt4allpretrain is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the textgeneration task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases,
MoritzLaurer/DeBERTav3basemnlifeverdocnliling2c is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the textclassification task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitati
google/pegasusnewsroom is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the summarization task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find this in
microsoft/DialogRPTwidth is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the textclassification task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find
togethercomputer/GPTJTModeration6B is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the textgeneration task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you
aubmindlab/bertbasearabertv02twitter is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the fillmask task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can
flaxcommunity/t5recipegeneration is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the text2textgeneration task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, y
MonoHime/rubertbasecasedsentimentnew is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the textclassification task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biase
EleutherAI/pythia160mdedupedv0 is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the textgeneration task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can
ingen51/DialoGPTmediumGPT4 is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the conversational task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find t
mariagrandury/robertabasefinetunedsmsspamdetection is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the textclassification task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitat
morit/XLMTfullxnli is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the zeroshotclassification task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can fin
bhadreshsavani/distilbertbaseuncasedgoemotion is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the textclassification task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations,
smanjil/GermanMedBERT is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the fillmask task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find this informa
IDEACCNL/RandengT577MMultiTaskChinese is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the text2textgeneration task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and bi
yiyanghkust/finbertfls is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the textclassification task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find t
hfinternaltesting/tinyrandomAlbertForMaskedLM is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the fillmask task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases
manishiitg/distilbertresumepartsclassify is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the textclassification task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and bi
facebook/optimlmax1.3b is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the textgeneration task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find thi
Shitao/RetroMAEMSMARCO is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the fillmask task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find this inform
nlpodyssey/bertmultilingualuncasedgeocountriesheadlines is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the textclassification task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, li
Davlan/xlmrobertalargenerhrl is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the tokenclassification task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you
KoboldAI/GPTNeo2.7BShinen is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the textgeneration task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find
smilegateai/korunsmile is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the textclassification task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find
etalabia/camembertbasesquadFRfquadpiaf is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the questionanswering task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and bia
paulagarciaserrano/robertadepressiondetection is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the textclassification task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, an