Hugging Face
Hugging FaceHosts a vast repository of open-source models and tools for natural language processing tasks.
Total Models: 10860
huggingface-inference-toolkit-gpu-small-bitnet
huggingface-inference-toolkit-gpu-small-bitnet

Template model for huggingfaceinferencetoolkitgpusmallbitnet pipelines

huggingface-inference-toolkit-cpu-small-bitnet
huggingface-inference-toolkit-cpu-small-bitnet

Template model for huggingfaceinferencetoolkitcpusmallbitnet pipelines

jinaai-readerlm-v2
jinaai-readerlm-v2

jinaai/ReaderLMv2 powered by Text Generation Inference (TGI) Example Notebook Original Model Card Send Request You can use cURL or any REST Client to send a request to

text-generation
bsc-lt-alia-40b
bsc-lt-alia-40b

BSCLT/ALIA40b powered by Text Generation Inference (TGI) Example Notebook Original Model Card Send Request You can use cURL or any REST Client to send a request to the Az

text-generation
bsc-lt-salamandra-7b-base-fp8
bsc-lt-salamandra-7b-base-fp8

BSCLT/salamandra7bbasefp8 powered by Text Generation Inference (TGI) Example Notebook Original Model Card Send Request You can use cURL or any REST Client

text-generation
speakleash-bielik-11b-v2-3-instruct
speakleash-bielik-11b-v2-3-instruct

speakleash/Bielik11Bv2.3Instruct powered by Text Generation Inference (TGI) Example Notebook Original Model Card Send Request You can use cURL or any

text-generation
vagosolutions-llama-3.1-sauerkrautlm-70b-instruct
vagosolutions-llama-3.1-sauerkrautlm-70b-instruct

VAGOsolutions/Llama3.1SauerkrautLM70bInstruct powered by Text Generation Inference. Example Notebook Original Model Card Send Request Y

text-generation
freedomintelligence-acegpt-v1.5-13b
freedomintelligence-acegpt-v1.5-13b

FreedomIntelligence/AceGPTv1.513B powered by Text Generation Inference. Example Notebook Original Model Card Send Request You can use cURL or any REST

text-generation
bin12345-autocoder
bin12345-autocoder

Bin12345/AutoCoder powered by Text Generation Inference. Example Notebook Original Model Card Send Request You can use cURL or any REST Client to sent request. Just add y

text-generation
alibaba-nlp-gte-large-en-v1.5
alibaba-nlp-gte-large-en-v1.5

AlibabaNLP/gtelargeenv1.5 powered by Text Embeddings Inference. Text Embeddings Inference (TEI) is a toolkit for deploying and serving open source text embeddings and sequence classification models. TEI enables highperformance extraction for the most popular models, including FlagEmbedding,

sentence-similarity
hyunwoongko-asian-bart-ecjk
hyunwoongko-asian-bart-ecjk

hyunwoongko/asianbartecjk is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the text2textgeneration task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can f

text2text-generation
toddgoldfarb-cadet-tiny
toddgoldfarb-cadet-tiny

ToddGoldfarb/CadetTiny is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the conversational task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find this i

conversational
persiannlp-mt5-large-parsinlu-opus-translation-fa-en
persiannlp-mt5-large-parsinlu-opus-translation-fa-en

persiannlp/mt5largeparsinluopustranslationfaen is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the text2textgeneration task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitatio

text2text-generation
sonnenblume-bert-base-uncased-ancient-greek-v3
sonnenblume-bert-base-uncased-ancient-greek-v3

Sonnenblume/bertbaseuncasedancientgreekv3 is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the fillmask task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, y

fill-mask
oliverguhr-fullstop-punctuation-multilingual-base
oliverguhr-fullstop-punctuation-multilingual-base

oliverguhr/fullstoppunctuationmultilingualbase is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the tokenclassification task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations,

token-classification
monologg-kobigbird-bert-base
monologg-kobigbird-bert-base

monologg/kobigbirdbertbase is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the fillmask task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find this i

fill-mask
activebus-bert-review
activebus-bert-review

activebus/BERTReview is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the fillmask task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find this informat

fill-mask
cointegrated-rut5-base
cointegrated-rut5-base

cointegrated/rut5base is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the text2textgeneration task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find t

text2text-generation
facebook-blenderbot-1b-distill
facebook-blenderbot-1b-distill

facebook/blenderbot1Bdistill is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the conversational task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find

conversational
hf-internal-testing-tiny-random-albertforquestionanswering
hf-internal-testing-tiny-random-albertforquestionanswering

hfinternaltesting/tinyrandomAlbertForQuestionAnswering is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the questionanswering task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limit

question-answering
aidenh20-dnabert-500down
aidenh20-dnabert-500down

AidenH20/DNABERT500down is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the textclassification task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find

text-classification
allenai-tk-instruct-base-def-pos
allenai-tk-instruct-base-def-pos

allenai/tkinstructbasedefpos is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the text2textgeneration task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you

text2text-generation
microsoft-tapex-base
microsoft-tapex-base

microsoft/tapexbase is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the tablequestionanswering task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find

table-question-answering
hf-internal-testing-tiny-random-rembertformaskedlm
hf-internal-testing-tiny-random-rembertformaskedlm

hfinternaltesting/tinyrandomRemBertForMaskedLM is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the fillmask task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biase

fill-mask
yoshitomo-matsubara-bert-base-uncased-mnli
yoshitomo-matsubara-bert-base-uncased-mnli

yoshitomomatsubara/bertbaseuncasedmnli is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the textclassification task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and bia

text-classification
andreaskoepf-pythia-1.4b-gpt4all-pretrain
andreaskoepf-pythia-1.4b-gpt4all-pretrain

andreaskoepf/pythia1.4bgpt4allpretrain is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the textgeneration task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases,

text-generation
moritzlaurer-deberta-v3-base-mnli-fever-docnli-ling-2c
moritzlaurer-deberta-v3-base-mnli-fever-docnli-ling-2c

MoritzLaurer/DeBERTav3basemnlifeverdocnliling2c is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the textclassification task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitati

text-classification
google-pegasus-newsroom
google-pegasus-newsroom

google/pegasusnewsroom is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the summarization task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find this in

summarization
microsoft-dialogrpt-width
microsoft-dialogrpt-width

microsoft/DialogRPTwidth is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the textclassification task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find

text-classification
togethercomputer-gpt-jt-moderation-6b
togethercomputer-gpt-jt-moderation-6b

togethercomputer/GPTJTModeration6B is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the textgeneration task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you

text-generation
aubmindlab-bert-base-arabertv02-twitter
aubmindlab-bert-base-arabertv02-twitter

aubmindlab/bertbasearabertv02twitter is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the fillmask task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can

fill-mask
flax-community-t5-recipe-generation
flax-community-t5-recipe-generation

flaxcommunity/t5recipegeneration is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the text2textgeneration task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, y

text2text-generation
monohime-rubert-base-cased-sentiment-new
monohime-rubert-base-cased-sentiment-new

MonoHime/rubertbasecasedsentimentnew is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the textclassification task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biase

text-classification
eleutherai-pythia-160m-deduped-v0
eleutherai-pythia-160m-deduped-v0

EleutherAI/pythia160mdedupedv0 is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the textgeneration task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can

text-generation
ingen51-dialogpt-medium-gpt4
ingen51-dialogpt-medium-gpt4

ingen51/DialoGPTmediumGPT4 is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the conversational task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find t

conversational
mariagrandury-roberta-base-finetuned-sms-spam-detection
mariagrandury-roberta-base-finetuned-sms-spam-detection

mariagrandury/robertabasefinetunedsmsspamdetection is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the textclassification task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitat

text-classification
morit-xlm-t-full-xnli
morit-xlm-t-full-xnli

morit/XLMTfullxnli is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the zeroshotclassification task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can fin

zero-shot-classification
bhadresh-savani-distilbert-base-uncased-go-emotion
bhadresh-savani-distilbert-base-uncased-go-emotion

bhadreshsavani/distilbertbaseuncasedgoemotion is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the textclassification task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations,

text-classification
smanjil-german-medbert
smanjil-german-medbert

smanjil/GermanMedBERT is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the fillmask task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find this informa

fill-mask
idea-ccnl-randeng-t5-77m-multitask-chinese
idea-ccnl-randeng-t5-77m-multitask-chinese

IDEACCNL/RandengT577MMultiTaskChinese is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the text2textgeneration task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and bi

text2text-generation
yiyanghkust-finbert-fls
yiyanghkust-finbert-fls

yiyanghkust/finbertfls is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the textclassification task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find t

text-classification
hf-internal-testing-tiny-random-albertformaskedlm
hf-internal-testing-tiny-random-albertformaskedlm

hfinternaltesting/tinyrandomAlbertForMaskedLM is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the fillmask task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases

fill-mask
manishiitg-distilbert-resume-parts-classify
manishiitg-distilbert-resume-parts-classify

manishiitg/distilbertresumepartsclassify is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the textclassification task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and bi

text-classification
facebook-opt-iml-max-1.3b
facebook-opt-iml-max-1.3b

facebook/optimlmax1.3b is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the textgeneration task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find thi

text-generation
shitao-retromae-msmarco
shitao-retromae-msmarco

Shitao/RetroMAEMSMARCO is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the fillmask task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find this inform

fill-mask
nlpodyssey-bert-multilingual-uncased-geo-countries-headlines
nlpodyssey-bert-multilingual-uncased-geo-countries-headlines

nlpodyssey/bertmultilingualuncasedgeocountriesheadlines is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the textclassification task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, li

text-classification
davlan-xlm-roberta-large-ner-hrl
davlan-xlm-roberta-large-ner-hrl

Davlan/xlmrobertalargenerhrl is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the tokenclassification task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you

token-classification
koboldai-gpt-neo-2.7b-shinen
koboldai-gpt-neo-2.7b-shinen

KoboldAI/GPTNeo2.7BShinen is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the textgeneration task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find

text-generation
smilegate-ai-kor-unsmile
smilegate-ai-kor-unsmile

smilegateai/korunsmile is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the textclassification task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and biases, you can find

text-classification
etalab-ia-camembert-base-squadfr-fquad-piaf
etalab-ia-camembert-base-squadfr-fquad-piaf

etalabia/camembertbasesquadFRfquadpiaf is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the questionanswering task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, and bia

question-answering
paulagarciaserrano-roberta-depression-detection
paulagarciaserrano-roberta-depression-detection

paulagarciaserrano/robertadepressiondetection is a pretrained language model available on the Hugging Face Hub. It's specifically designed for the textclassification task in the transformers library. If you want to learn more about the model's architecture, hyperparameters, limitations, an

text-classification
1