Azure AI Foundry LogoAzure AI Foundry/Catalog/Publishers/Deci AI
Deci AI
Deci AISpecializes in optimizing deep learning models for faster inference and deployment efficiency.
Total Models: 4
Deci-DeciCoder-1b
Deci-DeciCoder-1b

The Model Card for DeciCoder 1B provides details about a 1 billion parameter decoderonly code completion model developed by Deci. The model was trained on Python, Java, and JavaScript subsets of Starcoder Training Dataset and uses Grouped Query Attention with a context window of 2048 tokens. It was

text-generation
Deci-DeciLM-7B
Deci-DeciLM-7B

DeciLM7B is a decoderonly text generation model with 7.04 billion parameters, released by Deci under the Apache 2.0 license. It is the topperforming 7B base language model on the Open LLM Leaderboard and uses variable GroupedQuery Attention (GQA) to achieve a superior balance between accuracy an

text-generation
deci-decidiffusion-v1-0
deci-decidiffusion-v1-0

DeciDiffusion 1.0 is an 820 million parameter latent diffusion model designed for texttoimage conversion. Trained initially on the LAIONv2 dataset and finetuned on the LAIONART dataset, the model's training involved advanced techniques to improve speed, training performance, and achieve super

text-to-image
Deci-DeciLM-7B-instruct
Deci-DeciLM-7B-instruct

DeciLM7Binstruct is a model for shortform instruction following, built by LoRA finetuning on the SlimOrca dataset. It is a derivative of the recently released DeciLM7B language model, a pretrained, highefficiency generative text model with 7 billion parameters. DeciLM7Binstruct is one of th

text-generation
© 2025 Microsoft Corporation. All rights reserved.
Documentation Blog
Privacy Trademarks Contact Support