mistral-community-Mixtral-8x22B-v0-1
Version: 6
The Mixtral-8x22B Large Language Model (LLM) is a pretrained generative Sparse Mixture of Experts.
Mixtral-8x22B-v0.1 is a pretrained base model and therefore does not have any moderation mechanisms.
Evaluation Results
Open LLM Leaderboard Evaluation Results Detailed results can be found hereMetric | Value |
---|---|
Avg. | 74.46 |
AI2 Reasoning Challenge (25-Shot) | 70.48 |
HellaSwag (10-Shot) | 88.73 |
MMLU (5-Shot) | 77.81 |
TruthfulQA (0-shot) | 51.08 |
Winogrande (5-shot) | 84.53 |
GSM8k (5-shot) | 74.15 |
Inference samples
Inference type | Python sample (Notebook) | CLI with YAML |
---|---|---|
Real time | text-generation-online-endpoint.ipynb | text-generation-online-endpoint.sh |
Batch | text-generation-batch-endpoint.ipynb | coming soon |
Sample inputs and outputs
Sample input
{
"input_data": {
"input_string": [
"What is your favourite condiment?",
"Do you have mayonnaise recipes?"
],
"parameters": {
"max_new_tokens": 100,
"do_sample": true,
"return_full_text": false
}
}
}
Sample output
[
{
"0": "\n\nDoes Hellmann's Mayonnaise Mallows really exist?\n\nThis is a difficult one because I want to pick Orkney ice cream which is unbelievable but I am also drawn to Hellmann's Mayonnaise Mallows (yeah, they really do exist) which I recently tried for the first time in California.\n\nThey were exactly how I expected them to taste – like marshmallows made from mayonnaise. I can'",
"1": " I would imagine that the ingredients consist, at least in large part, of oil and cream [suggest edit]. However, I'm interested in baking mayonnaise into food, which means I'm worried that 50% of mayonnaise is just going to turn into oil and get absorbed by whatever it's cooked with [suggest edit] [suggest edit]. I thought that perhaps there might be a different recipe for mayonnaise which could be used specifically to with"
}
]
Model Specifications
Quality Index0.12
LicenseApache-2.0
Last UpdatedDecember 2024
PublisherMistral AI
Languages6 Languages
Related Models