mistralai-Mixtral-8x22B-v0-1
Version: 5
The Mixtral-8x22B Large Language Model (LLM) is a pretrained generative Sparse Mixture of Experts.
Mixtral-8x22B-v0.1 is a pretrained base model and therefore does not have any moderation mechanisms.
Evaluation Results
Open LLM Leaderboard Evaluation Results Detailed results can be found hereMetric | Value |
---|---|
Avg. | 74.46 |
AI2 Reasoning Challenge (25-Shot) | 70.48 |
HellaSwag (10-Shot) | 88.73 |
MMLU (5-Shot) | 77.81 |
TruthfulQA (0-shot) | 51.08 |
Winogrande (5-shot) | 84.53 |
GSM8k (5-shot) | 74.15 |
Inference samples
Inference type | Python sample (Notebook) | CLI with YAML |
---|---|---|
Real time | text-generation-online-endpoint.ipynb | text-generation-online-endpoint.sh |
Batch | text-generation-batch-endpoint.ipynb | coming soon |
Sample inputs and outputs
Sample input
{
"input_data": {
"input_string": [
"What is your favourite condiment?",
"Do you have mayonnaise recipes?"
],
"parameters": {
"max_new_tokens": 100,
"do_sample": true,
"return_full_text": false
}
}
}
Sample output
[
{
"0": "\n\nDoes Hellmann's Mayonnaise Mallows really exist?\n\nThis Punctuation Day Sign Maker was a playful way to celebrate Punctuation Day and to attempt to get more people involved in enjoyable punctuation practice that was creative. In this activity, learners were presented with a list of over 20 common punctuation or word processing symbols. They chose their favourite and then created a punctuation sign around it.\n\n## Preparation for this Pun",
"1": " I would imagine that the ingredients consist, at least in large part, of oil and creamy things. Then there are eggs to hold it all together, salt and pepper for taste, and perhaps other flavourings depending on the tradition of any mayonnaise makers who attract your custom. A suitable theory of mayonnaise should surely tell us the function and contribution of these ingredients in a good mayonnaise.\n\nThe distinction between theories and recipes has been a theme in economics and in"
}
]
Model Specifications
LicenseApache-2.0
Last UpdatedDecember 2024
PublisherMistral AI
Languages6 Languages