🟢 : 预训练模型:这类模型是新的基础模型,它们是基于特定数据集进行预训练的。
🔶 :领域特定微调模型:这些预训练模型经过了针对特定领域数据集的进一步微调,以获得更好的性能。
💬 : 聊天模型:包括使用任务指令数据集的IFT(指令式任务训练)、RLHF(强化学习从人类反馈)或DPO(通过增加策略稍微改变模型的损失)等方法进行的聊天式微调模型。
🤝 :基础合并和Moerges模型:这类模型通过合并或MoErges(模型融合)技术集成了多个模型,但不需要额外的微调。如果您发现没有图标的模型,请随时提交问题,以补充模型信息。
❓:表示未知
模型名称 | 模型类型 | 参数大小(亿) | 平均分 | ARC分数 | Hellaswag分数 | MMLU分数 | TruthfulQA分数 | Winogrande分数 | GSM8K分数 | 模型架构 |
---|---|---|---|---|---|---|---|---|---|---|
CCK_Asura_v2 📑 | 🔶 |
689.8 |
73.62 |
70.82 |
88.09 |
74.72 |
56.97 |
85.24 |
65.88 |
LlamaForCausalLM |
Qwen-72B ✅ 📑 | 🟢 |
722.9 |
73.6 |
65.19 |
85.94 |
77.37 |
60.19 |
82.48 |
70.43 |
QWenLMHeadModel |
Experiment7-7B 📑 | 🔶 |
89.9 |
73.55 |
71.84 |
88.04 |
65.25 |
70.59 |
80.82 |
64.75 |
MistralForCausalLM |
SOLAR-10.7b-Instruct-dpo 📑 | 🔶 |
107.3 |
73.54 |
71.76 |
88.08 |
66.06 |
71.98 |
82.32 |
61.03 |
LlamaForCausalLM |
Experiment8-7B 📑 | 🔶 |
89.9 |
73.47 |
72.1 |
88.13 |
65.25 |
70.25 |
80.66 |
64.44 |
MistralForCausalLM |
Pearl-7B-0210-dare 📑 | 🤝 |
72.4 |
73.46 |
70.9 |
88.8 |
61.69 |
71.46 |
84.53 |
63.38 |
MistralForCausalLM |
Mixtral-8x7B-Instruct-v0.1-DPO 📑 | 💬 |
467 |
73.44 |
69.8 |
87.83 |
71.05 |
69.18 |
81.37 |
61.41 |
MixtralForCausalLM |
neuronovo-9B-v0.4 📑 | 💬 |
89.9 |
73.42 |
72.44 |
88.33 |
65.24 |
71.07 |
80.66 |
62.77 |
MistralForCausalLM |
Mixtral_7Bx5_MoE_30B 📑 | 🔶 |
297.9 |
73.39 |
69.97 |
86.82 |
64.42 |
65.97 |
80.98 |
72.18 |
MixtralForCausalLM |
Experiment9-7B 📑 | 🔶 |
89.9 |
73.39 |
72.01 |
88.06 |
65.32 |
70.42 |
80.74 |
63.76 |
MistralForCausalLM |
Experiment1-7B 📑 | 🔶 |
89.9 |
73.39 |
72.53 |
88.17 |
65.28 |
69.98 |
80.82 |
63.53 |
MistralForCausalLM |
Experiment2-7B 📑 | 🔶 |
89.9 |
73.38 |
72.18 |
88.15 |
65.1 |
69.97 |
81.22 |
63.68 |
MistralForCausalLM |
Experiment4-7B 📑 | 🔶 |
89.9 |
73.38 |
72.18 |
88.09 |
65.03 |
70.39 |
81.14 |
63.46 |
MistralForCausalLM |
Daredevil-7B 📑 | 🔶 |
72.4 |
73.36 |
69.37 |
87.17 |
65.3 |
64.09 |
81.29 |
72.93 |
MistralForCausalLM |
Nous-Hermes-2-Mixtral-8x7B-DPO 📑 | 🔶 |
467 |
73.35 |
71.08 |
87.29 |
72.17 |
54.83 |
83.11 |
71.65 |
MixtralForCausalLM |
multimaster-7b-v2 📑 | 🔶 |
354.3 |
73.33 |
70.48 |
87.59 |
65.09 |
60.63 |
84.29 |
71.87 |
MixtralForCausalLM |
Nous-Hermes-2-MoE-2x34B 📑 | 🔶 |
608.1 |
73.3 |
66.64 |
85.73 |
76.49 |
58.08 |
83.35 |
69.52 |
MixtralForCausalLM |
v-alpha-tross 📑 | 💬 |
689.8 |
73.28 |
71.93 |
86.82 |
70.38 |
65.21 |
83.58 |
61.79 |
LlamaForCausalLM |
SUS-Chat-34B 📑 | 💬 |
340 |
73.22 |
66.3 |
83.91 |
76.41 |
57.04 |
83.5 |
72.18 |
LlamaForCausalLM |
SOLAR-10.7B-NahIdWin 📑 | 🔶 |
107.3 |
73.21 |
64.51 |
85.67 |
64.17 |
76.73 |
80.51 |
67.7 |
LlamaForCausalLM |
7Bx4_DPO 📑 | 💬 |
241.5 |
73.2 |
69.37 |
86.89 |
64.73 |
65.66 |
80.58 |
71.95 |
MixtralForCausalLM |
notus-8x7b-experiment 📑 | 💬 |
467 |
73.18 |
70.99 |
87.73 |
71.33 |
65.79 |
81.61 |
61.64 |
Unknown |
v-alpha-tross 📑 | 🔶 |
689.8 |
73.16 |
71.84 |
86.84 |
70.44 |
65.22 |
83.11 |
61.49 |
LlamaForCausalLM |
Nous-Hermes-2-Mixtral-8x7B-DPO 📑 | 💬 |
467 |
73.12 |
71.42 |
87.21 |
72.28 |
54.53 |
82.64 |
70.66 |
MixtralForCausalLM |
multimaster-7b-v3 📑 | 🔶 |
354.3 |
73.07 |
70.39 |
87.65 |
65.07 |
59.7 |
84.06 |
71.57 |
MixtralForCausalLM |
notux-8x7b-v1-epoch-2 📑 | 💬 |
0 |
73.05 |
70.65 |
87.8 |
71.43 |
65.97 |
82.08 |
60.35 |
Unknown |
34b-beta 📑 | 🔶 |
343.9 |
73.04 |
70.56 |
84.2 |
85.6 |
58.38 |
81.29 |
58.23 |
LlamaForCausalLM |
Marcoro14-7B-ties 📑 | 🔶 |
72.4 |
73.01 |
69.8 |
87.13 |
65.11 |
63.54 |
81.61 |
70.89 |
Unknown |
7Bx4_DPO_2e 📑 | 💬 |
241.5 |
72.99 |
68.94 |
86.8 |
64.5 |
65.6 |
80.74 |
71.34 |
MixtralForCausalLM |
notux-8x7b-v1 📑 | 🔶 |
467 |
72.97 |
70.65 |
87.72 |
71.39 |
66.21 |
80.74 |
61.11 |
MixtralForCausalLM |
HuginnV5.5-12.6B 📑 | 🔶 |
129.1 |
72.93 |
72.01 |
86.7 |
64.5 |
70.45 |
81.29 |
62.62 |
MistralForCausalLM |
SauerkrautLM-Mixtral-8x7B-Instruct 📑 | 🔶 |
467 |
72.89 |
70.48 |
87.75 |
71.37 |
65.71 |
81.22 |
60.8 |
MixtralForCausalLM |
Severus-7B-DPO 📑 | 🔶 |
72.4 |
72.81 |
70.22 |
87.09 |
64.93 |
64.41 |
80.66 |
69.52 |
MistralForCausalLM |
MPOMixtral-8x7B-Instruct-v0.1 📑 | 🔶 |
467 |
72.8 |
70.99 |
87.95 |
70.26 |
66.52 |
82.56 |
58.53 |
MixtralForCausalLM |
19B_TRUTH_DPO 📑 | 🔶 |
191.9 |
72.8 |
71.67 |
88.63 |
65.78 |
72.23 |
82.16 |
56.33 |
MixtralForCausalLM |
CCK_Gony_v3.3 📑 | 🔶 |
467 |
72.76 |
70.39 |
87.88 |
71.43 |
67.41 |
81.22 |
58.23 |
MixtralForCausalLM |
Pearl-7B-slerp 📑 | 🤝 |
72.4 |
72.75 |
68.0 |
87.16 |
64.04 |
62.35 |
81.29 |
73.62 |
MistralForCausalLM |
SauerkrautLM-Mixtral-8x7B-Instruct 📑 | 🔶 |
467 |
72.73 |
70.56 |
87.74 |
71.08 |
65.72 |
81.45 |
59.82 |
MixtralForCausalLM |
TenyxChat-8x7B-v1 📑 | 🔶 |
467 |
72.72 |
69.71 |
87.76 |
71.12 |
65.42 |
81.22 |
61.11 |
MixtralForCausalLM |
Mixtral-8x7B-Instruct-v0.1 📑 | 🔶 |
467 |
72.7 |
70.14 |
87.55 |
71.4 |
64.98 |
81.06 |
61.11 |
MixtralForCausalLM |
SJ-SOLAR-10.7b-DPO 📑 | 💬 |
108.6 |
72.67 |
68.26 |
86.95 |
66.73 |
67.74 |
84.21 |
62.09 |
LlamaForCausalLM |
garten2-7b 📑 | 🔶 |
72.4 |
72.65 |
69.37 |
87.54 |
65.44 |
59.5 |
84.69 |
69.37 |
MistralForCausalLM |
Fimbulvetr-11B-v2-Test-14 📑 | 🔶 |
107.3 |
72.64 |
70.05 |
87.79 |
66.78 |
63.43 |
82.95 |
64.82 |
LlamaForCausalLM |
Mixtral-8x7B-Instruct-v0.1 📑 | 💬 |
467 |
72.62 |
70.22 |
87.63 |
71.16 |
64.58 |
81.37 |
60.73 |
MixtralForCausalLM |
Severus-7B 📑 | 🔶 |
72.4 |
72.58 |
68.43 |
86.89 |
65.2 |
61.36 |
80.9 |
72.71 |
MistralForCausalLM |
KuroMitsu-11B 📑 | 🔶 |
110 |
72.58 |
70.31 |
88.07 |
66.66 |
61.36 |
84.69 |
64.37 |
LlamaForCausalLM |
PiVoT-SUS-RP 📑 | 💬 |
343.9 |
72.57 |
66.55 |
84.23 |
76.23 |
54.57 |
83.35 |
70.51 |
LlamaForCausalLM |
Marcoroni-7B-v3 📑 | 🔶 |
70 |
72.53 |
69.45 |
86.78 |
65.0 |
60.4 |
81.45 |
72.1 |
Unknown |
Marcoroni-v3-neural-chat-v3-3-Slerp 📑 | 🔶 |
72.4 |
72.51 |
68.77 |
86.55 |
64.51 |
62.7 |
80.74 |
71.8 |
Unknown |
bagel-dpo-8x7b-v0.2 📑 | 🔶 |
467 |
72.49 |
72.1 |
86.41 |
70.27 |
72.83 |
83.27 |
50.04 |
MixtralForCausalLM |
Mistral-T5-7B-v1 📑 | 🔶 |
70 |
72.47 |
68.6 |
86.3 |
64.62 |
61.86 |
80.27 |
73.16 |
MistralForCausalLM |
Kunoichi-DPO-v2-7B 📑 | 🔶 |
72.4 |
72.46 |
69.62 |
87.44 |
64.94 |
66.06 |
80.82 |
65.88 |
MistralForCausalLM |
Instruct_Mixtral-8x7B-v0.1_Dolly15K 📑 | 🔶 |
467 |
72.44 |
69.28 |
87.59 |
70.96 |
64.83 |
82.56 |
59.44 |
MixtralForCausalLM |
Kunoichi-DPO-v2-7B 📑 | 🔶 |
72.4 |
72.4 |
69.37 |
87.42 |
64.83 |
66.0 |
80.74 |
66.03 |
MistralForCausalLM |
laserxtral 📑 | 🔶 |
241.5 |
72.34 |
69.03 |
86.76 |
64.68 |
63.8 |
80.03 |
69.75 |
MixtralForCausalLM |
mindy-7b 📑 | 🔶 |
72.4 |
72.34 |
69.11 |
86.57 |
64.69 |
60.89 |
81.06 |
71.72 |
Unknown |
supermario-v2 📑 | 🔶 |
72.4 |
72.34 |
68.52 |
86.51 |
64.88 |
60.58 |
81.37 |
72.18 |
Unknown |
openbuddy-deepseek-67b-v15.2 📑 | 🔶 |
674.2 |
72.33 |
68.6 |
86.37 |
71.5 |
56.2 |
84.45 |
66.87 |
LlamaForCausalLM |
supermario-slerp 📑 | 🔶 |
72.4 |
72.32 |
68.94 |
86.58 |
64.93 |
60.11 |
81.29 |
72.1 |
Unknown |
piccolo-math-2x7b 📑 | 🤝 |
128.8 |
72.32 |
69.11 |
87.27 |
63.69 |
63.86 |
79.87 |
70.13 |
MixtralForCausalLM |
CCK_Gony_v0.1 📑 | 🔶 |
467 |
72.32 |
70.05 |
87.27 |
71.21 |
63.23 |
80.35 |
61.79 |
MixtralForCausalLM |
Solar-10.7B-SLERP 📑 | 🔶 |
107.3 |
72.31 |
70.73 |
87.87 |
65.77 |
65.72 |
82.48 |
61.26 |
LlamaForCausalLM |
yi-34B-v3 📑 | 💬 |
343.9 |
72.26 |
67.06 |
85.11 |
75.8 |
57.54 |
83.5 |
64.52 |
LlamaForCausalLM |
Fimbulvetr-10.7B-v1 📑 | 🔶 |
107.3 |
72.25 |
68.94 |
87.27 |
66.59 |
60.54 |
83.5 |
66.64 |
LlamaForCausalLM |
Kunoichi-DPO-7B 📑 | 🔶 |
72.4 |
72.24 |
69.62 |
87.14 |
64.79 |
67.31 |
80.58 |
63.99 |
MistralForCausalLM |
supermario-slerp-v3 📑 | 🤝 |
72.4 |
72.22 |
69.28 |
86.71 |
65.11 |
61.77 |
80.51 |
69.98 |
MistralForCausalLM |
LeoScorpius-7B 📑 | 🔶 |
72.4 |
72.21 |
69.28 |
87.01 |
65.04 |
63.95 |
81.53 |
66.41 |
MistralForCausalLM |
CCK_Gony_v3.1 📑 | 🔶 |
467 |
72.2 |
69.62 |
87.45 |
71.2 |
64.17 |
81.14 |
59.59 |
MixtralForCausalLM |
grindin 📑 | 💬 |
0 |
72.18 |
69.88 |
87.02 |
64.98 |
59.34 |
80.9 |
70.96 |
Unknown |
Mistral_7B_SFT_DPO_v0 📑 | 💬 |
72.4 |
72.17 |
66.3 |
84.9 |
64.53 |
69.72 |
81.77 |
65.81 |
MistralForCausalLM |
yi-34B-v2 📑 | 💬 |
343.9 |
72.12 |
66.13 |
85.0 |
75.64 |
57.34 |
83.66 |
64.97 |
LlamaForCausalLM |
72B-preview 📑 | 🔶 |
720 |
72.12 |
65.19 |
83.23 |
77.14 |
52.58 |
82.48 |
72.1 |
Unknown |
Nous-Hermes-2-Mixtral-8x7B-SFT 📑 | 💬 |
467 |
72.07 |
69.71 |
86.74 |
72.21 |
51.22 |
82.95 |
69.6 |
MixtralForCausalLM |
72B-preview 📑 | 🔶 |
720 |
72.06 |
64.85 |
83.28 |
77.21 |
52.51 |
82.48 |
72.02 |
Unknown |
BigWeave-v16-103b 📑 | 🤝 |
1032 |
72.02 |
65.87 |
87.61 |
73.22 |
63.81 |
80.43 |
61.18 |
LlamaForCausalLM |
72B-preview-llamafied-qwen-llamafy 📑 | 🔶 |
720 |
72.0 |
65.19 |
83.24 |
77.04 |
52.55 |
82.4 |
71.57 |
LlamaForCausalLM |
mistral-ft-optimized-1218 📑 | 🔶 |
72.4 |
71.94 |
67.92 |
86.26 |
64.99 |
59.48 |
80.74 |
72.25 |
MistralForCausalLM |
Pluto_24B_DPO_200 📑 | 🔶 |
241.5 |
71.88 |
65.61 |
86.38 |
64.59 |
69.86 |
78.93 |
65.88 |
MixtralForCausalLM |
Nous-Hermes-2-SOLAR-10.7B-MISALIGNED 📑 | 💬 |
107.3 |
71.83 |
68.26 |
86.11 |
66.26 |
57.79 |
83.43 |
69.14 |
LlamaForCausalLM |
deepseek-llm-67b-chat ✅ 📑 | 💬 |
670 |
71.79 |
67.75 |
86.82 |
72.42 |
55.85 |
84.21 |
63.68 |
LlamaForCausalLM |
NeuralDarewin-7B 📑 | 💬 |
72.4 |
71.79 |
70.14 |
86.4 |
64.85 |
62.92 |
79.72 |
66.72 |
MistralForCausalLM |
openbuddy-deepseek-67b-v15.1 📑 | 🔶 |
674.2 |
71.76 |
67.66 |
86.49 |
70.3 |
54.42 |
84.77 |
66.94 |
LlamaForCausalLM |
Tess-M-Creative-v1.0 📑 | 🔶 |
343.9 |
71.73 |
66.81 |
85.14 |
75.54 |
57.68 |
83.11 |
62.09 |
LlamaForCausalLM |
Evangelion-7B 📑 | 💬 |
72.4 |
71.71 |
68.94 |
86.45 |
63.97 |
64.01 |
79.95 |
66.94 |
MistralForCausalLM |
platypus-yi-34b 📑 | 💬 |
343.9 |
71.69 |
68.43 |
85.21 |
78.13 |
54.48 |
84.06 |
59.82 |
LlamaForCausalLM |
SOLAR-tail-10.7B-Merge-v1.0 📑 | 🔶 |
107.3 |
71.68 |
66.13 |
86.54 |
66.52 |
60.57 |
84.77 |
65.58 |
LlamaForCausalLM |
BigWeave-v15-103b 📑 | 🤝 |
1032 |
71.67 |
69.71 |
86.41 |
71.25 |
66.1 |
80.35 |
56.18 |
LlamaForCausalLM |
A0106 📑 | 🔶 |
343.9 |
71.53 |
66.38 |
85.05 |
74.0 |
57.88 |
82.87 |
63.0 |
Unknown |
deepseek-llm-67b-chat ✅ 📑 | 🔶 |
670 |
71.52 |
67.75 |
86.8 |
72.19 |
55.83 |
84.21 |
62.32 |
LlamaForCausalLM |
supermario-slerp-v2 📑 | 🤝 |
72.4 |
71.45 |
69.71 |
86.54 |
64.82 |
63.06 |
80.74 |
63.84 |
MistralForCausalLM |
A0106 📑 | 🔶 |
343.9 |
71.44 |
66.47 |
85.05 |
74.03 |
57.82 |
82.72 |
62.55 |
Unknown |
openbuddy-deepseek-67b-v15.3-4k 📑 | 💬 |
674.2 |
71.42 |
67.58 |
85.15 |
70.38 |
54.88 |
83.35 |
67.17 |
LlamaForCausalLM |
amadeus-v0.1 📑 | 🔶 |
241.5 |
71.42 |
68.94 |
86.98 |
64.69 |
63.82 |
79.95 |
64.14 |
MixtralForCausalLM |
Deita-20b 📑 | 🔶 |
198.6 |
71.4 |
63.91 |
83.11 |
67.4 |
57.29 |
84.61 |
72.1 |
LlamaForCausalLM |
LDCC-SOLAR-10.7B 📑 | 🔶 |
108.6 |
71.4 |
67.58 |
88.11 |
66.63 |
68.87 |
83.66 |
53.53 |
LlamaForCausalLM |
LDCC-SOLAR-10.7B 📑 | 🔶 |
108.6 |
71.4 |
67.32 |
88.11 |
66.83 |
68.85 |
83.66 |
53.6 |
LlamaForCausalLM |
OpenHermes-2.5-neural-chat-v3-3-Slerp 📑 | 🔶 |
72.4 |
71.38 |
68.09 |
86.2 |
64.26 |
62.78 |
79.16 |
67.78 |
Unknown |
DiscoLM-70b 📑 | 🔶 |
689.8 |
71.37 |
68.77 |
86.1 |
68.58 |
57.64 |
83.58 |
63.53 |
LlamaForCausalLM |
MisterUkrainianDPO 📑 | 🔶 |
72.4 |
71.37 |
68.34 |
86.78 |
62.92 |
70.18 |
80.74 |
59.29 |
MistralForCausalLM |
MoMo-70B-LoRA-V1.2_1 📑 | 💬 |
700 |
71.36 |
70.65 |
86.4 |
69.9 |
61.41 |
83.19 |
56.63 |
Unknown |
注意:手机屏幕有限,仅展示平均分,所有内容建议电脑端访问。
模型名称: | CCK_Asura_v2 📑 🔶 |
参数大小: |
689.8 |
平均分: |
73.62 |
模型名称: | Qwen-72B ✅ 📑 🟢 |
参数大小: |
722.9 |
平均分: |
73.6 |
模型名称: | Experiment7-7B 📑 🔶 |
参数大小: |
89.9 |
平均分: |
73.55 |
模型名称: | SOLAR-10.7b-Instruct-dpo 📑 🔶 |
参数大小: |
107.3 |
平均分: |
73.54 |
模型名称: | Experiment8-7B 📑 🔶 |
参数大小: |
89.9 |
平均分: |
73.47 |
模型名称: | Pearl-7B-0210-dare 📑 🤝 |
参数大小: |
72.4 |
平均分: |
73.46 |
模型名称: | Mixtral-8x7B-Instruct-v0.1-DPO 📑 💬 |
参数大小: |
467 |
平均分: |
73.44 |
模型名称: | neuronovo-9B-v0.4 📑 💬 |
参数大小: |
89.9 |
平均分: |
73.42 |
模型名称: | Mixtral_7Bx5_MoE_30B 📑 🔶 |
参数大小: |
297.9 |
平均分: |
73.39 |
模型名称: | Experiment9-7B 📑 🔶 |
参数大小: |
89.9 |
平均分: |
73.39 |
模型名称: | Experiment1-7B 📑 🔶 |
参数大小: |
89.9 |
平均分: |
73.39 |
模型名称: | Experiment2-7B 📑 🔶 |
参数大小: |
89.9 |
平均分: |
73.38 |
模型名称: | Experiment4-7B 📑 🔶 |
参数大小: |
89.9 |
平均分: |
73.38 |
模型名称: | Daredevil-7B 📑 🔶 |
参数大小: |
72.4 |
平均分: |
73.36 |
模型名称: | Nous-Hermes-2-Mixtral-8x7B-DPO 📑 🔶 |
参数大小: |
467 |
平均分: |
73.35 |
模型名称: | multimaster-7b-v2 📑 🔶 |
参数大小: |
354.3 |
平均分: |
73.33 |
模型名称: | Nous-Hermes-2-MoE-2x34B 📑 🔶 |
参数大小: |
608.1 |
平均分: |
73.3 |
模型名称: | v-alpha-tross 📑 💬 |
参数大小: |
689.8 |
平均分: |
73.28 |
模型名称: | SUS-Chat-34B 📑 💬 |
参数大小: |
340 |
平均分: |
73.22 |
模型名称: | SOLAR-10.7B-NahIdWin 📑 🔶 |
参数大小: |
107.3 |
平均分: |
73.21 |
模型名称: | 7Bx4_DPO 📑 💬 |
参数大小: |
241.5 |
平均分: |
73.2 |
模型名称: | notus-8x7b-experiment 📑 💬 |
参数大小: |
467 |
平均分: |
73.18 |
模型名称: | v-alpha-tross 📑 🔶 |
参数大小: |
689.8 |
平均分: |
73.16 |
模型名称: | Nous-Hermes-2-Mixtral-8x7B-DPO 📑 💬 |
参数大小: |
467 |
平均分: |
73.12 |
模型名称: | multimaster-7b-v3 📑 🔶 |
参数大小: |
354.3 |
平均分: |
73.07 |
模型名称: | notux-8x7b-v1-epoch-2 📑 💬 |
参数大小: |
0 |
平均分: |
73.05 |
模型名称: | 34b-beta 📑 🔶 |
参数大小: |
343.9 |
平均分: |
73.04 |
模型名称: | Marcoro14-7B-ties 📑 🔶 |
参数大小: |
72.4 |
平均分: |
73.01 |
模型名称: | 7Bx4_DPO_2e 📑 💬 |
参数大小: |
241.5 |
平均分: |
72.99 |
模型名称: | notux-8x7b-v1 📑 🔶 |
参数大小: |
467 |
平均分: |
72.97 |
模型名称: | HuginnV5.5-12.6B 📑 🔶 |
参数大小: |
129.1 |
平均分: |
72.93 |
模型名称: | SauerkrautLM-Mixtral-8x7B-Instruct 📑 🔶 |
参数大小: |
467 |
平均分: |
72.89 |
模型名称: | Severus-7B-DPO 📑 🔶 |
参数大小: |
72.4 |
平均分: |
72.81 |
模型名称: | MPOMixtral-8x7B-Instruct-v0.1 📑 🔶 |
参数大小: |
467 |
平均分: |
72.8 |
模型名称: | 19B_TRUTH_DPO 📑 🔶 |
参数大小: |
191.9 |
平均分: |
72.8 |
模型名称: | CCK_Gony_v3.3 📑 🔶 |
参数大小: |
467 |
平均分: |
72.76 |
模型名称: | Pearl-7B-slerp 📑 🤝 |
参数大小: |
72.4 |
平均分: |
72.75 |
模型名称: | SauerkrautLM-Mixtral-8x7B-Instruct 📑 🔶 |
参数大小: |
467 |
平均分: |
72.73 |
模型名称: | TenyxChat-8x7B-v1 📑 🔶 |
参数大小: |
467 |
平均分: |
72.72 |
模型名称: | Mixtral-8x7B-Instruct-v0.1 📑 🔶 |
参数大小: |
467 |
平均分: |
72.7 |
模型名称: | SJ-SOLAR-10.7b-DPO 📑 💬 |
参数大小: |
108.6 |
平均分: |
72.67 |
模型名称: | garten2-7b 📑 🔶 |
参数大小: |
72.4 |
平均分: |
72.65 |
模型名称: | Fimbulvetr-11B-v2-Test-14 📑 🔶 |
参数大小: |
107.3 |
平均分: |
72.64 |
模型名称: | Mixtral-8x7B-Instruct-v0.1 📑 💬 |
参数大小: |
467 |
平均分: |
72.62 |
模型名称: | Severus-7B 📑 🔶 |
参数大小: |
72.4 |
平均分: |
72.58 |
模型名称: | KuroMitsu-11B 📑 🔶 |
参数大小: |
110 |
平均分: |
72.58 |
模型名称: | PiVoT-SUS-RP 📑 💬 |
参数大小: |
343.9 |
平均分: |
72.57 |
模型名称: | Marcoroni-7B-v3 📑 🔶 |
参数大小: |
70 |
平均分: |
72.53 |
模型名称: | Marcoroni-v3-neural-chat-v3-3-Slerp 📑 🔶 |
参数大小: |
72.4 |
平均分: |
72.51 |
模型名称: | bagel-dpo-8x7b-v0.2 📑 🔶 |
参数大小: |
467 |
平均分: |
72.49 |
模型名称: | Mistral-T5-7B-v1 📑 🔶 |
参数大小: |
70 |
平均分: |
72.47 |
模型名称: | Kunoichi-DPO-v2-7B 📑 🔶 |
参数大小: |
72.4 |
平均分: |
72.46 |
模型名称: | Instruct_Mixtral-8x7B-v0.1_Dolly15K 📑 🔶 |
参数大小: |
467 |
平均分: |
72.44 |
模型名称: | Kunoichi-DPO-v2-7B 📑 🔶 |
参数大小: |
72.4 |
平均分: |
72.4 |
模型名称: | laserxtral 📑 🔶 |
参数大小: |
241.5 |
平均分: |
72.34 |
模型名称: | mindy-7b 📑 🔶 |
参数大小: |
72.4 |
平均分: |
72.34 |
模型名称: | supermario-v2 📑 🔶 |
参数大小: |
72.4 |
平均分: |
72.34 |
模型名称: | openbuddy-deepseek-67b-v15.2 📑 🔶 |
参数大小: |
674.2 |
平均分: |
72.33 |
模型名称: | supermario-slerp 📑 🔶 |
参数大小: |
72.4 |
平均分: |
72.32 |
模型名称: | piccolo-math-2x7b 📑 🤝 |
参数大小: |
128.8 |
平均分: |
72.32 |
模型名称: | CCK_Gony_v0.1 📑 🔶 |
参数大小: |
467 |
平均分: |
72.32 |
模型名称: | Solar-10.7B-SLERP 📑 🔶 |
参数大小: |
107.3 |
平均分: |
72.31 |
模型名称: | yi-34B-v3 📑 💬 |
参数大小: |
343.9 |
平均分: |
72.26 |
模型名称: | Fimbulvetr-10.7B-v1 📑 🔶 |
参数大小: |
107.3 |
平均分: |
72.25 |
模型名称: | Kunoichi-DPO-7B 📑 🔶 |
参数大小: |
72.4 |
平均分: |
72.24 |
模型名称: | supermario-slerp-v3 📑 🤝 |
参数大小: |
72.4 |
平均分: |
72.22 |
模型名称: | LeoScorpius-7B 📑 🔶 |
参数大小: |
72.4 |
平均分: |
72.21 |
模型名称: | CCK_Gony_v3.1 📑 🔶 |
参数大小: |
467 |
平均分: |
72.2 |
模型名称: | grindin 📑 💬 |
参数大小: |
0 |
平均分: |
72.18 |
模型名称: | Mistral_7B_SFT_DPO_v0 📑 💬 |
参数大小: |
72.4 |
平均分: |
72.17 |
模型名称: | yi-34B-v2 📑 💬 |
参数大小: |
343.9 |
平均分: |
72.12 |
模型名称: | 72B-preview 📑 🔶 |
参数大小: |
720 |
平均分: |
72.12 |
模型名称: | Nous-Hermes-2-Mixtral-8x7B-SFT 📑 💬 |
参数大小: |
467 |
平均分: |
72.07 |
模型名称: | 72B-preview 📑 🔶 |
参数大小: |
720 |
平均分: |
72.06 |
模型名称: | BigWeave-v16-103b 📑 🤝 |
参数大小: |
1032 |
平均分: |
72.02 |
模型名称: | 72B-preview-llamafied-qwen-llamafy 📑 🔶 |
参数大小: |
720 |
平均分: |
72.0 |
模型名称: | mistral-ft-optimized-1218 📑 🔶 |
参数大小: |
72.4 |
平均分: |
71.94 |
模型名称: | Pluto_24B_DPO_200 📑 🔶 |
参数大小: |
241.5 |
平均分: |
71.88 |
模型名称: | Nous-Hermes-2-SOLAR-10.7B-MISALIGNED 📑 💬 |
参数大小: |
107.3 |
平均分: |
71.83 |
模型名称: | deepseek-llm-67b-chat ✅ 📑 💬 |
参数大小: |
670 |
平均分: |
71.79 |
模型名称: | NeuralDarewin-7B 📑 💬 |
参数大小: |
72.4 |
平均分: |
71.79 |
模型名称: | openbuddy-deepseek-67b-v15.1 📑 🔶 |
参数大小: |
674.2 |
平均分: |
71.76 |
模型名称: | Tess-M-Creative-v1.0 📑 🔶 |
参数大小: |
343.9 |
平均分: |
71.73 |
模型名称: | Evangelion-7B 📑 💬 |
参数大小: |
72.4 |
平均分: |
71.71 |
模型名称: | platypus-yi-34b 📑 💬 |
参数大小: |
343.9 |
平均分: |
71.69 |
模型名称: | SOLAR-tail-10.7B-Merge-v1.0 📑 🔶 |
参数大小: |
107.3 |
平均分: |
71.68 |
模型名称: | BigWeave-v15-103b 📑 🤝 |
参数大小: |
1032 |
平均分: |
71.67 |
模型名称: | A0106 📑 🔶 |
参数大小: |
343.9 |
平均分: |
71.53 |
模型名称: | deepseek-llm-67b-chat ✅ 📑 🔶 |
参数大小: |
670 |
平均分: |
71.52 |
模型名称: | supermario-slerp-v2 📑 🤝 |
参数大小: |
72.4 |
平均分: |
71.45 |
模型名称: | A0106 📑 🔶 |
参数大小: |
343.9 |
平均分: |
71.44 |
模型名称: | openbuddy-deepseek-67b-v15.3-4k 📑 💬 |
参数大小: |
674.2 |
平均分: |
71.42 |
模型名称: | amadeus-v0.1 📑 🔶 |
参数大小: |
241.5 |
平均分: |
71.42 |
模型名称: | Deita-20b 📑 🔶 |
参数大小: |
198.6 |
平均分: |
71.4 |
模型名称: | LDCC-SOLAR-10.7B 📑 🔶 |
参数大小: |
108.6 |
平均分: |
71.4 |
模型名称: | LDCC-SOLAR-10.7B 📑 🔶 |
参数大小: |
108.6 |
平均分: |
71.4 |
模型名称: | OpenHermes-2.5-neural-chat-v3-3-Slerp 📑 🔶 |
参数大小: |
72.4 |
平均分: |
71.38 |
模型名称: | DiscoLM-70b 📑 🔶 |
参数大小: |
689.8 |
平均分: |
71.37 |
模型名称: | MisterUkrainianDPO 📑 🔶 |
参数大小: |
72.4 |
平均分: |
71.37 |
模型名称: | MoMo-70B-LoRA-V1.2_1 📑 💬 |
参数大小: |
700 |
平均分: |
71.36 |