🟢 : 预训练模型:这类模型是新的基础模型,它们是基于特定数据集进行预训练的。
🔶 :领域特定微调模型:这些预训练模型经过了针对特定领域数据集的进一步微调,以获得更好的性能。
💬 : 聊天模型:包括使用任务指令数据集的IFT(指令式任务训练)、RLHF(强化学习从人类反馈)或DPO(通过增加策略稍微改变模型的损失)等方法进行的聊天式微调模型。
🤝 :基础合并和Moerges模型:这类模型通过合并或MoErges(模型融合)技术集成了多个模型,但不需要额外的微调。如果您发现没有图标的模型,请随时提交问题,以补充模型信息。
❓:表示未知
模型名称 | 模型类型 | 参数大小(亿) | 平均分 | ARC分数 | Hellaswag分数 | MMLU分数 | TruthfulQA分数 | Winogrande分数 | GSM8K分数 | 模型架构 |
---|---|---|---|---|---|---|---|---|---|---|
MoMo-72B-lora-1.8.7-DPO 📑 | 💬 |
722.9 |
78.55 |
70.82 |
85.96 |
77.13 |
74.71 |
84.06 |
78.62 |
LlamaForCausalLM |
Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B 📑 | 💬 |
128.8 |
77.44 |
74.91 |
89.3 |
64.67 |
78.02 |
88.24 |
69.52 |
MixtralForCausalLM |
MoMo-72B-lora-1.8.6-DPO 📑 | 💬 |
722.9 |
77.29 |
70.14 |
86.03 |
77.4 |
69.0 |
84.37 |
76.8 |
LlamaForCausalLM |
MixTAO-7Bx2-MoE-Instruct-v7.0 📑 | 💬 |
128.8 |
76.55 |
74.23 |
89.37 |
64.54 |
74.26 |
87.77 |
69.14 |
MixtralForCausalLM |
MoMo-72B-lora-1.8.4-DPO 📑 | 💬 |
722.9 |
76.23 |
69.62 |
85.35 |
77.33 |
64.64 |
84.14 |
76.27 |
LlamaForCausalLM |
CarbonBeagle-11B-truthy 📑 | 💬 |
107.3 |
76.1 |
72.27 |
89.31 |
66.55 |
78.55 |
83.82 |
66.11 |
MistralForCausalLM |
test3_sft_16bit_dpo2 📑 | 💬 |
72.4 |
74.98 |
73.63 |
89.03 |
64.63 |
70.71 |
84.37 |
67.48 |
MistralForCausalLM |
MetaMath-Bagel-DPO-34B 📑 | 💬 |
343.9 |
74.8 |
68.17 |
84.23 |
76.54 |
65.44 |
82.24 |
72.18 |
LlamaForCausalLM |
raccoon-small 📑 | 💬 |
191.9 |
74.78 |
74.4 |
88.73 |
64.55 |
76.74 |
87.37 |
56.86 |
MixtralForCausalLM |
bagel-dpo-34b-v0.2 📑 | 💬 |
343.9 |
74.69 |
71.93 |
85.25 |
76.58 |
70.05 |
83.35 |
60.96 |
LlamaForCausalLM |
llamaRAGdrama 📑 | 💬 |
72.4 |
74.65 |
72.01 |
88.83 |
64.5 |
70.24 |
86.66 |
65.66 |
MistralForCausalLM |
Mixtral-7Bx2-truthy 📑 | 💬 |
128.8 |
74.64 |
72.18 |
87.88 |
65.2 |
74.68 |
80.66 |
67.25 |
MixtralForCausalLM |
MM-OV-bagel-DPO-34b-c1000-250 📑 | 💬 |
343.9 |
74.47 |
68.17 |
83.97 |
76.33 |
63.67 |
82.4 |
72.25 |
LlamaForCausalLM |
Truthful_DPO_MOE_19B 📑 | 💬 |
191.9 |
74.3 |
71.08 |
88.46 |
66.13 |
72.29 |
83.35 |
64.52 |
MixtralForCausalLM |
CarbonVillain-en-13B-v1 📑 | 💬 |
107.3 |
74.28 |
71.25 |
88.46 |
66.42 |
71.98 |
83.27 |
64.29 |
Unknown |
SOLAR-10B-OrcaDPO-Jawade 📑 | 💬 |
107.3 |
74.27 |
71.16 |
88.27 |
66.12 |
71.57 |
83.66 |
64.82 |
LlamaForCausalLM |
SOLAR-10.7B-Instruct-v1.0 📑 | 💬 |
107.3 |
74.2 |
71.08 |
88.16 |
66.21 |
71.43 |
83.58 |
64.75 |
LlamaForCausalLM |
SOLAR-10B-Nector-DPO-Jawade 📑 | 💬 |
107.3 |
74.19 |
71.33 |
88.62 |
66.22 |
70.92 |
83.43 |
64.59 |
LlamaForCausalLM |
SOLAR-Instruct-ko-Adapter-Attach 📑 | 💬 |
107.3 |
74.11 |
71.08 |
88.2 |
66.09 |
71.51 |
83.5 |
64.29 |
LlamaForCausalLM |
BrokenKeyboard 📑 | 💬 |
107.3 |
74.08 |
71.25 |
88.34 |
66.04 |
71.36 |
83.19 |
64.29 |
LlamaForCausalLM |
UNA-SOLAR-10.7B-Instruct-v1.0 📑 | 💬 |
107.3 |
74.07 |
70.73 |
88.32 |
66.1 |
72.52 |
83.35 |
63.38 |
LlamaForCausalLM |
UNA-TheBeagle-7b-v1 📑 | 💬 |
72.4 |
73.87 |
73.04 |
88.0 |
63.48 |
69.85 |
82.16 |
66.72 |
MistralForCausalLM |
tulu-2-dpo-70b 📑 | 💬 |
689.8 |
73.77 |
72.1 |
88.99 |
69.84 |
65.78 |
83.27 |
62.62 |
LlamaForCausalLM |
kellemar-DPO-Orca-Distilled-7B-SLERP 📑 | 💬 |
72.4 |
73.71 |
70.48 |
87.56 |
65.33 |
64.97 |
81.93 |
72.02 |
MistralForCausalLM |
Mixtral-8x7B-Instruct-v0.1-DPO 📑 | 💬 |
467 |
73.44 |
69.8 |
87.83 |
71.05 |
69.18 |
81.37 |
61.41 |
MixtralForCausalLM |
neuronovo-9B-v0.4 📑 | 💬 |
89.9 |
73.42 |
72.44 |
88.33 |
65.24 |
71.07 |
80.66 |
62.77 |
MistralForCausalLM |
v-alpha-tross 📑 | 💬 |
689.8 |
73.28 |
71.93 |
86.82 |
70.38 |
65.21 |
83.58 |
61.79 |
LlamaForCausalLM |
SUS-Chat-34B 📑 | 💬 |
340 |
73.22 |
66.3 |
83.91 |
76.41 |
57.04 |
83.5 |
72.18 |
LlamaForCausalLM |
7Bx4_DPO 📑 | 💬 |
241.5 |
73.2 |
69.37 |
86.89 |
64.73 |
65.66 |
80.58 |
71.95 |
MixtralForCausalLM |
notus-8x7b-experiment 📑 | 💬 |
467 |
73.18 |
70.99 |
87.73 |
71.33 |
65.79 |
81.61 |
61.64 |
Unknown |
Nous-Hermes-2-Mixtral-8x7B-DPO 📑 | 💬 |
467 |
73.12 |
71.42 |
87.21 |
72.28 |
54.53 |
82.64 |
70.66 |
MixtralForCausalLM |
notux-8x7b-v1-epoch-2 📑 | 💬 |
0 |
73.05 |
70.65 |
87.8 |
71.43 |
65.97 |
82.08 |
60.35 |
Unknown |
7Bx4_DPO_2e 📑 | 💬 |
241.5 |
72.99 |
68.94 |
86.8 |
64.5 |
65.6 |
80.74 |
71.34 |
MixtralForCausalLM |
SJ-SOLAR-10.7b-DPO 📑 | 💬 |
108.6 |
72.67 |
68.26 |
86.95 |
66.73 |
67.74 |
84.21 |
62.09 |
LlamaForCausalLM |
Mixtral-8x7B-Instruct-v0.1 📑 | 💬 |
467 |
72.62 |
70.22 |
87.63 |
71.16 |
64.58 |
81.37 |
60.73 |
MixtralForCausalLM |
PiVoT-SUS-RP 📑 | 💬 |
343.9 |
72.57 |
66.55 |
84.23 |
76.23 |
54.57 |
83.35 |
70.51 |
LlamaForCausalLM |
yi-34B-v3 📑 | 💬 |
343.9 |
72.26 |
67.06 |
85.11 |
75.8 |
57.54 |
83.5 |
64.52 |
LlamaForCausalLM |
grindin 📑 | 💬 |
0 |
72.18 |
69.88 |
87.02 |
64.98 |
59.34 |
80.9 |
70.96 |
Unknown |
Mistral_7B_SFT_DPO_v0 📑 | 💬 |
72.4 |
72.17 |
66.3 |
84.9 |
64.53 |
69.72 |
81.77 |
65.81 |
MistralForCausalLM |
yi-34B-v2 📑 | 💬 |
343.9 |
72.12 |
66.13 |
85.0 |
75.64 |
57.34 |
83.66 |
64.97 |
LlamaForCausalLM |
Nous-Hermes-2-Mixtral-8x7B-SFT 📑 | 💬 |
467 |
72.07 |
69.71 |
86.74 |
72.21 |
51.22 |
82.95 |
69.6 |
MixtralForCausalLM |
Nous-Hermes-2-SOLAR-10.7B-MISALIGNED 📑 | 💬 |
107.3 |
71.83 |
68.26 |
86.11 |
66.26 |
57.79 |
83.43 |
69.14 |
LlamaForCausalLM |
deepseek-llm-67b-chat ✅ 📑 | 💬 |
670 |
71.79 |
67.75 |
86.82 |
72.42 |
55.85 |
84.21 |
63.68 |
LlamaForCausalLM |
NeuralDarewin-7B 📑 | 💬 |
72.4 |
71.79 |
70.14 |
86.4 |
64.85 |
62.92 |
79.72 |
66.72 |
MistralForCausalLM |
Evangelion-7B 📑 | 💬 |
72.4 |
71.71 |
68.94 |
86.45 |
63.97 |
64.01 |
79.95 |
66.94 |
MistralForCausalLM |
platypus-yi-34b 📑 | 💬 |
343.9 |
71.69 |
68.43 |
85.21 |
78.13 |
54.48 |
84.06 |
59.82 |
LlamaForCausalLM |
openbuddy-deepseek-67b-v15.3-4k 📑 | 💬 |
674.2 |
71.42 |
67.58 |
85.15 |
70.38 |
54.88 |
83.35 |
67.17 |
LlamaForCausalLM |
MoMo-70B-LoRA-V1.2_1 📑 | 💬 |
700 |
71.36 |
70.65 |
86.4 |
69.9 |
61.41 |
83.19 |
56.63 |
Unknown |
Mixtral-8x7b-DPO-v0.2 📑 | 💬 |
467 |
71.32 |
70.39 |
87.73 |
71.03 |
58.69 |
82.56 |
57.54 |
MixtralForCausalLM |
ipo-test 📑 | 💬 |
0 |
71.29 |
67.92 |
85.99 |
65.05 |
55.87 |
80.9 |
72.02 |
Unknown |
sheep-duck-llama-2-70b-v1.1 📑 | 💬 |
700 |
71.22 |
73.12 |
87.77 |
70.77 |
64.55 |
83.11 |
47.99 |
LlamaForCausalLM |
PlatYi-34B-Llama-Q 📑 | 💬 |
343.9 |
71.13 |
65.7 |
85.22 |
78.78 |
53.64 |
83.03 |
60.42 |
LlamaForCausalLM |
Yi-34B-200K-AEZAKMI-v2 📑 | 💬 |
343.9 |
71.0 |
67.92 |
85.61 |
75.22 |
56.74 |
81.61 |
58.91 |
LlamaForCausalLM |
yi-34b-200k-rawrr-dpo-1 📑 | 💬 |
343.9 |
70.97 |
65.44 |
85.69 |
76.09 |
54.0 |
82.79 |
61.79 |
LlamaForCausalLM |
openbuddy-mixtral-7bx8-v18.1-32k 📑 | 💬 |
467.4 |
70.95 |
67.66 |
84.3 |
70.94 |
56.72 |
80.98 |
65.13 |
MixtralForCausalLM |
dolphin-2.2-70b 📑 | 💬 |
700 |
70.6 |
70.05 |
85.97 |
69.18 |
60.14 |
81.45 |
56.79 |
Unknown |
Pallas-0.2 📑 | 💬 |
343.9 |
70.51 |
64.59 |
83.44 |
75.53 |
55.29 |
81.61 |
62.62 |
LlamaForCausalLM |
Pallas-0.2 📑 | 💬 |
343.9 |
70.49 |
64.51 |
83.47 |
75.64 |
55.27 |
81.37 |
62.7 |
LlamaForCausalLM |
Mixtral-8x7b-DPO-v0.1 📑 | 💬 |
467 |
70.45 |
70.9 |
87.61 |
70.66 |
57.38 |
82.4 |
53.75 |
MixtralForCausalLM |
OpenAGI-7B-v0.1 📑 | 💬 |
72.4 |
70.34 |
66.72 |
86.13 |
63.53 |
69.55 |
79.48 |
56.63 |
MistralForCausalLM |
SauerkrautLM-7b-LaserChat 📑 | 💬 |
72.4 |
70.32 |
67.58 |
83.58 |
64.93 |
56.08 |
80.9 |
68.84 |
MistralForCausalLM |
Pallas-0.5 📑 | 💬 |
343.9 |
70.22 |
64.76 |
83.46 |
75.01 |
56.88 |
81.29 |
59.89 |
LlamaForCausalLM |
Yi-34B-200K-AEZAKMI-RAW-2301 📑 | 💬 |
343.9 |
70.12 |
66.04 |
84.7 |
74.89 |
56.89 |
81.14 |
57.09 |
LlamaForCausalLM |
Pallas-0.4 📑 | 💬 |
343.9 |
70.08 |
63.65 |
83.3 |
74.93 |
57.26 |
80.43 |
60.88 |
LlamaForCausalLM |
Pallas-0.3 📑 | 💬 |
343.9 |
70.06 |
63.74 |
83.3 |
75.08 |
57.31 |
80.66 |
60.27 |
LlamaForCausalLM |
Pallas-0.4 📑 | 💬 |
343.9 |
70.04 |
63.65 |
83.3 |
75.11 |
57.29 |
80.58 |
60.27 |
LlamaForCausalLM |
Pallas-0.3 📑 | 💬 |
343.9 |
69.88 |
63.57 |
83.36 |
75.09 |
57.32 |
80.19 |
59.74 |
LlamaForCausalLM |
PlatYi-34B-Q 📑 | 💬 |
343.9 |
69.86 |
66.89 |
85.14 |
77.66 |
53.03 |
82.48 |
53.98 |
LlamaForCausalLM |
Rabbit-7B-DPO-Chat 📑 | 💬 |
70 |
69.69 |
70.31 |
87.43 |
60.5 |
62.18 |
79.16 |
58.53 |
MistralForCausalLM |
Yi-34B-200K-AEZAKMI-RAW-2901 📑 | 💬 |
343.9 |
69.59 |
64.93 |
84.98 |
73.7 |
55.09 |
79.32 |
59.51 |
LlamaForCausalLM |
DPOpenHermes-7B-v2 📑 | 💬 |
72.4 |
69.58 |
66.64 |
85.22 |
63.64 |
59.22 |
79.16 |
63.61 |
MistralForCausalLM |
Rabbit-7B-v2-DPO-Chat 📑 | 💬 |
72.4 |
69.36 |
66.13 |
85.18 |
62.92 |
67.06 |
79.24 |
55.65 |
MistralForCausalLM |
MetaModel_moe_multilingualv1 📑 | 💬 |
467 |
69.33 |
67.58 |
84.72 |
63.77 |
61.21 |
77.35 |
61.33 |
MixtralForCausalLM |
openchat-3.5-0106 📑 | 💬 |
72.4 |
69.3 |
66.04 |
82.93 |
65.04 |
51.9 |
81.77 |
68.16 |
MistralForCausalLM |
airoboros-l2-70b-2.2.1 📑 | 💬 |
700 |
69.13 |
69.71 |
87.95 |
69.79 |
59.49 |
82.95 |
44.88 |
LlamaForCausalLM |
loyal-piano-m7-cdpo 📑 | 💬 |
72.4 |
69.08 |
67.15 |
85.39 |
64.52 |
61.53 |
79.4 |
56.48 |
MistralForCausalLM |
servile-harpsichord-cdpo 📑 | 💬 |
72.4 |
68.98 |
67.32 |
85.18 |
64.54 |
60.61 |
79.16 |
57.09 |
MistralForCausalLM |
Mixtral-8x7B-peft-v0.1 📑 | 💬 |
70 |
68.87 |
67.24 |
86.03 |
68.59 |
59.54 |
80.43 |
51.4 |
Unknown |
LHK 📑 | 💬 |
107.3 |
68.74 |
66.38 |
84.49 |
65.13 |
59.12 |
80.98 |
56.33 |
LlamaForCausalLM |
SOLAR-10.7B-dpo-instruct-tuned-v0.1 📑 | 💬 |
107.3 |
68.68 |
65.19 |
86.09 |
66.25 |
51.81 |
83.98 |
58.76 |
LlamaForCausalLM |
Yi-34B-AEZAKMI-v1 📑 | 💬 |
343.9 |
68.67 |
64.33 |
84.31 |
73.91 |
55.73 |
80.82 |
52.92 |
LlamaForCausalLM |
loyal-piano-m7 📑 | 💬 |
72.4 |
68.67 |
66.72 |
85.03 |
64.43 |
60.03 |
79.08 |
56.71 |
MistralForCausalLM |
MixtralRPChat-ZLoss 📑 | 💬 |
467 |
68.59 |
68.6 |
86.1 |
70.44 |
53.85 |
82.0 |
50.57 |
MixtralForCausalLM |
ds_diasum_md_mixtral 📑 | 💬 |
0 |
68.42 |
66.3 |
85.45 |
69.51 |
55.72 |
80.35 |
53.22 |
Unknown |
agiin-13.6B-v0.1 📑 | 💬 |
137.8 |
68.4 |
69.45 |
86.64 |
61.15 |
67.97 |
78.69 |
46.47 |
MistralForCausalLM |
PlatYi-34B-Llama 📑 | 💬 |
343.9 |
68.37 |
67.83 |
85.35 |
78.26 |
53.46 |
82.87 |
42.46 |
Unknown |
PlatYi-34B-Llama-Q-FastChat 📑 | 💬 |
343.9 |
68.31 |
66.13 |
85.25 |
78.37 |
53.62 |
82.16 |
44.35 |
Unknown |
CapybaraHermes-2.5-Mistral-7B 📑 | 💬 |
72.4 |
68.14 |
65.78 |
85.45 |
63.13 |
56.91 |
78.3 |
59.29 |
MistralForCausalLM |
PlatYi-34B-LoRA 📑 | 💬 |
343.9 |
68.1 |
67.15 |
85.37 |
78.46 |
53.32 |
83.66 |
40.64 |
LlamaForCausalLM |
Merged-DPO-7B 📑 | 💬 |
70 |
68.06 |
68.94 |
87.75 |
55.35 |
72.76 |
78.37 |
45.19 |
Unknown |
lil-c3po 📑 | 💬 |
72.4 |
68.03 |
65.02 |
84.45 |
62.36 |
68.73 |
79.16 |
48.45 |
Unknown |
PlatYi-34B-Llama-Q-v2 📑 | 💬 |
343.9 |
67.88 |
61.09 |
85.09 |
76.59 |
52.65 |
82.79 |
49.05 |
LlamaForCausalLM |
OpenAGI-7B-v0.1 📑 | 💬 |
72.4 |
67.87 |
68.26 |
85.06 |
61.6 |
59.4 |
79.79 |
53.07 |
MistralForCausalLM |
PlatYi-34B-200k-Q-FastChat 📑 | 💬 |
340 |
67.85 |
64.93 |
84.46 |
77.13 |
48.38 |
80.74 |
51.48 |
LlamaForCausalLM |
Mixtral-Orca-v0.1 📑 | 💬 |
467 |
67.82 |
69.71 |
88.88 |
66.06 |
63.85 |
81.14 |
37.3 |
MixtralForCausalLM |
DPOpenHermes-7B 📑 | 💬 |
72.4 |
67.58 |
65.7 |
85.96 |
63.89 |
56.95 |
78.61 |
54.36 |
MistralForCausalLM |
SeaLLM-7B-v2 📑 | 💬 |
70 |
67.57 |
62.03 |
82.32 |
61.89 |
51.11 |
79.08 |
68.99 |
MistralForCausalLM |
MoMo-70B-LoRA-V1.1 📑 | 💬 |
700 |
67.53 |
66.64 |
87.16 |
66.76 |
54.98 |
83.35 |
46.32 |
Unknown |
Samantha-1.1-70b 📑 | 💬 |
687.2 |
67.43 |
68.77 |
87.46 |
68.6 |
64.85 |
83.27 |
31.61 |
Unknown |
Samantha-1.11-70b 📑 | 💬 |
687.2 |
67.28 |
70.05 |
87.55 |
67.82 |
65.02 |
83.27 |
29.95 |
Unknown |
注意:手机屏幕有限,仅展示平均分,所有内容建议电脑端访问。
模型名称: | MoMo-72B-lora-1.8.7-DPO 📑 💬 |
参数大小: |
722.9 |
平均分: |
78.55 |
模型名称: | Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B 📑 💬 |
参数大小: |
128.8 |
平均分: |
77.44 |
模型名称: | MoMo-72B-lora-1.8.6-DPO 📑 💬 |
参数大小: |
722.9 |
平均分: |
77.29 |
模型名称: | MixTAO-7Bx2-MoE-Instruct-v7.0 📑 💬 |
参数大小: |
128.8 |
平均分: |
76.55 |
模型名称: | MoMo-72B-lora-1.8.4-DPO 📑 💬 |
参数大小: |
722.9 |
平均分: |
76.23 |
模型名称: | CarbonBeagle-11B-truthy 📑 💬 |
参数大小: |
107.3 |
平均分: |
76.1 |
模型名称: | test3_sft_16bit_dpo2 📑 💬 |
参数大小: |
72.4 |
平均分: |
74.98 |
模型名称: | MetaMath-Bagel-DPO-34B 📑 💬 |
参数大小: |
343.9 |
平均分: |
74.8 |
模型名称: | raccoon-small 📑 💬 |
参数大小: |
191.9 |
平均分: |
74.78 |
模型名称: | bagel-dpo-34b-v0.2 📑 💬 |
参数大小: |
343.9 |
平均分: |
74.69 |
模型名称: | llamaRAGdrama 📑 💬 |
参数大小: |
72.4 |
平均分: |
74.65 |
模型名称: | Mixtral-7Bx2-truthy 📑 💬 |
参数大小: |
128.8 |
平均分: |
74.64 |
模型名称: | MM-OV-bagel-DPO-34b-c1000-250 📑 💬 |
参数大小: |
343.9 |
平均分: |
74.47 |
模型名称: | Truthful_DPO_MOE_19B 📑 💬 |
参数大小: |
191.9 |
平均分: |
74.3 |
模型名称: | CarbonVillain-en-13B-v1 📑 💬 |
参数大小: |
107.3 |
平均分: |
74.28 |
模型名称: | SOLAR-10B-OrcaDPO-Jawade 📑 💬 |
参数大小: |
107.3 |
平均分: |
74.27 |
模型名称: | SOLAR-10.7B-Instruct-v1.0 📑 💬 |
参数大小: |
107.3 |
平均分: |
74.2 |
模型名称: | SOLAR-10B-Nector-DPO-Jawade 📑 💬 |
参数大小: |
107.3 |
平均分: |
74.19 |
模型名称: | SOLAR-Instruct-ko-Adapter-Attach 📑 💬 |
参数大小: |
107.3 |
平均分: |
74.11 |
模型名称: | BrokenKeyboard 📑 💬 |
参数大小: |
107.3 |
平均分: |
74.08 |
模型名称: | UNA-SOLAR-10.7B-Instruct-v1.0 📑 💬 |
参数大小: |
107.3 |
平均分: |
74.07 |
模型名称: | UNA-TheBeagle-7b-v1 📑 💬 |
参数大小: |
72.4 |
平均分: |
73.87 |
模型名称: | tulu-2-dpo-70b 📑 💬 |
参数大小: |
689.8 |
平均分: |
73.77 |
模型名称: | kellemar-DPO-Orca-Distilled-7B-SLERP 📑 💬 |
参数大小: |
72.4 |
平均分: |
73.71 |
模型名称: | Mixtral-8x7B-Instruct-v0.1-DPO 📑 💬 |
参数大小: |
467 |
平均分: |
73.44 |
模型名称: | neuronovo-9B-v0.4 📑 💬 |
参数大小: |
89.9 |
平均分: |
73.42 |
模型名称: | v-alpha-tross 📑 💬 |
参数大小: |
689.8 |
平均分: |
73.28 |
模型名称: | SUS-Chat-34B 📑 💬 |
参数大小: |
340 |
平均分: |
73.22 |
模型名称: | 7Bx4_DPO 📑 💬 |
参数大小: |
241.5 |
平均分: |
73.2 |
模型名称: | notus-8x7b-experiment 📑 💬 |
参数大小: |
467 |
平均分: |
73.18 |
模型名称: | Nous-Hermes-2-Mixtral-8x7B-DPO 📑 💬 |
参数大小: |
467 |
平均分: |
73.12 |
模型名称: | notux-8x7b-v1-epoch-2 📑 💬 |
参数大小: |
0 |
平均分: |
73.05 |
模型名称: | 7Bx4_DPO_2e 📑 💬 |
参数大小: |
241.5 |
平均分: |
72.99 |
模型名称: | SJ-SOLAR-10.7b-DPO 📑 💬 |
参数大小: |
108.6 |
平均分: |
72.67 |
模型名称: | Mixtral-8x7B-Instruct-v0.1 📑 💬 |
参数大小: |
467 |
平均分: |
72.62 |
模型名称: | PiVoT-SUS-RP 📑 💬 |
参数大小: |
343.9 |
平均分: |
72.57 |
模型名称: | yi-34B-v3 📑 💬 |
参数大小: |
343.9 |
平均分: |
72.26 |
模型名称: | grindin 📑 💬 |
参数大小: |
0 |
平均分: |
72.18 |
模型名称: | Mistral_7B_SFT_DPO_v0 📑 💬 |
参数大小: |
72.4 |
平均分: |
72.17 |
模型名称: | yi-34B-v2 📑 💬 |
参数大小: |
343.9 |
平均分: |
72.12 |
模型名称: | Nous-Hermes-2-Mixtral-8x7B-SFT 📑 💬 |
参数大小: |
467 |
平均分: |
72.07 |
模型名称: | Nous-Hermes-2-SOLAR-10.7B-MISALIGNED 📑 💬 |
参数大小: |
107.3 |
平均分: |
71.83 |
模型名称: | deepseek-llm-67b-chat ✅ 📑 💬 |
参数大小: |
670 |
平均分: |
71.79 |
模型名称: | NeuralDarewin-7B 📑 💬 |
参数大小: |
72.4 |
平均分: |
71.79 |
模型名称: | Evangelion-7B 📑 💬 |
参数大小: |
72.4 |
平均分: |
71.71 |
模型名称: | platypus-yi-34b 📑 💬 |
参数大小: |
343.9 |
平均分: |
71.69 |
模型名称: | openbuddy-deepseek-67b-v15.3-4k 📑 💬 |
参数大小: |
674.2 |
平均分: |
71.42 |
模型名称: | MoMo-70B-LoRA-V1.2_1 📑 💬 |
参数大小: |
700 |
平均分: |
71.36 |
模型名称: | Mixtral-8x7b-DPO-v0.2 📑 💬 |
参数大小: |
467 |
平均分: |
71.32 |
模型名称: | ipo-test 📑 💬 |
参数大小: |
0 |
平均分: |
71.29 |
模型名称: | sheep-duck-llama-2-70b-v1.1 📑 💬 |
参数大小: |
700 |
平均分: |
71.22 |
模型名称: | PlatYi-34B-Llama-Q 📑 💬 |
参数大小: |
343.9 |
平均分: |
71.13 |
模型名称: | Yi-34B-200K-AEZAKMI-v2 📑 💬 |
参数大小: |
343.9 |
平均分: |
71.0 |
模型名称: | yi-34b-200k-rawrr-dpo-1 📑 💬 |
参数大小: |
343.9 |
平均分: |
70.97 |
模型名称: | openbuddy-mixtral-7bx8-v18.1-32k 📑 💬 |
参数大小: |
467.4 |
平均分: |
70.95 |
模型名称: | dolphin-2.2-70b 📑 💬 |
参数大小: |
700 |
平均分: |
70.6 |
模型名称: | Pallas-0.2 📑 💬 |
参数大小: |
343.9 |
平均分: |
70.51 |
模型名称: | Pallas-0.2 📑 💬 |
参数大小: |
343.9 |
平均分: |
70.49 |
模型名称: | Mixtral-8x7b-DPO-v0.1 📑 💬 |
参数大小: |
467 |
平均分: |
70.45 |
模型名称: | OpenAGI-7B-v0.1 📑 💬 |
参数大小: |
72.4 |
平均分: |
70.34 |
模型名称: | SauerkrautLM-7b-LaserChat 📑 💬 |
参数大小: |
72.4 |
平均分: |
70.32 |
模型名称: | Pallas-0.5 📑 💬 |
参数大小: |
343.9 |
平均分: |
70.22 |
模型名称: | Yi-34B-200K-AEZAKMI-RAW-2301 📑 💬 |
参数大小: |
343.9 |
平均分: |
70.12 |
模型名称: | Pallas-0.4 📑 💬 |
参数大小: |
343.9 |
平均分: |
70.08 |
模型名称: | Pallas-0.3 📑 💬 |
参数大小: |
343.9 |
平均分: |
70.06 |
模型名称: | Pallas-0.4 📑 💬 |
参数大小: |
343.9 |
平均分: |
70.04 |
模型名称: | Pallas-0.3 📑 💬 |
参数大小: |
343.9 |
平均分: |
69.88 |
模型名称: | PlatYi-34B-Q 📑 💬 |
参数大小: |
343.9 |
平均分: |
69.86 |
模型名称: | Rabbit-7B-DPO-Chat 📑 💬 |
参数大小: |
70 |
平均分: |
69.69 |
模型名称: | Yi-34B-200K-AEZAKMI-RAW-2901 📑 💬 |
参数大小: |
343.9 |
平均分: |
69.59 |
模型名称: | DPOpenHermes-7B-v2 📑 💬 |
参数大小: |
72.4 |
平均分: |
69.58 |
模型名称: | Rabbit-7B-v2-DPO-Chat 📑 💬 |
参数大小: |
72.4 |
平均分: |
69.36 |
模型名称: | MetaModel_moe_multilingualv1 📑 💬 |
参数大小: |
467 |
平均分: |
69.33 |
模型名称: | openchat-3.5-0106 📑 💬 |
参数大小: |
72.4 |
平均分: |
69.3 |
模型名称: | airoboros-l2-70b-2.2.1 📑 💬 |
参数大小: |
700 |
平均分: |
69.13 |
模型名称: | loyal-piano-m7-cdpo 📑 💬 |
参数大小: |
72.4 |
平均分: |
69.08 |
模型名称: | servile-harpsichord-cdpo 📑 💬 |
参数大小: |
72.4 |
平均分: |
68.98 |
模型名称: | Mixtral-8x7B-peft-v0.1 📑 💬 |
参数大小: |
70 |
平均分: |
68.87 |
模型名称: | LHK 📑 💬 |
参数大小: |
107.3 |
平均分: |
68.74 |
模型名称: | SOLAR-10.7B-dpo-instruct-tuned-v0.1 📑 💬 |
参数大小: |
107.3 |
平均分: |
68.68 |
模型名称: | Yi-34B-AEZAKMI-v1 📑 💬 |
参数大小: |
343.9 |
平均分: |
68.67 |
模型名称: | loyal-piano-m7 📑 💬 |
参数大小: |
72.4 |
平均分: |
68.67 |
模型名称: | MixtralRPChat-ZLoss 📑 💬 |
参数大小: |
467 |
平均分: |
68.59 |
模型名称: | ds_diasum_md_mixtral 📑 💬 |
参数大小: |
0 |
平均分: |
68.42 |
模型名称: | agiin-13.6B-v0.1 📑 💬 |
参数大小: |
137.8 |
平均分: |
68.4 |
模型名称: | PlatYi-34B-Llama 📑 💬 |
参数大小: |
343.9 |
平均分: |
68.37 |
模型名称: | PlatYi-34B-Llama-Q-FastChat 📑 💬 |
参数大小: |
343.9 |
平均分: |
68.31 |
模型名称: | CapybaraHermes-2.5-Mistral-7B 📑 💬 |
参数大小: |
72.4 |
平均分: |
68.14 |
模型名称: | PlatYi-34B-LoRA 📑 💬 |
参数大小: |
343.9 |
平均分: |
68.1 |
模型名称: | Merged-DPO-7B 📑 💬 |
参数大小: |
70 |
平均分: |
68.06 |
模型名称: | lil-c3po 📑 💬 |
参数大小: |
72.4 |
平均分: |
68.03 |
模型名称: | PlatYi-34B-Llama-Q-v2 📑 💬 |
参数大小: |
343.9 |
平均分: |
67.88 |
模型名称: | OpenAGI-7B-v0.1 📑 💬 |
参数大小: |
72.4 |
平均分: |
67.87 |
模型名称: | PlatYi-34B-200k-Q-FastChat 📑 💬 |
参数大小: |
340 |
平均分: |
67.85 |
模型名称: | Mixtral-Orca-v0.1 📑 💬 |
参数大小: |
467 |
平均分: |
67.82 |
模型名称: | DPOpenHermes-7B 📑 💬 |
参数大小: |
72.4 |
平均分: |
67.58 |
模型名称: | SeaLLM-7B-v2 📑 💬 |
参数大小: |
70 |
平均分: |
67.57 |
模型名称: | MoMo-70B-LoRA-V1.1 📑 💬 |
参数大小: |
700 |
平均分: |
67.53 |
模型名称: | Samantha-1.1-70b 📑 💬 |
参数大小: |
687.2 |
平均分: |
67.43 |
模型名称: | Samantha-1.11-70b 📑 💬 |
参数大小: |
687.2 |
平均分: |
67.28 |