🟢 : 预训练模型:这类模型是新的基础模型,它们是基于特定数据集进行预训练的。
🔶 :领域特定微调模型:这些预训练模型经过了针对特定领域数据集的进一步微调,以获得更好的性能。
💬 : 聊天模型:包括使用任务指令数据集的IFT(指令式任务训练)、RLHF(强化学习从人类反馈)或DPO(通过增加策略稍微改变模型的损失)等方法进行的聊天式微调模型。
🤝 :基础合并和Moerges模型:这类模型通过合并或MoErges(模型融合)技术集成了多个模型,但不需要额外的微调。如果您发现没有图标的模型,请随时提交问题,以补充模型信息。
❓:表示未知
模型名称 | 模型类型 | 参数大小(亿) | 平均分 | ARC分数 | Hellaswag分数 | MMLU分数 | TruthfulQA分数 | Winogrande分数 | GSM8K分数 | 模型架构 |
---|---|---|---|---|---|---|---|---|---|---|
airoboros-m-7b-3.1.2 📑 | 🔶 |
72.4 |
58.75 |
61.86 |
83.51 |
61.91 |
53.75 |
77.58 |
13.87 |
MistralForCausalLM |
dromedary-65b-lora-HF 📑 | 🔶 |
650 |
58.73 |
61.6 |
82.53 |
63.08 |
38.82 |
78.93 |
27.45 |
LlamaForCausalLM |
CollectiveCognition-v1.1-Mistral-7B-dare-0.85 📑 | 🔶 |
70 |
58.72 |
61.01 |
84.31 |
64.34 |
44.87 |
78.85 |
18.95 |
MistralForCausalLM |
chinese-mixtral 📑 | 🔶 |
467 |
58.69 |
67.58 |
85.34 |
70.38 |
46.86 |
82.0 |
0.0 |
MixtralForCausalLM |
Nous-Hermes-2-SOLAR-10.7B-v1.1 📑 | 🔶 |
107.3 |
58.69 |
63.99 |
82.72 |
65.85 |
56.97 |
81.22 |
1.36 |
Unknown |
Dolphin-Nebula-7B 📑 | 💬 |
72.4 |
58.69 |
55.2 |
78.57 |
53.44 |
57.97 |
73.88 |
33.06 |
MistralForCausalLM |
alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont2 📑 | 🔶 |
72.4 |
58.67 |
60.32 |
82.88 |
59.79 |
42.36 |
76.56 |
30.1 |
MistralForCausalLM |
72B-preview-canary-llamafied-qwen-llamafy-unbias-qkv 📑 | 🔶 |
720 |
58.67 |
53.07 |
63.13 |
67.39 |
57.62 |
75.14 |
35.63 |
LlamaForCausalLM |
Mistral-v0.1-PeanutButter-v0.0.2-7B 📑 | 💬 |
72.4 |
58.66 |
61.77 |
84.11 |
64.38 |
45.92 |
78.37 |
17.44 |
Unknown |
Orca-2-13b ✅ 📑 | 🔶 |
130 |
58.64 |
60.67 |
79.81 |
60.37 |
56.41 |
76.64 |
17.97 |
LlamaForCausalLM |
falcon-40b-openassistant-peft 📑 | 🔶 |
400 |
58.63 |
62.63 |
85.59 |
57.77 |
51.02 |
81.45 |
13.34 |
Unknown |
SOLAR-Platypus-10.7B-v1 📑 | 💬 |
107.3 |
58.62 |
61.69 |
84.23 |
60.37 |
51.58 |
82.79 |
11.07 |
LlamaForCausalLM |
QuantumLM-70B-hf 📑 | 🔶 |
689.8 |
58.61 |
59.47 |
83.02 |
62.25 |
53.39 |
78.77 |
14.78 |
LlamaForCausalLM |
mamba-gpt-7b-v1 📑 | 🔶 |
70 |
58.61 |
61.26 |
84.1 |
63.46 |
46.34 |
79.16 |
17.36 |
MistralForCausalLM |
koOpenChat-sft 📑 | 💬 |
0 |
58.61 |
59.81 |
78.73 |
61.32 |
51.24 |
76.4 |
24.18 |
MistralForCausalLM |
alignment-handbook-zephyr-7b_ppo_5e7step_51 📑 | 🔶 |
72.4 |
58.59 |
59.73 |
82.52 |
59.76 |
41.46 |
77.19 |
30.86 |
MistralForCausalLM |
dolphin-2.0-mistral-7b 📑 | 🔶 |
71.1 |
58.58 |
59.22 |
80.26 |
56.9 |
61.09 |
75.37 |
18.65 |
Unknown |
zephyr-neural-chat-frankenmerge11b 📑 | 🔶 |
113.9 |
58.57 |
61.52 |
84.09 |
61.51 |
60.63 |
76.24 |
7.43 |
MistralForCausalLM |
chinese-mixtral 📑 | 🟢 |
467 |
58.57 |
67.49 |
85.25 |
70.31 |
46.75 |
81.61 |
0.0 |
MixtralForCausalLM |
claude2-alpaca-13B 📑 | 🔶 |
130 |
58.57 |
61.18 |
84.21 |
55.93 |
45.02 |
76.8 |
28.28 |
Unknown |
72B-preview-canary-llamafied-qwen-llamafy-unbias-qkv 📑 | 🔶 |
720 |
58.54 |
52.56 |
62.99 |
67.45 |
57.61 |
75.14 |
35.48 |
LlamaForCausalLM |
MegaMix-A1-13B 📑 | 🔶 |
130.2 |
58.52 |
61.6 |
83.49 |
58.26 |
47.48 |
76.16 |
24.11 |
Unknown |
Noromaid-13b-v0.2 📑 | 🔶 |
130.2 |
58.51 |
60.92 |
84.04 |
57.67 |
52.58 |
74.11 |
21.76 |
LlamaForCausalLM |
MLewd-ReMM-L2-Chat-20B 📑 | 🔶 |
199.9 |
58.49 |
62.46 |
85.62 |
59.13 |
55.63 |
77.19 |
10.92 |
LlamaForCausalLM |
open-aditi-hi-v1 📑 | 🔶 |
72.4 |
58.49 |
58.79 |
81.38 |
58.51 |
42.34 |
76.48 |
33.43 |
MistralForCausalLM |
Wizard-Vicuna-30B-Uncensored-GPTQ 📑 | ❓ |
355.8 |
58.47 |
61.09 |
82.4 |
56.46 |
49.9 |
77.66 |
23.28 |
LlamaForCausalLM |
neural-chat-7b-v3 📑 | 🔶 |
70 |
58.46 |
67.15 |
83.29 |
62.26 |
58.77 |
78.06 |
1.21 |
MistralForCausalLM |
Llama2-chat-AYB-13B 📑 | 🔶 |
130.2 |
58.45 |
63.4 |
84.79 |
59.34 |
55.62 |
76.24 |
11.3 |
LlamaForCausalLM |
X-MythoChronos-13B 📑 | 🔶 |
130.2 |
58.43 |
59.73 |
83.39 |
56.5 |
53.55 |
74.43 |
22.97 |
LlamaForCausalLM |
CodeMate-v0.1 📑 | 🔶 |
337.4 |
58.39 |
55.55 |
78.03 |
55.31 |
48.64 |
72.61 |
40.18 |
LlamaForCausalLM |
SynthIA-7B-v1.3-dare-0.85 📑 | 🔶 |
70 |
58.38 |
61.01 |
83.5 |
64.49 |
43.77 |
78.93 |
18.57 |
MistralForCausalLM |
Uncensored-Frank-33B 📑 | 🔶 |
330 |
58.38 |
62.12 |
83.3 |
57.57 |
54.03 |
76.56 |
16.68 |
LlamaForCausalLM |
phi-2-OpenHermes-2.5 📑 | 🔶 |
27.8 |
58.38 |
59.81 |
74.85 |
55.51 |
43.86 |
75.06 |
41.17 |
PhiForCausalLM |
alignment-handbook-zephyr-7b_ppo_5e7step_102 📑 | 🔶 |
72.4 |
58.37 |
59.22 |
82.45 |
59.62 |
41.56 |
77.03 |
30.33 |
MistralForCausalLM |
SynthIA-v1.3-Nebula-v2-7B 📑 | 🔶 |
72.4 |
58.33 |
59.39 |
82.77 |
57.57 |
50.62 |
74.74 |
24.87 |
MistralForCausalLM |
DPO_mistral_v01_7b_ultra_0131_1k_1epoch 📑 | 🔶 |
72.4 |
58.32 |
55.97 |
76.78 |
55.97 |
57.94 |
73.4 |
29.87 |
MistralForCausalLM |
mamba-gpt-7b-v2 📑 | 🔶 |
70 |
58.31 |
61.95 |
83.83 |
61.74 |
46.63 |
78.45 |
17.29 |
MistralForCausalLM |
alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont1 📑 | 🔶 |
72.4 |
58.29 |
60.24 |
82.28 |
60.61 |
40.55 |
77.11 |
28.96 |
MistralForCausalLM |
Mistral-v0.1-PeanutButter-v0.0.5-SFT-7B-QLoRA 📑 | 💬 |
70 |
58.24 |
60.75 |
84.24 |
63.66 |
44.94 |
78.69 |
17.13 |
Unknown |
deepmoney-34b-200k-base 📑 | 🔶 |
340 |
58.21 |
63.99 |
83.87 |
74.04 |
45.93 |
81.45 |
0.0 |
LlamaForCausalLM |
airoboros-33b-gpt4-1.4 📑 | 🔶 |
330 |
58.2 |
64.42 |
85.13 |
59.53 |
50.47 |
77.9 |
11.75 |
LlamaForCausalLM |
mistral-7b-platypus1k 📑 | 💬 |
72.4 |
58.19 |
61.6 |
82.93 |
63.16 |
46.96 |
78.14 |
16.38 |
MistralForCausalLM |
llama-33B-instructed 📑 | 💬 |
330 |
58.18 |
64.59 |
86.17 |
60.5 |
44.12 |
79.32 |
14.4 |
LlamaForCausalLM |
sitebunny-13b 📑 | 🔶 |
128.5 |
58.17 |
63.14 |
83.64 |
59.91 |
56.21 |
76.72 |
9.4 |
Unknown |
Dans-TotSirocco-7b 📑 | 🔶 |
72.4 |
58.16 |
62.2 |
84.28 |
63.8 |
46.04 |
79.48 |
13.19 |
MistralForCausalLM |
llama-megamerge-dare-13b 📑 | 💬 |
130.2 |
58.15 |
60.58 |
83.0 |
54.91 |
45.76 |
76.16 |
28.51 |
Unknown |
Dans-TotSirocco-7b 📑 | 🔶 |
72.4 |
58.15 |
62.03 |
84.23 |
64.19 |
46.49 |
78.69 |
13.27 |
MistralForCausalLM |
mistral-guanaco1k-ep2 📑 | 🔶 |
71.1 |
58.13 |
60.07 |
82.76 |
61.5 |
54.4 |
78.06 |
11.98 |
Unknown |
Mistral-7B-guanaco1k-ep2 📑 | 🔶 |
71.1 |
58.13 |
60.07 |
82.76 |
61.5 |
54.4 |
78.06 |
11.98 |
Unknown |
Stheno-1.8-L2-13B 📑 | 🔶 |
130 |
58.12 |
63.48 |
84.12 |
58.57 |
52.86 |
76.4 |
13.27 |
LlamaForCausalLM |
2x-LoRA-Assemble-13B 📑 | 🔶 |
130.2 |
58.1 |
63.65 |
83.47 |
59.82 |
55.94 |
76.48 |
9.25 |
Unknown |
vigogne-33b-instruct 📑 | 🔶 |
330 |
58.08 |
63.05 |
85.0 |
58.32 |
52.1 |
78.85 |
11.14 |
LlamaForCausalLM |
falcon-40b ✅ 📑 | 🟢 |
400 |
58.07 |
61.86 |
85.28 |
56.89 |
41.65 |
81.29 |
21.46 |
FalconForCausalLM |
Mistral-7B-openplatypus-1k 📑 | 💬 |
70 |
58.07 |
60.15 |
84.25 |
59.84 |
49.86 |
76.87 |
17.44 |
LlamaForCausalLM |
LLaMA-Pro-8B-Instruct 📑 | 🔶 |
83.6 |
58.06 |
52.99 |
76.98 |
52.58 |
49.43 |
72.22 |
44.2 |
LlamaForCausalLM |
typhoon-7b 📑 | 🟢 |
70 |
58.05 |
58.53 |
81.55 |
59.54 |
40.52 |
76.56 |
31.61 |
MistralForCausalLM |
Mistral-11B-v0.1 📑 | 🔶 |
107.3 |
58.05 |
59.56 |
81.17 |
63.56 |
40.67 |
76.64 |
26.69 |
MistralForCausalLM |
airoboros-m-7b-3.1.2-dare-0.85 📑 | 🔶 |
70 |
58.03 |
61.09 |
83.57 |
64.05 |
43.64 |
78.37 |
17.44 |
MistralForCausalLM |
CodeLlama-70b-Python-hf 📑 | 🟢 |
689.8 |
58.0 |
55.12 |
78.48 |
56.17 |
41.78 |
73.01 |
43.44 |
LlamaForCausalLM |
Luban-Marcoroni-13B 📑 | 💬 |
130.2 |
57.98 |
63.65 |
82.92 |
58.7 |
55.55 |
77.03 |
10.01 |
Unknown |
samantha-mistral-7b 📑 | 💬 |
71.1 |
57.96 |
63.4 |
84.1 |
61.36 |
46.08 |
76.8 |
16.0 |
Unknown |
Luban-Marcoroni-13B-v3 📑 | 💬 |
130.2 |
57.94 |
63.74 |
82.88 |
58.64 |
55.56 |
76.87 |
9.93 |
LlamaForCausalLM |
Llamix2-Xwin-MoE-4x13B 📑 | 🔶 |
385 |
57.93 |
60.41 |
82.96 |
56.24 |
39.63 |
75.14 |
33.21 |
Unknown |
ASTS-PFAF 📑 | 🔶 |
130.2 |
57.93 |
61.26 |
82.94 |
58.96 |
43.74 |
76.87 |
23.81 |
LlamaForCausalLM |
Luban-Marcoroni-13B-v2 📑 | 💬 |
130.2 |
57.92 |
63.48 |
82.89 |
58.72 |
55.56 |
76.95 |
9.93 |
LlamaForCausalLM |
Mistral-7B-OpenOrca-Guanaco-accu16 📑 | 🔶 |
70 |
57.91 |
59.73 |
83.08 |
61.29 |
50.81 |
76.56 |
16.0 |
LlamaForCausalLM |
Wizard-Vicuna-30B-Uncensored-fp16 📑 | 🔶 |
300 |
57.89 |
62.12 |
83.45 |
58.24 |
50.81 |
78.45 |
14.25 |
LlamaForCausalLM |
Wizard-Vicuna-30B-Uncensored 📑 | 🔶 |
323.2 |
57.89 |
62.12 |
83.45 |
58.24 |
50.81 |
78.45 |
14.25 |
Unknown |
Llama2-chat-AYT-13B 📑 | 🔶 |
130.2 |
57.88 |
63.31 |
83.53 |
59.67 |
55.8 |
76.09 |
8.87 |
LlamaForCausalLM |
bubo-bubo-13b 📑 | 🔶 |
130.2 |
57.86 |
61.43 |
83.14 |
58.18 |
47.62 |
76.16 |
20.62 |
LlamaForCausalLM |
Chat-AYB-Nova-13B 📑 | 💬 |
130.2 |
57.84 |
62.97 |
84.28 |
58.58 |
51.28 |
77.58 |
12.36 |
Unknown |
mistral_7b_2epoch_norobots 📑 | 🔶 |
70 |
57.84 |
61.01 |
83.37 |
63.96 |
42.62 |
79.08 |
16.98 |
Unknown |
DPO_mistral_v01_7b_ultra_0130_1k 📑 | 🔶 |
72.4 |
57.83 |
57.17 |
79.16 |
55.85 |
55.62 |
72.85 |
26.31 |
MistralForCausalLM |
Stheno-V2-Delta-fp16 📑 | 🔶 |
130.2 |
57.81 |
62.46 |
83.45 |
59.04 |
55.25 |
73.88 |
12.81 |
Unknown |
OpenOrcaxOpenChat-Preview2-13B 📑 | 🔶 |
130 |
57.76 |
62.37 |
82.96 |
58.68 |
51.23 |
77.19 |
14.1 |
LlamaForCausalLM |
MLewd-L2-Chat-13B 📑 | 🔶 |
130 |
57.75 |
62.03 |
84.19 |
58.75 |
52.84 |
77.43 |
11.3 |
LlamaForCausalLM |
Pwen-14B-Chat-20_30 📑 | 🔶 |
140 |
57.74 |
56.14 |
79.78 |
60.01 |
47.02 |
76.48 |
26.99 |
Unknown |
Luban-13B 📑 | 🔶 |
128.5 |
57.73 |
63.05 |
82.8 |
58.73 |
55.53 |
76.56 |
9.7 |
Unknown |
airoboros-33b-gpt4-1.2 📑 | 🔶 |
330 |
57.69 |
64.42 |
84.93 |
60.35 |
49.18 |
77.51 |
9.78 |
LlamaForCausalLM |
Alpacino30b 📑 | 🔶 |
300 |
57.67 |
62.71 |
85.04 |
58.48 |
44.23 |
79.79 |
15.77 |
LlamaForCausalLM |
mistral_7b_DolphinCoder 📑 | 🔶 |
70 |
57.67 |
59.73 |
81.64 |
59.87 |
43.95 |
74.59 |
26.23 |
Unknown |
mistral_7b_DolphinCoder 📑 | 🔶 |
70 |
57.67 |
59.73 |
81.64 |
59.87 |
43.95 |
74.59 |
26.23 |
Unknown |
magpie-13b 📑 | 🔶 |
128.5 |
57.64 |
63.31 |
84.25 |
58.15 |
49.15 |
76.48 |
14.48 |
Unknown |
mistral-7b-v0.1-layla-v2 📑 | 🔶 |
72.4 |
57.6 |
56.31 |
79.76 |
50.81 |
51.57 |
75.77 |
31.39 |
MistralForCausalLM |
Orca-2-13B-GPTQ 📑 | 🔶 |
162.4 |
57.6 |
59.81 |
79.12 |
59.35 |
55.14 |
76.64 |
15.54 |
LlamaForCausalLM |
MelloGPT 📑 | 🔶 |
0 |
57.59 |
53.84 |
76.12 |
55.99 |
55.61 |
73.88 |
30.1 |
MistralForCausalLM |
mistral-7b-v0.1-layla-v1 📑 | ❓ |
70 |
57.56 |
60.15 |
83.25 |
60.31 |
48.9 |
75.93 |
16.83 |
MistralForCausalLM |
Michel-13B 📑 | 💬 |
130.2 |
57.56 |
61.26 |
83.21 |
55.05 |
50.43 |
75.22 |
20.17 |
LlamaForCausalLM |
zephyr-7b-sft-full 📑 | 🔶 |
72.4 |
57.56 |
57.68 |
80.82 |
60.31 |
41.71 |
76.09 |
28.73 |
MistralForCausalLM |
tulu-30B-fp16 📑 | ❓ |
300 |
57.53 |
59.98 |
83.4 |
56.1 |
45.14 |
80.82 |
19.71 |
LlamaForCausalLM |
Mistral-7B-v0.1-flashback-v2 📑 | 🔶 |
72.4 |
57.53 |
57.17 |
80.74 |
59.98 |
40.66 |
77.19 |
29.42 |
MistralForCausalLM |
zephyr-7b-sft-full 📑 | 🔶 |
72.4 |
57.52 |
58.11 |
80.83 |
60.2 |
41.74 |
76.24 |
27.98 |
MistralForCausalLM |
SOLAR-10.7B-tutored 📑 | 🔶 |
107.3 |
57.49 |
62.29 |
82.24 |
65.09 |
55.13 |
80.19 |
0.0 |
Unknown |
airoboros-33b-gpt4-1.3 📑 | 🔶 |
330 |
57.49 |
63.82 |
85.09 |
58.94 |
45.33 |
79.01 |
12.74 |
LlamaForCausalLM |
Unholy-v1-12L-13B 📑 | 🔶 |
130.2 |
57.47 |
63.57 |
83.75 |
58.08 |
51.09 |
77.27 |
11.07 |
LlamaForCausalLM |
7B_ppo_phiRM_2GPU_3e-7step_4000 📑 | 🔶 |
72.4 |
57.46 |
57.25 |
80.24 |
60.06 |
41.48 |
76.32 |
29.42 |
MistralForCausalLM |
Dans-AdventurousWinds-7b 📑 | 🔶 |
72.4 |
57.46 |
61.01 |
83.47 |
63.69 |
42.65 |
78.22 |
15.69 |
MistralForCausalLM |
airoboros-33b-gpt4-1.3 📑 | 💬 |
330 |
57.43 |
63.91 |
85.04 |
58.53 |
45.36 |
78.69 |
13.04 |
LlamaForCausalLM |
MXLewd-L2-20B 📑 | 🔶 |
199.9 |
57.43 |
63.23 |
85.33 |
57.36 |
51.65 |
76.09 |
10.92 |
LlamaForCausalLM |
chinese-alpaca-2-13b 📑 | 💬 |
130 |
57.41 |
58.7 |
79.76 |
55.12 |
50.22 |
75.61 |
25.02 |
LlamaForCausalLM |
注意:手机屏幕有限,仅展示平均分,所有内容建议电脑端访问。
模型名称: | airoboros-m-7b-3.1.2 📑 🔶 |
参数大小: |
72.4 |
平均分: |
58.75 |
模型名称: | dromedary-65b-lora-HF 📑 🔶 |
参数大小: |
650 |
平均分: |
58.73 |
模型名称: | CollectiveCognition-v1.1-Mistral-7B-dare-0.85 📑 🔶 |
参数大小: |
70 |
平均分: |
58.72 |
模型名称: | chinese-mixtral 📑 🔶 |
参数大小: |
467 |
平均分: |
58.69 |
模型名称: | Nous-Hermes-2-SOLAR-10.7B-v1.1 📑 🔶 |
参数大小: |
107.3 |
平均分: |
58.69 |
模型名称: | Dolphin-Nebula-7B 📑 💬 |
参数大小: |
72.4 |
平均分: |
58.69 |
模型名称: | alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont2 📑 🔶 |
参数大小: |
72.4 |
平均分: |
58.67 |
模型名称: | 72B-preview-canary-llamafied-qwen-llamafy-unbias-qkv 📑 🔶 |
参数大小: |
720 |
平均分: |
58.67 |
模型名称: | Mistral-v0.1-PeanutButter-v0.0.2-7B 📑 💬 |
参数大小: |
72.4 |
平均分: |
58.66 |
模型名称: | Orca-2-13b ✅ 📑 🔶 |
参数大小: |
130 |
平均分: |
58.64 |
模型名称: | falcon-40b-openassistant-peft 📑 🔶 |
参数大小: |
400 |
平均分: |
58.63 |
模型名称: | SOLAR-Platypus-10.7B-v1 📑 💬 |
参数大小: |
107.3 |
平均分: |
58.62 |
模型名称: | QuantumLM-70B-hf 📑 🔶 |
参数大小: |
689.8 |
平均分: |
58.61 |
模型名称: | mamba-gpt-7b-v1 📑 🔶 |
参数大小: |
70 |
平均分: |
58.61 |
模型名称: | koOpenChat-sft 📑 💬 |
参数大小: |
0 |
平均分: |
58.61 |
模型名称: | alignment-handbook-zephyr-7b_ppo_5e7step_51 📑 🔶 |
参数大小: |
72.4 |
平均分: |
58.59 |
模型名称: | dolphin-2.0-mistral-7b 📑 🔶 |
参数大小: |
71.1 |
平均分: |
58.58 |
模型名称: | zephyr-neural-chat-frankenmerge11b 📑 🔶 |
参数大小: |
113.9 |
平均分: |
58.57 |
模型名称: | chinese-mixtral 📑 🟢 |
参数大小: |
467 |
平均分: |
58.57 |
模型名称: | claude2-alpaca-13B 📑 🔶 |
参数大小: |
130 |
平均分: |
58.57 |
模型名称: | 72B-preview-canary-llamafied-qwen-llamafy-unbias-qkv 📑 🔶 |
参数大小: |
720 |
平均分: |
58.54 |
模型名称: | MegaMix-A1-13B 📑 🔶 |
参数大小: |
130.2 |
平均分: |
58.52 |
模型名称: | Noromaid-13b-v0.2 📑 🔶 |
参数大小: |
130.2 |
平均分: |
58.51 |
模型名称: | MLewd-ReMM-L2-Chat-20B 📑 🔶 |
参数大小: |
199.9 |
平均分: |
58.49 |
模型名称: | open-aditi-hi-v1 📑 🔶 |
参数大小: |
72.4 |
平均分: |
58.49 |
模型名称: | Wizard-Vicuna-30B-Uncensored-GPTQ 📑 ❓ |
参数大小: |
355.8 |
平均分: |
58.47 |
模型名称: | neural-chat-7b-v3 📑 🔶 |
参数大小: |
70 |
平均分: |
58.46 |
模型名称: | Llama2-chat-AYB-13B 📑 🔶 |
参数大小: |
130.2 |
平均分: |
58.45 |
模型名称: | X-MythoChronos-13B 📑 🔶 |
参数大小: |
130.2 |
平均分: |
58.43 |
模型名称: | CodeMate-v0.1 📑 🔶 |
参数大小: |
337.4 |
平均分: |
58.39 |
模型名称: | SynthIA-7B-v1.3-dare-0.85 📑 🔶 |
参数大小: |
70 |
平均分: |
58.38 |
模型名称: | Uncensored-Frank-33B 📑 🔶 |
参数大小: |
330 |
平均分: |
58.38 |
模型名称: | phi-2-OpenHermes-2.5 📑 🔶 |
参数大小: |
27.8 |
平均分: |
58.38 |
模型名称: | alignment-handbook-zephyr-7b_ppo_5e7step_102 📑 🔶 |
参数大小: |
72.4 |
平均分: |
58.37 |
模型名称: | SynthIA-v1.3-Nebula-v2-7B 📑 🔶 |
参数大小: |
72.4 |
平均分: |
58.33 |
模型名称: | DPO_mistral_v01_7b_ultra_0131_1k_1epoch 📑 🔶 |
参数大小: |
72.4 |
平均分: |
58.32 |
模型名称: | mamba-gpt-7b-v2 📑 🔶 |
参数大小: |
70 |
平均分: |
58.31 |
模型名称: | alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont1 📑 🔶 |
参数大小: |
72.4 |
平均分: |
58.29 |
模型名称: | Mistral-v0.1-PeanutButter-v0.0.5-SFT-7B-QLoRA 📑 💬 |
参数大小: |
70 |
平均分: |
58.24 |
模型名称: | deepmoney-34b-200k-base 📑 🔶 |
参数大小: |
340 |
平均分: |
58.21 |
模型名称: | airoboros-33b-gpt4-1.4 📑 🔶 |
参数大小: |
330 |
平均分: |
58.2 |
模型名称: | mistral-7b-platypus1k 📑 💬 |
参数大小: |
72.4 |
平均分: |
58.19 |
模型名称: | llama-33B-instructed 📑 💬 |
参数大小: |
330 |
平均分: |
58.18 |
模型名称: | sitebunny-13b 📑 🔶 |
参数大小: |
128.5 |
平均分: |
58.17 |
模型名称: | Dans-TotSirocco-7b 📑 🔶 |
参数大小: |
72.4 |
平均分: |
58.16 |
模型名称: | llama-megamerge-dare-13b 📑 💬 |
参数大小: |
130.2 |
平均分: |
58.15 |
模型名称: | Dans-TotSirocco-7b 📑 🔶 |
参数大小: |
72.4 |
平均分: |
58.15 |
模型名称: | mistral-guanaco1k-ep2 📑 🔶 |
参数大小: |
71.1 |
平均分: |
58.13 |
模型名称: | Mistral-7B-guanaco1k-ep2 📑 🔶 |
参数大小: |
71.1 |
平均分: |
58.13 |
模型名称: | Stheno-1.8-L2-13B 📑 🔶 |
参数大小: |
130 |
平均分: |
58.12 |
模型名称: | 2x-LoRA-Assemble-13B 📑 🔶 |
参数大小: |
130.2 |
平均分: |
58.1 |
模型名称: | vigogne-33b-instruct 📑 🔶 |
参数大小: |
330 |
平均分: |
58.08 |
模型名称: | falcon-40b ✅ 📑 🟢 |
参数大小: |
400 |
平均分: |
58.07 |
模型名称: | Mistral-7B-openplatypus-1k 📑 💬 |
参数大小: |
70 |
平均分: |
58.07 |
模型名称: | LLaMA-Pro-8B-Instruct 📑 🔶 |
参数大小: |
83.6 |
平均分: |
58.06 |
模型名称: | typhoon-7b 📑 🟢 |
参数大小: |
70 |
平均分: |
58.05 |
模型名称: | Mistral-11B-v0.1 📑 🔶 |
参数大小: |
107.3 |
平均分: |
58.05 |
模型名称: | airoboros-m-7b-3.1.2-dare-0.85 📑 🔶 |
参数大小: |
70 |
平均分: |
58.03 |
模型名称: | CodeLlama-70b-Python-hf 📑 🟢 |
参数大小: |
689.8 |
平均分: |
58.0 |
模型名称: | Luban-Marcoroni-13B 📑 💬 |
参数大小: |
130.2 |
平均分: |
57.98 |
模型名称: | samantha-mistral-7b 📑 💬 |
参数大小: |
71.1 |
平均分: |
57.96 |
模型名称: | Luban-Marcoroni-13B-v3 📑 💬 |
参数大小: |
130.2 |
平均分: |
57.94 |
模型名称: | Llamix2-Xwin-MoE-4x13B 📑 🔶 |
参数大小: |
385 |
平均分: |
57.93 |
模型名称: | ASTS-PFAF 📑 🔶 |
参数大小: |
130.2 |
平均分: |
57.93 |
模型名称: | Luban-Marcoroni-13B-v2 📑 💬 |
参数大小: |
130.2 |
平均分: |
57.92 |
模型名称: | Mistral-7B-OpenOrca-Guanaco-accu16 📑 🔶 |
参数大小: |
70 |
平均分: |
57.91 |
模型名称: | Wizard-Vicuna-30B-Uncensored-fp16 📑 🔶 |
参数大小: |
300 |
平均分: |
57.89 |
模型名称: | Wizard-Vicuna-30B-Uncensored 📑 🔶 |
参数大小: |
323.2 |
平均分: |
57.89 |
模型名称: | Llama2-chat-AYT-13B 📑 🔶 |
参数大小: |
130.2 |
平均分: |
57.88 |
模型名称: | bubo-bubo-13b 📑 🔶 |
参数大小: |
130.2 |
平均分: |
57.86 |
模型名称: | Chat-AYB-Nova-13B 📑 💬 |
参数大小: |
130.2 |
平均分: |
57.84 |
模型名称: | mistral_7b_2epoch_norobots 📑 🔶 |
参数大小: |
70 |
平均分: |
57.84 |
模型名称: | DPO_mistral_v01_7b_ultra_0130_1k 📑 🔶 |
参数大小: |
72.4 |
平均分: |
57.83 |
模型名称: | Stheno-V2-Delta-fp16 📑 🔶 |
参数大小: |
130.2 |
平均分: |
57.81 |
模型名称: | OpenOrcaxOpenChat-Preview2-13B 📑 🔶 |
参数大小: |
130 |
平均分: |
57.76 |
模型名称: | MLewd-L2-Chat-13B 📑 🔶 |
参数大小: |
130 |
平均分: |
57.75 |
模型名称: | Pwen-14B-Chat-20_30 📑 🔶 |
参数大小: |
140 |
平均分: |
57.74 |
模型名称: | Luban-13B 📑 🔶 |
参数大小: |
128.5 |
平均分: |
57.73 |
模型名称: | airoboros-33b-gpt4-1.2 📑 🔶 |
参数大小: |
330 |
平均分: |
57.69 |
模型名称: | Alpacino30b 📑 🔶 |
参数大小: |
300 |
平均分: |
57.67 |
模型名称: | mistral_7b_DolphinCoder 📑 🔶 |
参数大小: |
70 |
平均分: |
57.67 |
模型名称: | mistral_7b_DolphinCoder 📑 🔶 |
参数大小: |
70 |
平均分: |
57.67 |
模型名称: | magpie-13b 📑 🔶 |
参数大小: |
128.5 |
平均分: |
57.64 |
模型名称: | mistral-7b-v0.1-layla-v2 📑 🔶 |
参数大小: |
72.4 |
平均分: |
57.6 |
模型名称: | Orca-2-13B-GPTQ 📑 🔶 |
参数大小: |
162.4 |
平均分: |
57.6 |
模型名称: | MelloGPT 📑 🔶 |
参数大小: |
0 |
平均分: |
57.59 |
模型名称: | mistral-7b-v0.1-layla-v1 📑 ❓ |
参数大小: |
70 |
平均分: |
57.56 |
模型名称: | Michel-13B 📑 💬 |
参数大小: |
130.2 |
平均分: |
57.56 |
模型名称: | zephyr-7b-sft-full 📑 🔶 |
参数大小: |
72.4 |
平均分: |
57.56 |
模型名称: | tulu-30B-fp16 📑 ❓ |
参数大小: |
300 |
平均分: |
57.53 |
模型名称: | Mistral-7B-v0.1-flashback-v2 📑 🔶 |
参数大小: |
72.4 |
平均分: |
57.53 |
模型名称: | zephyr-7b-sft-full 📑 🔶 |
参数大小: |
72.4 |
平均分: |
57.52 |
模型名称: | SOLAR-10.7B-tutored 📑 🔶 |
参数大小: |
107.3 |
平均分: |
57.49 |
模型名称: | airoboros-33b-gpt4-1.3 📑 🔶 |
参数大小: |
330 |
平均分: |
57.49 |
模型名称: | Unholy-v1-12L-13B 📑 🔶 |
参数大小: |
130.2 |
平均分: |
57.47 |
模型名称: | 7B_ppo_phiRM_2GPU_3e-7step_4000 📑 🔶 |
参数大小: |
72.4 |
平均分: |
57.46 |
模型名称: | Dans-AdventurousWinds-7b 📑 🔶 |
参数大小: |
72.4 |
平均分: |
57.46 |
模型名称: | airoboros-33b-gpt4-1.3 📑 💬 |
参数大小: |
330 |
平均分: |
57.43 |
模型名称: | MXLewd-L2-20B 📑 🔶 |
参数大小: |
199.9 |
平均分: |
57.43 |
模型名称: | chinese-alpaca-2-13b 📑 💬 |
参数大小: |
130 |
平均分: |
57.41 |