🟢 : 预训练模型:这类模型是新的基础模型,它们是基于特定数据集进行预训练的。
🔶 :领域特定微调模型:这些预训练模型经过了针对特定领域数据集的进一步微调,以获得更好的性能。
💬 : 聊天模型:包括使用任务指令数据集的IFT(指令式任务训练)、RLHF(强化学习从人类反馈)或DPO(通过增加策略稍微改变模型的损失)等方法进行的聊天式微调模型。
🤝 :基础合并和Moerges模型:这类模型通过合并或MoErges(模型融合)技术集成了多个模型,但不需要额外的微调。如果您发现没有图标的模型,请随时提交问题,以补充模型信息。
❓:表示未知
模型名称 | 模型类型 | 参数大小(亿) | 平均分 | ARC分数 | Hellaswag分数 | MMLU分数 | TruthfulQA分数 | Winogrande分数 | GSM8K分数 | 模型架构 |
---|---|---|---|---|---|---|---|---|---|---|
Stheno-1.2-L2-13B 📑 | 🔶 |
130 |
56.15 |
60.75 |
83.67 |
56.27 |
50.32 |
74.98 |
10.92 |
LlamaForCausalLM |
Orca-2-13b-SFT-v6 📑 | 💬 |
130.2 |
56.15 |
60.41 |
80.46 |
59.51 |
54.01 |
77.43 |
5.08 |
LlamaForCausalLM |
ELYZA-japanese-Llama-2-13b 📑 | 🔶 |
130 |
56.14 |
57.0 |
80.89 |
54.38 |
40.43 |
76.87 |
27.29 |
LlamaForCausalLM |
SpeechlessV1-Nova-13B 📑 | 💬 |
130.2 |
56.14 |
61.77 |
82.68 |
57.75 |
51.44 |
77.43 |
5.76 |
Unknown |
Instruct_Yi-6B_Dolly_CodeAlpaca 📑 | 🔶 |
60.6 |
56.11 |
53.16 |
75.3 |
63.06 |
41.42 |
75.37 |
28.35 |
LlamaForCausalLM |
NewHope_HF_not_official 📑 | 🔶 |
0 |
56.11 |
61.09 |
84.03 |
55.73 |
44.96 |
74.98 |
15.85 |
LlamaForCausalLM |
chronos-hermes-13b-v2 📑 | 🔶 |
130 |
56.1 |
60.32 |
83.21 |
55.05 |
50.91 |
75.37 |
11.75 |
LlamaForCausalLM |
Nebula-7B 📑 | 💬 |
72.4 |
56.1 |
59.3 |
83.46 |
57.0 |
45.56 |
76.4 |
14.86 |
Unknown |
prometheus-13b-v1.0 📑 | 🔶 |
130 |
56.09 |
53.24 |
80.75 |
51.49 |
45.66 |
73.72 |
31.69 |
LlamaForCausalLM |
qCammel-13 📑 | 🔶 |
0 |
56.05 |
60.84 |
83.66 |
56.73 |
47.54 |
76.16 |
11.37 |
LlamaForCausalLM |
ReMM-SLERP-L2-13B 📑 | 🔶 |
130 |
56.03 |
60.92 |
83.56 |
55.33 |
51.97 |
75.22 |
9.17 |
LlamaForCausalLM |
carl-33b 📑 | 🔶 |
330 |
56.03 |
64.59 |
85.27 |
58.38 |
45.32 |
76.24 |
6.37 |
LlamaForCausalLM |
synapsellm-7b-mistral-v0.5-preview 📑 | 🔶 |
72.4 |
56.03 |
52.73 |
76.51 |
54.67 |
55.16 |
74.35 |
22.74 |
MistralForCausalLM |
neural-chat-7b-v3-1-Nebula-v2-7B 📑 | 🔶 |
72.4 |
56.01 |
61.77 |
80.21 |
59.07 |
58.56 |
71.82 |
4.62 |
MistralForCausalLM |
MythoMax-L2-13b 📑 | 🔶 |
130 |
56.0 |
60.92 |
83.56 |
55.33 |
51.97 |
75.22 |
9.02 |
LlamaForCausalLM |
huginnv1.2 📑 | 💬 |
128.5 |
55.98 |
62.37 |
84.28 |
57.02 |
47.81 |
75.22 |
9.17 |
Unknown |
Nous-Hermes-Llama2-13b 📑 | 🔶 |
130 |
55.97 |
61.52 |
83.29 |
55.11 |
50.38 |
75.45 |
10.08 |
LlamaForCausalLM |
Samantha-1.11-13b 📑 | 🔶 |
128.5 |
55.97 |
60.84 |
82.99 |
55.96 |
47.72 |
76.01 |
12.28 |
Unknown |
LongQLoRA-Vicuna-13b-8k 📑 | 🔶 |
130 |
55.96 |
56.4 |
81.05 |
53.68 |
47.07 |
74.51 |
23.05 |
LlamaForCausalLM |
Walter-SOLAR-11B 📑 | 💬 |
107.3 |
55.95 |
60.41 |
84.86 |
64.99 |
44.88 |
79.56 |
0.99 |
LlamaForCausalLM |
Nous-Hermes-13B-Code 📑 | 🔶 |
130 |
55.93 |
61.18 |
83.21 |
55.13 |
50.56 |
75.14 |
10.39 |
LlamaForCausalLM |
Chat-AYB-Platypus2-13B 📑 | 💬 |
130.2 |
55.93 |
60.49 |
84.03 |
57.83 |
54.52 |
75.77 |
2.96 |
Unknown |
synapsellm-7b-mistral-v0.4-preview2 📑 | 🔶 |
72.4 |
55.93 |
52.99 |
74.54 |
54.6 |
53.79 |
73.95 |
25.7 |
MistralForCausalLM |
synapsellm-7b-mistral-v0.5-preview2 📑 | 🔶 |
72.4 |
55.93 |
52.22 |
75.54 |
51.64 |
55.47 |
73.09 |
27.6 |
MistralForCausalLM |
AppleSauce-L2-13b 📑 | 🔶 |
130.2 |
55.91 |
61.01 |
83.61 |
57.07 |
47.81 |
75.93 |
10.01 |
LlamaForCausalLM |
Synthia-13B-v1.2 📑 | 🔶 |
130 |
55.9 |
61.26 |
82.93 |
56.47 |
47.27 |
76.48 |
10.99 |
LlamaForCausalLM |
openbuddy-llama2-34b-v11.1-bf16 📑 | 🔶 |
335.3 |
55.88 |
50.0 |
71.19 |
55.71 |
53.01 |
70.8 |
34.57 |
Unknown |
vicuna-class-tutor-13b-ep3 📑 | 🔶 |
130 |
55.88 |
57.34 |
81.51 |
57.02 |
52.99 |
74.35 |
12.05 |
LlamaForCausalLM |
Synatra-V0.1-7B-Instruct 📑 | 💬 |
70 |
55.86 |
55.29 |
76.63 |
55.29 |
55.76 |
72.77 |
19.41 |
MistralForCausalLM |
Synatra-V0.1-7B 📑 | 🔶 |
71.1 |
55.86 |
55.29 |
76.63 |
55.29 |
55.76 |
72.77 |
19.41 |
Unknown |
Newton-7B 📑 | 🔶 |
72.4 |
55.85 |
63.99 |
81.72 |
62.78 |
44.36 |
78.85 |
3.41 |
MistralForCausalLM |
Mistral-7B-Instruct-v0.2-DARE 📑 | 🔶 |
72.4 |
55.84 |
61.95 |
75.62 |
49.99 |
54.36 |
74.98 |
18.12 |
Unknown |
Metamath-reproduce-7b 📑 | 🔶 |
70 |
55.81 |
47.18 |
73.65 |
42.94 |
41.58 |
71.35 |
58.15 |
LlamaForCausalLM |
llama-2-13b-OpenOrca_5w 📑 | 🔶 |
130 |
55.8 |
61.01 |
82.82 |
56.09 |
44.87 |
77.74 |
12.28 |
LlamaForCausalLM |
Nous-Hermes-Llama2-13b 📑 | 🔶 |
130 |
55.75 |
61.26 |
83.26 |
55.04 |
50.41 |
75.37 |
9.17 |
LlamaForCausalLM |
Stable-Platypus2-13B 📑 | 🔶 |
130.2 |
55.75 |
62.71 |
82.29 |
58.3 |
52.52 |
76.87 |
1.82 |
LlamaForCausalLM |
CollectiveCognition-v1.1-Nebula-7B 📑 | 🔶 |
72.4 |
55.72 |
58.11 |
82.39 |
57.03 |
53.53 |
73.72 |
9.55 |
Unknown |
openchat_v3.1 📑 | 🔶 |
0 |
55.71 |
60.15 |
82.84 |
56.84 |
44.38 |
76.24 |
13.8 |
LlamaForCausalLM |
Stheno-1.1-L2-13B 📑 | 🔶 |
130 |
55.71 |
60.75 |
83.64 |
56.39 |
50.3 |
75.22 |
7.96 |
LlamaForCausalLM |
openchat_v3.2 📑 | 🔶 |
0 |
55.68 |
59.64 |
82.68 |
56.68 |
44.49 |
76.95 |
13.65 |
LlamaForCausalLM |
ELYZA-japanese-Llama-2-13b-fast 📑 | 🔶 |
130 |
55.67 |
55.89 |
80.73 |
54.4 |
40.31 |
77.19 |
25.47 |
LlamaForCausalLM |
speechless-hermes-coig-lite-13b 📑 | 🔶 |
130.2 |
55.65 |
59.47 |
82.28 |
55.18 |
47.6 |
78.61 |
10.77 |
LlamaForCausalLM |
U-Amethyst-20B 📑 | 🔶 |
199.9 |
55.65 |
62.2 |
83.11 |
55.88 |
53.2 |
74.19 |
5.31 |
LlamaForCausalLM |
Uncensored-Frank-13B 📑 | 🔶 |
130 |
55.64 |
61.6 |
82.62 |
54.55 |
48.34 |
74.74 |
11.98 |
LlamaForCausalLM |
Nova-13B-50-step 📑 | 🔶 |
130.2 |
55.61 |
61.6 |
82.31 |
57.27 |
51.53 |
76.56 |
4.4 |
Unknown |
ANIMA-Phi-Neptune-Mistral-7B-v4 📑 | 🔶 |
71.1 |
55.61 |
55.46 |
77.63 |
53.12 |
59.01 |
73.48 |
14.94 |
Unknown |
sqlcoder-34b-alpha 📑 | 🔶 |
340 |
55.59 |
54.18 |
75.93 |
54.42 |
40.63 |
73.48 |
34.87 |
LlamaForCausalLM |
Stable-Platypus2-13B-QLoRA-0.80-epoch 📑 | 💬 |
130.2 |
55.56 |
62.29 |
82.46 |
57.09 |
51.41 |
76.56 |
3.56 |
Unknown |
ANIMA-Phi-Neptune-Mistral-7B 📑 | 🔶 |
70 |
55.54 |
55.97 |
76.22 |
52.89 |
59.76 |
73.48 |
14.94 |
MistralForCausalLM |
internlm-20b-chat 📑 | 🔶 |
200 |
55.53 |
55.38 |
78.58 |
58.53 |
43.22 |
78.77 |
18.73 |
Unknown |
llama-2-13b-dolphin_5w 📑 | 🔶 |
130 |
55.53 |
60.67 |
82.69 |
56.23 |
44.41 |
77.35 |
11.83 |
LlamaForCausalLM |
speechless-hermes-coig-lite-13b 📑 | 🔶 |
130.2 |
55.51 |
59.56 |
82.26 |
55.3 |
47.56 |
78.53 |
9.86 |
LlamaForCausalLM |
shisa-gamma-7b-v1 📑 | 🔶 |
72.4 |
55.5 |
53.16 |
77.3 |
55.23 |
50.73 |
73.88 |
22.74 |
MistralForCausalLM |
Stheno-Inverted-1.2-L2-13B 📑 | 🔶 |
130 |
55.5 |
59.39 |
83.01 |
55.77 |
51.22 |
74.66 |
8.95 |
LlamaForCausalLM |
UndiMix-v1-13b 📑 | 🔶 |
130.2 |
55.5 |
59.47 |
82.45 |
55.83 |
49.78 |
75.45 |
10.01 |
LlamaForCausalLM |
chronolima-airo-grad-l2-13B 📑 | 🔶 |
130 |
55.5 |
59.56 |
83.47 |
55.8 |
44.58 |
75.61 |
13.95 |
LlamaForCausalLM |
openchat_v3.2 📑 | 🔶 |
0 |
55.49 |
59.47 |
82.6 |
56.82 |
44.51 |
76.09 |
13.42 |
LlamaForCausalLM |
Zhongjing-LLaMA-base 📑 | 🔶 |
0 |
55.47 |
55.12 |
79.72 |
48.23 |
48.88 |
74.82 |
26.08 |
LlamaForCausalLM |
vicuna-13b-v1.5 📑 | 🔶 |
130 |
55.41 |
57.08 |
81.24 |
56.67 |
51.51 |
74.66 |
11.3 |
LlamaForCausalLM |
model_007_13b_v2 📑 | 🔶 |
128.5 |
55.41 |
61.95 |
82.48 |
57.32 |
53.5 |
75.85 |
1.36 |
Unknown |
llama2_13b_instructed_version2 📑 | 🔶 |
130 |
55.41 |
60.07 |
84.05 |
55.61 |
46.12 |
75.61 |
10.99 |
LlamaForCausalLM |
Synthia-13B 📑 | 🔶 |
130 |
55.41 |
59.98 |
81.86 |
56.11 |
47.41 |
76.09 |
10.99 |
LlamaForCausalLM |
nash-vicuna-13b-v1dot5-ep2-w-rag-w-simple 📑 | 🔶 |
130 |
55.4 |
59.13 |
80.64 |
56.12 |
51.29 |
74.66 |
10.54 |
LlamaForCausalLM |
speechless-orca-platypus-coig-lite-2k-0.6e-13b 📑 | 🔶 |
130 |
55.4 |
59.9 |
80.76 |
58.34 |
47.97 |
77.9 |
7.51 |
LlamaForCausalLM |
yuren-13b-chatml 📑 | 🔶 |
130 |
55.39 |
53.07 |
78.03 |
56.34 |
42.32 |
74.43 |
28.13 |
LlamaForCausalLM |
tora-13b-v1.0 📑 | 🔶 |
130 |
55.37 |
58.96 |
82.31 |
54.59 |
40.22 |
75.37 |
20.77 |
LlamaForCausalLM |
minotaur-llama2-13b-qlora 📑 | 💬 |
130 |
55.37 |
60.07 |
82.42 |
55.87 |
45.57 |
76.24 |
12.05 |
Unknown |
Tinybra_13B 📑 | 🔶 |
130.2 |
55.36 |
55.72 |
80.99 |
54.37 |
49.14 |
73.8 |
18.12 |
LlamaForCausalLM |
Luban-Platypus2-13B-QLora-0.80-epoch 📑 | 💬 |
130.2 |
55.34 |
60.24 |
82.22 |
58.03 |
55.26 |
75.37 |
0.91 |
Unknown |
SthenoWriter-L2-13B 📑 | 🔶 |
130.2 |
55.33 |
62.29 |
83.28 |
56.14 |
44.72 |
74.35 |
11.22 |
LlamaForCausalLM |
2x-LoRA-Assemble-Platypus2-13B 📑 | 💬 |
130.2 |
55.33 |
60.58 |
82.56 |
58.25 |
54.77 |
74.9 |
0.91 |
Unknown |
mistral-se-inst-ppo 📑 | 🔶 |
72.4 |
55.3 |
56.31 |
79.49 |
60.91 |
51.34 |
78.14 |
5.61 |
Unknown |
Xwin-LM-13B-V0.1 📑 | 🔶 |
130 |
55.29 |
62.54 |
82.8 |
56.53 |
45.96 |
74.27 |
9.63 |
LlamaForCausalLM |
CodeLlama-34b-hf 📑 | 🟢 |
337.4 |
55.28 |
54.18 |
75.82 |
54.92 |
39.11 |
73.32 |
34.34 |
LlamaForCausalLM |
llama-2-13b-OpenOrca_20w 📑 | 🔶 |
130 |
55.28 |
59.9 |
82.51 |
56.3 |
43.14 |
77.19 |
12.66 |
LlamaForCausalLM |
chronos-13b-v2 📑 | 🔶 |
130 |
55.25 |
58.7 |
82.52 |
53.39 |
50.55 |
75.06 |
11.3 |
LlamaForCausalLM |
SOLAR-Platypus-10.7B-v2 📑 | 💬 |
107.3 |
55.25 |
59.39 |
83.57 |
59.93 |
43.15 |
81.45 |
4.02 |
LlamaForCausalLM |
CreativityEngine 📑 | 🔶 |
0 |
55.25 |
59.3 |
82.42 |
53.55 |
52.46 |
74.19 |
9.55 |
LlamaForCausalLM |
OpenHermes-13B 📑 | 🔶 |
130 |
55.24 |
59.81 |
82.24 |
56.35 |
46.01 |
75.45 |
11.6 |
LlamaForCausalLM |
llama2-13b-Chinese-chat 📑 | 🔶 |
130 |
55.22 |
60.58 |
82.19 |
55.45 |
45.11 |
76.64 |
11.37 |
Unknown |
OrcaMini-Platypus2-13B-QLoRA-0.80-epoch 📑 | 💬 |
130.2 |
55.22 |
60.84 |
82.56 |
56.42 |
53.32 |
75.93 |
2.27 |
Unknown |
airoboros-l2-13b-3.0 📑 | 🔶 |
130.2 |
55.21 |
59.81 |
83.71 |
54.86 |
47.79 |
76.16 |
8.95 |
LlamaForCausalLM |
Mythical-Destroyer-V2-L2-13B 📑 | 🔶 |
130.2 |
55.2 |
59.3 |
82.66 |
57.39 |
57.09 |
74.74 |
0.0 |
LlamaForCausalLM |
minotaur-13b-fixed 📑 | 🔶 |
130 |
55.19 |
59.04 |
81.66 |
50.1 |
50.36 |
76.87 |
13.12 |
LlamaForCausalLM |
Dionysus-Mistral-n1-v1 📑 | 🔶 |
72.4 |
55.18 |
60.24 |
81.6 |
59.32 |
47.94 |
71.35 |
10.61 |
Unknown |
zephyr_7b_norobots 📑 | 💬 |
70 |
55.16 |
56.48 |
79.64 |
55.52 |
44.6 |
74.11 |
20.62 |
Unknown |
airoboros-c34b-2.2.1 📑 | 💬 |
340 |
55.15 |
54.69 |
76.84 |
55.43 |
51.36 |
72.53 |
20.02 |
LlamaForCausalLM |
Llama-2-13B-Instruct-v0.2 📑 | 💬 |
130 |
55.14 |
60.58 |
81.96 |
55.46 |
45.71 |
77.82 |
9.33 |
? |
WizardLM-1.0-Uncensored-Llama2-13b 📑 | 🔶 |
128.5 |
55.14 |
55.72 |
80.34 |
55.4 |
51.44 |
74.66 |
13.27 |
Unknown |
athene-noctua-13b 📑 | 🔶 |
130.2 |
55.13 |
57.17 |
81.52 |
55.91 |
47.49 |
73.4 |
15.31 |
LlamaForCausalLM |
13B-Legerdemain-L2 📑 | 🔶 |
130 |
55.13 |
61.26 |
83.26 |
56.0 |
41.99 |
75.22 |
13.04 |
LlamaForCausalLM |
pygmalion-2-13b 📑 | 🔶 |
130.2 |
55.12 |
60.32 |
82.37 |
56.02 |
42.22 |
78.06 |
11.75 |
LlamaForCausalLM |
PuddleJumper-13b 📑 | 🔶 |
130 |
55.11 |
58.7 |
81.18 |
58.25 |
56.44 |
72.77 |
3.34 |
LlamaForCausalLM |
WizardLM-1.0-Uncensored-Llama2-13b 📑 | 🔶 |
128.5 |
55.1 |
55.8 |
80.41 |
55.59 |
51.42 |
74.11 |
13.27 |
Unknown |
llama2-13b-orca-8k-3319 📑 | 🔶 |
130 |
55.09 |
60.75 |
81.91 |
57.06 |
42.64 |
77.19 |
10.99 |
LlamaForCausalLM |
Llama2-Chinese-13b-Chat 📑 | 🔶 |
130 |
55.07 |
55.97 |
82.05 |
54.74 |
48.9 |
76.16 |
12.59 |
LlamaForCausalLM |
llama-2-13b-dolphin_20w 📑 | 🔶 |
130 |
55.06 |
59.56 |
82.55 |
55.89 |
42.67 |
77.27 |
12.43 |
LlamaForCausalLM |
Python-Code-33B 📑 | 🔶 |
330 |
55.06 |
56.31 |
81.01 |
54.22 |
44.39 |
75.22 |
19.18 |
LlamaForCausalLM |
shisa-7b-v1 📑 | 🔶 |
79.6 |
55.01 |
56.14 |
78.63 |
23.12 |
52.49 |
78.06 |
41.62 |
MistralForCausalLM |
dulia-13b-8k-alpha 📑 | 🔶 |
130.2 |
55.0 |
60.67 |
82.0 |
56.87 |
42.59 |
77.19 |
10.69 |
LlamaForCausalLM |
注意:手机屏幕有限,仅展示平均分,所有内容建议电脑端访问。
模型名称: | Stheno-1.2-L2-13B 📑 🔶 |
参数大小: |
130 |
平均分: |
56.15 |
模型名称: | Orca-2-13b-SFT-v6 📑 💬 |
参数大小: |
130.2 |
平均分: |
56.15 |
模型名称: | ELYZA-japanese-Llama-2-13b 📑 🔶 |
参数大小: |
130 |
平均分: |
56.14 |
模型名称: | SpeechlessV1-Nova-13B 📑 💬 |
参数大小: |
130.2 |
平均分: |
56.14 |
模型名称: | Instruct_Yi-6B_Dolly_CodeAlpaca 📑 🔶 |
参数大小: |
60.6 |
平均分: |
56.11 |
模型名称: | NewHope_HF_not_official 📑 🔶 |
参数大小: |
0 |
平均分: |
56.11 |
模型名称: | chronos-hermes-13b-v2 📑 🔶 |
参数大小: |
130 |
平均分: |
56.1 |
模型名称: | Nebula-7B 📑 💬 |
参数大小: |
72.4 |
平均分: |
56.1 |
模型名称: | prometheus-13b-v1.0 📑 🔶 |
参数大小: |
130 |
平均分: |
56.09 |
模型名称: | qCammel-13 📑 🔶 |
参数大小: |
0 |
平均分: |
56.05 |
模型名称: | ReMM-SLERP-L2-13B 📑 🔶 |
参数大小: |
130 |
平均分: |
56.03 |
模型名称: | carl-33b 📑 🔶 |
参数大小: |
330 |
平均分: |
56.03 |
模型名称: | synapsellm-7b-mistral-v0.5-preview 📑 🔶 |
参数大小: |
72.4 |
平均分: |
56.03 |
模型名称: | neural-chat-7b-v3-1-Nebula-v2-7B 📑 🔶 |
参数大小: |
72.4 |
平均分: |
56.01 |
模型名称: | MythoMax-L2-13b 📑 🔶 |
参数大小: |
130 |
平均分: |
56.0 |
模型名称: | huginnv1.2 📑 💬 |
参数大小: |
128.5 |
平均分: |
55.98 |
模型名称: | Nous-Hermes-Llama2-13b 📑 🔶 |
参数大小: |
130 |
平均分: |
55.97 |
模型名称: | Samantha-1.11-13b 📑 🔶 |
参数大小: |
128.5 |
平均分: |
55.97 |
模型名称: | LongQLoRA-Vicuna-13b-8k 📑 🔶 |
参数大小: |
130 |
平均分: |
55.96 |
模型名称: | Walter-SOLAR-11B 📑 💬 |
参数大小: |
107.3 |
平均分: |
55.95 |
模型名称: | Nous-Hermes-13B-Code 📑 🔶 |
参数大小: |
130 |
平均分: |
55.93 |
模型名称: | Chat-AYB-Platypus2-13B 📑 💬 |
参数大小: |
130.2 |
平均分: |
55.93 |
模型名称: | synapsellm-7b-mistral-v0.4-preview2 📑 🔶 |
参数大小: |
72.4 |
平均分: |
55.93 |
模型名称: | synapsellm-7b-mistral-v0.5-preview2 📑 🔶 |
参数大小: |
72.4 |
平均分: |
55.93 |
模型名称: | AppleSauce-L2-13b 📑 🔶 |
参数大小: |
130.2 |
平均分: |
55.91 |
模型名称: | Synthia-13B-v1.2 📑 🔶 |
参数大小: |
130 |
平均分: |
55.9 |
模型名称: | openbuddy-llama2-34b-v11.1-bf16 📑 🔶 |
参数大小: |
335.3 |
平均分: |
55.88 |
模型名称: | vicuna-class-tutor-13b-ep3 📑 🔶 |
参数大小: |
130 |
平均分: |
55.88 |
模型名称: | Synatra-V0.1-7B-Instruct 📑 💬 |
参数大小: |
70 |
平均分: |
55.86 |
模型名称: | Synatra-V0.1-7B 📑 🔶 |
参数大小: |
71.1 |
平均分: |
55.86 |
模型名称: | Newton-7B 📑 🔶 |
参数大小: |
72.4 |
平均分: |
55.85 |
模型名称: | Mistral-7B-Instruct-v0.2-DARE 📑 🔶 |
参数大小: |
72.4 |
平均分: |
55.84 |
模型名称: | Metamath-reproduce-7b 📑 🔶 |
参数大小: |
70 |
平均分: |
55.81 |
模型名称: | llama-2-13b-OpenOrca_5w 📑 🔶 |
参数大小: |
130 |
平均分: |
55.8 |
模型名称: | Nous-Hermes-Llama2-13b 📑 🔶 |
参数大小: |
130 |
平均分: |
55.75 |
模型名称: | Stable-Platypus2-13B 📑 🔶 |
参数大小: |
130.2 |
平均分: |
55.75 |
模型名称: | CollectiveCognition-v1.1-Nebula-7B 📑 🔶 |
参数大小: |
72.4 |
平均分: |
55.72 |
模型名称: | openchat_v3.1 📑 🔶 |
参数大小: |
0 |
平均分: |
55.71 |
模型名称: | Stheno-1.1-L2-13B 📑 🔶 |
参数大小: |
130 |
平均分: |
55.71 |
模型名称: | openchat_v3.2 📑 🔶 |
参数大小: |
0 |
平均分: |
55.68 |
模型名称: | ELYZA-japanese-Llama-2-13b-fast 📑 🔶 |
参数大小: |
130 |
平均分: |
55.67 |
模型名称: | speechless-hermes-coig-lite-13b 📑 🔶 |
参数大小: |
130.2 |
平均分: |
55.65 |
模型名称: | U-Amethyst-20B 📑 🔶 |
参数大小: |
199.9 |
平均分: |
55.65 |
模型名称: | Uncensored-Frank-13B 📑 🔶 |
参数大小: |
130 |
平均分: |
55.64 |
模型名称: | Nova-13B-50-step 📑 🔶 |
参数大小: |
130.2 |
平均分: |
55.61 |
模型名称: | ANIMA-Phi-Neptune-Mistral-7B-v4 📑 🔶 |
参数大小: |
71.1 |
平均分: |
55.61 |
模型名称: | sqlcoder-34b-alpha 📑 🔶 |
参数大小: |
340 |
平均分: |
55.59 |
模型名称: | Stable-Platypus2-13B-QLoRA-0.80-epoch 📑 💬 |
参数大小: |
130.2 |
平均分: |
55.56 |
模型名称: | ANIMA-Phi-Neptune-Mistral-7B 📑 🔶 |
参数大小: |
70 |
平均分: |
55.54 |
模型名称: | internlm-20b-chat 📑 🔶 |
参数大小: |
200 |
平均分: |
55.53 |
模型名称: | llama-2-13b-dolphin_5w 📑 🔶 |
参数大小: |
130 |
平均分: |
55.53 |
模型名称: | speechless-hermes-coig-lite-13b 📑 🔶 |
参数大小: |
130.2 |
平均分: |
55.51 |
模型名称: | shisa-gamma-7b-v1 📑 🔶 |
参数大小: |
72.4 |
平均分: |
55.5 |
模型名称: | Stheno-Inverted-1.2-L2-13B 📑 🔶 |
参数大小: |
130 |
平均分: |
55.5 |
模型名称: | UndiMix-v1-13b 📑 🔶 |
参数大小: |
130.2 |
平均分: |
55.5 |
模型名称: | chronolima-airo-grad-l2-13B 📑 🔶 |
参数大小: |
130 |
平均分: |
55.5 |
模型名称: | openchat_v3.2 📑 🔶 |
参数大小: |
0 |
平均分: |
55.49 |
模型名称: | Zhongjing-LLaMA-base 📑 🔶 |
参数大小: |
0 |
平均分: |
55.47 |
模型名称: | vicuna-13b-v1.5 📑 🔶 |
参数大小: |
130 |
平均分: |
55.41 |
模型名称: | model_007_13b_v2 📑 🔶 |
参数大小: |
128.5 |
平均分: |
55.41 |
模型名称: | llama2_13b_instructed_version2 📑 🔶 |
参数大小: |
130 |
平均分: |
55.41 |
模型名称: | Synthia-13B 📑 🔶 |
参数大小: |
130 |
平均分: |
55.41 |
模型名称: | nash-vicuna-13b-v1dot5-ep2-w-rag-w-simple 📑 🔶 |
参数大小: |
130 |
平均分: |
55.4 |
模型名称: | speechless-orca-platypus-coig-lite-2k-0.6e-13b 📑 🔶 |
参数大小: |
130 |
平均分: |
55.4 |
模型名称: | yuren-13b-chatml 📑 🔶 |
参数大小: |
130 |
平均分: |
55.39 |
模型名称: | tora-13b-v1.0 📑 🔶 |
参数大小: |
130 |
平均分: |
55.37 |
模型名称: | minotaur-llama2-13b-qlora 📑 💬 |
参数大小: |
130 |
平均分: |
55.37 |
模型名称: | Tinybra_13B 📑 🔶 |
参数大小: |
130.2 |
平均分: |
55.36 |
模型名称: | Luban-Platypus2-13B-QLora-0.80-epoch 📑 💬 |
参数大小: |
130.2 |
平均分: |
55.34 |
模型名称: | SthenoWriter-L2-13B 📑 🔶 |
参数大小: |
130.2 |
平均分: |
55.33 |
模型名称: | 2x-LoRA-Assemble-Platypus2-13B 📑 💬 |
参数大小: |
130.2 |
平均分: |
55.33 |
模型名称: | mistral-se-inst-ppo 📑 🔶 |
参数大小: |
72.4 |
平均分: |
55.3 |
模型名称: | Xwin-LM-13B-V0.1 📑 🔶 |
参数大小: |
130 |
平均分: |
55.29 |
模型名称: | CodeLlama-34b-hf 📑 🟢 |
参数大小: |
337.4 |
平均分: |
55.28 |
模型名称: | llama-2-13b-OpenOrca_20w 📑 🔶 |
参数大小: |
130 |
平均分: |
55.28 |
模型名称: | chronos-13b-v2 📑 🔶 |
参数大小: |
130 |
平均分: |
55.25 |
模型名称: | SOLAR-Platypus-10.7B-v2 📑 💬 |
参数大小: |
107.3 |
平均分: |
55.25 |
模型名称: | CreativityEngine 📑 🔶 |
参数大小: |
0 |
平均分: |
55.25 |
模型名称: | OpenHermes-13B 📑 🔶 |
参数大小: |
130 |
平均分: |
55.24 |
模型名称: | llama2-13b-Chinese-chat 📑 🔶 |
参数大小: |
130 |
平均分: |
55.22 |
模型名称: | OrcaMini-Platypus2-13B-QLoRA-0.80-epoch 📑 💬 |
参数大小: |
130.2 |
平均分: |
55.22 |
模型名称: | airoboros-l2-13b-3.0 📑 🔶 |
参数大小: |
130.2 |
平均分: |
55.21 |
模型名称: | Mythical-Destroyer-V2-L2-13B 📑 🔶 |
参数大小: |
130.2 |
平均分: |
55.2 |
模型名称: | minotaur-13b-fixed 📑 🔶 |
参数大小: |
130 |
平均分: |
55.19 |
模型名称: | Dionysus-Mistral-n1-v1 📑 🔶 |
参数大小: |
72.4 |
平均分: |
55.18 |
模型名称: | zephyr_7b_norobots 📑 💬 |
参数大小: |
70 |
平均分: |
55.16 |
模型名称: | airoboros-c34b-2.2.1 📑 💬 |
参数大小: |
340 |
平均分: |
55.15 |
模型名称: | Llama-2-13B-Instruct-v0.2 📑 💬 |
参数大小: |
130 |
平均分: |
55.14 |
模型名称: | WizardLM-1.0-Uncensored-Llama2-13b 📑 🔶 |
参数大小: |
128.5 |
平均分: |
55.14 |
模型名称: | athene-noctua-13b 📑 🔶 |
参数大小: |
130.2 |
平均分: |
55.13 |
模型名称: | 13B-Legerdemain-L2 📑 🔶 |
参数大小: |
130 |
平均分: |
55.13 |
模型名称: | pygmalion-2-13b 📑 🔶 |
参数大小: |
130.2 |
平均分: |
55.12 |
模型名称: | PuddleJumper-13b 📑 🔶 |
参数大小: |
130 |
平均分: |
55.11 |
模型名称: | WizardLM-1.0-Uncensored-Llama2-13b 📑 🔶 |
参数大小: |
128.5 |
平均分: |
55.1 |
模型名称: | llama2-13b-orca-8k-3319 📑 🔶 |
参数大小: |
130 |
平均分: |
55.09 |
模型名称: | Llama2-Chinese-13b-Chat 📑 🔶 |
参数大小: |
130 |
平均分: |
55.07 |
模型名称: | llama-2-13b-dolphin_20w 📑 🔶 |
参数大小: |
130 |
平均分: |
55.06 |
模型名称: | Python-Code-33B 📑 🔶 |
参数大小: |
330 |
平均分: |
55.06 |
模型名称: | shisa-7b-v1 📑 🔶 |
参数大小: |
79.6 |
平均分: |
55.01 |
模型名称: | dulia-13b-8k-alpha 📑 🔶 |
参数大小: |
130.2 |
平均分: |
55.0 |