🟢 : 预训练模型:这类模型是新的基础模型,它们是基于特定数据集进行预训练的。
🔶 :领域特定微调模型:这些预训练模型经过了针对特定领域数据集的进一步微调,以获得更好的性能。
💬 : 聊天模型:包括使用任务指令数据集的IFT(指令式任务训练)、RLHF(强化学习从人类反馈)或DPO(通过增加策略稍微改变模型的损失)等方法进行的聊天式微调模型。
🤝 :基础合并和Moerges模型:这类模型通过合并或MoErges(模型融合)技术集成了多个模型,但不需要额外的微调。如果您发现没有图标的模型,请随时提交问题,以补充模型信息。
❓:表示未知
模型名称 | 模型类型 | 参数大小(亿) | 平均分 | ARC分数 | Hellaswag分数 | MMLU分数 | TruthfulQA分数 | Winogrande分数 | GSM8K分数 | 模型架构 |
---|---|---|---|---|---|---|---|---|---|---|
zarafusionex-1.2-l2-7b 📑 | 🔶 |
70 |
53.73 |
56.66 |
79.16 |
51.94 |
51.29 |
74.74 |
8.57 |
LlamaForCausalLM |
mistral-7b-sft-open-orca-flan-50k 📑 | 💬 |
72.4 |
53.7 |
58.79 |
81.92 |
55.72 |
37.49 |
77.98 |
10.31 |
MistralForCausalLM |
llama-2-13b-huangyt_FINETUNE2_3w 📑 | 🔶 |
128.5 |
53.69 |
58.62 |
82.32 |
54.25 |
38.17 |
76.8 |
11.98 |
Unknown |
openbuddy-mixtral-8x7b-v16.2-32k 📑 | 🔶 |
467.4 |
53.69 |
34.39 |
81.72 |
71.33 |
56.65 |
77.82 |
0.23 |
MixtralForCausalLM |
Llama-2-13b-chat-dutch 📑 | 💬 |
130.2 |
53.69 |
59.3 |
81.45 |
55.82 |
38.23 |
76.64 |
10.69 |
LlamaForCausalLM |
airoboros-13b-gpt4-1.1 📑 | 🔶 |
130 |
53.68 |
59.04 |
83.05 |
49.41 |
46.62 |
75.77 |
8.19 |
LlamaForCausalLM |
bimoGPT-llama2-13b 📑 | ❓ |
130 |
53.68 |
58.79 |
82.08 |
55.6 |
37.82 |
76.48 |
11.3 |
Unknown |
airoboros-13b-gpt4 📑 | 🔶 |
130 |
53.64 |
59.39 |
83.29 |
47.89 |
47.65 |
75.77 |
7.88 |
LlamaForCausalLM |
platypus2-22b-relora 📑 | 💬 |
218.3 |
53.64 |
57.51 |
82.36 |
54.94 |
43.62 |
77.11 |
6.29 |
Unknown |
PuffedLIMA13bQLORA 📑 | 🔶 |
130 |
53.63 |
59.9 |
84.39 |
53.68 |
39.9 |
75.22 |
8.72 |
Unknown |
deacon-13b 📑 | 💬 |
130.2 |
53.63 |
57.85 |
82.63 |
55.25 |
39.33 |
76.32 |
10.39 |
LlamaForCausalLM |
llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o 📑 | 🔶 |
130 |
53.62 |
59.04 |
81.15 |
53.0 |
40.16 |
76.48 |
11.9 |
LlamaForCausalLM |
tora-13b-v1.0 📑 | 💬 |
130 |
53.62 |
58.96 |
82.31 |
54.73 |
40.25 |
75.61 |
9.86 |
LlamaForCausalLM |
MistralInstructLongish 📑 | 💬 |
72.4 |
53.62 |
60.75 |
81.86 |
60.49 |
40.55 |
76.56 |
1.52 |
MistralForCausalLM |
internlm2-base-7b-llama 📑 | 🟢 |
77.4 |
53.62 |
54.35 |
79.47 |
54.05 |
43.23 |
71.43 |
19.18 |
LlamaForCausalLM |
Python-Code-13B 📑 | 🔶 |
130 |
53.61 |
58.79 |
81.66 |
54.78 |
42.83 |
74.03 |
9.55 |
LlamaForCausalLM |
llama2-13b-ft-mc4_nl_cleaned_tiny 📑 | 🔶 |
130.2 |
53.6 |
59.3 |
82.04 |
54.67 |
38.03 |
77.27 |
10.31 |
LlamaForCausalLM |
WizardLM-1.0-Uncensored-CodeLlama-34b 📑 | 💬 |
334.8 |
53.59 |
56.4 |
75.45 |
54.51 |
43.06 |
72.45 |
19.64 |
Unknown |
tulu-13B-fp16 📑 | 🔶 |
130 |
53.58 |
53.92 |
80.66 |
53.19 |
43.84 |
75.61 |
14.25 |
LlamaForCausalLM |
TacoBeLLM 📑 | 🔶 |
130.2 |
53.56 |
58.53 |
81.9 |
56.97 |
46.06 |
76.64 |
1.29 |
LlamaForCausalLM |
dolphin-llama-13b 📑 | 🔶 |
128.5 |
53.56 |
55.55 |
77.11 |
52.16 |
52.23 |
69.93 |
14.4 |
Unknown |
guanaco-13B-HF 📑 | 🔶 |
130 |
53.54 |
57.85 |
83.84 |
48.28 |
46.73 |
75.85 |
8.72 |
LlamaForCausalLM |
llama-2-13b-Open_Platypus_and_ccp_2.6w 📑 | 🔶 |
130 |
53.52 |
58.96 |
82.51 |
56.12 |
40.07 |
76.64 |
6.82 |
LlamaForCausalLM |
code-millenials-34b 📑 | 💬 |
337.4 |
53.51 |
49.83 |
75.09 |
49.28 |
45.37 |
69.06 |
32.45 |
LlamaForCausalLM |
MoECPM-Untrained-4x2b 📑 | 🤝 |
77.9 |
53.51 |
46.76 |
72.58 |
53.21 |
38.41 |
65.51 |
44.58 |
MixtralForCausalLM |
vigogne-13b-chat 📑 | 🔶 |
130 |
53.5 |
58.62 |
80.85 |
47.76 |
48.73 |
76.72 |
8.34 |
LlamaForCausalLM |
openbuddy-mistral-7b-v13 📑 | 🔶 |
70 |
53.5 |
52.3 |
75.09 |
56.34 |
50.81 |
71.74 |
14.71 |
MistralForCausalLM |
orca_mini_v3_7b 📑 | 💬 |
70 |
53.47 |
56.91 |
79.64 |
52.37 |
50.51 |
74.27 |
7.13 |
LlamaForCausalLM |
orca_mini_v3_7b 📑 | 🔶 |
66.1 |
53.47 |
56.91 |
79.64 |
52.37 |
50.51 |
74.27 |
7.13 |
Unknown |
tigerbot-13b-base 📑 | 🟢 |
130 |
53.42 |
53.84 |
77.05 |
53.57 |
44.06 |
74.98 |
17.06 |
Unknown |
zarafusionex-1.1-l2-7b 📑 | 🔶 |
70 |
53.41 |
56.14 |
79.34 |
52.1 |
50.66 |
74.43 |
7.81 |
LlamaForCausalLM |
QuantumLM 📑 | ❓ |
0 |
53.41 |
55.8 |
79.74 |
54.17 |
46.71 |
74.19 |
9.86 |
LlamaForCausalLM |
samantha-mistral-instruct-7b 📑 | 💬 |
71.1 |
53.4 |
53.5 |
75.14 |
51.72 |
58.81 |
70.4 |
10.84 |
Unknown |
GiftedConvo13bLoraNoEcons 📑 | 💬 |
130 |
53.35 |
59.39 |
83.19 |
55.15 |
40.56 |
74.03 |
7.81 |
Unknown |
airoboros-l2-13b-2.1 📑 | 🔶 |
130 |
53.34 |
59.47 |
82.47 |
54.83 |
44.65 |
75.06 |
3.56 |
LlamaForCausalLM |
llama-2-13b-FINETUNE5_4w-r4-q_k_v_o 📑 | 🔶 |
130 |
53.32 |
58.36 |
81.1 |
54.53 |
37.02 |
76.64 |
12.28 |
LlamaForCausalLM |
vicuna-13b-v1.3.0-GPTQ 📑 | ❓ |
162.2 |
53.29 |
54.35 |
79.47 |
51.97 |
50.88 |
74.66 |
8.42 |
LlamaForCausalLM |
vicuna-13B-1.1-HF 📑 | 🔶 |
128.5 |
53.29 |
52.73 |
80.13 |
51.94 |
52.08 |
74.19 |
8.64 |
Unknown |
vicuna-13b-1.1 📑 | 🔶 |
130 |
53.29 |
52.73 |
80.13 |
51.94 |
52.08 |
74.19 |
8.64 |
LlamaForCausalLM |
Vicuna-13B-CoT-fp16 📑 | 🔶 |
130 |
53.28 |
52.73 |
80.14 |
51.9 |
52.08 |
74.19 |
8.64 |
LlamaForCausalLM |
Llama-2-13B-GPTQ 📑 | 🔶 |
162.3 |
53.26 |
59.13 |
81.48 |
54.45 |
37.07 |
76.16 |
11.3 |
LlamaForCausalLM |
finance-chat 📑 | 🔶 |
0 |
53.26 |
53.75 |
76.6 |
50.16 |
44.54 |
75.69 |
18.8 |
LlamaForCausalLM |
BeingWell_llama2_7b 📑 | 🔶 |
70 |
53.22 |
54.95 |
78.27 |
47.46 |
45.93 |
74.19 |
18.5 |
LlamaForCausalLM |
llama-2-13b-FINETUNE2_TEST_2.2w 📑 | 🔶 |
130 |
53.2 |
56.23 |
82.7 |
55.35 |
39.55 |
76.72 |
8.64 |
LlamaForCausalLM |
MetaMath-Llemma-7B 📑 | 🔶 |
70 |
53.19 |
46.5 |
61.69 |
47.66 |
39.61 |
62.75 |
60.96 |
LlamaForCausalLM |
llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o 📑 | 🔶 |
130 |
53.18 |
54.78 |
81.4 |
54.73 |
41.02 |
76.64 |
10.54 |
LlamaForCausalLM |
fusedyi 📑 | 🔶 |
109.1 |
53.18 |
55.03 |
76.6 |
63.43 |
49.29 |
72.69 |
2.05 |
LlamaForCausalLM |
zarafusionix-l2-7b 📑 | 🔶 |
70 |
53.18 |
55.55 |
79.4 |
51.21 |
51.05 |
74.66 |
7.2 |
LlamaForCausalLM |
WizardLM-13B-V1-1-SuperHOT-8K-fp16 📑 | 🔶 |
130 |
53.16 |
58.62 |
81.07 |
48.32 |
54.19 |
76.01 |
0.76 |
LlamaForCausalLM |
Athena-Platypus2-13B-QLora-0.80-epoch 📑 | 💬 |
130.2 |
53.16 |
56.66 |
80.56 |
55.43 |
53.62 |
72.61 |
0.08 |
Unknown |
Airboros2.1-Platypus2-13B-QLora-0.80-epoch 📑 | 💬 |
130.2 |
53.15 |
58.96 |
82.46 |
54.62 |
47.71 |
75.14 |
0.0 |
Unknown |
Airoboros-L2-13B-2.1-GPTQ 📑 | 🔶 |
162.3 |
53.14 |
58.96 |
81.72 |
53.16 |
44.68 |
74.35 |
5.99 |
LlamaForCausalLM |
Llama-2-13b-hf-ds_wiki_1024_full_r_64_alpha_16 📑 | 💬 |
130 |
53.14 |
59.04 |
82.33 |
55.36 |
35.75 |
76.32 |
10.01 |
Unknown |
MLewd-L2-13B 📑 | 🔶 |
130 |
53.12 |
58.28 |
82.32 |
54.67 |
48.66 |
73.48 |
1.29 |
LlamaForCausalLM |
manticore-13b-chat-pyg-GPTQ 📑 | ❓ |
162.2 |
53.11 |
57.85 |
81.07 |
47.56 |
47.77 |
75.93 |
8.49 |
LlamaForCausalLM |
llama2_7b_mmlu 📑 | 🔶 |
70 |
53.1 |
56.14 |
79.13 |
60.04 |
40.95 |
74.43 |
7.88 |
LlamaForCausalLM |
LlongOrca-7B-16k 📑 | 💬 |
70 |
53.02 |
57.51 |
79.44 |
49.35 |
49.84 |
74.51 |
7.51 |
LlamaForCausalLM |
airoboros-l2-13b-gpt4-1.4.1 📑 | 🔶 |
130 |
53.02 |
59.13 |
82.78 |
55.62 |
40.27 |
73.32 |
6.97 |
LlamaForCausalLM |
Walter-Mistral-7B 📑 | 💬 |
72.4 |
53.0 |
58.87 |
83.43 |
58.65 |
39.93 |
77.03 |
0.08 |
MistralForCausalLM |
medicine-chat 📑 | 🔶 |
0 |
52.99 |
53.75 |
76.11 |
49.98 |
43.46 |
75.69 |
18.95 |
LlamaForCausalLM |
Code-290k-13B 📑 | 🔶 |
130 |
52.96 |
56.06 |
81.55 |
51.99 |
37.65 |
72.69 |
17.82 |
LlamaForCausalLM |
digital-socrates-7b 📑 | 🔶 |
70 |
52.95 |
54.44 |
75.99 |
51.41 |
44.88 |
73.09 |
17.89 |
LlamaForCausalLM |
zaraxe-l2-7b 📑 | 🔶 |
70 |
52.95 |
57.17 |
79.34 |
51.0 |
49.11 |
73.48 |
7.58 |
LlamaForCausalLM |
baize-v2-13b 📑 | 🔶 |
130 |
52.94 |
56.91 |
79.29 |
49.72 |
47.88 |
74.9 |
8.95 |
LlamaForCausalLM |
Llama-2-7b-chat-hf-afr-100step-flan-v2 📑 | 💬 |
70 |
52.92 |
53.24 |
78.43 |
48.43 |
45.66 |
72.3 |
19.48 |
LlamaForCausalLM |
llama2-22b-chat-wizard-uncensored 📑 | 🔶 |
218.3 |
52.9 |
56.23 |
80.39 |
53.62 |
45.76 |
70.24 |
11.14 |
LlamaForCausalLM |
Llama-2-13b-ft-instruct-es 📑 | 💬 |
130 |
52.89 |
59.39 |
81.51 |
54.31 |
37.81 |
75.77 |
8.57 |
LlamaForCausalLM |
law-chat 📑 | 🔶 |
0 |
52.88 |
53.41 |
76.16 |
50.24 |
43.53 |
75.45 |
18.5 |
LlamaForCausalLM |
Llama-2-7b-chat-hf-afr-100step-flan 📑 | 💬 |
70 |
52.88 |
52.9 |
78.44 |
48.4 |
45.67 |
72.38 |
19.48 |
LlamaForCausalLM |
archangel_sft-kto_llama13b 📑 | 💬 |
130.2 |
52.87 |
56.14 |
80.8 |
47.84 |
39.42 |
76.16 |
16.83 |
LlamaForCausalLM |
chimera-inst-chat-13b-hf 📑 | ❓ |
130 |
52.86 |
55.38 |
78.93 |
50.6 |
50.12 |
73.95 |
8.19 |
LlamaForCausalLM |
Telugu-Llama2-7B-v0-Instruct 📑 | 💬 |
70 |
52.86 |
53.58 |
78.33 |
47.63 |
43.26 |
73.95 |
20.39 |
LlamaForCausalLM |
japanese-stablelm-instruct-gamma-7b 📑 | 💬 |
72.4 |
52.82 |
50.68 |
78.68 |
54.82 |
39.77 |
73.72 |
19.26 |
MistralForCausalLM |
Llama2-13B-no_robots-alpaca-lora 📑 | 🔶 |
130 |
52.77 |
58.87 |
82.43 |
53.11 |
40.46 |
75.3 |
6.44 |
LlamaForCausalLM |
ypotryll-22b-epoch2-qlora 📑 | 💬 |
220 |
52.75 |
59.22 |
80.66 |
54.52 |
40.42 |
76.32 |
5.38 |
Unknown |
orca_mini_v2_13b 📑 | 🔶 |
128.5 |
52.75 |
55.12 |
79.69 |
50.07 |
52.56 |
72.69 |
6.37 |
Unknown |
Llama-2-7b-chat-hf-afr-200step-flan-v2 📑 | 💬 |
70 |
52.75 |
52.65 |
78.04 |
48.51 |
45.42 |
72.93 |
18.95 |
LlamaForCausalLM |
EverythingLM-13b-V2-16k 📑 | 🔶 |
130 |
52.75 |
58.7 |
80.88 |
49.69 |
47.37 |
73.01 |
6.82 |
LlamaForCausalLM |
ELYZA-japanese-Llama-2-13b-fast-instruct 📑 | 🔶 |
130 |
52.72 |
57.51 |
81.82 |
54.52 |
43.82 |
75.93 |
2.73 |
LlamaForCausalLM |
Synthia-7B-v1.2 📑 | 🔶 |
70 |
52.71 |
54.35 |
79.29 |
49.33 |
48.92 |
73.56 |
10.84 |
LlamaForCausalLM |
MetaMath-13B-V1.0 📑 | 🔶 |
130 |
52.71 |
49.49 |
76.48 |
47.74 |
41.58 |
72.45 |
28.51 |
LlamaForCausalLM |
yehoon_llama2 📑 | 💬 |
0 |
52.71 |
54.78 |
78.98 |
51.29 |
49.17 |
74.74 |
7.28 |
Unknown |
Nous-Capybara-7B 📑 | 🔶 |
66.1 |
52.7 |
55.29 |
80.73 |
48.72 |
51.13 |
73.32 |
6.97 |
Unknown |
Tulpar-7b-v0 📑 | 🔶 |
70 |
52.69 |
56.31 |
79.01 |
52.55 |
51.68 |
73.88 |
2.73 |
LlamaForCausalLM |
Capybara-7B 📑 | 🔶 |
66.1 |
52.69 |
55.2 |
80.76 |
48.8 |
51.07 |
73.4 |
6.9 |
Unknown |
CodeEngine 📑 | 🔶 |
0 |
52.68 |
58.36 |
82.27 |
54.18 |
45.18 |
74.59 |
1.52 |
LlamaForCausalLM |
openbuddy-mixtral-8x7b-v16.1-32k 📑 | 🔶 |
467.4 |
52.68 |
29.1 |
82.27 |
71.37 |
55.97 |
77.35 |
0.0 |
MixtralForCausalLM |
Mistral-Trismegistus-7B 📑 | 💬 |
70 |
52.66 |
54.1 |
77.91 |
54.49 |
49.36 |
70.17 |
9.93 |
MistralForCausalLM |
airoboros-l2-13b-gpt4-m2.0 📑 | 🔶 |
130 |
52.66 |
59.22 |
81.02 |
53.73 |
39.7 |
73.64 |
8.64 |
LlamaForCausalLM |
Llama-2-7b-chat-hf-afr-200step-flan 📑 | 💬 |
70 |
52.62 |
52.47 |
78.02 |
48.42 |
45.47 |
72.69 |
18.65 |
LlamaForCausalLM |
openbuddy-mistral-7b-v13.1 📑 | 🔶 |
70 |
52.62 |
52.56 |
75.73 |
56.68 |
50.44 |
71.59 |
8.72 |
MistralForCausalLM |
LIMA-13b-hf 📑 | 🔶 |
130 |
52.61 |
57.42 |
81.68 |
48.72 |
41.76 |
77.19 |
8.87 |
LlamaForCausalLM |
Chinese-Llama-2-7b 📑 | 🔶 |
70 |
52.59 |
52.99 |
75.64 |
50.74 |
48.94 |
72.77 |
14.48 |
LlamaForCausalLM |
japanese-stablelm-base-gamma-7b 📑 | 🔶 |
72.4 |
52.59 |
50.34 |
77.47 |
54.75 |
41.2 |
73.95 |
17.82 |
MistralForCausalLM |
Mistral-7B-SFT 📑 | 🔶 |
72.4 |
52.58 |
46.5 |
75.69 |
51.04 |
52.02 |
72.77 |
17.44 |
MistralForCausalLM |
speechless-codellama-dolphin-orca-platypus-34b 📑 | 🔶 |
340 |
52.53 |
52.47 |
74.13 |
53.47 |
47.14 |
73.24 |
14.71 |
LlamaForCausalLM |
speechless-codellama-34b-v1.0 📑 | 🔶 |
340 |
52.53 |
52.47 |
74.13 |
53.47 |
47.14 |
73.24 |
14.71 |
LlamaForCausalLM |
Llama-2-7b-chat-hf-10-attention-sparsity 📑 | 💬 |
67.4 |
52.52 |
52.9 |
78.18 |
48.1 |
45.4 |
71.43 |
19.11 |
LlamaForCausalLM |
speechless-codellama-34b-v2.0 📑 | 🔶 |
340 |
52.51 |
54.35 |
75.65 |
54.67 |
45.21 |
73.56 |
11.6 |
LlamaForCausalLM |
airoboros-l2-13b-gpt4-2.0 📑 | 🔶 |
130 |
52.49 |
59.04 |
82.82 |
54.71 |
36.47 |
74.19 |
7.73 |
LlamaForCausalLM |
注意:手机屏幕有限,仅展示平均分,所有内容建议电脑端访问。
模型名称: | zarafusionex-1.2-l2-7b 📑 🔶 |
参数大小: |
70 |
平均分: |
53.73 |
模型名称: | mistral-7b-sft-open-orca-flan-50k 📑 💬 |
参数大小: |
72.4 |
平均分: |
53.7 |
模型名称: | llama-2-13b-huangyt_FINETUNE2_3w 📑 🔶 |
参数大小: |
128.5 |
平均分: |
53.69 |
模型名称: | openbuddy-mixtral-8x7b-v16.2-32k 📑 🔶 |
参数大小: |
467.4 |
平均分: |
53.69 |
模型名称: | Llama-2-13b-chat-dutch 📑 💬 |
参数大小: |
130.2 |
平均分: |
53.69 |
模型名称: | airoboros-13b-gpt4-1.1 📑 🔶 |
参数大小: |
130 |
平均分: |
53.68 |
模型名称: | bimoGPT-llama2-13b 📑 ❓ |
参数大小: |
130 |
平均分: |
53.68 |
模型名称: | airoboros-13b-gpt4 📑 🔶 |
参数大小: |
130 |
平均分: |
53.64 |
模型名称: | platypus2-22b-relora 📑 💬 |
参数大小: |
218.3 |
平均分: |
53.64 |
模型名称: | PuffedLIMA13bQLORA 📑 🔶 |
参数大小: |
130 |
平均分: |
53.63 |
模型名称: | deacon-13b 📑 💬 |
参数大小: |
130.2 |
平均分: |
53.63 |
模型名称: | llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o 📑 🔶 |
参数大小: |
130 |
平均分: |
53.62 |
模型名称: | tora-13b-v1.0 📑 💬 |
参数大小: |
130 |
平均分: |
53.62 |
模型名称: | MistralInstructLongish 📑 💬 |
参数大小: |
72.4 |
平均分: |
53.62 |
模型名称: | internlm2-base-7b-llama 📑 🟢 |
参数大小: |
77.4 |
平均分: |
53.62 |
模型名称: | Python-Code-13B 📑 🔶 |
参数大小: |
130 |
平均分: |
53.61 |
模型名称: | llama2-13b-ft-mc4_nl_cleaned_tiny 📑 🔶 |
参数大小: |
130.2 |
平均分: |
53.6 |
模型名称: | WizardLM-1.0-Uncensored-CodeLlama-34b 📑 💬 |
参数大小: |
334.8 |
平均分: |
53.59 |
模型名称: | tulu-13B-fp16 📑 🔶 |
参数大小: |
130 |
平均分: |
53.58 |
模型名称: | TacoBeLLM 📑 🔶 |
参数大小: |
130.2 |
平均分: |
53.56 |
模型名称: | dolphin-llama-13b 📑 🔶 |
参数大小: |
128.5 |
平均分: |
53.56 |
模型名称: | guanaco-13B-HF 📑 🔶 |
参数大小: |
130 |
平均分: |
53.54 |
模型名称: | llama-2-13b-Open_Platypus_and_ccp_2.6w 📑 🔶 |
参数大小: |
130 |
平均分: |
53.52 |
模型名称: | code-millenials-34b 📑 💬 |
参数大小: |
337.4 |
平均分: |
53.51 |
模型名称: | MoECPM-Untrained-4x2b 📑 🤝 |
参数大小: |
77.9 |
平均分: |
53.51 |
模型名称: | vigogne-13b-chat 📑 🔶 |
参数大小: |
130 |
平均分: |
53.5 |
模型名称: | openbuddy-mistral-7b-v13 📑 🔶 |
参数大小: |
70 |
平均分: |
53.5 |
模型名称: | orca_mini_v3_7b 📑 💬 |
参数大小: |
70 |
平均分: |
53.47 |
模型名称: | orca_mini_v3_7b 📑 🔶 |
参数大小: |
66.1 |
平均分: |
53.47 |
模型名称: | tigerbot-13b-base 📑 🟢 |
参数大小: |
130 |
平均分: |
53.42 |
模型名称: | zarafusionex-1.1-l2-7b 📑 🔶 |
参数大小: |
70 |
平均分: |
53.41 |
模型名称: | QuantumLM 📑 ❓ |
参数大小: |
0 |
平均分: |
53.41 |
模型名称: | samantha-mistral-instruct-7b 📑 💬 |
参数大小: |
71.1 |
平均分: |
53.4 |
模型名称: | GiftedConvo13bLoraNoEcons 📑 💬 |
参数大小: |
130 |
平均分: |
53.35 |
模型名称: | airoboros-l2-13b-2.1 📑 🔶 |
参数大小: |
130 |
平均分: |
53.34 |
模型名称: | llama-2-13b-FINETUNE5_4w-r4-q_k_v_o 📑 🔶 |
参数大小: |
130 |
平均分: |
53.32 |
模型名称: | vicuna-13b-v1.3.0-GPTQ 📑 ❓ |
参数大小: |
162.2 |
平均分: |
53.29 |
模型名称: | vicuna-13B-1.1-HF 📑 🔶 |
参数大小: |
128.5 |
平均分: |
53.29 |
模型名称: | vicuna-13b-1.1 📑 🔶 |
参数大小: |
130 |
平均分: |
53.29 |
模型名称: | Vicuna-13B-CoT-fp16 📑 🔶 |
参数大小: |
130 |
平均分: |
53.28 |
模型名称: | Llama-2-13B-GPTQ 📑 🔶 |
参数大小: |
162.3 |
平均分: |
53.26 |
模型名称: | finance-chat 📑 🔶 |
参数大小: |
0 |
平均分: |
53.26 |
模型名称: | BeingWell_llama2_7b 📑 🔶 |
参数大小: |
70 |
平均分: |
53.22 |
模型名称: | llama-2-13b-FINETUNE2_TEST_2.2w 📑 🔶 |
参数大小: |
130 |
平均分: |
53.2 |
模型名称: | MetaMath-Llemma-7B 📑 🔶 |
参数大小: |
70 |
平均分: |
53.19 |
模型名称: | llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o 📑 🔶 |
参数大小: |
130 |
平均分: |
53.18 |
模型名称: | fusedyi 📑 🔶 |
参数大小: |
109.1 |
平均分: |
53.18 |
模型名称: | zarafusionix-l2-7b 📑 🔶 |
参数大小: |
70 |
平均分: |
53.18 |
模型名称: | WizardLM-13B-V1-1-SuperHOT-8K-fp16 📑 🔶 |
参数大小: |
130 |
平均分: |
53.16 |
模型名称: | Athena-Platypus2-13B-QLora-0.80-epoch 📑 💬 |
参数大小: |
130.2 |
平均分: |
53.16 |
模型名称: | Airboros2.1-Platypus2-13B-QLora-0.80-epoch 📑 💬 |
参数大小: |
130.2 |
平均分: |
53.15 |
模型名称: | Airoboros-L2-13B-2.1-GPTQ 📑 🔶 |
参数大小: |
162.3 |
平均分: |
53.14 |
模型名称: | Llama-2-13b-hf-ds_wiki_1024_full_r_64_alpha_16 📑 💬 |
参数大小: |
130 |
平均分: |
53.14 |
模型名称: | MLewd-L2-13B 📑 🔶 |
参数大小: |
130 |
平均分: |
53.12 |
模型名称: | manticore-13b-chat-pyg-GPTQ 📑 ❓ |
参数大小: |
162.2 |
平均分: |
53.11 |
模型名称: | llama2_7b_mmlu 📑 🔶 |
参数大小: |
70 |
平均分: |
53.1 |
模型名称: | LlongOrca-7B-16k 📑 💬 |
参数大小: |
70 |
平均分: |
53.02 |
模型名称: | airoboros-l2-13b-gpt4-1.4.1 📑 🔶 |
参数大小: |
130 |
平均分: |
53.02 |
模型名称: | Walter-Mistral-7B 📑 💬 |
参数大小: |
72.4 |
平均分: |
53.0 |
模型名称: | medicine-chat 📑 🔶 |
参数大小: |
0 |
平均分: |
52.99 |
模型名称: | Code-290k-13B 📑 🔶 |
参数大小: |
130 |
平均分: |
52.96 |
模型名称: | digital-socrates-7b 📑 🔶 |
参数大小: |
70 |
平均分: |
52.95 |
模型名称: | zaraxe-l2-7b 📑 🔶 |
参数大小: |
70 |
平均分: |
52.95 |
模型名称: | baize-v2-13b 📑 🔶 |
参数大小: |
130 |
平均分: |
52.94 |
模型名称: | Llama-2-7b-chat-hf-afr-100step-flan-v2 📑 💬 |
参数大小: |
70 |
平均分: |
52.92 |
模型名称: | llama2-22b-chat-wizard-uncensored 📑 🔶 |
参数大小: |
218.3 |
平均分: |
52.9 |
模型名称: | Llama-2-13b-ft-instruct-es 📑 💬 |
参数大小: |
130 |
平均分: |
52.89 |
模型名称: | law-chat 📑 🔶 |
参数大小: |
0 |
平均分: |
52.88 |
模型名称: | Llama-2-7b-chat-hf-afr-100step-flan 📑 💬 |
参数大小: |
70 |
平均分: |
52.88 |
模型名称: | archangel_sft-kto_llama13b 📑 💬 |
参数大小: |
130.2 |
平均分: |
52.87 |
模型名称: | chimera-inst-chat-13b-hf 📑 ❓ |
参数大小: |
130 |
平均分: |
52.86 |
模型名称: | Telugu-Llama2-7B-v0-Instruct 📑 💬 |
参数大小: |
70 |
平均分: |
52.86 |
模型名称: | japanese-stablelm-instruct-gamma-7b 📑 💬 |
参数大小: |
72.4 |
平均分: |
52.82 |
模型名称: | Llama2-13B-no_robots-alpaca-lora 📑 🔶 |
参数大小: |
130 |
平均分: |
52.77 |
模型名称: | ypotryll-22b-epoch2-qlora 📑 💬 |
参数大小: |
220 |
平均分: |
52.75 |
模型名称: | orca_mini_v2_13b 📑 🔶 |
参数大小: |
128.5 |
平均分: |
52.75 |
模型名称: | Llama-2-7b-chat-hf-afr-200step-flan-v2 📑 💬 |
参数大小: |
70 |
平均分: |
52.75 |
模型名称: | EverythingLM-13b-V2-16k 📑 🔶 |
参数大小: |
130 |
平均分: |
52.75 |
模型名称: | ELYZA-japanese-Llama-2-13b-fast-instruct 📑 🔶 |
参数大小: |
130 |
平均分: |
52.72 |
模型名称: | Synthia-7B-v1.2 📑 🔶 |
参数大小: |
70 |
平均分: |
52.71 |
模型名称: | MetaMath-13B-V1.0 📑 🔶 |
参数大小: |
130 |
平均分: |
52.71 |
模型名称: | yehoon_llama2 📑 💬 |
参数大小: |
0 |
平均分: |
52.71 |
模型名称: | Nous-Capybara-7B 📑 🔶 |
参数大小: |
66.1 |
平均分: |
52.7 |
模型名称: | Tulpar-7b-v0 📑 🔶 |
参数大小: |
70 |
平均分: |
52.69 |
模型名称: | Capybara-7B 📑 🔶 |
参数大小: |
66.1 |
平均分: |
52.69 |
模型名称: | CodeEngine 📑 🔶 |
参数大小: |
0 |
平均分: |
52.68 |
模型名称: | openbuddy-mixtral-8x7b-v16.1-32k 📑 🔶 |
参数大小: |
467.4 |
平均分: |
52.68 |
模型名称: | Mistral-Trismegistus-7B 📑 💬 |
参数大小: |
70 |
平均分: |
52.66 |
模型名称: | airoboros-l2-13b-gpt4-m2.0 📑 🔶 |
参数大小: |
130 |
平均分: |
52.66 |
模型名称: | Llama-2-7b-chat-hf-afr-200step-flan 📑 💬 |
参数大小: |
70 |
平均分: |
52.62 |
模型名称: | openbuddy-mistral-7b-v13.1 📑 🔶 |
参数大小: |
70 |
平均分: |
52.62 |
模型名称: | LIMA-13b-hf 📑 🔶 |
参数大小: |
130 |
平均分: |
52.61 |
模型名称: | Chinese-Llama-2-7b 📑 🔶 |
参数大小: |
70 |
平均分: |
52.59 |
模型名称: | japanese-stablelm-base-gamma-7b 📑 🔶 |
参数大小: |
72.4 |
平均分: |
52.59 |
模型名称: | Mistral-7B-SFT 📑 🔶 |
参数大小: |
72.4 |
平均分: |
52.58 |
模型名称: | speechless-codellama-dolphin-orca-platypus-34b 📑 🔶 |
参数大小: |
340 |
平均分: |
52.53 |
模型名称: | speechless-codellama-34b-v1.0 📑 🔶 |
参数大小: |
340 |
平均分: |
52.53 |
模型名称: | Llama-2-7b-chat-hf-10-attention-sparsity 📑 💬 |
参数大小: |
67.4 |
平均分: |
52.52 |
模型名称: | speechless-codellama-34b-v2.0 📑 🔶 |
参数大小: |
340 |
平均分: |
52.51 |
模型名称: | airoboros-l2-13b-gpt4-2.0 📑 🔶 |
参数大小: |
130 |
平均分: |
52.49 |