🟢 : 预训练模型:这类模型是新的基础模型,它们是基于特定数据集进行预训练的。
🔶 :领域特定微调模型:这些预训练模型经过了针对特定领域数据集的进一步微调,以获得更好的性能。
💬 : 聊天模型:包括使用任务指令数据集的IFT(指令式任务训练)、RLHF(强化学习从人类反馈)或DPO(通过增加策略稍微改变模型的损失)等方法进行的聊天式微调模型。
🤝 :基础合并和Moerges模型:这类模型通过合并或MoErges(模型融合)技术集成了多个模型,但不需要额外的微调。如果您发现没有图标的模型,请随时提交问题,以补充模型信息。
❓:表示未知
模型名称 | 模型类型 | 参数大小(亿) | 平均分 | ARC分数 | Hellaswag分数 | MMLU分数 | TruthfulQA分数 | Winogrande分数 | GSM8K分数 | 模型架构 |
---|---|---|---|---|---|---|---|---|---|---|
Mistral-7B-golden 📑 | 💬 |
70 |
52.49 |
60.75 |
44.42 |
59.29 |
53.51 |
76.64 |
20.32 |
MistralForCausalLM |
Llama-2-7b-chat-hf-10-sparsity 📑 | 💬 |
67.4 |
52.48 |
53.16 |
78.26 |
48.18 |
45.29 |
71.59 |
18.42 |
LlamaForCausalLM |
Llama2-7b-openorca-mc-v2 📑 | 🔶 |
66.1 |
52.47 |
55.55 |
81.26 |
48.3 |
51.49 |
72.85 |
5.38 |
Unknown |
llama_mirror_13b_v1.0 📑 | 🔶 |
130 |
52.46 |
57.59 |
80.53 |
48.0 |
44.54 |
76.64 |
7.43 |
Unknown |
vigogne-2-7b-chat 📑 | 🔶 |
70 |
52.45 |
55.63 |
78.71 |
50.98 |
47.21 |
74.43 |
7.73 |
LlamaForCausalLM |
llama-13b-supercot 📑 | 🔶 |
130 |
52.44 |
56.06 |
81.71 |
45.36 |
48.55 |
75.77 |
7.2 |
LlamaForCausalLM |
Llama-2-7b-chat-hf-afr-300step-flan-v2 📑 | 💬 |
70 |
52.41 |
52.56 |
77.76 |
48.51 |
45.14 |
72.53 |
17.97 |
LlamaForCausalLM |
PuddleJumper-Platypus2-13B-QLoRA-0.80-epoch 📑 | 💬 |
130.2 |
52.41 |
54.52 |
79.36 |
55.15 |
54.32 |
71.11 |
0.0 |
Unknown |
zararp-l2-7b 📑 | 🔶 |
70 |
52.39 |
56.31 |
79.19 |
51.36 |
51.26 |
74.51 |
1.74 |
LlamaForCausalLM |
TowerInstruct-7B-v0.1 📑 | 💬 |
67.4 |
52.39 |
55.46 |
79.0 |
46.88 |
42.59 |
73.95 |
16.45 |
LlamaForCausalLM |
Alpacino13b 📑 | 🔶 |
130 |
52.39 |
58.53 |
81.31 |
47.92 |
41.66 |
76.95 |
7.96 |
LlamaForCausalLM |
EverythingLM-13b-16k 📑 | 🔶 |
130 |
52.33 |
56.57 |
80.58 |
50.18 |
47.46 |
72.77 |
6.44 |
LlamaForCausalLM |
openbuddy-atom-13b-v9-bf16 📑 | 🔶 |
130 |
52.31 |
51.19 |
75.99 |
49.33 |
48.66 |
73.32 |
15.39 |
LlamaForCausalLM |
airoboros-13b-gpt4-1.2 📑 | 🔶 |
130 |
52.31 |
58.36 |
81.61 |
48.84 |
47.54 |
73.64 |
3.87 |
LlamaForCausalLM |
vicuna-13b 📑 | 🔶 |
128.5 |
52.3 |
51.71 |
79.94 |
50.84 |
52.68 |
71.03 |
7.58 |
Unknown |
Asimov-7B-v2 📑 | 🔶 |
70 |
52.29 |
54.27 |
78.72 |
52.59 |
45.44 |
71.82 |
10.92 |
MistralForCausalLM |
Palworld-SME-13b 📑 | 🔶 |
130.2 |
52.28 |
55.55 |
80.81 |
53.64 |
46.67 |
74.82 |
2.2 |
LlamaForCausalLM |
Llama-2-7b-chat-hf-afr-441step-flan-v2 📑 | 💬 |
70 |
52.28 |
52.13 |
77.63 |
48.52 |
45.02 |
72.53 |
17.82 |
LlamaForCausalLM |
Platypus2-13B-QLoRA-0.80-epoch 📑 | 💬 |
130 |
52.27 |
57.76 |
81.63 |
55.63 |
39.7 |
75.93 |
2.96 |
Unknown |
Llama-2-7b-chat-hf-afr-200step-merged 📑 | 💬 |
70 |
52.26 |
52.05 |
77.38 |
48.65 |
44.6 |
71.9 |
18.95 |
LlamaForCausalLM |
zararp-1.1-l2-7b 📑 | 🔶 |
70 |
52.22 |
56.48 |
78.85 |
51.49 |
51.99 |
73.4 |
1.14 |
LlamaForCausalLM |
L2-7b-Hermes-Synthia 📑 | ❓ |
70 |
52.21 |
51.02 |
79.12 |
47.88 |
46.77 |
74.51 |
13.95 |
LlamaForCausalLM |
Llama-2-7b-chat-hf-20-attention-sparsity 📑 | 💬 |
67.4 |
52.19 |
53.41 |
77.91 |
47.49 |
45.84 |
70.72 |
17.74 |
LlamaForCausalLM |
Nous-Hermes-13B-SuperHOT-8K-fp16 📑 | 🔶 |
130 |
52.18 |
55.29 |
81.87 |
48.23 |
51.19 |
75.3 |
1.21 |
LlamaForCausalLM |
WizardLM-13B-V1-1-SuperHOT-8K-GPTQ 📑 | 🔶 |
162.2 |
52.15 |
57.0 |
80.32 |
47.08 |
53.46 |
74.35 |
0.68 |
LlamaForCausalLM |
Stheno-1.3-L2-13B 📑 | 🔶 |
130 |
52.15 |
56.83 |
81.7 |
52.79 |
50.23 |
71.11 |
0.23 |
LlamaForCausalLM |
mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2 📑 | 💬 |
70 |
52.13 |
57.17 |
79.57 |
50.24 |
52.51 |
72.93 |
0.38 |
Unknown |
Alpagasus-2-13B-QLoRA-pipeline 📑 | 🔶 |
128.5 |
52.13 |
58.28 |
80.98 |
54.14 |
34.21 |
75.93 |
9.25 |
Unknown |
openbuddy-mixtral-7bx8-v16.3-32k 📑 | 🔶 |
467.4 |
52.13 |
26.45 |
80.83 |
71.99 |
56.39 |
77.11 |
0.0 |
MixtralForCausalLM |
ANIMA-Nectar-v2 📑 | 🔶 |
72.4 |
52.13 |
53.24 |
76.63 |
54.21 |
49.04 |
74.11 |
5.53 |
MistralForCausalLM |
Xwin-LM-7B-V0.1 📑 | 🔶 |
70 |
52.08 |
56.57 |
79.4 |
49.98 |
47.89 |
73.32 |
5.31 |
LlamaForCausalLM |
vicuna-7b-v1.5 📑 | 🔶 |
70 |
52.06 |
53.24 |
77.39 |
51.04 |
50.34 |
72.14 |
8.19 |
LlamaForCausalLM |
llama2-7b-layla 📑 | 🔶 |
70 |
52.05 |
54.18 |
79.34 |
49.7 |
46.5 |
74.11 |
8.49 |
LlamaForCausalLM |
llama2-7b-hf-chat-lora 📑 | 🔶 |
66.1 |
52.03 |
55.72 |
78.75 |
47.99 |
43.11 |
75.85 |
10.77 |
Unknown |
Llama-2-7b-chat-hf-20-sparsity 📑 | 💬 |
70 |
52.01 |
52.47 |
77.91 |
47.27 |
45.88 |
70.72 |
17.82 |
LlamaForCausalLM |
bactrian-x-llama-13b-merged 📑 | 🔶 |
128.5 |
52.0 |
56.4 |
79.33 |
48.4 |
48.38 |
73.95 |
5.53 |
Unknown |
vicuna-7b-v1.5 📑 | 🔶 |
70 |
51.99 |
53.24 |
77.39 |
50.82 |
50.33 |
72.06 |
8.11 |
LlamaForCausalLM |
Qwen-LLaMAfied-7B-Chat 📑 | 🔶 |
70 |
51.99 |
50.94 |
83.47 |
53.52 |
46.09 |
73.16 |
4.78 |
LlamaForCausalLM |
openbuddy-mistral-7b-v13-base 📑 | 🔶 |
70 |
51.99 |
52.9 |
76.12 |
57.54 |
52.82 |
71.35 |
1.21 |
MistralForCausalLM |
spicyboros-7b-2.2 📑 | 🔶 |
70 |
51.95 |
56.57 |
80.09 |
48.47 |
47.22 |
74.51 |
4.85 |
LlamaForCausalLM |
10k_v1_lora_qkvo_rank28_v2 📑 | 🔶 |
0 |
51.95 |
55.38 |
79.21 |
50.5 |
52.75 |
73.24 |
0.61 |
Unknown |
Yi-6b-200k-dpo 📑 | 💬 |
60.6 |
51.93 |
43.09 |
74.53 |
64.0 |
45.51 |
73.09 |
11.37 |
LlamaForCausalLM |
Yi-7b-dpo 📑 | 💬 |
60.6 |
51.93 |
43.09 |
74.53 |
64.0 |
45.51 |
73.09 |
11.37 |
Unknown |
Nous-Hermes-llama-2-7b 📑 | 🔶 |
67.4 |
51.87 |
55.12 |
78.94 |
48.34 |
49.01 |
74.03 |
5.76 |
LlamaForCausalLM |
openbuddy-zephyr-7b-v14.1 📑 | 🔶 |
70 |
51.86 |
52.13 |
75.02 |
56.21 |
49.84 |
73.24 |
4.7 |
MistralForCausalLM |
Synthia-7B 📑 | 🔶 |
70 |
51.83 |
56.14 |
78.6 |
50.35 |
45.03 |
74.27 |
6.6 |
LlamaForCausalLM |
Medusa-1.1-L2-7B 📑 | 🔶 |
67.4 |
51.8 |
56.48 |
78.57 |
51.56 |
47.7 |
75.06 |
1.44 |
LlamaForCausalLM |
Llama-2-7b-chat-hf-30-attention-sparsity 📑 | 💬 |
67.4 |
51.8 |
53.41 |
76.87 |
47.04 |
45.02 |
71.03 |
17.44 |
LlamaForCausalLM |
Stheno-Mix-L2-20B 📑 | 🔶 |
206.3 |
51.79 |
57.76 |
79.63 |
52.51 |
51.8 |
68.98 |
0.08 |
LlamaForCausalLM |
recycled-wizardlm-7b-v2.0 📑 | 🔶 |
70 |
51.79 |
54.95 |
77.85 |
45.79 |
48.29 |
71.51 |
12.36 |
Unknown |
airoboros-13b-gpt4-1.3 📑 | 🔶 |
130 |
51.76 |
58.53 |
81.6 |
46.96 |
45.29 |
75.85 |
2.35 |
LlamaForCausalLM |
llama-2-7b-miniguanaco 📑 | 🔶 |
67.4 |
51.74 |
50.0 |
76.96 |
48.05 |
42.84 |
73.48 |
19.11 |
LlamaForCausalLM |
L2-7b-Orca-WVG-Test 📑 | 🔶 |
66.1 |
51.72 |
54.86 |
78.25 |
51.13 |
43.68 |
74.35 |
8.04 |
Unknown |
blossom-v2-llama2-7b 📑 | 💬 |
70 |
51.71 |
54.1 |
78.57 |
51.66 |
46.84 |
74.35 |
4.78 |
LlamaForCausalLM |
em_german_leo_mistral 📑 | 🔶 |
72.4 |
51.69 |
52.82 |
78.03 |
50.03 |
50.19 |
73.48 |
5.61 |
MistralForCausalLM |
ALMA-13B-Pretrain 📑 | 🔶 |
130 |
51.68 |
56.91 |
80.15 |
50.31 |
37.44 |
76.4 |
8.87 |
LlamaForCausalLM |
LLaMA-Pro-8B ✅ 📑 | 🔶 |
83.6 |
51.67 |
53.75 |
77.91 |
47.49 |
38.86 |
74.19 |
17.82 |
LlamaForCausalLM |
llama_ppo_1e6_new_tokenizerstep_8000 📑 | 🔶 |
67.4 |
51.67 |
54.78 |
78.64 |
46.63 |
41.06 |
74.03 |
14.86 |
LlamaForCausalLM |
L2-7b-Base-test-WVG 📑 | 🔶 |
66.1 |
51.66 |
54.27 |
77.81 |
51.07 |
46.28 |
73.56 |
6.97 |
Unknown |
LosslessMegaCoder-llama2-7b-mini 📑 | 💬 |
70 |
51.66 |
53.5 |
77.38 |
49.72 |
45.77 |
74.03 |
9.55 |
LlamaForCausalLM |
llama_sft_longer 📑 | 🔶 |
67.4 |
51.64 |
54.78 |
78.58 |
46.87 |
40.82 |
73.88 |
14.94 |
LlamaForCausalLM |
shisa-base-7b-v1 📑 | 🟢 |
79.6 |
51.64 |
52.3 |
77.63 |
23.12 |
42.4 |
78.53 |
35.86 |
MistralForCausalLM |
stable-vicuna-13B-HF 📑 | 💬 |
130 |
51.64 |
53.33 |
78.5 |
50.29 |
48.38 |
75.22 |
4.09 |
LlamaForCausalLM |
llama_ppo_1e6step_4000 📑 | 🔶 |
67.4 |
51.61 |
54.44 |
78.66 |
46.74 |
41.24 |
74.19 |
14.4 |
LlamaForCausalLM |
llama-7b-ludwig-alpaca 📑 | 🔶 |
70 |
51.6 |
54.01 |
78.73 |
45.8 |
41.91 |
74.27 |
14.86 |
Unknown |
tamil-llama-13b-instruct-v0.1 📑 | 💬 |
130 |
51.59 |
54.52 |
79.35 |
50.37 |
41.22 |
76.56 |
7.51 |
LlamaForCausalLM |
vicuna-7b-v1.5-16k ✅ 📑 | 🔶 |
70 |
51.58 |
54.69 |
77.32 |
49.51 |
50.41 |
71.11 |
6.44 |
LlamaForCausalLM |
airoboros-c34b-2.1 📑 | 💬 |
340 |
51.52 |
54.69 |
76.45 |
55.08 |
46.15 |
68.43 |
8.34 |
LlamaForCausalLM |
MiniChat-2-3B 📑 | 🔶 |
30 |
51.49 |
44.88 |
67.69 |
47.59 |
49.64 |
66.46 |
32.68 |
LlamaForCausalLM |
openbuddy-deepseek-10b-v17.1-4k 📑 | 🔶 |
105.5 |
51.48 |
54.35 |
76.93 |
53.17 |
45.96 |
74.03 |
4.47 |
LlamaForCausalLM |
vicuna-class-tutor-7b-ep3 📑 | 🔶 |
70 |
51.45 |
52.13 |
78.07 |
51.32 |
52.3 |
71.19 |
3.71 |
LlamaForCausalLM |
MistralLite 📑 | ❓ |
0 |
51.45 |
59.56 |
81.84 |
50.93 |
37.87 |
77.43 |
1.06 |
MistralForCausalLM |
llama2_7b_zh 📑 | 🔶 |
70 |
51.44 |
52.05 |
74.88 |
60.69 |
42.86 |
71.74 |
6.44 |
LlamaForCausalLM |
zoyllm-7b-slimorca 📑 | 🔶 |
72.4 |
51.44 |
50.6 |
72.12 |
48.78 |
49.13 |
67.32 |
20.7 |
MistralForCausalLM |
vicuna-7b-v1.5-16k ✅ 📑 | 🔶 |
70 |
51.42 |
54.18 |
77.31 |
49.3 |
50.35 |
71.03 |
6.37 |
LlamaForCausalLM |
Baichuan2-7B-Chat-LLaMAfied 📑 | 🔶 |
70 |
51.42 |
52.47 |
74.04 |
53.88 |
48.04 |
69.14 |
10.92 |
LlamaForCausalLM |
llama2-22b-daydreamer-v3 📑 | 🔶 |
220 |
51.39 |
56.06 |
80.07 |
52.49 |
42.43 |
73.48 |
3.79 |
LlamaForCausalLM |
Colossal-LLaMA-2-7b-base 📑 | 🔶 |
70 |
51.39 |
53.5 |
70.5 |
54.4 |
50.19 |
70.01 |
9.7 |
LlamaForCausalLM |
OpenOrca-Preview1-13B 📑 | 🔶 |
130 |
51.38 |
54.95 |
78.19 |
50.12 |
49.05 |
71.03 |
4.93 |
LlamaForCausalLM |
kuchiki-1.1-l2-7b 📑 | 🔶 |
70 |
51.36 |
54.18 |
78.0 |
48.14 |
49.96 |
73.16 |
4.7 |
LlamaForCausalLM |
llama-13b 📑 | 🟢 |
130.2 |
51.33 |
56.14 |
80.92 |
47.61 |
39.48 |
76.24 |
7.58 |
LlamaForCausalLM |
kuchiki-l2-7b 📑 | 🔶 |
70 |
51.33 |
54.35 |
78.44 |
47.74 |
49.88 |
73.09 |
4.47 |
LlamaForCausalLM |
Luna-AI-Llama2-Uncensored 📑 | 🔶 |
0 |
51.29 |
54.35 |
78.6 |
46.7 |
45.5 |
72.77 |
9.86 |
LlamaForCausalLM |
zarablend-l2-7b 📑 | 🔶 |
70 |
51.29 |
54.44 |
78.62 |
47.61 |
49.38 |
73.32 |
4.4 |
LlamaForCausalLM |
finetuned-llama2-chat-5000-v2.0 📑 | 🔶 |
0 |
51.28 |
52.05 |
76.13 |
46.33 |
45.18 |
72.3 |
15.69 |
LlamaForCausalLM |
Rhino-Mistral-7B 📑 | 🔶 |
72.4 |
51.27 |
48.12 |
71.42 |
48.95 |
45.9 |
71.11 |
22.14 |
MistralForCausalLM |
OpenHermes-7B 📑 | 💬 |
70 |
51.26 |
56.14 |
78.32 |
48.62 |
45.0 |
74.51 |
5.0 |
LlamaForCausalLM |
zarablend-1.1-l2-7b 📑 | 🔶 |
70 |
51.25 |
54.86 |
78.58 |
47.89 |
49.0 |
72.61 |
4.55 |
LlamaForCausalLM |
airoboros-l2-7b-2.2.1 📑 | 💬 |
70 |
51.22 |
55.03 |
80.06 |
47.64 |
44.65 |
73.8 |
6.14 |
LlamaForCausalLM |
MentaLLaMA-chat-7B 📑 | 🔶 |
70 |
51.17 |
52.82 |
76.1 |
47.51 |
44.02 |
70.4 |
16.15 |
LlamaForCausalLM |
koala-13B-HF 📑 | 🔶 |
130 |
51.16 |
52.99 |
77.59 |
45.32 |
50.23 |
74.03 |
6.82 |
LlamaForCausalLM |
LexPodLM-13B 📑 | 🔶 |
640 |
51.14 |
57.76 |
81.04 |
48.38 |
43.48 |
76.16 |
0.0 |
LlamaForCausalLM |
Llama2-Chinese-7b-Chat 📑 | 🔶 |
70 |
51.13 |
52.39 |
77.52 |
47.72 |
46.87 |
74.27 |
8.04 |
LlamaForCausalLM |
finetuned-llama2-2048-v3.0 📑 | 🔶 |
67.4 |
51.13 |
49.83 |
77.09 |
46.69 |
46.21 |
72.06 |
14.94 |
LlamaForCausalLM |
vicuna-7b-v1.3-attention-sparsity-10 📑 | 💬 |
67.4 |
51.13 |
52.22 |
77.05 |
47.93 |
46.87 |
69.53 |
13.19 |
LlamaForCausalLM |
doctorLLM 📑 | 🔶 |
67.4 |
51.12 |
52.9 |
79.76 |
46.47 |
42.52 |
71.59 |
13.5 |
LlamaForCausalLM |
EverythingLM-13b-V3-16k 📑 | 🔶 |
130 |
51.11 |
58.19 |
80.12 |
50.48 |
45.18 |
70.72 |
1.97 |
LlamaForCausalLM |
pygmalion-2-7b 📑 | 🔶 |
67.4 |
51.11 |
54.01 |
78.23 |
49.11 |
43.78 |
75.14 |
6.37 |
LlamaForCausalLM |
finetuned-llama2-chat-5000-v1.0-squad 📑 | 🔶 |
0 |
51.09 |
50.94 |
76.61 |
46.43 |
44.45 |
71.98 |
16.15 |
LlamaForCausalLM |
recycled-alpaca-7b-v2.0 📑 | 🔶 |
70 |
51.09 |
54.18 |
77.98 |
46.79 |
45.4 |
71.35 |
10.84 |
Unknown |
注意:手机屏幕有限,仅展示平均分,所有内容建议电脑端访问。
模型名称: | Mistral-7B-golden 📑 💬 |
参数大小: |
70 |
平均分: |
52.49 |
模型名称: | Llama-2-7b-chat-hf-10-sparsity 📑 💬 |
参数大小: |
67.4 |
平均分: |
52.48 |
模型名称: | Llama2-7b-openorca-mc-v2 📑 🔶 |
参数大小: |
66.1 |
平均分: |
52.47 |
模型名称: | llama_mirror_13b_v1.0 📑 🔶 |
参数大小: |
130 |
平均分: |
52.46 |
模型名称: | vigogne-2-7b-chat 📑 🔶 |
参数大小: |
70 |
平均分: |
52.45 |
模型名称: | llama-13b-supercot 📑 🔶 |
参数大小: |
130 |
平均分: |
52.44 |
模型名称: | Llama-2-7b-chat-hf-afr-300step-flan-v2 📑 💬 |
参数大小: |
70 |
平均分: |
52.41 |
模型名称: | PuddleJumper-Platypus2-13B-QLoRA-0.80-epoch 📑 💬 |
参数大小: |
130.2 |
平均分: |
52.41 |
模型名称: | zararp-l2-7b 📑 🔶 |
参数大小: |
70 |
平均分: |
52.39 |
模型名称: | TowerInstruct-7B-v0.1 📑 💬 |
参数大小: |
67.4 |
平均分: |
52.39 |
模型名称: | Alpacino13b 📑 🔶 |
参数大小: |
130 |
平均分: |
52.39 |
模型名称: | EverythingLM-13b-16k 📑 🔶 |
参数大小: |
130 |
平均分: |
52.33 |
模型名称: | openbuddy-atom-13b-v9-bf16 📑 🔶 |
参数大小: |
130 |
平均分: |
52.31 |
模型名称: | airoboros-13b-gpt4-1.2 📑 🔶 |
参数大小: |
130 |
平均分: |
52.31 |
模型名称: | vicuna-13b 📑 🔶 |
参数大小: |
128.5 |
平均分: |
52.3 |
模型名称: | Asimov-7B-v2 📑 🔶 |
参数大小: |
70 |
平均分: |
52.29 |
模型名称: | Palworld-SME-13b 📑 🔶 |
参数大小: |
130.2 |
平均分: |
52.28 |
模型名称: | Llama-2-7b-chat-hf-afr-441step-flan-v2 📑 💬 |
参数大小: |
70 |
平均分: |
52.28 |
模型名称: | Platypus2-13B-QLoRA-0.80-epoch 📑 💬 |
参数大小: |
130 |
平均分: |
52.27 |
模型名称: | Llama-2-7b-chat-hf-afr-200step-merged 📑 💬 |
参数大小: |
70 |
平均分: |
52.26 |
模型名称: | zararp-1.1-l2-7b 📑 🔶 |
参数大小: |
70 |
平均分: |
52.22 |
模型名称: | L2-7b-Hermes-Synthia 📑 ❓ |
参数大小: |
70 |
平均分: |
52.21 |
模型名称: | Llama-2-7b-chat-hf-20-attention-sparsity 📑 💬 |
参数大小: |
67.4 |
平均分: |
52.19 |
模型名称: | Nous-Hermes-13B-SuperHOT-8K-fp16 📑 🔶 |
参数大小: |
130 |
平均分: |
52.18 |
模型名称: | WizardLM-13B-V1-1-SuperHOT-8K-GPTQ 📑 🔶 |
参数大小: |
162.2 |
平均分: |
52.15 |
模型名称: | Stheno-1.3-L2-13B 📑 🔶 |
参数大小: |
130 |
平均分: |
52.15 |
模型名称: | mc_data_30k_from_platpus_orca_7b_10k_v1_lora_qkvo_rank14_v2 📑 💬 |
参数大小: |
70 |
平均分: |
52.13 |
模型名称: | Alpagasus-2-13B-QLoRA-pipeline 📑 🔶 |
参数大小: |
128.5 |
平均分: |
52.13 |
模型名称: | openbuddy-mixtral-7bx8-v16.3-32k 📑 🔶 |
参数大小: |
467.4 |
平均分: |
52.13 |
模型名称: | ANIMA-Nectar-v2 📑 🔶 |
参数大小: |
72.4 |
平均分: |
52.13 |
模型名称: | Xwin-LM-7B-V0.1 📑 🔶 |
参数大小: |
70 |
平均分: |
52.08 |
模型名称: | vicuna-7b-v1.5 📑 🔶 |
参数大小: |
70 |
平均分: |
52.06 |
模型名称: | llama2-7b-layla 📑 🔶 |
参数大小: |
70 |
平均分: |
52.05 |
模型名称: | llama2-7b-hf-chat-lora 📑 🔶 |
参数大小: |
66.1 |
平均分: |
52.03 |
模型名称: | Llama-2-7b-chat-hf-20-sparsity 📑 💬 |
参数大小: |
70 |
平均分: |
52.01 |
模型名称: | bactrian-x-llama-13b-merged 📑 🔶 |
参数大小: |
128.5 |
平均分: |
52.0 |
模型名称: | vicuna-7b-v1.5 📑 🔶 |
参数大小: |
70 |
平均分: |
51.99 |
模型名称: | Qwen-LLaMAfied-7B-Chat 📑 🔶 |
参数大小: |
70 |
平均分: |
51.99 |
模型名称: | openbuddy-mistral-7b-v13-base 📑 🔶 |
参数大小: |
70 |
平均分: |
51.99 |
模型名称: | spicyboros-7b-2.2 📑 🔶 |
参数大小: |
70 |
平均分: |
51.95 |
模型名称: | 10k_v1_lora_qkvo_rank28_v2 📑 🔶 |
参数大小: |
0 |
平均分: |
51.95 |
模型名称: | Yi-6b-200k-dpo 📑 💬 |
参数大小: |
60.6 |
平均分: |
51.93 |
模型名称: | Yi-7b-dpo 📑 💬 |
参数大小: |
60.6 |
平均分: |
51.93 |
模型名称: | Nous-Hermes-llama-2-7b 📑 🔶 |
参数大小: |
67.4 |
平均分: |
51.87 |
模型名称: | openbuddy-zephyr-7b-v14.1 📑 🔶 |
参数大小: |
70 |
平均分: |
51.86 |
模型名称: | Synthia-7B 📑 🔶 |
参数大小: |
70 |
平均分: |
51.83 |
模型名称: | Medusa-1.1-L2-7B 📑 🔶 |
参数大小: |
67.4 |
平均分: |
51.8 |
模型名称: | Llama-2-7b-chat-hf-30-attention-sparsity 📑 💬 |
参数大小: |
67.4 |
平均分: |
51.8 |
模型名称: | Stheno-Mix-L2-20B 📑 🔶 |
参数大小: |
206.3 |
平均分: |
51.79 |
模型名称: | recycled-wizardlm-7b-v2.0 📑 🔶 |
参数大小: |
70 |
平均分: |
51.79 |
模型名称: | airoboros-13b-gpt4-1.3 📑 🔶 |
参数大小: |
130 |
平均分: |
51.76 |
模型名称: | llama-2-7b-miniguanaco 📑 🔶 |
参数大小: |
67.4 |
平均分: |
51.74 |
模型名称: | L2-7b-Orca-WVG-Test 📑 🔶 |
参数大小: |
66.1 |
平均分: |
51.72 |
模型名称: | blossom-v2-llama2-7b 📑 💬 |
参数大小: |
70 |
平均分: |
51.71 |
模型名称: | em_german_leo_mistral 📑 🔶 |
参数大小: |
72.4 |
平均分: |
51.69 |
模型名称: | ALMA-13B-Pretrain 📑 🔶 |
参数大小: |
130 |
平均分: |
51.68 |
模型名称: | LLaMA-Pro-8B ✅ 📑 🔶 |
参数大小: |
83.6 |
平均分: |
51.67 |
模型名称: | llama_ppo_1e6_new_tokenizerstep_8000 📑 🔶 |
参数大小: |
67.4 |
平均分: |
51.67 |
模型名称: | L2-7b-Base-test-WVG 📑 🔶 |
参数大小: |
66.1 |
平均分: |
51.66 |
模型名称: | LosslessMegaCoder-llama2-7b-mini 📑 💬 |
参数大小: |
70 |
平均分: |
51.66 |
模型名称: | llama_sft_longer 📑 🔶 |
参数大小: |
67.4 |
平均分: |
51.64 |
模型名称: | shisa-base-7b-v1 📑 🟢 |
参数大小: |
79.6 |
平均分: |
51.64 |
模型名称: | stable-vicuna-13B-HF 📑 💬 |
参数大小: |
130 |
平均分: |
51.64 |
模型名称: | llama_ppo_1e6step_4000 📑 🔶 |
参数大小: |
67.4 |
平均分: |
51.61 |
模型名称: | llama-7b-ludwig-alpaca 📑 🔶 |
参数大小: |
70 |
平均分: |
51.6 |
模型名称: | tamil-llama-13b-instruct-v0.1 📑 💬 |
参数大小: |
130 |
平均分: |
51.59 |
模型名称: | vicuna-7b-v1.5-16k ✅ 📑 🔶 |
参数大小: |
70 |
平均分: |
51.58 |
模型名称: | airoboros-c34b-2.1 📑 💬 |
参数大小: |
340 |
平均分: |
51.52 |
模型名称: | MiniChat-2-3B 📑 🔶 |
参数大小: |
30 |
平均分: |
51.49 |
模型名称: | openbuddy-deepseek-10b-v17.1-4k 📑 🔶 |
参数大小: |
105.5 |
平均分: |
51.48 |
模型名称: | vicuna-class-tutor-7b-ep3 📑 🔶 |
参数大小: |
70 |
平均分: |
51.45 |
模型名称: | MistralLite 📑 ❓ |
参数大小: |
0 |
平均分: |
51.45 |
模型名称: | llama2_7b_zh 📑 🔶 |
参数大小: |
70 |
平均分: |
51.44 |
模型名称: | zoyllm-7b-slimorca 📑 🔶 |
参数大小: |
72.4 |
平均分: |
51.44 |
模型名称: | vicuna-7b-v1.5-16k ✅ 📑 🔶 |
参数大小: |
70 |
平均分: |
51.42 |
模型名称: | Baichuan2-7B-Chat-LLaMAfied 📑 🔶 |
参数大小: |
70 |
平均分: |
51.42 |
模型名称: | llama2-22b-daydreamer-v3 📑 🔶 |
参数大小: |
220 |
平均分: |
51.39 |
模型名称: | Colossal-LLaMA-2-7b-base 📑 🔶 |
参数大小: |
70 |
平均分: |
51.39 |
模型名称: | OpenOrca-Preview1-13B 📑 🔶 |
参数大小: |
130 |
平均分: |
51.38 |
模型名称: | kuchiki-1.1-l2-7b 📑 🔶 |
参数大小: |
70 |
平均分: |
51.36 |
模型名称: | llama-13b 📑 🟢 |
参数大小: |
130.2 |
平均分: |
51.33 |
模型名称: | kuchiki-l2-7b 📑 🔶 |
参数大小: |
70 |
平均分: |
51.33 |
模型名称: | Luna-AI-Llama2-Uncensored 📑 🔶 |
参数大小: |
0 |
平均分: |
51.29 |
模型名称: | zarablend-l2-7b 📑 🔶 |
参数大小: |
70 |
平均分: |
51.29 |
模型名称: | finetuned-llama2-chat-5000-v2.0 📑 🔶 |
参数大小: |
0 |
平均分: |
51.28 |
模型名称: | Rhino-Mistral-7B 📑 🔶 |
参数大小: |
72.4 |
平均分: |
51.27 |
模型名称: | OpenHermes-7B 📑 💬 |
参数大小: |
70 |
平均分: |
51.26 |
模型名称: | zarablend-1.1-l2-7b 📑 🔶 |
参数大小: |
70 |
平均分: |
51.25 |
模型名称: | airoboros-l2-7b-2.2.1 📑 💬 |
参数大小: |
70 |
平均分: |
51.22 |
模型名称: | MentaLLaMA-chat-7B 📑 🔶 |
参数大小: |
70 |
平均分: |
51.17 |
模型名称: | koala-13B-HF 📑 🔶 |
参数大小: |
130 |
平均分: |
51.16 |
模型名称: | LexPodLM-13B 📑 🔶 |
参数大小: |
640 |
平均分: |
51.14 |
模型名称: | Llama2-Chinese-7b-Chat 📑 🔶 |
参数大小: |
70 |
平均分: |
51.13 |
模型名称: | finetuned-llama2-2048-v3.0 📑 🔶 |
参数大小: |
67.4 |
平均分: |
51.13 |
模型名称: | vicuna-7b-v1.3-attention-sparsity-10 📑 💬 |
参数大小: |
67.4 |
平均分: |
51.13 |
模型名称: | doctorLLM 📑 🔶 |
参数大小: |
67.4 |
平均分: |
51.12 |
模型名称: | EverythingLM-13b-V3-16k 📑 🔶 |
参数大小: |
130 |
平均分: |
51.11 |
模型名称: | pygmalion-2-7b 📑 🔶 |
参数大小: |
67.4 |
平均分: |
51.11 |
模型名称: | finetuned-llama2-chat-5000-v1.0-squad 📑 🔶 |
参数大小: |
0 |
平均分: |
51.09 |
模型名称: | recycled-alpaca-7b-v2.0 📑 🔶 |
参数大小: |
70 |
平均分: |
51.09 |