🟢 : 预训练模型:这类模型是新的基础模型,它们是基于特定数据集进行预训练的。
🔶 :领域特定微调模型:这些预训练模型经过了针对特定领域数据集的进一步微调,以获得更好的性能。
💬 : 聊天模型:包括使用任务指令数据集的IFT(指令式任务训练)、RLHF(强化学习从人类反馈)或DPO(通过增加策略稍微改变模型的损失)等方法进行的聊天式微调模型。
🤝 :基础合并和Moerges模型:这类模型通过合并或MoErges(模型融合)技术集成了多个模型,但不需要额外的微调。如果您发现没有图标的模型,请随时提交问题,以补充模型信息。
❓:表示未知
模型名称 | 模型类型 | 参数大小(亿) | 平均分 | ARC分数 | Hellaswag分数 | MMLU分数 | TruthfulQA分数 | Winogrande分数 | GSM8K分数 | 模型架构 |
---|---|---|---|---|---|---|---|---|---|---|
CollectiveCognition-v1-Mistral-7B 📑 | 💬 |
70 |
60.1 |
62.37 |
85.5 |
62.76 |
54.48 |
77.58 |
17.89 |
MistralForCausalLM |
zephyr-python-ru 📑 | 💬 |
0 |
60.08 |
56.14 |
82.03 |
60.18 |
52.8 |
76.8 |
32.52 |
Unknown |
ccy0-2g7e-wqsa-0 📑 | 🔶 |
72.4 |
60.07 |
58.19 |
82.19 |
59.59 |
49.99 |
78.22 |
32.22 |
MistralForCausalLM |
firefly-zephyr-6x7b 📑 | 🔶 |
354.3 |
60.06 |
60.75 |
82.8 |
60.03 |
48.84 |
77.03 |
30.93 |
MixtralForCausalLM |
freeze_KoSoLAR-10.7B-v0.2_1.4_dedup 📑 | 💬 |
108 |
60.06 |
58.45 |
81.26 |
64.83 |
44.5 |
79.08 |
32.22 |
LlamaForCausalLM |
Metis-0.1 📑 | 💬 |
72.4 |
60.02 |
60.15 |
82.85 |
61.42 |
45.24 |
77.27 |
33.21 |
MistralForCausalLM |
Velara 📑 | 🔶 |
113.9 |
60.01 |
58.96 |
82.83 |
59.45 |
44.7 |
73.8 |
40.33 |
MistralForCausalLM |
WizardLM-33B-V1.0-Uncensored 📑 | ❓ |
323.2 |
59.99 |
63.65 |
83.84 |
59.36 |
56.8 |
77.66 |
18.65 |
Unknown |
zephyr-dpo-v2 📑 | 🔶 |
72.4 |
59.99 |
57.85 |
82.72 |
58.61 |
56.16 |
74.35 |
30.25 |
MistralForCausalLM |
CodeLlama-70b-Instruct-hf 📑 | 🟢 |
689.8 |
59.98 |
55.03 |
77.24 |
56.4 |
50.44 |
74.51 |
46.25 |
LlamaForCausalLM |
psyonic-cetacean-20B 📑 | 🔶 |
199.9 |
59.97 |
63.57 |
86.2 |
59.66 |
57.55 |
78.14 |
14.71 |
LlamaForCausalLM |
AIRIC-The-Mistral 📑 | 💬 |
72.4 |
59.95 |
59.98 |
82.98 |
60.67 |
48.24 |
76.95 |
30.86 |
LlamaForCausalLM |
Novocode7b-v3 📑 | 🔶 |
70 |
59.94 |
57.51 |
81.17 |
61.91 |
48.29 |
74.51 |
36.24 |
MistralForCausalLM |
Xenon-2 📑 | 💬 |
72.4 |
59.93 |
57.51 |
83.28 |
60.25 |
60.92 |
78.22 |
19.41 |
MistralForCausalLM |
TarsChattyBasev0.0 📑 | 🔶 |
72.4 |
59.92 |
64.93 |
84.57 |
58.04 |
61.71 |
78.61 |
11.68 |
MistralForCausalLM |
Starling-LM-11B-alpha 📑 | 🔶 |
107.3 |
59.92 |
61.26 |
81.99 |
61.5 |
41.53 |
78.06 |
35.18 |
MistralForCausalLM |
neural-chat-7b-v3-1 📑 | 🔶 |
72.4 |
59.9 |
64.25 |
82.49 |
60.79 |
56.4 |
77.35 |
18.12 |
MistralForCausalLM |
Yi-6B-Infinity-Chat 📑 | 🔶 |
60.6 |
59.83 |
56.57 |
77.66 |
64.05 |
50.75 |
73.95 |
36.01 |
LlamaForCausalLM |
samantha-1.2-mistral-7b 📑 | 💬 |
71.1 |
59.83 |
64.08 |
85.08 |
63.91 |
50.4 |
78.53 |
16.98 |
Unknown |
Hermes-2-SOLAR-10.7B-Symbolic 📑 | 🔶 |
107.3 |
59.81 |
61.69 |
82.57 |
65.06 |
54.85 |
80.74 |
13.95 |
LlamaForCausalLM |
WizardMath-70B-V1.0 ✅ 📑 | 💬 |
700 |
59.81 |
67.49 |
86.03 |
68.44 |
52.23 |
81.77 |
2.88 |
LlamaForCausalLM |
AISquare-Instruct-SOLAR-10.7b-v0.5.32 📑 | 💬 |
107 |
59.79 |
61.86 |
84.66 |
63.13 |
51.19 |
82.79 |
15.09 |
LlamaForCausalLM |
mistral-7b-sft-beta 📑 | 🔶 |
70 |
59.78 |
57.42 |
82.23 |
61.42 |
43.58 |
77.58 |
36.47 |
MistralForCausalLM |
H4rmoniousAnthea 📑 | 💬 |
72.4 |
59.76 |
65.87 |
84.09 |
63.67 |
55.08 |
76.87 |
12.96 |
MistralForCausalLM |
bun_mistral_7b_v2 📑 | 🔶 |
72.4 |
59.76 |
59.9 |
82.65 |
61.77 |
40.67 |
78.3 |
35.25 |
MistralForCausalLM |
Llama-2-70B-chat-GPTQ 📑 | 🔶 |
728.2 |
59.75 |
62.63 |
84.81 |
62.74 |
50.98 |
78.69 |
18.65 |
Unknown |
apricot-wildflower-20 📑 | 🔶 |
72.4 |
59.74 |
59.64 |
81.76 |
63.38 |
41.76 |
77.9 |
33.97 |
MistralForCausalLM |
Mistral-7B-AEZAKMI-v2 📑 | 🔶 |
72.4 |
59.69 |
58.11 |
82.53 |
59.89 |
51.5 |
73.64 |
32.45 |
MistralForCausalLM |
ToxicHermes-2.5-Mistral-7B 📑 | 💬 |
72.4 |
59.69 |
64.59 |
83.75 |
63.67 |
50.84 |
77.9 |
17.36 |
MistralForCausalLM |
mistral-indo-7b 📑 | 🔶 |
72.4 |
59.68 |
61.09 |
81.19 |
62.99 |
42.34 |
78.37 |
32.07 |
MistralForCausalLM |
Synatra-RP-Orca-2-7b-v0.1 📑 | 💬 |
67.4 |
59.65 |
57.68 |
77.37 |
56.1 |
52.52 |
74.59 |
39.65 |
LlamaForCausalLM |
Orca-2-13B-no_robots 📑 | 💬 |
130.2 |
59.63 |
59.13 |
79.57 |
60.28 |
51.17 |
80.35 |
27.29 |
Unknown |
Yarn-Mistral-7b-64k 📑 | 🔶 |
70 |
59.63 |
59.9 |
82.51 |
62.96 |
41.86 |
77.27 |
33.28 |
MistralForCausalLM |
DiamondForce 📑 | 🔶 |
130.2 |
59.63 |
62.12 |
83.43 |
58.1 |
46.46 |
79.01 |
28.66 |
LlamaForCausalLM |
SynthIA-7B-v1.5 📑 | 🔶 |
70 |
59.59 |
62.71 |
83.37 |
63.48 |
51.32 |
79.24 |
17.44 |
MistralForCausalLM |
ShiningValiantXS 📑 | 🔶 |
130.2 |
59.56 |
58.96 |
81.93 |
56.75 |
48.7 |
76.95 |
34.04 |
LlamaForCausalLM |
Synatra-RP-Orca-2-7b-v0.1 📑 | 🔶 |
67.4 |
59.55 |
57.42 |
77.31 |
56.12 |
52.55 |
74.43 |
39.5 |
LlamaForCausalLM |
internlm-20b ✅ 📑 | 🟢 |
200 |
59.55 |
60.49 |
82.13 |
61.85 |
52.61 |
76.72 |
23.5 |
InternLMForCausalLM |
Mistral-7B-Discord-0.2 📑 | 🔶 |
72.4 |
59.55 |
60.58 |
82.49 |
62.82 |
42.73 |
77.74 |
30.93 |
MistralForCausalLM |
TarsMeta 📑 | 🔶 |
72.4 |
59.54 |
52.9 |
78.2 |
52.63 |
47.88 |
72.77 |
52.84 |
MistralForCausalLM |
Chupacabra-v3 📑 | 🔶 |
72.4 |
59.52 |
66.21 |
81.29 |
59.36 |
57.85 |
77.43 |
15.01 |
Unknown |
WizardLM-30B-fp16 📑 | 🔶 |
300 |
59.51 |
62.54 |
83.28 |
59.03 |
52.49 |
77.51 |
22.21 |
LlamaForCausalLM |
gpt4-alpaca-lora-30b-HF 📑 | ❓ |
300 |
59.51 |
64.85 |
85.72 |
58.51 |
52.24 |
80.19 |
15.54 |
LlamaForCausalLM |
zephyr-7b-alpha 📑 | 🔶 |
72.4 |
59.5 |
61.01 |
84.04 |
61.39 |
57.9 |
78.61 |
14.03 |
MistralForCausalLM |
HelpSteer-filtered-7B 📑 | 🔶 |
72.4 |
59.49 |
59.56 |
83.32 |
63.52 |
41.11 |
76.01 |
33.43 |
MistralForCausalLM |
Yarn-Mistral-7b-128k 📑 | 🔶 |
70 |
59.42 |
59.64 |
82.5 |
63.02 |
41.78 |
76.95 |
32.6 |
MistralForCausalLM |
Camelidae-8x13B 📑 | 💬 |
130 |
59.4 |
61.18 |
82.73 |
57.21 |
43.37 |
77.35 |
34.57 |
LlamaForCausalLM |
Tess-XS-v1.1 📑 | 🔶 |
72.4 |
59.39 |
63.91 |
84.06 |
63.07 |
49.92 |
79.16 |
16.22 |
MistralForCausalLM |
deepseek-llm-7b-chat ✅ 📑 | 💬 |
70 |
59.38 |
55.8 |
79.38 |
51.75 |
47.98 |
74.82 |
46.55 |
LlamaForCausalLM |
openchat-3.5-0106-128k 📑 | 🔶 |
72.4 |
59.38 |
64.25 |
77.31 |
57.58 |
46.5 |
77.66 |
32.98 |
MistralForCausalLM |
OpenAssistant-SFT-7-Llama-30B-HF 📑 | 🔶 |
300 |
59.34 |
60.58 |
82.17 |
57.93 |
46.94 |
78.61 |
29.8 |
LlamaForCausalLM |
llemma_34b 📑 | 🔶 |
340 |
59.34 |
55.29 |
75.08 |
58.93 |
40.31 |
75.53 |
50.87 |
LlamaForCausalLM |
SynthIA-7B-v1.3 📑 | 🔶 |
70 |
59.34 |
62.12 |
83.45 |
62.65 |
51.37 |
78.85 |
17.59 |
MistralForCausalLM |
fin-llama-33b-merged 📑 | ❓ |
330 |
59.33 |
65.02 |
86.2 |
58.73 |
49.75 |
80.03 |
16.22 |
LlamaForCausalLM |
MysticFusion-13B 📑 | 🔶 |
130.2 |
59.31 |
61.35 |
84.43 |
57.29 |
51.98 |
76.01 |
24.79 |
LlamaForCausalLM |
SuperPlatty-30B 📑 | 🔶 |
323.2 |
59.3 |
65.78 |
83.95 |
62.57 |
53.52 |
80.35 |
9.63 |
Unknown |
SAM 📑 | 🔶 |
72.4 |
59.3 |
59.39 |
82.31 |
62.15 |
52.64 |
76.4 |
22.9 |
MistralForCausalLM |
deepseek-llm-7b-chat ✅ 📑 | 🔶 |
70 |
59.27 |
55.72 |
79.38 |
51.77 |
47.92 |
74.9 |
45.94 |
LlamaForCausalLM |
Mistral-7B-claude-instruct 📑 | 💬 |
70 |
59.27 |
63.23 |
84.99 |
63.84 |
47.47 |
78.14 |
17.97 |
MistralForCausalLM |
Venomia-1.1-m7 📑 | 🔶 |
0 |
59.27 |
58.45 |
83.04 |
56.39 |
47.21 |
74.43 |
36.09 |
MistralForCausalLM |
Synatra-7B-v0.3-RP 📑 | 💬 |
70 |
59.26 |
62.2 |
82.29 |
60.8 |
52.64 |
76.48 |
21.15 |
MistralForCausalLM |
TimeCrystal-l2-13B 📑 | 🔶 |
130 |
59.26 |
61.18 |
83.71 |
56.46 |
51.3 |
75.37 |
27.52 |
LlamaForCausalLM |
Xenon-1 📑 | 💬 |
72.4 |
59.21 |
55.29 |
81.56 |
61.22 |
56.68 |
78.69 |
21.83 |
MistralForCausalLM |
Qwen-7B ✅ 📑 | 🟢 |
77.2 |
59.19 |
51.37 |
78.47 |
59.84 |
47.79 |
72.69 |
44.96 |
QWenLMHeadModel |
vigostral-7b-chat 📑 | 🔶 |
70 |
59.18 |
62.63 |
84.34 |
63.53 |
49.24 |
78.61 |
16.76 |
MistralForCausalLM |
Borealis-10.7B-DPO 📑 | 🔶 |
107 |
59.18 |
57.94 |
81.21 |
60.74 |
46.37 |
75.45 |
33.36 |
LlamaForCausalLM |
UltraQwen-7B 📑 | 🔶 |
77.2 |
59.17 |
51.71 |
77.93 |
59.16 |
48.2 |
73.95 |
44.05 |
LlamaForCausalLM |
PiVoT-0.1-Evil-a 📑 | 💬 |
72.4 |
59.16 |
59.64 |
81.48 |
58.94 |
39.23 |
75.3 |
40.41 |
MistralForCausalLM |
Karen_TheEditor_V2_STRICT_Mistral_7B 📑 | 💬 |
72.4 |
59.13 |
59.56 |
81.79 |
59.56 |
49.36 |
74.35 |
30.17 |
MistralForCausalLM |
Mistral-v0.1-PeanutButter-v0.0.0-7B 📑 | 💬 |
72.4 |
59.09 |
62.2 |
84.1 |
64.14 |
46.94 |
78.69 |
18.5 |
Unknown |
Noromaid-7B-0.4-DPO 📑 | 🔶 |
72.4 |
59.08 |
62.29 |
84.32 |
63.2 |
42.28 |
76.95 |
25.47 |
MistralForCausalLM |
finetuned-Mistral-5000-v1.0 📑 | 🔶 |
0 |
59.08 |
59.9 |
82.37 |
61.68 |
41.17 |
78.3 |
31.08 |
MistralForCausalLM |
zephyr-7b-beta 📑 | 🔶 |
72.4 |
59.08 |
62.03 |
84.53 |
61.06 |
57.44 |
78.06 |
11.37 |
MistralForCausalLM |
openchat_3.5-16k 📑 | 🔶 |
72.4 |
59.03 |
63.31 |
83.58 |
61.9 |
43.47 |
80.11 |
21.83 |
MistralForCausalLM |
Platypus-30B 📑 | 💬 |
325.3 |
59.03 |
64.59 |
84.26 |
64.23 |
45.35 |
81.37 |
14.4 |
LlamaForCausalLM |
Platypus-30B 📑 | 🔶 |
323.2 |
59.03 |
64.59 |
84.24 |
64.19 |
45.35 |
81.37 |
14.4 |
Unknown |
orca_mini_v3_13B-GPTQ 📑 | 🔶 |
162.3 |
59.01 |
61.95 |
81.56 |
56.1 |
49.22 |
75.77 |
29.49 |
LlamaForCausalLM |
zephyr-alpha-Nebula-v2-7B 📑 | 💬 |
72.4 |
59.01 |
58.62 |
83.05 |
56.68 |
58.28 |
73.56 |
23.88 |
MistralForCausalLM |
wizard-mistral-v0.1 📑 | 🔶 |
72.4 |
59.01 |
61.77 |
83.51 |
63.99 |
47.46 |
78.3 |
19.03 |
MistralForCausalLM |
samantha-1.1-llama-33b 📑 | ❓ |
323.2 |
58.98 |
67.83 |
85.55 |
58.79 |
61.19 |
76.48 |
4.02 |
Unknown |
Mistral-7b-FFT-Test3 📑 | 🔶 |
70 |
58.96 |
60.24 |
82.36 |
62.2 |
44.36 |
77.82 |
26.76 |
MistralForCausalLM |
Hercules-1.0-Mistral-7B 📑 | 💬 |
72.4 |
58.95 |
57.08 |
81.13 |
58.98 |
49.47 |
77.19 |
29.87 |
MistralForCausalLM |
rezephyr-dpo 📑 | 🔶 |
72.4 |
58.95 |
57.59 |
81.75 |
60.55 |
44.32 |
77.03 |
32.45 |
MistralForCausalLM |
Tess-XS-v1.0 📑 | 🔶 |
72.4 |
58.95 |
61.43 |
83.82 |
64.1 |
47.12 |
78.93 |
18.27 |
Unknown |
Noromaid-7B-0.4-DPO 📑 | 🔶 |
72.4 |
58.93 |
62.2 |
84.41 |
63.14 |
42.34 |
76.95 |
24.56 |
MistralForCausalLM |
CodeLlama-70b-hf ✅ 📑 | 🟢 |
689.8 |
58.93 |
56.74 |
78.21 |
59.67 |
39.79 |
75.22 |
43.97 |
LlamaForCausalLM |
CodeLlama-70b-hf ✅ 📑 | 🟢 |
689.8 |
58.93 |
56.74 |
78.21 |
59.67 |
39.79 |
75.22 |
43.97 |
LlamaForCausalLM |
CodeLlama-70b-hf ✅ 📑 | 🟢 |
689.8 |
58.93 |
56.74 |
78.21 |
59.67 |
39.79 |
75.22 |
43.97 |
LlamaForCausalLM |
chronoboros-33B 📑 | 🔶 |
330 |
58.92 |
63.91 |
85.0 |
59.44 |
49.83 |
80.35 |
15.01 |
LlamaForCausalLM |
Mistral-7B-v0.1-Open-Platypus 📑 | 💬 |
70 |
58.92 |
62.37 |
85.08 |
63.79 |
47.33 |
77.66 |
17.29 |
MistralForCausalLM |
SwahiliInstruct-v0.1 📑 | 🔶 |
72.4 |
58.92 |
57.59 |
80.92 |
57.0 |
58.08 |
74.66 |
25.25 |
MistralForCausalLM |
speechless-code-mistral-7b-v1.0 📑 | 🔶 |
70 |
58.85 |
60.58 |
83.75 |
62.98 |
47.9 |
78.69 |
19.18 |
MistralForCausalLM |
mistral_7b_norobots 📑 | 💬 |
70 |
58.85 |
58.96 |
80.57 |
57.66 |
41.91 |
75.61 |
38.36 |
Unknown |
airochronos-33B 📑 | ❓ |
325.3 |
58.84 |
64.42 |
85.21 |
59.79 |
50.59 |
79.32 |
13.72 |
LlamaForCausalLM |
Mistral-11B-SynthIAirOmniMix 📑 | 🔶 |
107.3 |
58.84 |
62.46 |
83.13 |
63.47 |
55.69 |
76.4 |
11.9 |
MistralForCausalLM |
Nebula-v2-7B 📑 | 💬 |
72.4 |
58.82 |
58.7 |
83.06 |
57.61 |
46.72 |
75.14 |
31.69 |
Unknown |
scarlett-33b 📑 | 🔶 |
330 |
58.81 |
67.75 |
85.48 |
58.98 |
61.05 |
76.8 |
2.81 |
LlamaForCausalLM |
Mistral-7b-FFT-Test3 📑 | 🔶 |
70 |
58.79 |
60.41 |
82.31 |
62.45 |
44.33 |
77.58 |
25.63 |
MistralForCausalLM |
Noromaid-13b-v0.3 📑 | 🔶 |
130.2 |
58.77 |
62.8 |
84.42 |
56.86 |
50.73 |
74.74 |
23.05 |
LlamaForCausalLM |
airochronos-33B 📑 | 🔶 |
325.3 |
58.75 |
64.25 |
85.2 |
59.83 |
50.56 |
79.08 |
13.57 |
LlamaForCausalLM |
注意:手机屏幕有限,仅展示平均分,所有内容建议电脑端访问。
模型名称: | CollectiveCognition-v1-Mistral-7B 📑 💬 |
参数大小: |
70 |
平均分: |
60.1 |
模型名称: | zephyr-python-ru 📑 💬 |
参数大小: |
0 |
平均分: |
60.08 |
模型名称: | ccy0-2g7e-wqsa-0 📑 🔶 |
参数大小: |
72.4 |
平均分: |
60.07 |
模型名称: | firefly-zephyr-6x7b 📑 🔶 |
参数大小: |
354.3 |
平均分: |
60.06 |
模型名称: | freeze_KoSoLAR-10.7B-v0.2_1.4_dedup 📑 💬 |
参数大小: |
108 |
平均分: |
60.06 |
模型名称: | Metis-0.1 📑 💬 |
参数大小: |
72.4 |
平均分: |
60.02 |
模型名称: | Velara 📑 🔶 |
参数大小: |
113.9 |
平均分: |
60.01 |
模型名称: | WizardLM-33B-V1.0-Uncensored 📑 ❓ |
参数大小: |
323.2 |
平均分: |
59.99 |
模型名称: | zephyr-dpo-v2 📑 🔶 |
参数大小: |
72.4 |
平均分: |
59.99 |
模型名称: | CodeLlama-70b-Instruct-hf 📑 🟢 |
参数大小: |
689.8 |
平均分: |
59.98 |
模型名称: | psyonic-cetacean-20B 📑 🔶 |
参数大小: |
199.9 |
平均分: |
59.97 |
模型名称: | AIRIC-The-Mistral 📑 💬 |
参数大小: |
72.4 |
平均分: |
59.95 |
模型名称: | Novocode7b-v3 📑 🔶 |
参数大小: |
70 |
平均分: |
59.94 |
模型名称: | Xenon-2 📑 💬 |
参数大小: |
72.4 |
平均分: |
59.93 |
模型名称: | TarsChattyBasev0.0 📑 🔶 |
参数大小: |
72.4 |
平均分: |
59.92 |
模型名称: | Starling-LM-11B-alpha 📑 🔶 |
参数大小: |
107.3 |
平均分: |
59.92 |
模型名称: | neural-chat-7b-v3-1 📑 🔶 |
参数大小: |
72.4 |
平均分: |
59.9 |
模型名称: | Yi-6B-Infinity-Chat 📑 🔶 |
参数大小: |
60.6 |
平均分: |
59.83 |
模型名称: | samantha-1.2-mistral-7b 📑 💬 |
参数大小: |
71.1 |
平均分: |
59.83 |
模型名称: | Hermes-2-SOLAR-10.7B-Symbolic 📑 🔶 |
参数大小: |
107.3 |
平均分: |
59.81 |
模型名称: | WizardMath-70B-V1.0 ✅ 📑 💬 |
参数大小: |
700 |
平均分: |
59.81 |
模型名称: | AISquare-Instruct-SOLAR-10.7b-v0.5.32 📑 💬 |
参数大小: |
107 |
平均分: |
59.79 |
模型名称: | mistral-7b-sft-beta 📑 🔶 |
参数大小: |
70 |
平均分: |
59.78 |
模型名称: | H4rmoniousAnthea 📑 💬 |
参数大小: |
72.4 |
平均分: |
59.76 |
模型名称: | bun_mistral_7b_v2 📑 🔶 |
参数大小: |
72.4 |
平均分: |
59.76 |
模型名称: | Llama-2-70B-chat-GPTQ 📑 🔶 |
参数大小: |
728.2 |
平均分: |
59.75 |
模型名称: | apricot-wildflower-20 📑 🔶 |
参数大小: |
72.4 |
平均分: |
59.74 |
模型名称: | Mistral-7B-AEZAKMI-v2 📑 🔶 |
参数大小: |
72.4 |
平均分: |
59.69 |
模型名称: | ToxicHermes-2.5-Mistral-7B 📑 💬 |
参数大小: |
72.4 |
平均分: |
59.69 |
模型名称: | mistral-indo-7b 📑 🔶 |
参数大小: |
72.4 |
平均分: |
59.68 |
模型名称: | Synatra-RP-Orca-2-7b-v0.1 📑 💬 |
参数大小: |
67.4 |
平均分: |
59.65 |
模型名称: | Orca-2-13B-no_robots 📑 💬 |
参数大小: |
130.2 |
平均分: |
59.63 |
模型名称: | Yarn-Mistral-7b-64k 📑 🔶 |
参数大小: |
70 |
平均分: |
59.63 |
模型名称: | DiamondForce 📑 🔶 |
参数大小: |
130.2 |
平均分: |
59.63 |
模型名称: | SynthIA-7B-v1.5 📑 🔶 |
参数大小: |
70 |
平均分: |
59.59 |
模型名称: | ShiningValiantXS 📑 🔶 |
参数大小: |
130.2 |
平均分: |
59.56 |
模型名称: | Synatra-RP-Orca-2-7b-v0.1 📑 🔶 |
参数大小: |
67.4 |
平均分: |
59.55 |
模型名称: | internlm-20b ✅ 📑 🟢 |
参数大小: |
200 |
平均分: |
59.55 |
模型名称: | Mistral-7B-Discord-0.2 📑 🔶 |
参数大小: |
72.4 |
平均分: |
59.55 |
模型名称: | TarsMeta 📑 🔶 |
参数大小: |
72.4 |
平均分: |
59.54 |
模型名称: | Chupacabra-v3 📑 🔶 |
参数大小: |
72.4 |
平均分: |
59.52 |
模型名称: | WizardLM-30B-fp16 📑 🔶 |
参数大小: |
300 |
平均分: |
59.51 |
模型名称: | gpt4-alpaca-lora-30b-HF 📑 ❓ |
参数大小: |
300 |
平均分: |
59.51 |
模型名称: | zephyr-7b-alpha 📑 🔶 |
参数大小: |
72.4 |
平均分: |
59.5 |
模型名称: | HelpSteer-filtered-7B 📑 🔶 |
参数大小: |
72.4 |
平均分: |
59.49 |
模型名称: | Yarn-Mistral-7b-128k 📑 🔶 |
参数大小: |
70 |
平均分: |
59.42 |
模型名称: | Camelidae-8x13B 📑 💬 |
参数大小: |
130 |
平均分: |
59.4 |
模型名称: | Tess-XS-v1.1 📑 🔶 |
参数大小: |
72.4 |
平均分: |
59.39 |
模型名称: | deepseek-llm-7b-chat ✅ 📑 💬 |
参数大小: |
70 |
平均分: |
59.38 |
模型名称: | openchat-3.5-0106-128k 📑 🔶 |
参数大小: |
72.4 |
平均分: |
59.38 |
模型名称: | OpenAssistant-SFT-7-Llama-30B-HF 📑 🔶 |
参数大小: |
300 |
平均分: |
59.34 |
模型名称: | llemma_34b 📑 🔶 |
参数大小: |
340 |
平均分: |
59.34 |
模型名称: | SynthIA-7B-v1.3 📑 🔶 |
参数大小: |
70 |
平均分: |
59.34 |
模型名称: | fin-llama-33b-merged 📑 ❓ |
参数大小: |
330 |
平均分: |
59.33 |
模型名称: | MysticFusion-13B 📑 🔶 |
参数大小: |
130.2 |
平均分: |
59.31 |
模型名称: | SuperPlatty-30B 📑 🔶 |
参数大小: |
323.2 |
平均分: |
59.3 |
模型名称: | SAM 📑 🔶 |
参数大小: |
72.4 |
平均分: |
59.3 |
模型名称: | deepseek-llm-7b-chat ✅ 📑 🔶 |
参数大小: |
70 |
平均分: |
59.27 |
模型名称: | Mistral-7B-claude-instruct 📑 💬 |
参数大小: |
70 |
平均分: |
59.27 |
模型名称: | Venomia-1.1-m7 📑 🔶 |
参数大小: |
0 |
平均分: |
59.27 |
模型名称: | Synatra-7B-v0.3-RP 📑 💬 |
参数大小: |
70 |
平均分: |
59.26 |
模型名称: | TimeCrystal-l2-13B 📑 🔶 |
参数大小: |
130 |
平均分: |
59.26 |
模型名称: | Xenon-1 📑 💬 |
参数大小: |
72.4 |
平均分: |
59.21 |
模型名称: | Qwen-7B ✅ 📑 🟢 |
参数大小: |
77.2 |
平均分: |
59.19 |
模型名称: | vigostral-7b-chat 📑 🔶 |
参数大小: |
70 |
平均分: |
59.18 |
模型名称: | Borealis-10.7B-DPO 📑 🔶 |
参数大小: |
107 |
平均分: |
59.18 |
模型名称: | UltraQwen-7B 📑 🔶 |
参数大小: |
77.2 |
平均分: |
59.17 |
模型名称: | PiVoT-0.1-Evil-a 📑 💬 |
参数大小: |
72.4 |
平均分: |
59.16 |
模型名称: | Karen_TheEditor_V2_STRICT_Mistral_7B 📑 💬 |
参数大小: |
72.4 |
平均分: |
59.13 |
模型名称: | Mistral-v0.1-PeanutButter-v0.0.0-7B 📑 💬 |
参数大小: |
72.4 |
平均分: |
59.09 |
模型名称: | Noromaid-7B-0.4-DPO 📑 🔶 |
参数大小: |
72.4 |
平均分: |
59.08 |
模型名称: | finetuned-Mistral-5000-v1.0 📑 🔶 |
参数大小: |
0 |
平均分: |
59.08 |
模型名称: | zephyr-7b-beta 📑 🔶 |
参数大小: |
72.4 |
平均分: |
59.08 |
模型名称: | openchat_3.5-16k 📑 🔶 |
参数大小: |
72.4 |
平均分: |
59.03 |
模型名称: | Platypus-30B 📑 💬 |
参数大小: |
325.3 |
平均分: |
59.03 |
模型名称: | Platypus-30B 📑 🔶 |
参数大小: |
323.2 |
平均分: |
59.03 |
模型名称: | orca_mini_v3_13B-GPTQ 📑 🔶 |
参数大小: |
162.3 |
平均分: |
59.01 |
模型名称: | zephyr-alpha-Nebula-v2-7B 📑 💬 |
参数大小: |
72.4 |
平均分: |
59.01 |
模型名称: | wizard-mistral-v0.1 📑 🔶 |
参数大小: |
72.4 |
平均分: |
59.01 |
模型名称: | samantha-1.1-llama-33b 📑 ❓ |
参数大小: |
323.2 |
平均分: |
58.98 |
模型名称: | Mistral-7b-FFT-Test3 📑 🔶 |
参数大小: |
70 |
平均分: |
58.96 |
模型名称: | Hercules-1.0-Mistral-7B 📑 💬 |
参数大小: |
72.4 |
平均分: |
58.95 |
模型名称: | rezephyr-dpo 📑 🔶 |
参数大小: |
72.4 |
平均分: |
58.95 |
模型名称: | Tess-XS-v1.0 📑 🔶 |
参数大小: |
72.4 |
平均分: |
58.95 |
模型名称: | Noromaid-7B-0.4-DPO 📑 🔶 |
参数大小: |
72.4 |
平均分: |
58.93 |
模型名称: | CodeLlama-70b-hf ✅ 📑 🟢 |
参数大小: |
689.8 |
平均分: |
58.93 |
模型名称: | CodeLlama-70b-hf ✅ 📑 🟢 |
参数大小: |
689.8 |
平均分: |
58.93 |
模型名称: | CodeLlama-70b-hf ✅ 📑 🟢 |
参数大小: |
689.8 |
平均分: |
58.93 |
模型名称: | chronoboros-33B 📑 🔶 |
参数大小: |
330 |
平均分: |
58.92 |
模型名称: | Mistral-7B-v0.1-Open-Platypus 📑 💬 |
参数大小: |
70 |
平均分: |
58.92 |
模型名称: | SwahiliInstruct-v0.1 📑 🔶 |
参数大小: |
72.4 |
平均分: |
58.92 |
模型名称: | speechless-code-mistral-7b-v1.0 📑 🔶 |
参数大小: |
70 |
平均分: |
58.85 |
模型名称: | mistral_7b_norobots 📑 💬 |
参数大小: |
70 |
平均分: |
58.85 |
模型名称: | airochronos-33B 📑 ❓ |
参数大小: |
325.3 |
平均分: |
58.84 |
模型名称: | Mistral-11B-SynthIAirOmniMix 📑 🔶 |
参数大小: |
107.3 |
平均分: |
58.84 |
模型名称: | Nebula-v2-7B 📑 💬 |
参数大小: |
72.4 |
平均分: |
58.82 |
模型名称: | scarlett-33b 📑 🔶 |
参数大小: |
330 |
平均分: |
58.81 |
模型名称: | Mistral-7b-FFT-Test3 📑 🔶 |
参数大小: |
70 |
平均分: |
58.79 |
模型名称: | Noromaid-13b-v0.3 📑 🔶 |
参数大小: |
130.2 |
平均分: |
58.77 |
模型名称: | airochronos-33B 📑 🔶 |
参数大小: |
325.3 |
平均分: |
58.75 |