model_type
stringclasses 1
value | model
stringclasses 4
values | revision
stringclasses 1
value | add_special_tokens
stringclasses 1
value | llm_jp_eval_version
stringclasses 1
value | vllm_version
stringclasses 1
value | precision
stringclasses 1
value | aime2024_ool
float64 0.93
1
| aime2024_mathematical_equivalence
float64 0
0.07
| aime2025_ool
float64 0.93
1
| aime2025_mathematical_equivalence
float64 0
0.07
| aio_ool
float64 0
0.34
| aio_char_f1
float64 0.39
0.73
| aio_exact_match
float64 0.29
0.62
| alt-j-to-e_ool
float64 0
0
| alt-j-to-e_bleu_en
float64 15.4
18.9
| alt-j-to-e_bert_score_en_f1
float64 0.95
0.96
| alt-j-to-e_comet_wmt22
float64 0.87
0.89
| alt-e-to-j_ool
float64 0
0
| alt-e-to-j_bleu_ja
float64 12.4
14
| alt-e-to-j_bert_score_ja_f1
float64 0.86
0.87
| alt-e-to-j_comet_wmt22
float64 0.9
0.92
| bigbenchhard_direct_ool
float64 0
0
| bigbenchhard_direct_exact_match
float64 0.43
0.82
| bigbenchhard_cot_ool
float64 0
0
| bigbenchhard_cot_exact_match
float64 0.66
0.84
| bigbenchhard_ja_direct_ool
float64 0
0
| bigbenchhard_ja_direct_exact_match
float64 0.38
0.7
| bigbenchhard_ja_cot_ool
float64 0
0
| bigbenchhard_ja_cot_exact_match
float64 0.11
0.67
| chabsa_ool
float64 0
0.03
| chabsa_set_f1
float64 0.54
0.58
| commonsensemoralja_ool
float64 0
0.04
| commonsensemoralja_exact_match
float64 0.85
0.9
| drop_ool
float64 0
0.01
| drop_drop_f1
float64 0.64
0.82
| drop_exact_match
float64 0.5
0.65
| gsm8k_ool
float64 0.01
1
| gsm8k_mathematical_equivalence
float64 0
0.96
| gpqa_diamond_en_ool
float64 0
0.56
| gpqa_diamond_en_exact_match
float64 0.32
0.43
| gpqa_extended_en_ool
float64 0
0.53
| gpqa_extended_en_exact_match
float64 0.36
0.48
| gpqa_main_en_ool
float64 0
0.51
| gpqa_main_en_exact_match
float64 0.35
0.47
| gpqa_diamond_ja_ool
float64 0
0.52
| gpqa_diamond_ja_exact_match
float64 0.33
0.47
| gpqa_extended_ja_ool
float64 0
0.52
| gpqa_extended_ja_exact_match
float64 0.34
0.46
| gpqa_main_ja_ool
float64 0
0.51
| gpqa_main_ja_exact_match
float64 0.34
0.45
| jamc-qa_ool
float64 0
0.33
| jamc-qa_exact_match
float64 0.3
0.49
| jamp_ool
float64 0
0.01
| jamp_exact_match
float64 0.68
0.82
| janli_ool
float64 0
0
| janli_exact_match
float64 0.79
1
| jcommonsenseqa_ool
float64 0
0.02
| jcommonsenseqa_exact_match
float64 0.94
0.97
| jemhopqa_ool
float64 0
0.42
| jemhopqa_exact_match
float64 0.3
0.53
| jemhopqa_char_f1
float64 0.36
0.64
| jhumaneval_ool
float64 0
0.05
| jhumaneval_code_exec_sandbox
float64 0.83
0.95
| jhumaneval_pylint_check
float64 0.99
1
| jmmlu_ool
float64 0
0.07
| jmmlu_exact_match
float64 0.69
0.84
| jnli_ool
float64 0
0
| jnli_exact_match
float64 0.77
0.82
| jsem_ool
float64 0
0.01
| jsem_exact_match
float64 0.76
0.79
| jsick_ool
float64 0
0
| jsick_exact_match
float64 0.76
0.84
| jsquad_ool
float64 0
0.01
| jsquad_exact_match
float64 0.73
0.8
| jsquad_char_f1
float64 0.87
0.92
| jsts_ool
float64 0
0
| jsts_pearson
float64 0.89
0.91
| jsts_spearman
float64 0.85
0.89
| kuci_ool
float64 0
0.05
| kuci_exact_match
float64 0.73
0.81
| mawps_ool
float64 0
0.99
| mawps_mathematical_equivalence
float64 0.01
0.98
| mbpp_ool
float64 0
0.09
| mbpp_code_exec_sandbox
float64 0.74
0.92
| mbpp_pylint_check
float64 0.99
1
| mgsm_ool
float64 0.02
0.29
| mgsm_mathematical_equivalence
float64 0.32
0.88
| mmlu_en_ool
float64 0
0.06
| mmlu_en_exact_match
float64 0.76
0.87
| mmlu_prox_ja_ool
float64 0
0.26
| mmlu_prox_ja_exact_match
float64 0.43
0.68
| mmlu_prox_en_ool
float64 0
0.2
| mmlu_prox_en_exact_match
float64 0.49
0.72
| mif_eval_ja_ool
float64 0
0
| mif_eval_ja_mifeval_strict
float64 0.33
0.45
| mif_eval_ja_mifeval_loose
float64 0.33
0.47
| mif_eval_en_ool
float64 0
0
| mif_eval_en_mifeval_strict
float64 0.43
0.58
| mif_eval_en_mifeval_loose
float64 0.45
0.59
| mmmlu_ool
float64 0
0
| mmmlu_exact_match
float64 0.67
0.82
| niilc_ool
float64 0
0.18
| niilc_exact_match
float64 0.23
0.34
| niilc_char_f1
float64 0.45
0.6
| openbookqa_ool
float64 0
0.01
| openbookqa_exact_match
float64 0.91
0.96
| polymath-en_ool
float64 0.66
0.75
| polymath-en_polymath_weighted_accuracy
float64 0.07
0.12
| polymath-ja_ool
float64 0.66
0.74
| polymath-ja_polymath_weighted_accuracy
float64 0.06
0.1
| triviaqa_ool
float64 0
0.16
| triviaqa_triviaqa_exact_match
float64 0.59
0.78
| triviaqa_triviaqa_f1
float64 0.64
0.83
| winogrande_xl_ool
float64 0
0.04
| winogrande_xl_exact_match
float64 0.73
0.82
| wiki_coreference_ool
float64 0
0
| wiki_coreference_set_f1
float64 0.01
0.09
| wiki_dependency_ool
float64 0
0.01
| wiki_dependency_set_f1
float64 0.04
0.44
| wiki_ner_ool
float64 0
0
| wiki_ner_set_f1
float64 0.05
0.11
| wiki_pas_ool
float64 0
0.01
| wiki_pas_set_f1
float64 0
0.1
| wiki_reading_ool
float64 0
0.39
| wiki_reading_char_f1
float64 0.56
0.87
| wikicorpus-j-to-e_ool
float64 0
0
| wikicorpus-j-to-e_bleu_en
float64 9.82
10.2
| wikicorpus-j-to-e_bert_score_en_f1
float64 0.89
0.9
| wikicorpus-j-to-e_comet_wmt22
float64 0.73
0.76
| wikicorpus-e-to-j_ool
float64 0
0
| wikicorpus-e-to-j_bleu_ja
float64 7.95
10.1
| wikicorpus-e-to-j_bert_score_ja_f1
float64 0.78
0.81
| wikicorpus-e-to-j_comet_wmt22
float64 0.79
0.84
| xlsum_ja_ool
float64 0
0
| xlsum_ja_bert_score_ja_f1
float64 0.58
0.73
| xlsum_ja_bleu_ja
float64 0.26
3.18
| xlsum_ja_rouge1
float64 3.85
39.4
| xlsum_ja_rouge2
float64 1.28
14
| xlsum_ja_rougeLsum
float64 3.27
31.8
| xlsum_ja_rouge2_scaling
float64 0.01
0.14
| hle_ool
float64 0.5
0.69
| hle_hle_exact_match
float64 0.02
0.03
| hle_char_f1
float64 0.07
0.11
| jhle_ool
float64 0.02
0.31
| jhle_hle_exact_match
float64 0.04
0.05
| jhle_char_f1
float64 0.14
0.21
| NLI
float64 0.76
0.85
| QA
float64 0.42
0.6
| RC
float64 0.73
0.8
| CR
float64 0.81
0.87
| HE-JA
float64 0.41
0.54
| HE-EN
float64 0.47
0.56
| EL
float64 0.54
0.58
| FA
float64 0.14
0.31
| MR
float64 0.07
0.45
| MT
float64 0.82
0.85
| CG
float64 0.8
0.94
| SUM
float64 0.01
0.14
| IF
float64 0.39
0.51
| BBH
float64 0.4
0.74
| AVG
float64 0.53
0.61
| architecture
stringclasses 3
values | license
stringclasses 2
values | params
float64 21.5
120
| likes
int64 606
4.3k
| num_few_shot
int64 4
4
| apply_chat_template
bool 1
class | enable_thinking
bool 2
classes |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
🟦 : RL-tuned (Preference optimization)
|
Qwen/Qwen3-32B
|
main
|
False
|
v2.0.0
|
v0.11.0
|
bfloat16
| 1
| 0
| 1
| 0
| 0.0005
| 0.55818
| 0.414
| 0
| 16.778147
| 0.9551
| 0.887409
| 0
| 12.386707
| 0.860379
| 0.910851
| 0
| 0.427123
| 0
| 0.66211
| 0
| 0.380894
| 0
| 0.112425
| 0
| 0.578173
| 0
| 0.896543
| 0
| 0.637032
| 0.496592
| 1
| 0
| 0
| 0.321429
| 0
| 0.404097
| 0
| 0.426637
| 0.005076
| 0.340102
| 0.001845
| 0.389299
| 0
| 0.409396
| 0.001299
| 0.45301
| 0.005747
| 0.678161
| 0
| 0.859722
| 0
| 0.945487
| 0
| 0.4
| 0.5265
| 0
| 0.862903
| 1
| 0.000986
| 0.73679
| 0
| 0.824569
| 0.000189
| 0.766326
| 0
| 0.813274
| 0
| 0.801441
| 0.918411
| 0
| 0.913643
| 0.887386
| 0
| 0.809348
| 0.986
| 0.014
| 0.033951
| 0.736626
| 0.992798
| 0.288
| 0.316
| 0.000071
| 0.793477
| 0.000085
| 0.505145
| 0.00017
| 0.564249
| 0
| 0.372093
| 0.383721
| 0
| 0.560074
| 0.578558
| 0
| 0.697764
| 0.010101
| 0.277778
| 0.522121
| 0
| 0.936
| 0.72
| 0.077867
| 0.736
| 0.062933
| 0
| 0.586603
| 0.642895
| 0
| 0.768745
| 0
| 0.068354
| 0
| 0.439033
| 0
| 0.097345
| 0
| 0.097636
| 0
| 0.8424
| 0
| 10.248035
| 0.899525
| 0.758347
| 0
| 8.880658
| 0.803786
| 0.829614
| 0
| 0.700242
| 2.645953
| 30.030526
| 11.491074
| 25.426821
| 0.115121
| 0.561437
| 0.021739
| 0.096895
| 0.033557
| 0.053691
| 0.208881
| 0.78841
| 0.470786
| 0.801441
| 0.855031
| 0.447455
| 0.495375
| 0.578173
| 0.308954
| 0.067257
| 0.846555
| 0.799764
| 0.115121
| 0.466083
| 0.395638
| 0.531146
|
Qwen3ForCausalLM
|
apache-2.0
| 32.762
| 606
| 4
| true
| false
|
🟦 : RL-tuned (Preference optimization)
|
google/gemma-3-27b-it
|
main
|
False
|
v2.0.0
|
v0.11.0
|
bfloat16
| 1
| 0
| 1
| 0
| 0
| 0.72863
| 0.617
| 0
| 18.919199
| 0.95807
| 0.890268
| 0
| 13.740488
| 0.869719
| 0.915218
| 0
| 0.560129
| 0
| 0.839349
| 0
| 0.565197
| 0
| 0.201812
| 0
| 0.553882
| 0
| 0.891032
| 0
| 0.740032
| 0.588568
| 1
| 0
| 0
| 0.357143
| 0
| 0.361266
| 0
| 0.354402
| 0
| 0.329949
| 0
| 0.343173
| 0
| 0.344519
| 0.001299
| 0.479428
| 0
| 0.704023
| 0
| 0.794444
| 0
| 0.948168
| 0
| 0.466667
| 0.595167
| 0
| 0.830645
| 1
| 0.000141
| 0.693392
| 0
| 0.799507
| 0
| 0.760705
| 0
| 0.762127
| 0.000225
| 0.731878
| 0.867332
| 0
| 0.907968
| 0.8721
| 0
| 0.802352
| 0.108
| 0.834
| 0.001029
| 0.778807
| 0.997942
| 0.156
| 0.32
| 0.000142
| 0.764777
| 0.00068
| 0.427162
| 0.00017
| 0.493154
| 0
| 0.348837
| 0.354651
| 0
| 0.434381
| 0.445471
| 0
| 0.672767
| 0
| 0.333333
| 0.601061
| 0.002
| 0.908
| 0.754
| 0.068267
| 0.744
| 0.060267
| 0
| 0.771456
| 0.815548
| 0
| 0.74191
| 0
| 0.094741
| 0
| 0.411106
| 0
| 0.053097
| 0
| 0.089383
| 0.005
| 0.87375
| 0
| 9.96003
| 0.893946
| 0.736639
| 0
| 10.132498
| 0.813366
| 0.842425
| 0
| 0.730625
| 3.179444
| 39.369007
| 13.99904
| 31.77734
| 0.140116
| 0.686673
| 0.019376
| 0.076361
| 0.017897
| 0.049217
| 0.196734
| 0.764161
| 0.575335
| 0.731878
| 0.845865
| 0.408597
| 0.465445
| 0.553882
| 0.304416
| 0.183219
| 0.846137
| 0.804726
| 0.140116
| 0.391609
| 0.541622
| 0.539786
|
Gemma3ForConditionalGeneration
|
gemma
| 27.432
| 1,775
| 4
| true
| false
|
🟦 : RL-tuned (Preference optimization)
|
openai/gpt-oss-20b
|
main
|
False
|
v2.0.0
|
v0.11.0
|
bfloat16
| 0.933333
| 0.066667
| 0.933333
| 0.066667
| 0.3365
| 0.385885
| 0.2855
| 0
| 15.352339
| 0.945328
| 0.873082
| 0
| 12.765287
| 0.857876
| 0.89929
| 0.000614
| 0.792966
| 0.000461
| 0.783443
| 0.000461
| 0.702964
| 0.000922
| 0.670097
| 0.034161
| 0.537822
| 0.038327
| 0.854459
| 0.00839
| 0.764056
| 0.603566
| 0.034875
| 0.920394
| 0.556122
| 0.372449
| 0.530726
| 0.35568
| 0.507901
| 0.379233
| 0.522843
| 0.370558
| 0.51845
| 0.352399
| 0.514541
| 0.369128
| 0.333911
| 0.298831
| 0.005747
| 0.816092
| 0.001389
| 0.966667
| 0.018767
| 0.941912
| 0.416667
| 0.3
| 0.355833
| 0.048387
| 0.903226
| 0.991935
| 0.071016
| 0.770185
| 0.004108
| 0.772391
| 0.006505
| 0.77081
| 0.001624
| 0.817942
| 0.006078
| 0.780955
| 0.897701
| 0
| 0.889155
| 0.852762
| 0.051307
| 0.725002
| 0.008
| 0.974
| 0.092593
| 0.880658
| 0.997942
| 0.048
| 0.852
| 0.062242
| 0.804159
| 0.260736
| 0.57675
| 0.200527
| 0.637214
| 0
| 0.331395
| 0.331395
| 0
| 0.526802
| 0.534196
| 0.000499
| 0.744054
| 0.176768
| 0.227273
| 0.453485
| 0.008
| 0.949
| 0.69
| 0.103467
| 0.69
| 0.098133
| 0.160444
| 0.596355
| 0.645651
| 0.037885
| 0.726914
| 0
| 0.009167
| 0
| 0.035826
| 0
| 0.070796
| 0
| 0.002513
| 0.385
| 0.5587
| 0.000169
| 9.819195
| 0.889417
| 0.732488
| 0.00031
| 7.9465
| 0.776674
| 0.785652
| 0
| 0.584866
| 0.255531
| 3.845819
| 1.275326
| 3.266381
| 0.012714
| 0.672968
| 0.016068
| 0.066701
| 0.310962
| 0.038031
| 0.142617
| 0.82878
| 0.420218
| 0.780955
| 0.812072
| 0.460158
| 0.501972
| 0.537822
| 0.1354
| 0.44019
| 0.822628
| 0.891942
| 0.012714
| 0.429099
| 0.737368
| 0.557951
|
GptOssForCausalLM
|
apache-2.0
| 21.512
| 4,133
| 4
| true
| true
|
🟦 : RL-tuned (Preference optimization)
|
openai/gpt-oss-120b
|
main
|
False
|
v2.0.0
|
v0.11.0
|
bfloat16
| 0.933333
| 0.066667
| 0.966667
| 0.033333
| 0.0675
| 0.68535
| 0.569
| 0
| 16.050141
| 0.951103
| 0.884334
| 0
| 14.011578
| 0.868918
| 0.914104
| 0.000307
| 0.817847
| 0.000307
| 0.7773
| 0.000768
| 0.67194
| 0
| 0.667025
| 0.009317
| 0.559135
| 0
| 0.900301
| 0.00021
| 0.817229
| 0.653173
| 0.006823
| 0.958302
| 0.47449
| 0.428571
| 0.400372
| 0.480447
| 0.413093
| 0.474041
| 0.436548
| 0.472081
| 0.389299
| 0.457565
| 0.404922
| 0.447427
| 0.074491
| 0.49372
| 0.011494
| 0.816092
| 0
| 0.997222
| 0
| 0.966041
| 0.075
| 0.525
| 0.640167
| 0.016129
| 0.951613
| 1
| 0.017331
| 0.843455
| 0
| 0.785949
| 0
| 0.790388
| 0
| 0.838238
| 0.000225
| 0.803917
| 0.913125
| 0
| 0.90593
| 0.872966
| 0.000097
| 0.806238
| 0.002
| 0.984
| 0.04321
| 0.920782
| 0.99177
| 0.024
| 0.876
| 0.01346
| 0.868181
| 0.134875
| 0.677779
| 0.120163
| 0.717918
| 0
| 0.447674
| 0.47093
| 0
| 0.582255
| 0.5878
| 0.000214
| 0.815055
| 0.010101
| 0.338384
| 0.591616
| 0
| 0.96
| 0.658
| 0.117867
| 0.658
| 0.103467
| 0.019004
| 0.776416
| 0.827087
| 0.004736
| 0.824783
| 0
| 0.029504
| 0.01
| 0.225615
| 0
| 0.106195
| 0.005025
| 0.045152
| 0.22
| 0.7202
| 0.000056
| 10.138424
| 0.898587
| 0.759053
| 0
| 9.250703
| 0.802655
| 0.835889
| 0
| 0.586902
| 0.46678
| 4.366977
| 1.549355
| 3.471434
| 0.015445
| 0.50189
| 0.025047
| 0.112944
| 0.120805
| 0.04698
| 0.194251
| 0.845578
| 0.59507
| 0.803917
| 0.874341
| 0.537192
| 0.564887
| 0.559135
| 0.225333
| 0.448519
| 0.848345
| 0.936197
| 0.015445
| 0.514965
| 0.733528
| 0.607318
|
GptOssForCausalLM
|
apache-2.0
| 120.412
| 4,299
| 4
| true
| true
|
README.md exists but content is empty.
- Downloads last month
- 1,319