Qwen2.5-Omni-7B

MERA Created at 22.01.2026 05:10

Ratings for leaderboard tasks

The table will scroll to the left

Board Result Attempted Score Coverage Place in the rating
Multi 0.317 0.317 1 3
Images 0.226 0.226 1 12
Audio 0.474 0.474 1 3
Video 0.442 0.442 1 9

Tasks

The table will scroll to the left

Task Modality Result Metric
0.455
EM JudgeScore
0.273
EM F1
0.612
EM JudgeScore
0.216
EM JudgeScore
0.23
EM JudgeScore
0.625
EM JudgeScore
0.146
EM JudgeScore
0.055
EM JudgeScore
0.524
EM JudgeScore
0.379
EM JudgeScore
0.187
EM JudgeScore
0.316
EM JudgeScore
0.385
EM JudgeScore
0.378
EM JudgeScore
0.485
EM JudgeScore
0.094
EM JudgeScore
culture 0.057 / 0.114
business 0.076 / 0.152
medicine 0.059 / 0.114
social_sciences 0.092 / 0.175
fundamental_sciences 0.056 / 0.101
applied_sciences 0.074 / 0.155
0.183
EM JudgeScore
biology 0.136 / 0.272
chemistry 0.109 / 0.237
physics 0.18 / 0.331
economics 0.108 / 0.2
ru 0.077 / 0.159
all 0.132 / 0.245
0.158
EM JudgeScore
biology 0.053 / 0.14
chemistry 0.045 / 0.179
physics 0.101 / 0.247
science 0.171 / 0.317

Information about the submission

Mera version
v1.0.0
Torch Version
2.8.0
The version of the codebase
7e640aa
CUDA version
12.8
Precision of the model weights
bfloat16
Seed
1234
Batch
1
Transformers version
4.57.1
The number of GPUs and their type
1 x NVIDIA A100-SXM4-80GB
Architecture
openai-chat-completions

Team:

MERA

Name of the ML model:

Qwen2.5-Omni-7B

Model size

7.0B

Model type:

Opened

SFT

Inference parameters

Generation Parameters:
ruslun - until=["\n\n"];do_sample=false;temperature=0; \nruenvaqa - until=["\n\n"];do_sample=false;temperature=0; \naquaria - until=["\n\n"];do_sample=false;temperature=0; \nrealvideoqa - until=["\n\n"];do_sample=false;temperature=0; \nruhhh_video - until=["\n\n"];do_sample=false;temperature=0; \ncommonvideoqa - until=["\n\n"];do_sample=false;temperature=0; \nrunaturalsciencevqa_biology - until=["<|endoftext|>"];temperature=0;do_sample=false;max_gen_toks=64; \nrunaturalsciencevqa_chemistry - until=["<|endoftext|>"];temperature=0;do_sample=false;max_gen_toks=64; \nrunaturalsciencevqa_earth_science - until=["<|endoftext|>"];temperature=0;do_sample=false;max_gen_toks=64; \nrunaturalsciencevqa_physics - until=["<|endoftext|>"];temperature=0;do_sample=false;max_gen_toks=64; \nlabtabvqa - until=["\n\n"];do_sample=false;temperature=0; \nruhhh_image - until=["\n\n"];do_sample=false;temperature=0; \nrumathvqa - until=["\n\n"];do_sample=false;temperature=0; \nweird - until=["\n\n"];do_sample=false;temperature=0; \nrealvqa - until=["\n\n"];do_sample=false;temperature=0; \nruclevr - until=["\n\n"];do_sample=false;temperature=0; \nrucommonvqa - until=["\n\n"];do_sample=false;temperature=0; \nschoolsciencevqa_biology - until=["<|endoftext|>"];temperature=0;do_sample=false;max_gen_toks=256; \nschoolsciencevqa_chemistry - until=["<|endoftext|>"];temperature=0;do_sample=false;max_gen_toks=256; \nschoolsciencevqa_earth_science - until=["<|endoftext|>"];temperature=0;do_sample=false;max_gen_toks=256; \nschoolsciencevqa_economics - until=["<|endoftext|>"];temperature=0;do_sample=false;max_gen_toks=256; \nschoolsciencevqa_history_all - until=["<|endoftext|>"];temperature=0;do_sample=false;max_gen_toks=256; \nschoolsciencevqa_history_ru - until=["<|endoftext|>"];temperature=0;do_sample=false;max_gen_toks=256; \nschoolsciencevqa_physics - until=["<|endoftext|>"];temperature=0;do_sample=false;max_gen_toks=256; \nunisciencevqa_applied_sciences - until=["<|endoftext|>"];temperature=0;do_sample=false;max_gen_toks=256; \nunisciencevqa_business - until=["<|endoftext|>"];temperature=0;do_sample=false;max_gen_toks=256; \nunisciencevqa_cultural_studies - until=["<|endoftext|>"];temperature=0;do_sample=false;max_gen_toks=256; \nunisciencevqa_fundamental_sciences - until=["<|endoftext|>"];temperature=0;do_sample=false;max_gen_toks=256; \nunisciencevqa_health_and_medicine - until=["<|endoftext|>"];temperature=0;do_sample=false;max_gen_toks=256; \nunisciencevqa_social_sciences - until=["<|endoftext|>"];temperature=0;do_sample=false;max_gen_toks=256;

The size of the context:
32768