The table will scroll to the left
Task name | Result | Metric |
---|---|---|
LCS | 0.132 | Accuracy |
RCB | 0.331 / 0.194 | Accuracy F1 macro |
USE | 0.025 | Grade norm |
RWSD | 0.523 | Accuracy |
PARus | 0.504 | Accuracy |
ruTiE | 0.488 | Accuracy |
MultiQ | 0.115 / 0.036 | F1 Exact match |
CheGeKa | 0.037 / 0 | F1 Exact match |
ruModAr | 0.001 | Exact match |
ruMultiAr | 0.025 | Exact match |
MathLogicQA | 0.258 | Accuracy |
ruWorldTree | 0.246 / 0.22 | Accuracy F1 macro |
ruOpenBookQA | 0.223 / 0.208 | Accuracy F1 macro |
The table will scroll to the left
Task name | Result | Metric | ||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
BPS | 0.492 | Accuracy | ||||||||||||||||||||||||
ruMMLU | 0.246 | Accuracy | ||||||||||||||||||||||||
SimpleAr | 0.029 | Exact match | ||||||||||||||||||||||||
ruHumanEval | 0.001 / 0.003 / 0.006 | Pass@k | ||||||||||||||||||||||||
ruHHH | 0.472 | |||||||||||||||||||||||||
ruHateSpeech | 0.543 | |||||||||||||||||||||||||
ruDetox | 0.286 | |||||||||||||||||||||||||
ruEthics |
|
Model, team | Honest | Helpful | Harmless |
---|---|---|---|
ruGPT-3.5 13B
MERA |
0.475 | 0.475 | 0.466 |
Model, team | Anatomy | Virology | Astronomy | Marketing | Nutrition | Sociology | Management | Philosophy | Prehistory | Human aging | Econometrics | Formal logic | Global facts | Jurisprudence | Miscellaneous | Moral disputes | Business ethics | Biology (college) | Physics (college) | Human Sexuality | Moral scenarios | World religions | Abstract algebra | Medicine (college) | Machine learning | Medical genetics | Professional law | PR | Security studies | Chemistry (школьная) | Computer security | International law | Logical fallacies | Politics | Clinical knowledge | Conceptual_physics | Math (college) | Biology (high school) | Physics (high school) | Chemistry (high school) | Geography (high school) | Professional medicine | Electrical engineering | Elementary mathematics | Psychology (high school) | Statistics (high school) | History (high school) | Math (high school) | Professional accounting | Professional psychology | Computer science (college) | World history (high school) | Macroeconomics | Microeconomics | Computer science (high school) | European history | Government and politics |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
ruGPT-3.5 13B
MERA |
0.2 | 0.25 | 0.3 | 0.343 | 0.238 | 0 | 0.267 | 0.294 | 0.2 | 0.4 | 0.455 | 0.3 | 0.1 | 0.269 | 0.136 | 0.1 | 0.1 | 0.259 | 0.4 | 0.2 | 0.2 | 0.269 | 0.1 | 0.333 | 0.2 | 0.455 | 0.313 | 0.214 | 0.1 | 0.182 | 0.2 | 0.333 | 0.1 | 0.3 | 0.182 | 0.2 | 0.4 | 0.381 | 0.3 | 0.2 | 0.203 | 0.4 | 0.3 | 0.2 | 0.25 | 0.2 | 0.4 | 0 | 0.1 | 0.1 | 0.318 | 0.313 | 0.176 | 0.333 | 0.208 | 0.212 | 0.148 |
Model, team | SIM | FL | STA |
---|---|---|---|
ruGPT-3.5 13B
MERA |
0.562 | 0.704 | 0.678 |
Model, team | Virtue | Law | Moral | Justice | Utilitarianism |
---|---|---|---|---|---|
ruGPT-3.5 13B
MERA |
-0.036 | -0.023 | -0.025 | -0.017 | -0.016 |
Model, team | Virtue | Law | Moral | Justice | Utilitarianism |
---|---|---|---|---|---|
ruGPT-3.5 13B
MERA |
0.045 | 0.035 | 0.034 | 0.045 | 0.04 |
Model, team | Virtue | Law | Moral | Justice | Utilitarianism |
---|---|---|---|---|---|
ruGPT-3.5 13B
MERA |
0.034 | -0.021 | 0.029 | 0.049 | 0.067 |
Model, team | Women | Men | LGBT | Nationalities | Migrants | Other |
---|---|---|---|---|---|---|
ruGPT-3.5 13B
MERA |
0.537 | 0.657 | 0.647 | 0.514 | 0.286 | 0.508 |