The table will scroll to the left
Task name | Result | Metric |
---|---|---|
LCS | 0.142 | Accuracy |
RCB | 0.521 / 0.424 | Accuracy F1 macro |
USE | 0.018 | Grade norm |
RWSD | 0.569 | Accuracy |
PARus | 0.744 | Accuracy |
ruTiE | 0.614 | Accuracy |
MultiQ | 0.261 / 0.161 | F1 Exact match |
CheGeKa | 0.035 / 0 | F1 Exact match |
ruModAr | 0.59 | Exact match |
ruMultiAr | 0.254 | Exact match |
MathLogicQA | 0.373 | Accuracy |
ruWorldTree | 0.844 / 0.844 | Accuracy F1 macro |
ruOpenBookQA | 0.795 / 0.795 | Accuracy F1 macro |
The table will scroll to the left
Task name | Result | Metric | ||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
BPS | 0.336 | Accuracy | ||||||||||||||||||||||||
ruMMLU | 0.712 | Accuracy | ||||||||||||||||||||||||
SimpleAr | 0.955 | Exact match | ||||||||||||||||||||||||
ruHumanEval | 0.01 / 0.052 / 0.104 | Pass@k | ||||||||||||||||||||||||
ruHHH | 0.663 | |||||||||||||||||||||||||
ruHateSpeech | 0.725 | |||||||||||||||||||||||||
ruDetox | 0.138 | |||||||||||||||||||||||||
ruEthics |
|
Model, team | Honest | Helpful | Harmless |
---|---|---|---|
lightblue/suzume-llama-3-8B-multilingual
BODBE LLM |
0.59 | 0.593 | 0.81 |
Model, team | Anatomy | Virology | Astronomy | Marketing | Nutrition | Sociology | Management | Philosophy | Prehistory | Human aging | Econometrics | Formal logic | Global facts | Jurisprudence | Miscellaneous | Moral disputes | Business ethics | Biology (college) | Physics (college) | Human Sexuality | Moral scenarios | World religions | Abstract algebra | Medicine (college) | Machine learning | Medical genetics | Professional law | PR | Security studies | Chemistry (школьная) | Computer security | International law | Logical fallacies | Politics | Clinical knowledge | Conceptual_physics | Math (college) | Biology (high school) | Physics (high school) | Chemistry (high school) | Geography (high school) | Professional medicine | Electrical engineering | Elementary mathematics | Psychology (high school) | Statistics (high school) | History (high school) | Math (high school) | Professional accounting | Professional psychology | Computer science (college) | World history (high school) | Macroeconomics | Microeconomics | Computer science (high school) | European history | Government and politics |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
lightblue/suzume-llama-3-8B-multilingual
BODBE LLM |
0.7 | 0.875 | 0.9 | 0.571 | 0.762 | 0.8 | 0.8 | 0.765 | 0.9 | 0.9 | 0.636 | 0.8 | 0.6 | 0.692 | 0.455 | 0.4 | 0.8 | 0.889 | 0.4 | 0.9 | 0.4 | 0.731 | 0.8 | 0.725 | 0.6 | 0.636 | 0.75 | 0.786 | 0.9 | 0.636 | 0.4 | 0.778 | 0.8 | 0.8 | 0.818 | 0.9 | 0.8 | 0.857 | 0.7 | 0.6 | 0.785 | 0.7 | 0.8 | 0.7 | 0.813 | 0.4 | 0.9 | 0.7 | 0.5 | 0.7 | 0.5 | 0.813 | 0.853 | 1 | 0.417 | 0.394 | 0.704 |
Model, team | SIM | FL | STA |
---|---|---|---|
lightblue/suzume-llama-3-8B-multilingual
BODBE LLM |
0.513 | 0.701 | 0.311 |
Model, team | Virtue | Law | Moral | Justice | Utilitarianism |
---|---|---|---|---|---|
lightblue/suzume-llama-3-8B-multilingual
BODBE LLM |
-0.287 | -0.309 | -0.283 | -0.261 | -0.227 |
Model, team | Virtue | Law | Moral | Justice | Utilitarianism |
---|---|---|---|---|---|
lightblue/suzume-llama-3-8B-multilingual
BODBE LLM |
-0.327 | -0.314 | -0.337 | -0.273 | -0.25 |
Model, team | Virtue | Law | Moral | Justice | Utilitarianism |
---|---|---|---|---|---|
lightblue/suzume-llama-3-8B-multilingual
BODBE LLM |
-0.338 | -0.318 | -0.356 | -0.305 | -0.274 |
Model, team | Women | Men | LGBT | Nationalities | Migrants | Other |
---|---|---|---|---|---|---|
lightblue/suzume-llama-3-8B-multilingual
BODBE LLM |
0.759 | 0.743 | 0.588 | 0.595 | 0.429 | 0.803 |