Llama-3.1-Nemotron-70B-Instruct

llmarena.ru Created at 21.03.2025 16:53
0.525
The overall result
83
Place in the rating

Ratings for leaderboard tasks

The table will scroll to the left

Task name Result Metric
LCS 0.168 Accuracy
RCB 0.58 / 0.528 Accuracy F1 macro
USE 0.314 Grade norm
RWSD 0.523 Accuracy
PARus 0.93 Accuracy
ruTiE 0.818 Accuracy
MultiQ 0.615 / 0.444 F1 Exact match
CheGeKa 0.351 / 0.255 F1 Exact match
ruModAr 0.447 Exact match
MaMuRAMu 0.813 Accuracy
ruMultiAr 0.314 Exact match
ruCodeEval 0 / 0 / 0 Pass@k
MathLogicQA 0.478 Accuracy
ruWorldTree 0.975 / 0.781 Accuracy F1 macro
ruOpenBookQA 0.89 / 0.724 Accuracy F1 macro

Evaluation on open tasks:

Go to the ratings by subcategory

The table will scroll to the left

Task name Result Metric
BPS 0.925 Accuracy
ruMMLU 0.756 Accuracy
SimpleAr 0.98 Exact match
ruHumanEval 0 / 0 / 0 Pass@k
ruHHH 0.82
ruHateSpeech 0.785
ruDetox 0.181
ruEthics
Correct God Ethical
Virtue 0.227 0.258 0.256
Law 0.218 0.239 0.235
Moral 0.248 0.269 0.267
Justice 0.209 0.218 0.225
Utilitarianism 0.199 0.231 0.222

Information about the submission:

Mera version
v.1.2.0
Torch Version
2.5.1
The version of the codebase
30667dc
CUDA version
12.4
Precision of the model weights
auto
Seed
1234
Batch
1
Transformers version
4.49.0
The number of GPUs and their type
0
Architecture
local-chat-completions

Team:

llmarena.ru

Name of the ML model:

Llama-3.1-Nemotron-70B-Instruct

Model size

70.0B

Model type:

API

Opened

SFT

Additional links:

https://huggingface.co/nvidia/Llama-3.1-Nemotron-70B-Instruct

Description of the training:

This model was trained using RLHF (specifically, REINFORCE), Llama-3.1-Nemotron-70B-Reward and HelpSteer2-Preference prompts on a Llama-3.1-70B-Instruct model as the initial policy.

License:

Llama 3.1 Community License Agreement

Inference parameters

Generation Parameters:
rucodeeval - do_sample=true;temperature=0.6;until=["\nclass","\ndef","\n#","\nif","\nprint"]; \nruhumaneval - do_sample=true;temperature=0.6;until=["\nclass","\ndef","\n#","\nif","\nprint"];

System prompt:
Реши задачу по инструкции ниже. Не давай никаких объяснений и пояснений к своему ответу. Не пиши ничего лишнего. Пиши только то, что указано в инструкции. Если по инструкции нужно решить пример, то напиши только числовой ответ без хода решения и пояснений. Если по инструкции нужно вывести букву, цифру или слово, выведи только его. Если по инструкции нужно выбрать один из вариантов ответа и вывести букву или цифру, которая ему соответствует, то выведи только эту букву или цифру, не давай никаких пояснений, не добавляй знаки препинания, только 1 символ в ответе. Если по инструкции нужно дописать код функции на языке Python, пиши сразу код, соблюдая отступы так, будто ты продолжаешь функцию из инструкции, не давай пояснений, не пиши комментарии, используй только аргументы из сигнатуры функции в инструкции, не пробуй считывать данные через функцию input. Не извиняйся, не строй диалог. Выдавай только ответ и ничего больше.

Expand information

Ratings by subcategory

Metric: Grade Norm
Model, team 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 8_0 8_1 8_2 8_3 8_4
Llama-3.1-Nemotron-70B-Instruct
llmarena.ru
0.433 0.333 0.767 0.3 0.1 0.533 0.1 - 0.067 0.033 0.133 0.033 0.233 0.133 0.033 0.483 0 0.067 0.1 0.1 0.1 0.633 0.233 0.267 0.233 0.625 0.2 0.5 0.467 0.467 0.6
Model, team Honest Helpful Harmless
Llama-3.1-Nemotron-70B-Instruct
llmarena.ru
0.787 0.78 0.897
Model, team Anatomy Virology Astronomy Marketing Nutrition Sociology Management Philosophy Prehistory Human aging Econometrics Formal logic Global facts Jurisprudence Miscellaneous Moral disputes Business ethics Biology (college) Physics (college) Human Sexuality Moral scenarios World religions Abstract algebra Medicine (college) Machine learning Medical genetics Professional law PR Security studies Chemistry (школьная) Computer security International law Logical fallacies Politics Clinical knowledge Conceptual_physics Math (college) Biology (high school) Physics (high school) Chemistry (high school) Geography (high school) Professional medicine Electrical engineering Elementary mathematics Psychology (high school) Statistics (high school) History (high school) Math (high school) Professional accounting Professional psychology Computer science (college) World history (high school) Macroeconomics Microeconomics Computer science (high school) European history Government and politics
Llama-3.1-Nemotron-70B-Instruct
llmarena.ru
0.704 0.548 0.849 0.88 0.853 0.886 0.806 0.817 0.852 0.758 0.614 0.556 0.54 0.815 0.852 0.76 0.78 0.903 0.567 0.84 0.74 0.848 0.45 0.723 0.571 0.88 0.623 0.685 0.776 0.51 0.78 0.934 0.767 0.929 0.77 0.799 0.51 0.903 0.609 0.734 0.864 0.842 0.724 0.66 0.882 0.699 0.907 0.47 0.525 0.752 0.63 0.886 0.813 0.853 0.87 0.861 0.943
Model, team SIM FL STA
Llama-3.1-Nemotron-70B-Instruct
llmarena.ru
0.412 0.606 0.736
Model, team Anatomy Virology Astronomy Marketing Nutrition Sociology Managment Philosophy Pre-History Gerontology Econometrics Formal logic Global facts Jurisprudence Miscellaneous Moral disputes Business ethics Bilology (college) Physics (college) Human sexuality Moral scenarios World religions Abstract algebra Medicine (college) Machine Learning Genetics Professional law PR Security Chemistry (college) Computer security International law Logical fallacies Politics Clinical knowledge Conceptual physics Math (college) Biology (high school) Physics (high school) Chemistry (high school) Geography (high school) Professional medicine Electrical Engineering Elementary mathematics Psychology (high school) Statistics (high school) History (high school) Math (high school) Professional Accounting Professional psychology Computer science (college) World history (high school) Macroeconomics Microeconomics Computer science (high school) Europe History Government and politics
Llama-3.1-Nemotron-70B-Instruct
llmarena.ru
0.689 0.891 0.7 0.648 0.882 0.828 0.69 0.737 0.904 0.754 0.833 0.758 0.65 0.783 0.819 0.728 0.682 0.822 0.772 0.86 0.877 0.864 0.867 0.864 0.822 0.864 0.821 0.789 0.895 0.822 0.844 0.859 0.83 0.912 0.727 0.875 0.822 0.822 0.702 0.8 0.91 0.841 0.822 0.867 0.897 0.889 0.897 0.886 0.754 0.93 0.889 0.884 0.835 0.74 0.651 0.76 0.9
Coorect
Good
Ethical
Model, team Virtue Law Moral Justice Utilitarianism
Llama-3.1-Nemotron-70B-Instruct
llmarena.ru
0.227 0.218 0.248 0.209 0.199
Model, team Virtue Law Moral Justice Utilitarianism
Llama-3.1-Nemotron-70B-Instruct
llmarena.ru
0.258 0.239 0.269 0.218 0.231
Model, team Virtue Law Moral Justice Utilitarianism
Llama-3.1-Nemotron-70B-Instruct
llmarena.ru
0.256 0.235 0.267 0.225 0.222
Model, team Women Men LGBT Nationalities Migrants Other
Llama-3.1-Nemotron-70B-Instruct
llmarena.ru
0.815 0.657 0.706 0.784 0.714 0.836