Norwegian NLU 🇳🇴

Last updated: 02/05/2024 11:26:44 CET
Model ID Parameters Vocabulary size Context Commercial Speed Rank NorNE-nb NorNE-nn NoReC ScaLA-nb ScaLA-nn NorQuAD NorNE-nb version NorNE-nn version NoReC version ScaLA-nb version ScaLA-nn version NorQuAD version
ltg/norbert3-large 354 50 508 True 5,048 ± 824 / 1,354 ± 429 1.12 93.12 ± 0.83 / 90.13 ± 1.02 89.39 ± 0.52 / 86.03 ± 0.65 64.62 ± 1.36 / 75.40 ± 0.97 77.97 ± 3.04 / 88.19 ± 1.89 76.30 ± 1.56 / 87.68 ± 0.86 66.03 ± 1.19 / 79.64 ± 1.09 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
NbAiLab/nb-bert-large 355 50 512 True 6,343 ± 1,236 / 1,444 ± 456 1.18 91.32 ± 0.90 / 87.09 ± 1.01 87.29 ± 0.74 / 83.06 ± 1.22 63.40 ± 1.38 / 74.60 ± 1.12 78.43 ± 1.47 / 88.75 ± 0.86 79.59 ± 1.72 / 89.56 ± 0.88 59.20 ± 1.13 / 73.80 ± 1.05 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
ltg/norbert3-base 124 50 508 True 11,405 ± 1,970 / 2,856 ± 917 1.32 92.36 ± 0.55 / 89.79 ± 0.73 88.49 ± 0.97 / 85.45 ± 1.21 59.73 ± 2.46 / 70.77 ± 3.26 74.40 ± 2.03 / 86.34 ± 1.28 68.85 ± 3.21 / 83.17 ± 2.09 57.67 ± 1.86 / 72.51 ± 1.52 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
AI-Sweden-Models/roberta-large-1160k 355 50 512 True 5,741 ± 987 / 1,554 ± 494 1.36 92.01 ± 0.98 / 92.36 ± 0.70 87.17 ± 1.24 / 88.75 ± 0.89 60.11 ± 2.96 / 70.55 ± 3.99 72.85 ± 5.60 / 85.73 ± 3.14 65.56 ± 1.91 / 82.23 ± 0.97 60.38 ± 0.95 / 75.42 ± 1.16 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1
AI-Sweden-Models/roberta-large-1350k 355 50 512 True 5,744 ± 969 / 1,539 ± 492 1.37 92.49 ± 0.81 / 92.58 ± 0.61 87.22 ± 1.19 / 88.71 ± 1.02 58.77 ± 3.69 / 69.56 ± 4.55 76.30 ± 2.09 / 87.19 ± 1.40 64.11 ± 5.18 / 80.68 ± 3.72 60.69 ± 1.15 / 75.73 ± 1.06 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1
gpt-4-0613 (few-shot, val) unknown 100 8192 True 597 ± 197 / 93 ± 33 1.49 81.16 ± 2.46 / 63.39 ± 4.07 75.75 ± 4.49 / 60.44 ± 5.46 72.72 ± 3.20 / 81.35 ± 2.22 77.30 ± 2.97 / 88.39 ± 1.60 57.18 ± 3.91 / 76.40 ± 2.66 47.50 ± 2.86 / 75.24 ± 1.32 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 12.9.0
KennethEnevoldsen/dfm-sentence-encoder-large-1 355 50 512 True 6,245 ± 1,260 / 1,416 ± 453 1.51 86.39 ± 1.06 / 85.20 ± 0.97 83.22 ± 1.44 / 82.50 ± 1.19 59.61 ± 1.28 / 71.40 ± 1.68 67.88 ± 7.37 / 82.75 ± 4.57 62.44 ± 4.39 / 80.43 ± 2.39 55.69 ± 1.60 / 70.77 ± 1.64 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
NbAiLab/nb-roberta-base-scandi-1e4 278 250 512 True 15,074 ± 2,990 / 3,347 ± 1,080 1.54 92.09 ± 0.51 / 89.67 ± 0.54 86.85 ± 1.94 / 83.35 ± 2.01 59.84 ± 1.40 / 72.11 ± 1.25 73.33 ± 2.17 / 85.89 ± 1.29 71.06 ± 1.62 / 84.78 ± 1.05 43.67 ± 1.71 / 57.42 ± 1.56 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
danish-foundation-models/encoder-large-v1 355 50 512 True 6,671 ± 1,380 / 1,497 ± 482 1.54 88.66 ± 1.23 / 84.60 ± 1.44 84.59 ± 1.98 / 80.07 ± 2.11 55.59 ± 10.43 / 67.82 ± 9.45 71.43 ± 1.67 / 84.61 ± 1.15 53.30 ± 13.11 / 74.52 ± 7.96 57.38 ± 1.41 / 72.48 ± 1.49 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
KennethEnevoldsen/dfm-sentence-encoder-large-2 355 50 512 True 6,569 ± 1,320 / 1,492 ± 476 1.55 86.78 ± 0.95 / 85.37 ± 0.78 83.28 ± 1.29 / 82.22 ± 1.30 58.73 ± 2.31 / 70.10 ± 2.66 70.73 ± 3.19 / 84.71 ± 1.82 59.58 ± 7.22 / 78.86 ± 3.51 56.04 ± 1.56 / 71.02 ± 1.42 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
NbAiLab/nb-roberta-base-scandi 278 250 512 True 15,079 ± 2,948 / 3,359 ± 1,091 1.56 92.24 ± 0.44 / 89.66 ± 0.60 87.58 ± 0.68 / 84.23 ± 0.85 59.98 ± 1.33 / 71.70 ± 1.69 70.18 ± 1.41 / 83.83 ± 0.91 70.81 ± 1.50 / 84.45 ± 0.95 44.27 ± 1.46 / 58.30 ± 1.82 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
google/rembert 576 250 256 True 11,736 ± 2,822 / 2,102 ± 677 1.57 88.70 ± 2.05 / 84.95 ± 2.83 86.11 ± 1.67 / 82.16 ± 1.96 54.19 ± 3.15 / 65.18 ± 4.55 69.83 ± 2.01 / 84.72 ± 1.10 54.84 ± 12.59 / 75.13 ± 9.44 58.18 ± 1.84 / 71.29 ± 1.51 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
meta-llama/Meta-Llama-3-70B (few-shot, val) 70554 128 8192 True 312 ± 55 / 177 ± 51 1.57 75.31 ± 3.84 / 64.90 ± 4.05 75.94 ± 4.62 / 62.81 ± 5.25 66.74 ± 4.50 / 74.89 ± 3.69 59.82 ± 3.52 / 79.17 ± 2.10 47.56 ± 3.52 / 71.91 ± 1.79 60.87 ± 4.82 / 82.30 ± 2.52 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
pere/roberta-base-exp-8 278 250 512 True 15,112 ± 2,969 / 3,347 ± 1,093 1.64 88.99 ± 0.96 / 86.13 ± 1.08 82.99 ± 1.53 / 78.66 ± 1.85 57.37 ± 1.31 / 70.45 ± 1.13 69.92 ± 2.01 / 83.51 ± 1.33 70.05 ± 1.76 / 84.13 ± 1.17 41.98 ± 3.70 / 55.88 ± 3.70 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
microsoft/mdeberta-v3-base 279 251 512 True 20,637 ± 3,925 / 4,497 ± 1,502 1.67 91.90 ± 0.54 / 89.55 ± 0.57 86.81 ± 1.35 / 83.46 ± 1.68 53.69 ± 4.28 / 63.69 ± 6.95 70.55 ± 1.64 / 84.79 ± 0.86 61.21 ± 1.20 / 79.87 ± 0.76 48.82 ± 1.20 / 63.72 ± 1.06 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
NbAiLab/nb-bert-base 178 120 512 True 14,050 ± 3,222 / 2,727 ± 886 1.68 93.01 ± 0.68 / 89.36 ± 0.86 88.43 ± 0.78 / 84.38 ± 0.81 60.84 ± 1.48 / 72.16 ± 1.38 73.89 ± 1.31 / 86.19 ± 0.93 72.10 ± 2.07 / 85.37 ± 1.33 33.01 ± 3.06 / 45.28 ± 3.64 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
pere/roberta-base-exp-32 278 250 512 True 15,081 ± 2,950 / 3,365 ± 1,092 1.69 91.66 ± 0.74 / 89.26 ± 0.97 87.74 ± 0.77 / 84.51 ± 1.03 57.43 ± 1.55 / 70.43 ± 1.41 63.31 ± 11.58 / 80.18 ± 5.59 62.79 ± 11.35 / 79.65 ± 6.48 41.05 ± 2.59 / 54.44 ± 3.33 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
KennethEnevoldsen/dfm-sentence-encoder-medium-3 178 120 512 True 14,050 ± 3,278 / 2,749 ± 894 1.70 91.17 ± 0.52 / 91.04 ± 0.58 87.30 ± 1.07 / 88.83 ± 0.84 59.10 ± 1.47 / 70.50 ± 2.06 74.32 ± 1.76 / 86.47 ± 1.18 72.94 ± 1.63 / 86.07 ± 0.99 34.06 ± 2.05 / 45.46 ± 2.65 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
sentence-transformers/use-cmlm-multilingual 471 501 512 True 30,231 ± 8,171 / 4,863 ± 1,598 1.70 90.08 ± 0.76 / 87.12 ± 1.08 86.04 ± 0.78 / 81.89 ± 0.98 56.35 ± 1.25 / 69.31 ± 1.02 59.38 ± 2.52 / 78.02 ± 1.71 46.54 ± 3.21 / 71.78 ± 2.00 55.05 ± 1.24 / 70.46 ± 1.22 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
vesteinn/ScandiBERT-no-faroese 124 50 512 True 15,436 ± 2,820 / 3,704 ± 1,187 1.70 91.09 ± 0.65 / 88.45 ± 0.70 85.72 ± 1.92 / 81.88 ± 2.16 50.90 ± 3.01 / 60.96 ± 5.41 69.34 ± 3.13 / 83.11 ± 2.17 66.24 ± 2.41 / 82.36 ± 1.49 48.45 ± 0.63 / 62.96 ± 0.80 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
intfloat/multilingual-e5-large 560 250 512 True 6,732 ± 1,273 / 1,633 ± 523 1.71 89.86 ± 0.93 / 90.64 ± 0.80 84.32 ± 1.08 / 86.52 ± 0.94 61.52 ± 1.87 / 72.72 ± 1.95 62.34 ± 2.48 / 79.94 ± 1.46 34.88 ± 11.23 / 66.51 ± 5.72 53.01 ± 1.05 / 70.46 ± 0.86 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
FacebookAI/xlm-roberta-large 560 250 512 True 17,897 ± 3,921 / 3,463 ± 1,141 1.73 91.66 ± 1.24 / 89.34 ± 1.62 86.19 ± 0.97 / 82.86 ± 1.29 50.25 ± 15.36 / 63.55 ± 13.05 55.51 ± 18.28 / 74.00 ± 13.38 43.89 ± 14.81 / 68.88 ± 10.32 57.57 ± 2.43 / 72.69 ± 2.22 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
pere/roberta-debug-8 278 250 512 True 15,103 ± 2,954 / 3,356 ± 1,090 1.73 91.16 ± 0.71 / 88.71 ± 0.94 84.75 ± 1.23 / 80.56 ± 1.59 55.25 ± 2.36 / 66.95 ± 2.79 68.03 ± 2.37 / 82.12 ± 1.69 66.90 ± 2.07 / 82.33 ± 1.34 41.65 ± 1.17 / 55.71 ± 1.73 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
vesteinn/FoBERT 124 50 512 True 15,623 ± 2,828 / 3,737 ± 1,191 1.73 90.65 ± 0.66 / 88.03 ± 0.77 84.88 ± 1.55 / 81.01 ± 1.95 52.44 ± 2.90 / 62.48 ± 4.62 68.77 ± 2.01 / 83.10 ± 1.42 65.40 ± 2.43 / 81.72 ± 1.68 43.13 ± 2.05 / 56.76 ± 2.21 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
ltg/norbert3-small 41 50 508 True 13,515 ± 2,514 / 3,042 ± 1,004 1.76 90.02 ± 0.72 / 86.99 ± 0.87 86.52 ± 1.17 / 83.03 ± 1.53 51.36 ± 3.24 / 61.35 ± 5.46 67.29 ± 2.13 / 82.23 ± 1.36 56.67 ± 2.29 / 76.74 ± 1.73 48.63 ± 1.31 / 63.28 ± 1.09 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
pere/roberta-debug-32 278 250 512 True 14,958 ± 2,903 / 3,331 ± 1,077 1.85 89.07 ± 1.19 / 85.81 ± 1.57 83.27 ± 1.68 / 78.80 ± 2.22 53.23 ± 1.67 / 65.23 ± 2.65 70.06 ± 2.33 / 83.61 ± 1.61 66.81 ± 1.83 / 82.19 ± 1.20 34.17 ± 2.16 / 47.61 ± 2.55 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
setu4993/LaBSE 471 501 512 True 25,418 ± 6,435 / 4,536 ± 1,452 1.86 90.58 ± 0.91 / 87.84 ± 1.00 85.21 ± 1.05 / 81.57 ± 1.30 54.26 ± 1.75 / 67.25 ± 1.47 59.44 ± 1.47 / 78.80 ± 1.00 49.30 ± 1.39 / 74.02 ± 1.04 46.42 ± 1.44 / 61.82 ± 1.47 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
abhishek/autotrain-llama3-oh-sft-v0-2 (few-shot) 70554 128 8192 False 2,668 ± 802 / 411 ± 135 1.94 80.97 ± 1.73 / 74.56 ± 3.82 79.69 ± 0.64 / 75.12 ± 2.70 63.91 ± 1.74 / 75.51 ± 1.63 39.82 ± 3.91 / 60.91 ± 4.26 26.86 ± 2.91 / 53.05 ± 3.88 47.06 ± 3.13 / 75.80 ± 2.01 12.9.1 12.9.1 12.9.1 12.9.1 12.9.1 12.9.1
pere/roberta-base-exp-32B 278 250 512 True 15,103 ± 2,982 / 3,357 ± 1,081 2.01 90.60 ± 1.16 / 87.83 ± 1.34 86.76 ± 0.82 / 83.48 ± 0.98 52.19 ± 2.87 / 65.60 ± 2.58 54.98 ± 14.19 / 75.67 ± 6.70 58.33 ± 10.94 / 77.88 ± 5.29 29.17 ± 1.36 / 41.75 ± 1.59 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
gpt-3.5-turbo-0613 (few-shot, val) unknown 100 4095 True 921 ± 293 / 113 ± 37 2.03 77.70 ± 2.64 / 68.71 ± 2.97 73.92 ± 2.53 / 67.96 ± 2.67 58.88 ± 3.23 / 71.00 ± 2.87 54.29 ± 4.27 / 73.02 ± 3.26 32.82 ± 3.43 / 56.05 ± 4.14 45.35 ± 2.97 / 73.47 ± 1.69 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 12.9.0
meta-llama/Meta-Llama-3-70B-Instruct (few-shot, val) 70554 128 8192 True 1,673 ± 583 / 275 ± 85 2.06 80.50 ± 2.85 / 76.71 ± 2.48 76.47 ± 3.13 / 73.94 ± 2.95 59.29 ± 5.92 / 69.99 ± 4.80 47.28 ± 3.57 / 69.23 ± 3.04 32.76 ± 3.80 / 60.66 ± 3.10 39.71 ± 2.59 / 71.60 ± 1.57 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
gpt-3.5-turbo-0613 (few-shot) unknown 100 4096 True 837 ± 294 / 126 ± 43 2.08 74.92 ± 1.24 / 64.00 ± 2.37 75.34 ± 1.15 / 68.02 ± 1.41 57.64 ± 1.33 / 71.34 ± 1.15 49.93 ± 1.78 / 69.26 ± 1.76 34.22 ± 2.98 / 57.61 ± 3.23 44.39 ± 1.06 / 71.71 ± 0.83 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
meta-llama/Llama-2-70b-hf (few-shot, val) 68977 32 4096 True 1,892 ± 650 / 318 ± 105 2.21 62.46 ± 4.62 / 60.77 ± 4.36 64.68 ± 3.04 / 61.69 ± 2.31 59.68 ± 3.30 / 70.62 ± 3.08 27.34 ± 12.20 / 50.42 ± 9.24 3.95 ± 4.66 / 36.08 ± 3.27 57.44 ± 4.59 / 78.69 ± 3.09 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
152334H/miqu-1-70b-sf (few-shot, val) 68977 32 32764 True 2,126 ± 676 / 319 ± 104 2.23 66.75 ± 2.07 / 58.61 ± 3.33 66.81 ± 2.57 / 57.71 ± 3.04 60.58 ± 4.96 / 70.33 ± 4.12 47.53 ± 4.07 / 72.24 ± 2.31 17.14 ± 4.72 / 51.14 ± 4.36 41.92 ± 3.36 / 72.51 ± 1.91 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
vesteinn/DanskBERT 124 50 512 True 15,749 ± 2,665 / 4,014 ± 1,281 2.24 86.82 ± 0.53 / 84.15 ± 0.59 79.91 ± 1.17 / 76.65 ± 1.46 47.84 ± 2.44 / 60.67 ± 3.47 51.99 ± 11.45 / 72.87 ± 8.55 30.57 ± 8.63 / 62.90 ± 7.32 36.75 ± 2.18 / 50.77 ± 2.11 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
AI-Sweden-Models/bert-large-nordic-pile-1M-steps 369 64 512 True 6,571 ± 1,331 / 1,493 ± 479 2.25 87.50 ± 0.53 / 87.45 ± 0.54 80.57 ± 1.41 / 82.71 ± 1.13 47.11 ± 2.05 / 60.70 ± 3.19 52.62 ± 3.99 / 75.01 ± 2.43 25.06 ± 6.94 / 60.75 ± 4.12 38.40 ± 2.57 / 50.51 ± 3.52 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
intfloat/multilingual-e5-base 278 250 512 True 14,965 ± 2,890 / 3,322 ± 1,074 2.27 88.26 ± 1.11 / 89.24 ± 0.95 81.37 ± 1.67 / 83.89 ± 1.34 54.61 ± 1.51 / 67.19 ± 1.65 50.35 ± 1.78 / 73.06 ± 1.46 22.15 ± 9.02 / 58.20 ± 5.38 31.77 ± 1.55 / 45.47 ± 2.02 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 12.6.1
microsoft/xlm-align-base 278 250 512 True 14,744 ± 2,870 / 3,265 ± 1,053 2.29 90.07 ± 1.08 / 87.56 ± 1.39 85.65 ± 0.96 / 82.40 ± 1.16 54.46 ± 1.16 / 68.25 ± 0.76 12.16 ± 5.91 / 50.55 ± 4.73 8.99 ± 2.25 / 48.57 ± 3.67 49.24 ± 1.30 / 64.35 ± 1.24 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
timpal0l/sol (few-shot) 10732 32 4096 False 3,701 ± 876 / 771 ± 247 2.29 65.14 ± 1.62 / 52.85 ± 3.03 65.88 ± 1.53 / 52.89 ± 2.29 57.06 ± 1.54 / 71.09 ± 1.17 26.41 ± 3.80 / 51.92 ± 5.30 19.58 ± 1.29 / 53.93 ± 2.90 51.60 ± 1.97 / 77.87 ± 1.44 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
cardiffnlp/twitter-xlm-roberta-base 278 250 512 True 34,475 ± 7,465 / 6,712 ± 2,223 2.31 87.70 ± 0.66 / 85.32 ± 0.94 81.41 ± 1.46 / 77.41 ± 1.57 48.34 ± 2.10 / 60.68 ± 3.61 55.30 ± 2.89 / 75.77 ± 1.87 37.46 ± 2.69 / 67.68 ± 1.66 24.49 ± 6.03 / 35.93 ± 8.56 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
Geotrend/bert-base-25lang-cased 151 85 512 True 13,908 ± 3,201 / 2,700 ± 872 2.33 87.99 ± 1.24 / 84.84 ± 1.42 83.10 ± 1.12 / 79.18 ± 1.45 36.21 ± 1.82 / 49.48 ± 2.69 46.43 ± 1.81 / 71.65 ± 1.39 39.82 ± 2.81 / 68.68 ± 1.81 40.01 ± 1.58 / 53.12 ± 1.81 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
google-bert/bert-base-multilingual-uncased 167 106 512 True 13,993 ± 3,217 / 2,752 ± 893 2.34 82.90 ± 1.44 / 79.06 ± 1.52 77.33 ± 2.00 / 72.83 ± 1.96 37.28 ± 2.13 / 48.69 ± 3.26 49.41 ± 1.57 / 73.96 ± 0.87 43.58 ± 2.23 / 71.20 ± 1.61 40.35 ± 2.26 / 54.01 ± 2.42 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
jonfd/electra-small-nordic 22 96 128 True 5,989 ± 120 / 3,809 ± 1,230 2.36 84.95 ± 0.58 / 81.68 ± 0.65 79.57 ± 1.49 / 74.62 ± 1.92 40.15 ± 1.15 / 46.26 ± 0.47 72.87 ± 1.11 / 85.86 ± 0.63 63.77 ± 1.27 / 81.62 ± 0.65 14.16 ± 4.55 / 19.70 ± 6.40 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
google-bert/bert-base-multilingual-cased 178 120 512 True 14,083 ± 3,264 / 2,738 ± 889 2.37 88.72 ± 0.76 / 85.49 ± 0.92 83.08 ± 1.22 / 79.36 ± 1.38 35.87 ± 1.85 / 48.94 ± 3.37 44.22 ± 3.29 / 70.31 ± 2.86 39.55 ± 7.01 / 68.65 ± 3.36 40.55 ± 2.19 / 53.62 ± 2.68 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
upstage/SOLAR-10.7B-v1.0 (few-shot) 10732 32 4096 True 3,780 ± 906 / 799 ± 261 2.37 68.11 ± 1.83 / 57.57 ± 3.10 68.19 ± 1.01 / 56.90 ± 2.54 55.33 ± 1.95 / 69.71 ± 1.56 10.15 ± 3.24 / 36.27 ± 1.44 7.51 ± 2.97 / 35.89 ± 1.30 55.33 ± 3.29 / 80.42 ± 1.68 12.5.3 12.5.3 12.5.3 12.5.3 12.5.3 12.5.3
Nexusflow/Starling-LM-7B-beta (few-shot) 7242 32 8192 False 5,876 ± 1,021 / 1,677 ± 546 2.38 66.22 ± 2.15 / 48.98 ± 4.65 64.14 ± 1.26 / 49.59 ± 4.31 55.48 ± 1.77 / 69.68 ± 1.45 26.13 ± 1.28 / 56.08 ± 2.05 17.32 ± 0.77 / 54.57 ± 1.49 49.75 ± 1.22 / 77.08 ± 0.60 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2
abhishek/autotrain-llama3-orpo-v2 (few-shot) 8030 128 8192 False 8,532 ± 2,749 / 1,177 ± 399 2.39 74.62 ± 1.74 / 66.51 ± 2.71 74.22 ± 1.16 / 67.02 ± 2.57 51.30 ± 3.11 / 66.17 ± 3.21 26.03 ± 2.08 / 61.04 ± 2.00 19.90 ± 1.99 / 58.22 ± 1.98 45.31 ± 3.83 / 71.62 ± 3.39 12.8.0 12.8.0 12.8.0 12.8.0 12.8.0 12.8.0
four-two-labs/orpo-llama-3-swe (few-shot) 8030 128 8192 False 4,974 ± 1,208 / 1,032 ± 342 2.40 61.63 ± 1.64 / 48.09 ± 2.89 61.30 ± 1.94 / 50.08 ± 2.60 48.85 ± 2.22 / 64.93 ± 2.00 24.15 ± 6.12 / 56.29 ± 6.81 21.33 ± 3.03 / 58.05 ± 2.59 53.66 ± 4.34 / 75.19 ± 3.59 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
meta-llama/Meta-Llama-3-8B-Instruct (few-shot) 8030 128 8192 True 4,909 ± 1,215 / 978 ± 319 2.40 74.47 ± 1.47 / 65.57 ± 2.39 72.93 ± 1.00 / 65.44 ± 2.55 50.62 ± 3.52 / 65.69 ± 3.50 27.77 ± 1.63 / 61.75 ± 1.77 20.35 ± 1.92 / 57.74 ± 2.28 42.90 ± 3.57 / 69.90 ± 3.17 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1
birgermoell/Munin-NeuralBeagle-NorskGPT (few-shot, val) 7242 32 32768 False 2,903 ± 407 / 1,157 ± 350 2.42 63.33 ± 2.69 / 53.24 ± 3.41 68.84 ± 1.87 / 53.85 ± 3.78 58.28 ± 3.11 / 69.79 ± 2.39 18.65 ± 3.84 / 45.34 ± 2.61 10.72 ± 5.52 / 43.91 ± 3.48 44.39 ± 3.95 / 70.76 ± 3.10 9.3.1 9.3.1 9.3.1 9.3.1 9.3.1 12.5.2
birgermoell/WestLake-Munin-Cat-NorskGPT (few-shot, val) 7242 32 32768 False 2,856 ± 391 / 1,142 ± 342 2.42 63.33 ± 2.69 / 53.24 ± 3.41 68.84 ± 1.87 / 53.85 ± 3.78 58.28 ± 3.11 / 69.79 ± 2.39 18.65 ± 3.84 / 45.34 ± 2.61 10.72 ± 5.52 / 43.91 ± 3.48 44.39 ± 3.95 / 70.76 ± 3.10 9.3.1 9.3.1 9.3.1 9.3.1 9.3.1 12.5.2
Geotrend/bert-base-en-no-cased 111 33 512 True 14,081 ± 3,231 / 2,748 ± 891 2.43 89.07 ± 1.08 / 85.89 ± 1.04 82.69 ± 0.90 / 79.22 ± 0.93 34.97 ± 1.74 / 48.21 ± 2.02 39.58 ± 5.70 / 67.91 ± 3.00 31.27 ± 9.57 / 62.81 ± 7.17 41.89 ± 1.64 / 55.17 ± 2.07 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
Geotrend/bert-base-en-fr-de-no-da-cased 118 42 512 True 13,973 ± 3,205 / 2,725 ± 884 2.44 88.05 ± 0.85 / 84.88 ± 1.05 83.08 ± 1.50 / 79.06 ± 1.70 35.34 ± 1.88 / 48.31 ± 2.28 31.45 ± 12.12 / 63.68 ± 6.21 36.12 ± 8.59 / 66.98 ± 4.91 41.59 ± 2.45 / 54.67 ± 2.65 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
meta-llama/Meta-Llama-3-8B (few-shot) 8030 128 8192 True 4,687 ± 1,121 / 967 ± 313 2.45 61.48 ± 1.83 / 47.65 ± 2.94 61.58 ± 2.21 / 50.10 ± 2.68 49.87 ± 1.88 / 66.15 ± 1.44 21.20 ± 6.57 / 52.29 ± 7.43 19.65 ± 4.32 / 56.66 ± 4.40 53.35 ± 4.33 / 74.98 ± 3.70 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1
ZurichNLP/unsup-simcse-xlm-roberta-base 278 250 512 True 34,520 ± 7,443 / 6,730 ± 2,224 2.46 86.56 ± 1.18 / 87.30 ± 0.95 80.57 ± 1.55 / 83.03 ± 1.19 49.62 ± 2.72 / 62.56 ± 3.88 38.45 ± 15.28 / 67.14 ± 7.71 11.38 ± 7.59 / 51.68 ± 4.73 31.50 ± 2.41 / 47.61 ± 2.55 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1
bineric/NorskGPT-Llama3-8b (few-shot) 8030 128 8192 False 3,695 ± 1,277 / 532 ± 183 2.46 66.94 ± 2.67 / 60.25 ± 3.14 67.69 ± 1.87 / 61.57 ± 2.05 48.40 ± 3.28 / 61.42 ± 3.56 24.26 ± 2.68 / 57.31 ± 2.64 18.43 ± 1.34 / 52.28 ± 2.63 46.80 ± 2.74 / 74.57 ± 2.20 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
bineric/NorskGPT-Mistral-7b (few-shot) 7242 32 32768 False 2,443 ± 451 / 761 ± 237 2.49 63.28 ± 1.99 / 47.72 ± 3.74 61.25 ± 1.05 / 45.04 ± 2.92 56.90 ± 1.49 / 70.81 ± 1.30 13.86 ± 1.95 / 44.84 ± 2.31 10.17 ± 1.89 / 46.48 ± 2.46 49.03 ± 4.22 / 74.38 ± 3.92 9.3.1 9.3.1 9.3.1 9.3.1 9.3.1 12.5.1
Geotrend/bert-base-en-da-cased 111 33 512 True 14,062 ± 3,216 / 2,733 ± 885 2.50 88.55 ± 0.85 / 85.29 ± 1.04 83.09 ± 1.06 / 79.27 ± 1.21 35.16 ± 1.75 / 48.41 ± 2.71 31.82 ± 8.92 / 62.98 ± 6.52 32.94 ± 5.88 / 64.36 ± 4.59 39.46 ± 1.70 / 52.33 ± 1.94 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
microsoft/infoxlm-large 560 250 512 True 6,696 ± 1,260 / 1,630 ± 515 2.50 91.90 ± 0.62 / 89.48 ± 0.83 86.59 ± 1.49 / 82.92 ± 1.66 30.56 ± 13.68 / 45.96 ± 11.45 9.79 ± 5.13 / 46.75 ± 6.05 6.36 ± 2.82 / 48.52 ± 4.11 60.47 ± 1.01 / 74.70 ± 0.92 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
timpal0l/dolphin-2.9-llama3-8b-flashback (few-shot, val) 8030 128 8192 False 5,018 ± 1,216 / 996 ± 324 2.50 64.51 ± 3.28 / 51.06 ± 4.78 65.66 ± 3.82 / 53.90 ± 4.32 52.90 ± 4.31 / 65.38 ± 3.73 29.34 ± 4.34 / 59.36 ± 4.64 17.42 ± 4.38 / 52.01 ± 3.50 38.49 ± 4.41 / 67.16 ± 3.41 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
microsoft/infoxlm-base 278 250 512 True 34,735 ± 7,558 / 6,846 ± 2,312 2.51 90.14 ± 0.97 / 87.71 ± 1.24 84.12 ± 1.85 / 80.21 ± 2.19 44.42 ± 13.10 / 57.73 ± 11.86 11.20 ± 2.99 / 48.77 ± 5.01 7.12 ± 2.39 / 49.23 ± 4.14 47.69 ± 1.95 / 62.39 ± 1.74 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
KBLab/megatron-bert-large-swedish-cased-165k 370 64 512 True 7,138 ± 1,111 / 2,067 ± 660 2.52 85.99 ± 0.83 / 83.09 ± 0.94 79.47 ± 1.14 / 75.61 ± 1.34 39.53 ± 0.99 / 50.90 ± 2.17 27.39 ± 2.48 / 61.03 ± 2.27 23.56 ± 2.23 / 60.05 ± 1.05 39.01 ± 1.18 / 51.83 ± 1.58 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
RJuro/munin-neuralbeagle-7b (few-shot, val) 7242 32 32768 False 2,493 ± 466 / 773 ± 243 2.53 61.18 ± 2.76 / 56.36 ± 3.30 65.16 ± 3.97 / 55.74 ± 4.71 55.61 ± 4.02 / 68.27 ± 3.49 20.84 ± 5.41 / 49.36 ± 4.98 9.12 ± 3.51 / 43.06 ± 3.74 42.92 ± 3.08 / 69.13 ± 2.85 9.3.2 9.3.2 9.3.2 9.3.2 9.3.2 12.5.2
timpal0l/BeagleCatMunin2 (few-shot, val) 7242 32 32768 False 2,477 ± 459 / 767 ± 241 2.54 61.17 ± 3.56 / 54.24 ± 3.45 65.44 ± 2.83 / 54.34 ± 3.95 58.69 ± 3.28 / 70.83 ± 2.49 15.03 ± 2.70 / 40.22 ± 1.66 5.95 ± 4.55 / 39.18 ± 2.91 42.42 ± 2.92 / 69.53 ± 3.17 9.3.1 9.3.1 9.3.1 9.3.1 9.3.1 12.5.2
AI-Nordics/bert-large-swedish-cased 335 31 512 True 7,199 ± 1,139 / 2,051 ± 651 2.55 83.32 ± 0.99 / 80.48 ± 0.89 77.97 ± 1.09 / 74.84 ± 1.18 38.44 ± 1.67 / 52.60 ± 1.95 37.54 ± 1.13 / 64.46 ± 1.44 23.10 ± 3.66 / 58.14 ± 3.65 39.97 ± 0.84 / 51.67 ± 1.11 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
Geotrend/bert-base-da-cased 104 23 512 True 15,432 ± 2,838 / 3,642 ± 1,189 2.56 87.52 ± 0.63 / 83.86 ± 0.68 82.66 ± 1.64 / 78.65 ± 2.01 32.73 ± 1.37 / 46.52 ± 1.86 36.41 ± 8.89 / 65.20 ± 6.41 30.37 ± 5.50 / 62.12 ± 5.66 37.71 ± 1.11 / 49.90 ± 1.47 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
meta-llama/Llama-2-70b-chat-hf (few-shot, val) 68977 32 4096 True 1,979 ± 621 / 320 ± 105 2.58 60.21 ± 1.86 / 47.06 ± 3.08 62.99 ± 2.66 / 48.82 ± 5.49 55.12 ± 5.10 / 66.55 ± 5.07 27.12 ± 4.90 / 54.26 ± 6.80 6.82 ± 5.06 / 46.18 ± 4.14 38.50 ± 3.93 / 69.99 ± 2.23 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
Mabeck/Heidrun-Mistral-7B-chat (few-shot) 7242 32 32768 False 5,822 ± 1,283 / 1,336 ± 430 2.62 61.41 ± 1.71 / 52.32 ± 2.63 59.49 ± 1.26 / 49.45 ± 3.31 49.19 ± 1.64 / 63.36 ± 1.52 15.17 ± 2.64 / 50.25 ± 4.51 10.78 ± 1.99 / 50.08 ± 4.20 48.99 ± 2.91 / 73.08 ± 2.26 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1 12.5.0
merge-crew/da-sv-dare-ties-density-0.9 (few-shot, val) 7242 32 32768 True 2,443 ± 458 / 750 ± 240 2.62 48.24 ± 3.18 / 42.53 ± 3.52 61.50 ± 1.54 / 50.90 ± 4.58 49.40 ± 3.40 / 60.71 ± 3.33 24.12 ± 3.24 / 59.38 ± 2.25 13.20 ± 3.16 / 54.42 ± 3.04 47.93 ± 3.46 / 69.52 ± 3.06 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1
timpal0l/BeagleCatMunin (few-shot, val) 7242 32 32768 False 2,495 ± 458 / 775 ± 244 2.63 54.04 ± 2.86 / 48.50 ± 2.85 62.21 ± 3.31 / 50.38 ± 4.32 54.74 ± 3.71 / 67.81 ± 2.80 14.51 ± 1.97 / 40.94 ± 1.63 5.38 ± 4.69 / 37.62 ± 2.92 42.83 ± 3.31 / 69.15 ± 2.50 9.3.2 9.3.2 9.3.2 9.3.2 9.3.2 12.5.2
clips/mfaq 278 250 128 True 5,591 ± 187 / 3,349 ± 1,105 2.64 89.46 ± 1.18 / 86.62 ± 1.53 79.71 ± 1.02 / 75.64 ± 1.17 52.91 ± 2.29 / 64.64 ± 3.28 27.55 ± 12.16 / 53.28 ± 8.27 15.20 ± 9.06 / 51.91 ± 4.95 12.36 ± 8.54 / 17.38 ± 11.78 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
KennethEnevoldsen/munin_mistral-7b (few-shot, val) 7242 32 32768 False 2,543 ± 466 / 787 ± 247 2.65 51.82 ± 4.16 / 44.64 ± 4.66 62.55 ± 3.84 / 49.66 ± 5.87 56.37 ± 4.27 / 69.55 ± 4.52 6.04 ± 5.92 / 36.34 ± 3.96 -0.02 ± 0.04 / 33.47 ± 0.88 48.85 ± 4.11 / 70.75 ± 3.73 12.5.2 12.5.2 12.3.1 12.3.2 12.3.2 12.3.2
merge-crew/da-sv-dare-ties-density-0.6 (few-shot, val) 7242 32 32768 True 2,515 ± 465 / 785 ± 247 2.66 47.26 ± 3.76 / 40.22 ± 3.43 59.35 ± 2.82 / 45.26 ± 3.91 54.93 ± 3.49 / 68.45 ± 2.61 9.00 ± 2.87 / 37.53 ± 2.91 5.26 ± 3.15 / 39.01 ± 3.54 45.95 ± 3.12 / 68.00 ± 3.07 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1
mlabonne/NeuralBeagle14-7B (few-shot, val) 7242 32 8192 False 2,549 ± 472 / 784 ± 245 2.67 62.47 ± 2.56 / 57.71 ± 3.02 66.69 ± 2.91 / 58.83 ± 3.70 54.04 ± 2.91 / 66.46 ± 2.59 16.75 ± 4.54 / 49.11 ± 4.45 13.00 ± 4.46 / 49.33 ± 2.69 34.48 ± 2.13 / 65.43 ± 2.07 9.3.2 9.3.2 9.3.2 9.3.2 9.3.2 12.5.2
senseable/WestLake-7B-v2 (few-shot) 7242 32 32768 False 5,993 ± 1,028 / 1,742 ± 561 2.67 64.37 ± 2.17 / 52.81 ± 2.48 62.77 ± 0.83 / 51.80 ± 2.77 50.60 ± 4.90 / 66.76 ± 3.04 18.09 ± 2.04 / 52.56 ± 2.60 12.25 ± 2.18 / 50.79 ± 2.42 38.34 ± 2.39 / 69.54 ± 1.96 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1
KBLab/megatron-bert-large-swedish-cased-110k 370 64 512 True 7,075 ± 1,093 / 2,057 ± 661 2.68 84.03 ± 0.79 / 80.97 ± 0.92 77.98 ± 1.36 / 74.25 ± 1.62 39.15 ± 3.29 / 53.00 ± 3.85 21.39 ± 2.73 / 58.08 ± 1.93 17.10 ± 3.43 / 57.00 ± 1.86 35.32 ± 1.71 / 47.41 ± 2.25 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
mhenrichsen/danskgpt-chat-v2.1 (few-shot) unknown 32 32768 True 5,085 ± 998 / 1,306 ± 404 2.69 62.43 ± 0.94 / 51.64 ± 2.45 60.68 ± 0.74 / 48.91 ± 2.93 53.41 ± 1.46 / 69.49 ± 1.05 -1.16 ± 1.31 / 33.56 ± 0.43 0.30 ± 0.71 / 34.08 ± 0.32 49.15 ± 2.79 / 76.77 ± 1.59 12.0.0 12.0.0 12.0.0 12.0.0 12.0.0 12.0.0
timpal0l/njord-alpha (few-shot) 7242 32 32768 True 5,431 ± 1,267 / 1,139 ± 365 2.69 50.47 ± 2.96 / 43.31 ± 2.54 51.97 ± 3.83 / 42.66 ± 5.03 48.03 ± 1.71 / 65.89 ± 1.68 22.65 ± 3.80 / 51.83 ± 5.03 17.10 ± 4.78 / 49.03 ± 6.45 44.72 ± 4.47 / 68.08 ± 4.22 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
KBLab/megatron-bert-base-swedish-cased-600k 135 64 512 True 15,726 ± 2,508 / 4,234 ± 1,365 2.71 82.20 ± 1.19 / 79.13 ± 1.26 76.64 ± 1.10 / 72.90 ± 1.43 40.20 ± 1.56 / 54.68 ± 2.46 24.45 ± 2.21 / 58.75 ± 1.80 19.18 ± 3.55 / 57.93 ± 2.05 30.69 ± 1.64 / 42.59 ± 1.94 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
birgermoell/BeagleCatMunin-Flashback-Bellman (few-shot, val) 7242 32 32768 False 2,890 ± 401 / 1,155 ± 348 2.72 53.96 ± 3.37 / 49.84 ± 3.30 63.45 ± 2.27 / 53.13 ± 3.43 52.70 ± 4.58 / 66.82 ± 3.41 14.87 ± 3.37 / 40.83 ± 1.91 2.48 ± 3.31 / 35.61 ± 1.83 41.43 ± 3.34 / 67.26 ± 2.73 9.3.1 9.3.1 9.3.1 9.3.1 9.3.1 12.5.2
merge-crew/da-sv-ties (few-shot, val) 7242 32 32768 True 2,457 ± 451 / 757 ± 237 2.72 47.61 ± 2.50 / 42.16 ± 2.82 60.57 ± 2.02 / 48.89 ± 4.48 44.46 ± 4.10 / 52.31 ± 4.53 23.99 ± 5.54 / 60.60 ± 2.74 11.60 ± 3.18 / 53.40 ± 2.75 47.02 ± 3.37 / 69.07 ± 2.64 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1
birgermoell/Flashback-Bellman (few-shot, val) 7242 32 32768 False 2,887 ± 403 / 1,144 ± 345 2.73 56.44 ± 3.14 / 50.10 ± 4.61 66.56 ± 2.40 / 54.48 ± 4.93 53.24 ± 4.75 / 67.94 ± 3.75 11.96 ± 2.46 / 37.26 ± 1.15 2.50 ± 4.21 / 35.26 ± 1.79 39.21 ± 3.48 / 64.09 ± 3.49 9.3.1 9.3.1 9.3.1 9.3.1 9.3.1 12.5.2
birgermoell/NeuralBeagle-Flashback (few-shot, val) 7242 32 32768 False 2,904 ± 405 / 1,155 ± 349 2.73 51.78 ± 2.90 / 47.69 ± 3.44 61.22 ± 3.73 / 50.00 ± 4.37 53.06 ± 4.92 / 67.05 ± 4.22 10.27 ± 5.84 / 43.06 ± 3.15 8.06 ± 3.56 / 41.59 ± 3.99 40.64 ± 2.58 / 66.46 ± 2.62 9.3.0 9.3.0 9.3.0 9.3.0 9.3.0 12.5.2
merge-crew/da-sv-slerp (few-shot, val) 7242 32 32768 True 2,467 ± 469 / 762 ± 244 2.74 49.67 ± 3.12 / 43.26 ± 3.03 61.11 ± 1.93 / 50.15 ± 4.14 56.07 ± 5.22 / 68.93 ± 4.07 3.81 ± 3.09 / 34.47 ± 1.22 -1.29 ± 2.53 / 33.32 ± 0.91 44.98 ± 4.12 / 68.18 ± 3.39 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1
merge-crew/da-sv-task-arithmetic (few-shot, val) 7242 32 32768 True 2,500 ± 469 / 762 ± 238 2.74 49.69 ± 2.90 / 43.57 ± 2.90 61.78 ± 2.03 / 49.91 ± 4.24 55.87 ± 5.21 / 68.97 ± 3.95 2.99 ± 3.04 / 34.16 ± 1.10 -1.29 ± 2.53 / 33.32 ± 0.91 44.62 ± 4.06 / 68.17 ± 3.48 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1
KB/bert-base-swedish-cased 125 50 512 True 16,181 ± 2,451 / 4,620 ± 1,507 2.75 85.91 ± 0.98 / 83.05 ± 1.29 79.67 ± 1.62 / 76.00 ± 2.01 38.70 ± 2.53 / 50.88 ± 3.38 39.13 ± 2.97 / 67.97 ± 1.68 24.13 ± 6.88 / 60.76 ± 3.29 19.04 ± 4.63 / 27.73 ± 6.70 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
mhenrichsen/hestenettetLM (few-shot) 7242 32 32768 True 5,160 ± 804 / 1,654 ± 516 2.75 52.52 ± 1.85 / 43.46 ± 2.21 55.60 ± 3.22 / 45.25 ± 4.20 48.23 ± 3.31 / 65.51 ± 3.01 8.53 ± 3.72 / 38.61 ± 3.22 6.65 ± 1.40 / 39.32 ± 2.51 46.89 ± 3.29 / 70.96 ± 2.84 12.5.2 12.5.2 12.3.2 12.3.2 12.3.2 12.3.2
DDSC/roberta-base-scandinavian 125 50 512 True 14,491 ± 2,800 / 3,182 ± 1,026 2.76 71.73 ± 15.69 / 68.50 ± 15.04 79.80 ± 0.72 / 75.76 ± 0.90 46.74 ± 5.96 / 60.25 ± 6.04 8.02 ± 12.19 / 50.30 ± 7.19 17.04 ± 13.78 / 56.87 ± 7.06 29.26 ± 1.27 / 42.50 ± 1.17 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
danish-foundation-models/munin-7b-v0.1dev0 (few-shot) 7242 32 8192 True 6,113 ± 1,044 / 1,790 ± 579 2.76 50.43 ± 2.27 / 42.19 ± 3.03 54.20 ± 1.81 / 43.92 ± 2.94 39.21 ± 5.64 / 56.54 ± 6.43 20.51 ± 4.43 / 52.48 ± 5.96 11.66 ± 4.10 / 48.13 ± 6.13 51.57 ± 3.87 / 73.95 ± 3.51 12.5.2 12.5.2 12.4.0 12.4.0 12.4.0 12.4.0
timpal0l/Llama-3-8B-flashback-v1 (few-shot) 8030 128 8192 True 4,849 ± 1,171 / 974 ± 316 2.76 56.02 ± 2.14 / 47.78 ± 2.24 56.10 ± 1.73 / 49.40 ± 2.75 44.77 ± 2.22 / 59.33 ± 2.66 10.73 ± 4.25 / 40.51 ± 3.74 6.05 ± 2.78 / 37.10 ± 2.77 50.82 ± 4.69 / 73.13 ± 3.87 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
birgermoell/Rapid-Cycling (few-shot, val) 7242 32 32768 False 2,346 ± 450 / 666 ± 249 2.77 55.93 ± 2.70 / 50.51 ± 3.15 63.85 ± 2.45 / 53.11 ± 4.11 50.41 ± 5.49 / 64.49 ± 4.37 15.74 ± 4.15 / 41.16 ± 2.21 2.23 ± 4.69 / 34.70 ± 1.39 39.81 ± 2.81 / 65.65 ± 2.64 9.3.2 9.3.2 9.3.2 9.3.2 9.3.2 12.5.2
facebook/xlm-v-base 778 902 512 True 25,396 ± 6,394 / 4,534 ± 1,421 2.77 89.99 ± 1.32 / 87.51 ± 1.20 78.60 ± 3.17 / 74.97 ± 3.56 17.93 ± 14.48 / 34.24 ± 10.14 43.46 ± 17.47 / 66.52 ± 12.26 10.97 ± 10.84 / 43.47 ± 9.81 43.74 ± 2.17 / 59.80 ± 1.86 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
KBLab/bert-base-swedish-cased 125 50 512 True 16,164 ± 2,392 / 4,574 ± 1,478 2.78 85.33 ± 1.01 / 82.13 ± 1.28 79.44 ± 1.66 / 75.74 ± 1.96 38.17 ± 2.21 / 50.44 ± 3.11 39.49 ± 3.36 / 68.13 ± 2.13 22.17 ± 7.22 / 60.16 ± 3.80 19.04 ± 4.63 / 27.73 ± 6.70 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
bineric/NorskGPT-Llama-7B-v0.1 (few-shot) 6738 32 4096 False 5,384 ± 879 / 1,746 ± 553 2.79 56.18 ± 3.05 / 49.39 ± 2.78 56.96 ± 1.64 / 48.30 ± 5.46 50.94 ± 1.41 / 66.55 ± 1.06 8.19 ± 1.95 / 45.17 ± 3.69 5.55 ± 1.71 / 48.92 ± 2.94 41.35 ± 2.33 / 69.72 ± 2.52 12.5.2 12.5.2 12.3.2 12.3.2 12.3.2 12.3.2
jhu-clsp/bernice 278 250 128 True 5,567 ± 450 / 2,483 ± 798 2.79 84.11 ± 1.13 / 81.19 ± 1.37 77.82 ± 1.28 / 73.93 ± 1.46 39.63 ± 1.06 / 49.23 ± 2.13 45.75 ± 3.27 / 71.33 ± 1.67 33.74 ± 2.91 / 63.89 ± 3.31 5.35 ± 3.79 / 7.65 ± 5.41 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
mistralai/Mistral-7B-v0.1 (few-shot) 7242 32 32768 True 2,657 ± 524 / 880 ± 278 2.79 52.00 ± 1.91 / 43.55 ± 2.21 55.12 ± 3.14 / 45.34 ± 4.15 47.25 ± 4.11 / 64.53 ± 3.71 8.66 ± 4.12 / 38.87 ± 3.40 6.80 ± 1.59 / 39.72 ± 2.50 46.86 ± 3.27 / 70.86 ± 2.79 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 12.5.1
AI-Sweden-Models/tyr (few-shot, val) 7242 32 32768 False 6,079 ± 1,051 / 1,760 ± 570 2.80 58.60 ± 4.42 / 52.72 ± 4.93 63.15 ± 2.29 / 54.03 ± 4.33 51.85 ± 3.51 / 64.28 ± 2.54 0.66 ± 1.29 / 33.42 ± 0.80 0.53 ± 1.05 / 33.42 ± 0.73 43.22 ± 2.24 / 68.55 ± 2.42 12.5.2 12.5.2 12.3.2 12.3.2 12.3.2 12.3.2
mlabonne/AlphaMonarch-7B (few-shot, val) 7242 32 8192 False 5,340 ± 1,262 / 1,157 ± 375 2.80 61.90 ± 2.57 / 57.16 ± 2.81 66.92 ± 2.52 / 57.81 ± 3.54 48.80 ± 4.56 / 63.38 ± 3.06 19.53 ± 5.49 / 51.96 ± 4.90 9.83 ± 4.57 / 47.95 ± 2.22 30.27 ± 2.28 / 62.04 ± 2.19 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2
timpal0l/Mistral-7B-v0.1-flashback-v2 (few-shot) 7242 32 32768 True 5,054 ± 1,200 / 1,056 ± 339 2.81 48.97 ± 2.42 / 39.15 ± 2.78 51.52 ± 2.96 / 40.17 ± 3.62 49.05 ± 2.73 / 63.94 ± 2.42 14.37 ± 2.18 / 47.80 ± 4.36 9.96 ± 1.34 / 48.97 ± 3.77 44.07 ± 3.40 / 68.49 ± 2.97 12.5.3 12.5.3 12.5.3 12.5.3 12.5.3 12.5.3
RJuro/munin-neuralbeagle-SkoleGPTOpenOrca-7b (few-shot, val) 7242 32 32768 False 3,008 ± 429 / 991 ± 323 2.82 53.68 ± 2.01 / 49.22 ± 2.67 61.92 ± 4.06 / 49.03 ± 3.97 47.78 ± 3.19 / 57.76 ± 2.55 0.91 ± 1.78 / 33.51 ± 0.85 1.24 ± 1.66 / 33.71 ± 0.94 47.76 ± 2.93 / 70.99 ± 2.39 9.3.2 9.3.2 9.3.2 9.3.2 9.3.2 12.5.2
Twitter/twhin-bert-base 279 250 512 True 11,514 ± 2,041 / 2,862 ± 918 2.82 84.11 ± 1.00 / 81.35 ± 1.22 77.22 ± 1.99 / 73.67 ± 2.26 37.02 ± 1.09 / 47.88 ± 2.50 35.42 ± 12.32 / 66.30 ± 6.78 6.87 ± 6.85 / 50.70 ± 3.62 25.98 ± 2.87 / 36.16 ± 3.35 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 12.6.1
ltg/norbert3-xs 15 50 508 True 14,208 ± 2,713 / 3,059 ± 1,002 2.82 87.63 ± 0.64 / 84.17 ± 0.81 80.19 ± 2.00 / 75.70 ± 2.31 49.92 ± 1.44 / 63.75 ± 1.45 7.93 ± 4.24 / 50.87 ± 2.29 5.06 ± 0.83 / 51.44 ± 1.05 22.46 ± 5.97 / 34.67 ± 8.87 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
sentence-transformers/paraphrase-multilingual-mpnet-base-v2 278 250 512 True 15,100 ± 3,019 / 3,369 ± 1,103 2.82 81.94 ± 0.73 / 78.39 ± 0.86 75.56 ± 1.01 / 71.27 ± 1.18 55.53 ± 1.05 / 68.89 ± 1.16 36.01 ± 2.02 / 64.39 ± 1.49 14.99 ± 8.03 / 54.08 ± 5.71 0.00 ± 0.00 / 0.00 ± 0.00 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
Mabeck/Heidrun-Mistral-7B-base (few-shot) 7242 32 32768 True 3,823 ± 967 / 860 ± 280 2.84 50.10 ± 2.16 / 41.80 ± 2.77 54.81 ± 1.88 / 45.95 ± 3.21 48.64 ± 2.14 / 66.06 ± 1.47 10.31 ± 3.46 / 43.68 ± 5.10 1.11 ± 2.48 / 36.52 ± 2.31 42.20 ± 3.02 / 65.18 ± 2.86 11.0.0 11.0.0 11.0.0 11.0.0 11.0.0 11.0.0
ThatsGroes/munin-SkoleGPTOpenOrca-7b-16bit (few-shot) 7242 32 32768 False 3,006 ± 479 / 1,053 ± 319 2.84 51.99 ± 1.85 / 37.40 ± 2.95 52.74 ± 1.13 / 36.83 ± 1.95 50.39 ± 1.38 / 66.42 ± 1.20 0.99 ± 1.03 / 33.56 ± 0.25 1.27 ± 1.30 / 34.04 ± 0.45 47.95 ± 3.19 / 72.60 ± 2.57 11.0.0 11.0.0 11.0.0 11.0.0 11.0.0 12.4.0
distilbert/distilbert-base-multilingual-cased 135 120 512 True 26,355 ± 5,946 / 5,266 ± 1,714 2.84 83.62 ± 0.75 / 80.61 ± 1.00 80.69 ± 0.69 / 76.61 ± 0.81 33.16 ± 2.13 / 46.93 ± 2.66 36.10 ± 2.45 / 66.11 ± 1.85 30.10 ± 2.50 / 64.29 ± 1.69 19.26 ± 1.57 / 30.04 ± 2.13 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
Geotrend/distilbert-base-en-no-cased 69 33 512 True 26,597 ± 6,036 / 5,271 ± 1,697 2.85 83.93 ± 0.95 / 81.01 ± 0.94 79.39 ± 1.03 / 75.07 ± 1.03 32.32 ± 2.30 / 47.12 ± 2.85 36.15 ± 1.99 / 66.57 ± 1.11 30.17 ± 1.72 / 63.98 ± 1.36 19.71 ± 1.41 / 30.26 ± 1.56 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
Geotrend/distilbert-base-25lang-cased 109 85 512 True 26,099 ± 5,881 / 5,178 ± 1,665 2.86 83.59 ± 1.36 / 80.55 ± 1.53 80.29 ± 1.02 / 76.08 ± 1.06 33.19 ± 1.75 / 46.63 ± 2.55 32.60 ± 6.93 / 65.19 ± 3.31 24.97 ± 6.47 / 61.39 ± 3.34 19.93 ± 1.76 / 30.69 ± 2.36 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
Geotrend/distilbert-base-en-fr-de-no-da-cased 76 42 512 True 26,081 ± 5,875 / 5,209 ± 1,692 2.86 83.49 ± 0.83 / 80.32 ± 0.76 80.23 ± 1.09 / 76.10 ± 1.23 32.66 ± 1.96 / 46.26 ± 3.19 33.65 ± 6.63 / 65.22 ± 4.03 29.07 ± 2.20 / 63.35 ± 1.54 19.29 ± 1.27 / 29.94 ± 1.81 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
alpindale/Mistral-7B-v0.2-hf (few-shot) 7242 32 32768 True 1,841 ± 297 / 651 ± 193 2.86 50.63 ± 2.12 / 44.59 ± 1.80 52.69 ± 2.30 / 46.51 ± 3.63 44.05 ± 2.51 / 61.80 ± 2.28 11.60 ± 4.10 / 43.01 ± 5.07 9.26 ± 1.14 / 46.28 ± 3.60 45.23 ± 3.73 / 68.68 ± 3.29 12.5.2 12.5.2 12.5.1 12.5.1 12.5.1 12.5.1
sentence-transformers/paraphrase-xlm-r-multilingual-v1 278 250 512 True 20,154 ± 4,438 / 3,890 ± 1,256 2.86 81.26 ± 1.25 / 77.69 ± 1.29 74.05 ± 1.72 / 69.84 ± 1.91 49.93 ± 1.46 / 62.37 ± 2.34 38.26 ± 7.56 / 66.01 ± 3.69 25.17 ± 5.32 / 61.27 ± 3.01 0.00 ± 0.00 / 0.00 ± 0.00 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
sentence-transformers/stsb-xlm-r-multilingual 278 250 512 True 15,040 ± 2,953 / 3,417 ± 1,100 2.91 80.08 ± 1.46 / 75.93 ± 1.64 74.59 ± 1.98 / 70.26 ± 2.24 52.16 ± 0.99 / 66.79 ± 0.98 36.30 ± 6.44 / 65.52 ± 3.06 14.21 ± 6.44 / 52.78 ± 5.69 0.00 ± 0.00 / 0.00 ± 0.00 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
Geotrend/distilbert-base-da-cased 61 23 512 True 28,950 ± 5,114 / 7,010 ± 2,267 2.94 82.84 ± 0.61 / 79.91 ± 0.64 78.83 ± 1.18 / 74.64 ± 1.40 30.70 ± 2.63 / 43.77 ± 2.62 34.24 ± 2.30 / 65.60 ± 1.50 27.20 ± 2.61 / 62.87 ± 1.41 16.44 ± 1.76 / 26.22 ± 2.65 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
Geotrend/distilbert-base-en-da-cased 69 33 512 True 26,196 ± 5,956 / 5,220 ± 1,691 2.94 83.27 ± 1.20 / 80.29 ± 1.38 79.59 ± 0.97 / 75.31 ± 1.19 29.37 ± 2.58 / 44.05 ± 3.33 31.50 ± 6.37 / 64.62 ± 3.29 24.06 ± 7.24 / 61.01 ± 3.82 18.62 ± 0.81 / 29.69 ± 1.81 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
flax-community/nordic-roberta-wiki 125 50 512 True 16,227 ± 2,650 / 4,252 ± 1,393 2.94 85.42 ± 0.61 / 82.31 ± 0.65 78.92 ± 1.42 / 74.86 ± 1.50 36.27 ± 1.57 / 50.95 ± 1.70 48.07 ± 5.64 / 72.00 ± 4.07 29.81 ± 3.52 / 64.03 ± 2.35 0.44 ± 0.41 / 1.08 ± 0.99 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
KBLab/megatron-bert-base-swedish-cased-125k 135 64 512 True 15,763 ± 2,523 / 4,238 ± 1,370 2.96 77.98 ± 1.58 / 75.03 ± 1.70 75.00 ± 1.28 / 71.00 ± 1.64 33.88 ± 1.40 / 49.21 ± 1.98 24.23 ± 1.83 / 58.89 ± 1.43 18.18 ± 2.65 / 57.28 ± 1.84 20.56 ± 1.82 / 30.08 ± 2.54 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
jannikskytt/MeDa-Bert 111 32 511 True 16,114 ± 2,429 / 4,566 ± 1,482 2.96 71.69 ± 0.92 / 68.09 ± 0.91 60.00 ± 1.99 / 56.64 ± 1.98 38.94 ± 2.59 / 53.58 ± 3.33 30.32 ± 4.68 / 62.42 ± 3.11 7.99 ± 3.34 / 53.24 ± 1.64 24.02 ± 1.35 / 37.28 ± 1.24 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
mistralai/Mistral-7B-Instruct-v0.2 (few-shot) 7242 32 32768 False 2,538 ± 415 / 821 ± 253 2.96 53.42 ± 2.48 / 42.63 ± 1.66 54.34 ± 1.93 / 41.06 ± 2.40 38.79 ± 2.56 / 53.72 ± 3.01 17.06 ± 1.53 / 56.51 ± 2.06 11.00 ± 1.00 / 53.26 ± 2.32 35.74 ± 2.44 / 64.27 ± 2.42 9.2.0 9.2.0 9.2.0 9.3.1 9.3.1 12.4.0
occiglot/occiglot-7b-eu5-instruct (few-shot) 7242 32 32768 False 2,088 ± 352 / 706 ± 214 2.97 45.50 ± 2.71 / 40.02 ± 3.16 45.96 ± 2.67 / 41.28 ± 2.25 44.46 ± 3.40 / 62.00 ± 2.71 0.00 ± 0.00 / 33.41 ± 0.30 0.00 ± 0.00 / 33.86 ± 0.33 52.19 ± 2.88 / 74.97 ± 2.11 12.5.2 12.5.2 12.2.0 12.3.1 12.3.1 12.4.0
RuterNorway/Llama-2-13b-chat-norwegian (few-shot) unknown 32 4096 False 3,254 ± 1,068 / 484 ± 173 2.98 58.61 ± 1.58 / 47.74 ± 2.83 60.40 ± 1.25 / 47.53 ± 2.68 41.36 ± 3.50 / 58.47 ± 3.79 6.52 ± 2.11 / 38.10 ± 2.56 3.95 ± 2.52 / 42.37 ± 4.20 38.93 ± 2.43 / 65.76 ± 3.07 9.3.1 9.3.1 9.3.1 9.3.1 9.3.1 9.3.1
mistralai/Mistral-7B-Instruct-v0.1 (few-shot) 7242 32 32768 False 5,443 ± 1,273 / 1,144 ± 364 2.99 50.08 ± 1.54 / 34.52 ± 1.17 51.27 ± 1.52 / 33.37 ± 2.37 43.65 ± 1.98 / 60.88 ± 1.36 14.09 ± 2.85 / 44.91 ± 3.95 8.28 ± 1.82 / 47.22 ± 3.72 37.23 ± 3.15 / 63.67 ± 2.98 9.3.1 9.3.1 9.3.1 9.3.1 9.3.1 12.4.0
neph1/bellman-7b-mistral-instruct-v0.2 (few-shot) 7242 32 32768 False 2,518 ± 463 / 779 ± 243 3.02 57.01 ± 1.93 / 44.65 ± 2.87 56.77 ± 0.98 / 41.67 ± 3.53 38.81 ± 2.67 / 56.39 ± 3.13 14.16 ± 2.24 / 54.43 ± 2.61 9.29 ± 2.65 / 50.59 ± 3.99 32.75 ± 1.68 / 59.21 ± 2.11 9.2.0 9.2.0 9.2.0 9.2.0 9.2.0 12.4.0
occiglot/occiglot-7b-eu5 (few-shot) 7242 32 32768 True 2,219 ± 427 / 717 ± 224 3.03 45.28 ± 3.06 / 41.73 ± 2.14 46.00 ± 4.26 / 42.96 ± 3.38 44.95 ± 3.19 / 61.88 ± 2.88 0.00 ± 0.00 / 33.41 ± 0.30 0.00 ± 0.00 / 33.86 ± 0.33 43.88 ± 4.07 / 66.65 ± 4.20 12.5.2 12.5.2 12.1.0 12.1.0 12.1.0 12.1.0
sarnikowski/convbert-medium-small-da-cased 24 29 512 True 13,821 ± 2,209 / 3,547 ± 1,184 3.03 79.50 ± 0.70 / 76.09 ± 0.70 73.03 ± 1.28 / 68.84 ± 1.39 32.40 ± 1.48 / 44.59 ± 1.66 41.65 ± 1.95 / 70.35 ± 0.97 25.53 ± 2.31 / 62.04 ± 1.19 5.41 ± 2.79 / 8.15 ± 4.18 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
KBLab/bert-base-swedish-cased-new 135 64 512 True 15,933 ± 2,541 / 4,289 ± 1,376 3.04 83.23 ± 1.19 / 80.34 ± 1.44 79.16 ± 1.50 / 75.55 ± 1.69 33.94 ± 3.74 / 47.96 ± 4.12 9.56 ± 5.01 / 52.24 ± 2.78 4.16 ± 2.97 / 50.07 ± 2.09 22.84 ± 2.52 / 33.72 ± 3.11 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
danish-foundation-models/encoder-medium-v1 111 32 512 True 16,130 ± 2,433 / 4,566 ± 1,473 3.04 68.66 ± 1.05 / 65.08 ± 1.07 61.77 ± 2.03 / 57.87 ± 2.08 36.56 ± 1.53 / 51.54 ± 2.45 31.23 ± 6.86 / 63.55 ± 5.48 5.40 ± 4.63 / 44.64 ± 6.37 22.56 ± 0.95 / 34.52 ± 1.15 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
mideind/IceBERT-xlmr-ic3 278 250 512 True 11,004 ± 2,244 / 2,324 ± 761 3.04 82.46 ± 1.46 / 83.74 ± 1.16 74.22 ± 0.57 / 77.50 ± 0.57 37.19 ± 1.76 / 47.27 ± 2.85 13.25 ± 6.73 / 48.39 ± 7.10 7.96 ± 5.56 / 45.68 ± 6.73 18.75 ± 3.84 / 30.21 ± 5.73 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2 118 250 512 True 29,201 ± 6,282 / 6,045 ± 2,027 3.05 78.31 ± 1.22 / 74.65 ± 1.36 72.13 ± 0.90 / 67.28 ± 1.09 47.53 ± 0.94 / 62.73 ± 1.07 26.92 ± 3.12 / 61.93 ± 2.04 14.63 ± 4.00 / 56.24 ± 2.51 0.00 ± 0.00 / 0.00 ± 0.00 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
Twitter/twhin-bert-large 561 250 512 True 9,707 ± 1,664 / 2,549 ± 831 3.10 86.26 ± 0.71 / 83.48 ± 1.19 80.10 ± 2.44 / 76.17 ± 2.67 34.17 ± 2.42 / 43.74 ± 2.19 12.11 ± 10.47 / 50.33 ± 7.16 4.28 ± 4.18 / 45.75 ± 4.32 11.74 ± 10.45 / 16.38 ± 14.33 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
merge-crew/da-sv-dare-ties-density-0.3 (few-shot, val) 7242 32 32768 True 2,461 ± 476 / 773 ± 248 3.15 35.98 ± 3.79 / 27.51 ± 2.13 47.39 ± 2.31 / 36.42 ± 2.87 38.98 ± 5.51 / 58.23 ± 4.01 11.54 ± 5.04 / 49.91 ± 3.96 5.20 ± 3.47 / 46.19 ± 5.23 37.54 ± 3.00 / 56.56 ± 2.96 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1
01-ai/Yi-6B (few-shot) 6061 64 4096 True 2,786 ± 532 / 784 ± 250 3.16 43.44 ± 1.89 / 33.41 ± 2.21 46.33 ± 3.12 / 34.05 ± 2.27 38.96 ± 2.34 / 56.27 ± 3.65 0.75 ± 1.07 / 33.42 ± 0.29 1.04 ± 1.93 / 33.14 ± 0.66 40.28 ± 3.58 / 62.78 ± 3.34 9.3.2 9.3.2 10.0.0 10.0.0 10.0.0 12.5.1
Addedk/mbert-swedish-distilled-cased 135 120 512 True 26,091 ± 5,835 / 5,209 ± 1,690 3.16 82.98 ± 1.32 / 79.80 ± 1.69 76.65 ± 1.24 / 72.29 ± 1.44 30.38 ± 2.29 / 42.84 ± 2.38 21.99 ± 6.74 / 54.54 ± 7.12 19.06 ± 4.26 / 56.45 ± 5.24 9.47 ± 5.36 / 15.24 ± 8.64 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
LumiOpen/Viking-33B (few-shot) 33119 131 4099 True 2,080 ± 700 / 331 ± 117 3.16 40.40 ± 2.29 / 30.41 ± 2.07 44.45 ± 3.61 / 34.06 ± 3.27 40.79 ± 1.70 / 57.84 ± 2.77 5.91 ± 2.51 / 47.81 ± 3.76 2.98 ± 2.86 / 45.49 ± 4.59 37.75 ± 3.23 / 59.72 ± 3.03 12.9.0 12.9.0 12.9.0 12.9.0 12.9.0 12.9.0
meta-llama/Llama-2-7b-hf (few-shot) 6738 32 4096 True 930 ± 310 / 128 ± 43 3.16 42.13 ± 3.82 / 37.17 ± 3.44 43.80 ± 2.85 / 37.48 ± 4.00 41.74 ± 2.25 / 57.91 ± 2.82 0.00 ± 0.00 / 33.41 ± 0.30 0.02 ± 0.04 / 33.88 ± 0.35 44.19 ± 4.13 / 66.18 ± 4.05 9.2.0 9.2.0 9.2.0 9.2.0 9.2.0 12.5.1
flax-community/swe-roberta-wiki-oscar 125 50 512 True 15,437 ± 2,628 / 3,834 ± 1,252 3.17 79.25 ± 1.22 / 76.73 ± 1.16 75.39 ± 1.03 / 71.63 ± 1.31 36.56 ± 3.06 / 51.25 ± 3.01 22.02 ± 5.34 / 57.45 ± 3.59 19.72 ± 3.67 / 56.70 ± 2.89 0.78 ± 0.74 / 1.49 ± 1.32 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
tollefj/nordavind-7b-instruct-warm (few-shot) 7248 33 2048 False 6,450 ± 961 / 2,082 ± 658 3.17 38.82 ± 5.36 / 30.48 ± 1.91 43.28 ± 3.13 / 33.87 ± 3.30 38.05 ± 1.85 / 47.06 ± 3.97 8.45 ± 2.47 / 46.75 ± 3.97 7.50 ± 1.65 / 48.14 ± 4.65 40.47 ± 2.77 / 64.21 ± 2.94 12.5.2 12.5.2 12.3.2 12.3.2 12.3.2 12.4.0
AI-Sweden-Models/gpt-sw3-20b (few-shot) 20918 64 2048 True 1,875 ± 673 / 261 ± 91 3.18 30.82 ± 5.81 / 25.27 ± 3.92 39.56 ± 4.73 / 32.12 ± 4.06 34.51 ± 1.27 / 42.18 ± 1.39 15.17 ± 1.41 / 49.46 ± 2.90 12.46 ± 3.29 / 48.89 ± 5.19 42.81 ± 3.10 / 66.15 ± 3.21 9.3.1 9.3.1 9.3.1 9.3.1 9.3.1 9.3.1
sarnikowski/convbert-small-da-cased 13 29 512 True 14,273 ± 2,312 / 3,555 ± 1,187 3.18 76.07 ± 1.18 / 72.78 ± 1.16 70.94 ± 1.19 / 66.73 ± 1.21 32.49 ± 1.55 / 43.12 ± 0.71 35.43 ± 2.39 / 66.84 ± 1.17 21.11 ± 1.97 / 60.09 ± 0.93 1.84 ± 2.41 / 2.78 ± 3.65 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
meta-llama/Llama-2-7b-chat-hf (few-shot) 6738 32 4096 False 2,643 ± 455 / 800 ± 247 3.19 44.99 ± 2.49 / 38.59 ± 2.84 49.09 ± 1.90 / 39.09 ± 4.02 41.56 ± 3.37 / 57.09 ± 3.80 3.04 ± 2.84 / 36.81 ± 2.42 4.03 ± 2.49 / 40.55 ± 4.14 33.77 ± 2.11 / 61.99 ± 2.34 9.3.1 9.3.1 9.3.1 9.3.1 9.3.1 12.4.0
norallm/normistral-7b-warm-instruct (few-shot) unknown 33 2048 True 6,194 ± 949 / 1,967 ± 619 3.21 46.49 ± 2.30 / 31.87 ± 1.92 51.46 ± 2.36 / 35.87 ± 2.14 37.98 ± 1.72 / 43.91 ± 2.51 7.86 ± 2.74 / 47.20 ± 2.71 7.23 ± 1.97 / 46.62 ± 3.92 33.31 ± 1.03 / 59.27 ± 1.53 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1
AI-Sweden-Models/gpt-sw3-20b-instruct (few-shot) 20918 64 2048 True 1,831 ± 587 / 268 ± 90 3.22 23.95 ± 1.93 / 21.43 ± 1.89 26.55 ± 1.50 / 25.06 ± 1.61 40.89 ± 3.60 / 59.86 ± 3.59 9.45 ± 1.58 / 44.62 ± 3.29 8.32 ± 1.89 / 42.30 ± 2.78 43.19 ± 1.82 / 67.93 ± 1.55 9.3.1 9.3.1 9.3.1 9.3.1 9.3.1 9.3.1
sarnikowski/electra-small-discriminator-da-256-cased 13 29 512 True 20,340 ± 3,185 / 5,178 ± 1,700 3.22 73.15 ± 1.21 / 70.05 ± 1.16 66.34 ± 1.25 / 62.07 ± 1.31 29.97 ± 0.99 / 42.12 ± 0.47 40.79 ± 2.06 / 69.48 ± 1.44 25.08 ± 1.86 / 61.74 ± 0.81 1.93 ± 2.05 / 3.62 ± 3.99 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
DDSC/roberta-base-danish 125 50 512 True 15,004 ± 2,964 / 3,290 ± 1,092 3.23 76.14 ± 2.58 / 72.24 ± 2.54 72.88 ± 1.50 / 68.61 ± 1.62 32.29 ± 9.23 / 49.08 ± 8.36 0.45 ± 1.61 / 49.14 ± 1.42 -0.08 ± 1.79 / 45.89 ± 3.49 23.91 ± 2.24 / 36.47 ± 2.77 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
Qwen/Qwen1.5-4B-Chat (few-shot) 3950 152 32768 False 4,347 ± 893 / 1,135 ± 365 3.23 44.83 ± 1.58 / 40.11 ± 2.00 46.29 ± 1.65 / 41.63 ± 3.45 32.70 ± 1.59 / 45.73 ± 2.82 3.57 ± 1.55 / 37.05 ± 2.34 1.61 ± 2.11 / 37.85 ± 3.99 42.55 ± 3.36 / 67.11 ± 2.50 12.5.2 12.5.2 10.0.1 12.1.0 12.1.0 12.5.2
AI-Sweden-Models/gpt-sw3-6.7b-v2 (few-shot) 7111 64 2048 True 2,351 ± 448 / 707 ± 216 3.24 29.62 ± 4.17 / 24.40 ± 2.42 32.30 ± 5.27 / 29.23 ± 3.22 34.67 ± 5.23 / 54.62 ± 5.71 8.37 ± 1.71 / 48.94 ± 2.72 7.76 ± 2.86 / 46.16 ± 4.77 44.62 ± 3.31 / 67.57 ± 3.03 9.2.0 9.2.0 9.2.0 9.2.0 9.2.0 12.5.1
Qwen/Qwen1.5-4B (few-shot) 3950 152 32768 True 3,248 ± 739 / 761 ± 252 3.25 32.12 ± 5.17 / 32.01 ± 2.80 36.86 ± 3.53 / 35.46 ± 3.07 36.97 ± 1.94 / 55.08 ± 2.30 5.27 ± 3.31 / 40.91 ± 5.24 1.40 ± 1.87 / 33.64 ± 0.74 40.00 ± 2.26 / 62.87 ± 1.60 12.5.2 12.5.2 10.0.1 12.1.0 12.1.0 12.1.0
sentence-transformers/distilbert-multilingual-nli-stsb-quora-ranking 135 120 512 True 33,753 ± 8,349 / 5,937 ± 1,946 3.25 77.81 ± 0.76 / 74.83 ± 0.79 72.22 ± 0.95 / 68.32 ± 1.13 44.59 ± 1.89 / 59.87 ± 1.84 8.98 ± 3.55 / 52.49 ± 2.08 5.72 ± 3.29 / 50.40 ± 3.21 0.00 ± 0.00 / 0.00 ± 0.00 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
sentence-transformers/quora-distilbert-multilingual 135 120 512 True 26,458 ± 5,992 / 5,274 ± 1,731 3.25 77.81 ± 0.76 / 74.83 ± 0.79 72.22 ± 0.95 / 68.32 ± 1.13 44.59 ± 1.89 / 59.87 ± 1.84 8.98 ± 3.55 / 52.49 ± 2.08 5.72 ± 3.29 / 50.40 ± 3.21 0.00 ± 0.00 / 0.00 ± 0.00 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
Maltehb/danish-bert-botxo 111 32 512 True 16,091 ± 2,427 / 4,575 ± 1,485 3.27 72.62 ± 0.81 / 69.33 ± 0.93 58.73 ± 1.81 / 55.12 ± 1.67 40.65 ± 1.63 / 55.20 ± 2.63 29.47 ± 2.30 / 62.25 ± 1.69 12.95 ± 3.01 / 55.31 ± 1.87 0.91 ± 0.93 / 2.56 ± 2.15 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
danish-foundation-models/munin-7b-alpha (few-shot) 7242 32 32768 True 6,116 ± 1,049 / 1,784 ± 577 3.28 48.89 ± 3.42 / 35.46 ± 2.58 51.95 ± 1.59 / 36.45 ± 2.57 20.54 ± 6.01 / 36.30 ± 6.77 4.39 ± 3.94 / 35.23 ± 2.81 1.20 ± 1.64 / 34.54 ± 1.31 47.16 ± 4.15 / 70.08 ± 3.96 12.5.2 12.5.2 12.4.0 12.4.0 12.4.0 12.4.0
Addedk/kbbert-distilled-cased 82 50 512 True 29,698 ± 4,287 / 8,677 ± 2,776 3.32 81.82 ± 0.85 / 78.30 ± 1.00 75.89 ± 1.11 / 72.08 ± 1.15 33.42 ± 1.96 / 48.63 ± 3.34 14.99 ± 4.11 / 52.87 ± 4.48 13.63 ± 4.52 / 53.34 ± 4.61 0.00 ± 0.00 / 0.00 ± 0.00 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
Maltehb/aelaectra-danish-electra-small-cased 14 32 128 True 4,593 ± 114 / 3,034 ± 973 3.32 71.85 ± 1.11 / 68.20 ± 1.23 67.14 ± 1.18 / 62.61 ± 1.22 29.00 ± 1.28 / 41.72 ± 0.52 33.57 ± 2.58 / 65.22 ± 1.59 21.79 ± 1.60 / 60.32 ± 0.99 0.03 ± 0.07 / 0.05 ± 0.10 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
dbmdz/bert-base-historic-multilingual-cased 111 32 512 True 20,047 ± 4,407 / 3,844 ± 1,259 3.35 68.63 ± 1.64 / 64.83 ± 1.55 67.70 ± 2.68 / 63.70 ± 2.54 25.68 ± 2.17 / 41.65 ± 2.77 6.73 ± 5.40 / 48.20 ± 3.68 3.35 ± 2.61 / 47.52 ± 3.20 22.57 ± 1.57 / 34.64 ± 1.94 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
norallm/normistral-7b-warm (few-shot) 7248 33 2048 True 3,175 ± 456 / 1,186 ± 354 3.35 42.29 ± 4.36 / 31.45 ± 1.88 46.29 ± 3.44 / 35.99 ± 4.20 27.05 ± 3.33 / 45.30 ± 3.46 1.63 ± 2.58 / 38.29 ± 4.05 2.57 ± 1.78 / 40.92 ± 4.00 39.18 ± 2.84 / 61.85 ± 3.07 11.0.0 11.0.0 11.0.0 11.0.0 11.0.0 11.0.0
AI-Sweden-Models/gpt-sw3-6.7b-v2-instruct (few-shot) 7111 64 2048 True 2,383 ± 451 / 718 ± 221 3.36 24.67 ± 1.69 / 24.58 ± 1.95 29.03 ± 2.12 / 29.83 ± 2.15 34.39 ± 5.34 / 50.45 ± 6.08 2.42 ± 1.83 / 35.49 ± 2.63 5.11 ± 2.68 / 38.37 ± 3.31 42.52 ± 2.05 / 68.98 ± 2.23 9.2.0 9.2.0 9.2.0 9.2.0 9.2.0 12.4.0
AI-Sweden-Models/gpt-sw3-1.3b-instruct (few-shot) 1445 64 2048 True 4,544 ± 1,000 / 1,106 ± 359 3.38 33.08 ± 2.22 / 34.51 ± 2.24 38.28 ± 2.63 / 40.50 ± 2.72 35.58 ± 2.13 / 44.49 ± 2.52 0.82 ± 1.46 / 34.78 ± 1.54 1.43 ± 1.70 / 34.19 ± 1.10 36.06 ± 1.76 / 58.71 ± 1.63 12.5.2 12.5.2 9.3.1 12.1.0 12.1.0 12.4.0
Maltehb/aelaectra-danish-electra-small-uncased 14 32 128 True 5,995 ± 135 / 3,839 ± 1,247 3.43 59.76 ± 3.01 / 55.95 ± 2.74 51.44 ± 2.28 / 48.14 ± 1.95 33.41 ± 1.40 / 45.12 ± 1.87 32.87 ± 1.49 / 65.82 ± 0.78 20.09 ± 1.88 / 59.27 ± 1.08 0.00 ± 0.00 / 0.00 ± 0.00 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
dbmdz/bert-medium-historic-multilingual-cased 42 32 512 True 24,291 ± 4,887 / 5,096 ± 1,655 3.44 69.65 ± 1.48 / 66.15 ± 1.66 66.78 ± 1.28 / 62.75 ± 1.40 26.33 ± 1.84 / 40.67 ± 0.71 6.62 ± 3.40 / 48.37 ± 4.02 5.16 ± 3.07 / 45.99 ± 4.76 15.75 ± 1.30 / 27.15 ± 1.91 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
birgermoell/roberta-swedish-scandi 125 50 512 True 15,385 ± 2,815 / 3,578 ± 1,177 3.48 72.74 ± 2.03 / 69.79 ± 2.34 69.74 ± 1.81 / 65.59 ± 2.06 29.68 ± 1.91 / 43.64 ± 2.18 15.83 ± 10.06 / 55.51 ± 5.25 8.70 ± 4.78 / 52.69 ± 2.75 1.04 ± 1.17 / 1.93 ± 2.12 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
timpal0l/Mistral-7B-v0.1-flashback-v2-instruct (few-shot) 7242 32 32768 False 5,172 ± 813 / 1,647 ± 518 3.50 50.34 ± 3.17 / 45.09 ± 2.65 52.06 ± 2.41 / 46.88 ± 2.39 32.19 ± 2.52 / 43.19 ± 4.63 -0.22 ± 0.43 / 33.41 ± 0.30 0.00 ± 0.00 / 33.86 ± 0.33 20.57 ± 0.64 / 40.19 ± 1.13 12.5.2 12.5.2 12.3.2 12.3.2 12.3.2 12.4.0
jjzha/dajobbert-base-uncased 110 32 512 True 16,243 ± 2,428 / 4,593 ± 1,484 3.51 65.95 ± 0.72 / 62.62 ± 0.66 55.29 ± 1.26 / 51.50 ± 1.21 33.31 ± 2.87 / 48.75 ± 3.38 20.34 ± 4.81 / 58.57 ± 2.56 8.07 ± 2.44 / 53.43 ± 1.32 0.00 ± 0.00 / 0.14 ± 0.16 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
google/gemma-2b-it (few-shot) 2506 256 8192 False 6,471 ± 1,142 / 1,961 ± 584 3.59 39.78 ± 2.67 / 31.15 ± 2.70 43.58 ± 2.49 / 36.60 ± 2.76 22.01 ± 3.00 / 40.48 ± 2.81 2.76 ± 1.35 / 44.34 ± 3.05 1.45 ± 1.35 / 39.55 ± 3.53 32.42 ± 1.72 / 56.65 ± 0.92 12.5.2 12.5.2 12.1.0 12.1.0 12.1.0 12.4.0
google/gemma-2b (few-shot) 2506 256 8192 True 6,087 ± 1,046 / 1,902 ± 563 3.63 15.53 ± 5.69 / 15.49 ± 5.08 19.78 ± 4.54 / 18.86 ± 4.22 32.89 ± 1.65 / 42.58 ± 3.16 1.18 ± 1.00 / 33.34 ± 0.26 0.00 ± 0.00 / 32.79 ± 0.34 33.33 ± 3.73 / 53.15 ± 4.42 12.5.2 12.5.2 12.1.0 12.1.0 12.1.0 12.1.0
HPLT/gpt-13b-nordic-prerelease (few-shot) 14030 131 4099 True 3,520 ± 736 / 823 ± 273 3.64 28.94 ± 5.63 / 27.01 ± 4.91 33.83 ± 5.52 / 30.49 ± 4.07 27.32 ± 3.13 / 38.30 ± 2.19 1.46 ± 1.07 / 49.06 ± 1.04 -0.59 ± 1.36 / 45.94 ± 2.35 25.62 ± 4.99 / 40.88 ± 7.54 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1
LumiOpen/Viking-13B (few-shot) 14030 131 4099 True 3,480 ± 727 / 822 ± 274 3.64 28.87 ± 5.63 / 27.09 ± 4.90 34.01 ± 5.51 / 30.65 ± 4.07 27.31 ± 3.11 / 38.30 ± 2.19 1.53 ± 1.06 / 49.09 ± 1.02 -0.63 ± 1.33 / 45.92 ± 2.34 25.53 ± 4.94 / 40.76 ± 7.49 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2
sentence-transformers/distiluse-base-multilingual-cased-v2 135 120 512 True 33,247 ± 8,123 / 6,017 ± 1,977 3.64 63.79 ± 2.11 / 67.14 ± 1.91 60.96 ± 1.11 / 64.65 ± 1.00 32.83 ± 1.48 / 43.32 ± 0.69 1.09 ± 1.93 / 48.72 ± 1.29 0.18 ± 1.93 / 47.30 ± 1.44 0.00 ± 0.00 / 0.00 ± 0.00 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1
sentence-transformers/distiluse-base-multilingual-cased 135 120 512 True 19,206 ± 4,451 / 3,658 ± 1,187 3.64 63.79 ± 2.11 / 67.14 ± 1.91 60.96 ± 1.11 / 64.65 ± 1.00 32.83 ± 1.48 / 43.32 ± 0.69 1.09 ± 1.93 / 48.72 ± 1.29 0.18 ± 1.93 / 47.30 ± 1.44 0.00 ± 0.00 / 0.00 ± 0.00 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1
AI-Sweden-Models/gpt-sw3-1.3b (few-shot) 1445 64 2048 True 4,608 ± 988 / 1,115 ± 354 3.69 13.49 ± 7.98 / 14.80 ± 7.68 14.74 ± 8.45 / 15.09 ± 7.85 27.28 ± 4.39 / 49.18 ± 4.23 3.09 ± 0.79 / 42.87 ± 3.49 1.86 ± 1.90 / 38.18 ± 1.44 34.91 ± 2.65 / 54.30 ± 2.96 9.3.1 9.3.1 9.3.1 9.3.1 9.3.1 12.5.1
AI-Sweden-Models/gpt-sw3-356m (few-shot) 471 64 2048 True 5,758 ± 1,348 / 1,215 ± 391 3.69 27.37 ± 4.07 / 27.94 ± 4.04 31.22 ± 3.87 / 31.39 ± 3.99 34.21 ± 1.63 / 47.17 ± 2.76 0.92 ± 1.55 / 40.71 ± 2.58 1.25 ± 2.30 / 43.49 ± 3.20 18.52 ± 2.78 / 32.10 ± 4.23 9.3.1 9.3.1 9.3.1 9.3.2 9.3.2 12.5.1
AI-Sweden-Models/gpt-sw3-6.7b (few-shot) 7111 64 2048 True 2,285 ± 443 / 671 ± 205 3.69 22.35 ± 7.84 / 23.89 ± 4.74 21.98 ± 7.52 / 27.22 ± 4.97 18.23 ± 9.28 / 38.93 ± 6.44 1.68 ± 1.35 / 39.93 ± 2.78 2.49 ± 1.88 / 40.26 ± 3.23 41.80 ± 3.14 / 64.25 ± 3.13 11.0.0 11.0.0 11.0.0 11.0.0 11.0.0 11.0.0
AI-Sweden-Models/gpt-sw3-356m-instruct (few-shot) 471 64 2048 True 5,855 ± 1,373 / 1,223 ± 391 3.70 24.38 ± 2.13 / 25.79 ± 2.29 31.28 ± 1.60 / 33.48 ± 1.60 30.88 ± 2.75 / 46.07 ± 3.07 -0.30 ± 1.46 / 34.93 ± 1.25 0.45 ± 0.57 / 33.74 ± 1.25 23.99 ± 1.59 / 42.69 ± 1.94 12.5.2 12.5.2 9.3.2 12.1.0 12.1.0 12.4.0
LumiOpen/Viking-7B (few-shot) 7550 131 4096 True 5,723 ± 1,025 / 1,670 ± 559 3.71 20.17 ± 4.56 / 19.89 ± 3.61 24.06 ± 6.19 / 22.95 ± 5.53 33.84 ± 2.91 / 54.38 ± 2.61 1.05 ± 1.41 / 46.50 ± 2.79 -0.47 ± 2.02 / 45.21 ± 3.32 22.58 ± 3.26 / 41.07 ± 4.94 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2
dbmdz/bert-mini-historic-multilingual-cased 12 32 512 True 47,122 ± 9,661 / 9,714 ± 3,152 3.72 61.55 ± 1.55 / 58.24 ± 1.47 59.90 ± 1.56 / 56.03 ± 1.41 24.59 ± 1.57 / 40.34 ± 0.99 3.45 ± 2.10 / 50.80 ± 1.16 2.72 ± 1.56 / 48.79 ± 1.92 3.99 ± 2.02 / 7.49 ± 3.66 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
EuropeanParliament/EUBERT 94 66 512 True 20,070 ± 3,977 / 4,400 ± 1,435 3.76 49.92 ± 0.61 / 49.17 ± 0.71 44.37 ± 1.15 / 43.43 ± 1.21 19.81 ± 2.15 / 40.90 ± 2.60 8.64 ± 3.57 / 53.17 ± 1.93 3.11 ± 1.16 / 50.44 ± 0.73 15.89 ± 1.29 / 26.33 ± 2.41 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1
NbAiLab/nb-gpt-j-6B-alpaca (few-shot) 6055 50 1024 False 2,607 ± 592 / 680 ± 208 3.76 23.82 ± 4.25 / 22.08 ± 2.50 26.04 ± 6.38 / 24.47 ± 3.69 32.60 ± 1.84 / 47.47 ± 3.33 0.34 ± 1.43 / 44.47 ± 2.44 2.26 ± 2.27 / 45.41 ± 3.25 21.33 ± 0.98 / 42.76 ± 1.02 9.3.2 9.3.2 10.0.1 10.0.1 10.0.1 12.4.0
norallm/normistral-7b-scratch (few-shot) 7248 33 2048 True 3,192 ± 454 / 1,198 ± 357 3.77 14.58 ± 6.07 / 15.44 ± 5.52 21.06 ± 7.77 / 21.99 ± 7.14 32.02 ± 1.59 / 36.85 ± 2.01 1.49 ± 1.40 / 35.35 ± 1.51 0.98 ± 1.85 / 35.28 ± 2.43 22.87 ± 1.85 / 38.93 ± 2.59 10.0.0 10.0.0 10.0.0 10.0.0 10.0.0 10.0.0
sentence-transformers/distiluse-base-multilingual-cased-v1 135 120 512 True 34,042 ± 8,482 / 5,951 ± 1,950 3.77 60.76 ± 1.53 / 58.35 ± 1.41 59.62 ± 1.19 / 55.90 ± 1.12 25.98 ± 1.33 / 40.58 ± 0.60 2.65 ± 2.08 / 48.09 ± 1.42 3.47 ± 1.47 / 48.98 ± 1.75 0.20 ± 0.09 / 0.82 ± 0.39 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
mhenrichsen/danskgpt-tiny-chat (few-shot) 1100 32 2048 False 1,745 ± 978 / 686 ± 159 3.78 28.74 ± 4.18 / 28.29 ± 4.37 30.34 ± 6.08 / 30.02 ± 6.42 27.49 ± 3.13 / 48.00 ± 3.89 -2.17 ± 1.06 / 33.52 ± 0.37 0.26 ± 1.08 / 34.12 ± 0.45 19.10 ± 2.33 / 38.96 ± 2.78 9.1.2 9.1.2 9.1.2 9.1.2 9.1.2 12.4.0
KBLab/albert-base-swedish-cased-alpha 14 50 512 True 15,925 ± 2,281 / 4,780 ± 1,554 3.79 66.97 ± 1.30 / 62.99 ± 1.05 63.90 ± 2.54 / 59.95 ± 2.51 18.85 ± 4.01 / 36.76 ± 2.79 5.83 ± 2.36 / 51.59 ± 1.57 4.02 ± 2.29 / 51.66 ± 1.21 0.00 ± 0.00 / 0.00 ± 0.00 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
alexanderfalk/danbert-small-cased 83 52 512 True 30,013 ± 4,309 / 8,840 ± 2,859 3.94 42.18 ± 1.03 / 39.53 ± 1.04 37.39 ± 1.81 / 34.88 ± 1.76 24.39 ± 1.60 / 40.44 ± 1.78 7.29 ± 2.49 / 49.37 ± 3.58 2.57 ± 1.78 / 46.00 ± 3.32 0.00 ± 0.00 / 0.00 ± 0.00 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
HPLT/gpt-7b-nordic-prerelease (few-shot) 7550 131 4096 True 5,404 ± 931 / 1,638 ± 542 3.95 20.25 ± 6.26 / 20.45 ± 5.54 28.99 ± 5.03 / 27.50 ± 4.36 17.44 ± 3.49 / 35.51 ± 4.10 3.20 ± 1.84 / 34.58 ± 0.97 2.61 ± 1.80 / 34.49 ± 1.46 21.50 ± 2.60 / 40.73 ± 3.86 12.5.2 12.5.2 12.3.2 12.3.2 12.3.2 12.3.2
jannesg/bertsson 124 50 512 True 15,314 ± 2,786 / 3,666 ± 1,201 3.97 49.30 ± 0.97 / 46.11 ± 1.04 46.11 ± 2.15 / 43.01 ± 2.05 23.21 ± 2.32 / 44.26 ± 2.95 2.26 ± 1.47 / 45.07 ± 4.04 -0.66 ± 1.81 / 44.50 ± 3.45 0.68 ± 0.32 / 1.50 ± 0.80 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
3ebdola/Dialectal-Arabic-XLM-R-Base 278 250 512 True 12,783 ± 2,537 / 2,712 ± 885 4.00 55.55 ± 3.71 / 52.52 ± 3.51 53.53 ± 3.03 / 50.23 ± 3.00 12.69 ± 4.51 / 32.23 ± 3.53 2.79 ± 1.16 / 47.71 ± 2.00 1.66 ± 2.18 / 46.60 ± 2.87 0.00 ± 0.00 / 0.60 ± 0.72 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
Qwen/Qwen1.5-1.8B-Chat (few-shot) 1837 152 32768 False 8,304 ± 1,846 / 1,933 ± 617 4.03 26.99 ± 3.92 / 24.61 ± 2.74 25.74 ± 3.76 / 24.51 ± 2.96 19.85 ± 1.97 / 35.75 ± 1.74 1.96 ± 1.33 / 44.22 ± 2.93 -0.01 ± 1.39 / 39.57 ± 2.97 16.33 ± 2.17 / 31.16 ± 3.40 12.5.2 12.5.2 11.0.0 12.1.0 12.1.0 12.5.0
Qwen/Qwen1.5-1.8B (few-shot) 1837 152 32768 True 5,666 ± 1,328 / 1,256 ± 408 4.06 12.10 ± 5.58 / 12.85 ± 4.80 13.42 ± 6.02 / 13.82 ± 5.16 22.82 ± 3.11 / 43.88 ± 3.10 2.70 ± 2.16 / 47.68 ± 3.18 2.21 ± 1.46 / 42.80 ± 4.36 16.31 ± 2.22 / 30.78 ± 3.64 12.5.2 12.5.2 10.0.1 12.1.0 12.1.0 12.1.0
dbmdz/bert-tiny-historic-multilingual-cased 5 32 512 True 78,027 ± 15,466 / 17,064 ± 5,335 4.06 46.11 ± 3.68 / 43.54 ± 3.44 35.18 ± 4.90 / 33.22 ± 4.64 19.19 ± 2.87 / 37.36 ± 1.97 2.76 ± 1.42 / 50.99 ± 0.83 0.42 ± 1.04 / 49.39 ± 0.80 0.00 ± 0.00 / 0.00 ± 0.00 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
allenai/OLMo-7B (few-shot) 6888 50 2051 True 5,403 ± 1,133 / 1,294 ± 423 4.08 34.42 ± 3.22 / 27.79 ± 2.40 35.17 ± 4.05 / 30.70 ± 3.37 21.46 ± 5.63 / 38.36 ± 6.18 0.34 ± 1.25 / 33.60 ± 0.50 0.26 ± 0.58 / 34.69 ± 3.11 0.12 ± 0.04 / 9.85 ± 0.17 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2
RuterNorway/Llama-2-7b-chat-norwegian (few-shot) unknown 32 4096 False 10,890 ± 2,686 / 2,186 ± 750 4.10 21.04 ± 2.63 / 20.44 ± 2.47 18.71 ± 2.67 / 19.91 ± 2.89 12.22 ± 1.17 / 23.50 ± 3.03 -1.18 ± 1.40 / 35.70 ± 2.67 0.36 ± 1.28 / 37.66 ± 4.07 26.86 ± 1.65 / 50.11 ± 1.80 9.3.1 9.3.1 9.3.1 9.3.1 9.3.1 12.5.2
mhenrichsen/danskgpt-tiny (few-shot) 1100 32 2048 True 8,597 ± 1,983 / 1,926 ± 600 4.13 27.37 ± 6.89 / 27.19 ± 7.19 27.59 ± 6.34 / 28.03 ± 6.94 18.09 ± 6.14 / 31.83 ± 6.77 -0.19 ± 1.93 / 41.38 ± 3.18 -0.80 ± 0.89 / 40.66 ± 3.78 5.84 ± 1.36 / 16.14 ± 2.48 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 12.5.1
Qwen/Qwen1.5-0.5B-Chat (few-shot) 620 152 32768 False 11,740 ± 3,000 / 2,209 ± 721 4.21 29.52 ± 1.48 / 29.79 ± 1.62 31.27 ± 1.30 / 31.91 ± 1.31 11.49 ± 1.38 / 27.12 ± 1.98 0.29 ± 1.58 / 40.21 ± 4.22 -0.12 ± 1.48 / 39.92 ± 3.90 7.80 ± 1.19 / 17.09 ± 2.72 12.5.2 12.5.2 11.0.0 12.1.0 12.1.0 12.4.0
allenai/OLMo-7B-Twin-2T (few-shot) 6888 50 2051 True 5,484 ± 1,125 / 1,317 ± 425 4.24 9.06 ± 6.86 / 8.39 ± 6.41 17.16 ± 6.21 / 16.00 ± 5.94 25.52 ± 3.72 / 41.44 ± 4.44 0.68 ± 1.54 / 45.09 ± 2.63 0.17 ± 2.27 / 42.02 ± 4.30 0.46 ± 0.07 / 6.96 ± 0.13 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2
RabotaRu/HRBert-mini 80 200 512 True 54,951 ± 11,500 / 11,401 ± 3,819 4.25 31.87 ± 2.26 / 30.31 ± 2.14 32.47 ± 1.48 / 30.59 ± 1.43 15.07 ± 1.97 / 35.80 ± 1.15 1.26 ± 1.26 / 48.42 ± 1.75 0.49 ± 1.58 / 45.93 ± 3.88 0.00 ± 0.00 / 0.00 ± 0.00 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
Qwen/Qwen1.5-0.5B (few-shot) 620 152 32768 True 11,371 ± 2,924 / 2,122 ± 692 4.26 34.46 ± 2.01 / 33.09 ± 2.32 33.41 ± 2.21 / 33.91 ± 2.33 6.31 ± 3.46 / 20.67 ± 2.69 -1.59 ± 1.08 / 36.27 ± 3.71 0.61 ± 1.41 / 38.84 ± 5.10 5.95 ± 1.53 / 16.20 ± 1.93 12.5.2 12.5.2 10.0.1 12.1.0 12.1.0 12.1.0
AI-Sweden-Models/gpt-sw3-126m-instruct (few-shot) 186 64 2048 True 7,717 ± 1,553 / 2,013 ± 625 4.30 27.66 ± 2.00 / 28.61 ± 2.15 30.88 ± 2.13 / 31.97 ± 2.10 5.13 ± 3.33 / 20.41 ± 3.12 0.00 ± 0.00 / 33.25 ± 0.30 0.00 ± 0.00 / 32.79 ± 0.34 7.55 ± 1.17 / 15.63 ± 2.64 11.0.0 11.0.0 9.3.2 11.0.0 11.0.0 12.4.0
fresh-xlm-roberta-base 278 250 512 True 2,214 ± 94 / 1,494 ± 229 4.31 25.49 ± 3.39 / 23.54 ± 3.23 25.94 ± 1.70 / 24.10 ± 1.64 12.60 ± 2.97 / 32.27 ± 3.43 0.50 ± 1.27 / 36.93 ± 4.00 1.83 ± 1.64 / 37.67 ± 4.40 0.00 ± 0.00 / 0.00 ± 0.00 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
allenai/OLMo-1B (few-shot) 1177 50 2051 True 8,536 ± 1,926 / 1,940 ± 619 4.32 30.79 ± 1.95 / 32.18 ± 1.98 31.12 ± 2.36 / 33.10 ± 2.68 9.95 ± 3.92 / 29.01 ± 2.80 -0.95 ± 1.87 / 39.37 ± 3.33 -0.04 ± 1.73 / 42.36 ± 4.61 0.00 ± 0.00 / 3.06 ± 0.05 12.5.2 12.5.2 12.1.0 12.1.0 12.1.0 12.1.0
fresh-electra-small 14 31 512 True 7,840 ± 1,538 / 3,024 ± 438 4.35 18.38 ± 2.01 / 17.08 ± 1.97 12.76 ± 1.29 / 11.65 ± 1.21 15.29 ± 5.37 / 34.15 ± 4.23 0.17 ± 0.84 / 36.29 ± 2.91 0.37 ± 0.69 / 35.08 ± 2.81 0.00 ± 0.00 / 0.00 ± 0.00 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0 0.0.0
NbAiLab/nb-gpt-j-6B-v2 (few-shot) 6051 50 1024 False 2,556 ± 580 / 681 ± 214 4.36 5.29 ± 4.68 / 4.93 ± 4.38 6.77 ± 6.18 / 6.78 ± 5.66 20.84 ± 6.06 / 35.78 ± 5.94 0.45 ± 1.09 / 34.65 ± 1.93 0.48 ± 0.66 / 32.86 ± 0.34 2.43 ± 0.61 / 22.78 ± 2.29 9.3.1 9.3.1 10.0.1 11.0.0 11.0.0 12.5.1
NbAiLab/nb-gpt-j-6B@sharded (few-shot) unknown 50 1024 True 2,630 ± 605 / 684 ± 217 4.48 0.22 ± 0.21 / 1.66 ± 1.38 0.24 ± 0.40 / 1.43 ± 1.97 20.64 ± 5.63 / 36.75 ± 3.29 -0.99 ± 0.88 / 33.37 ± 0.27 -0.15 ± 0.72 / 32.83 ± 0.34 0.53 ± 0.31 / 22.14 ± 2.25 9.3.1 9.3.1 10.0.1 10.0.1 10.0.1 12.5.1
AI-Sweden-Models/gpt-sw3-126m (few-shot) 186 64 2048 True 8,958 ± 1,815 / 2,240 ± 696 4.57 13.55 ± 6.73 / 15.90 ± 5.66 9.38 ± 4.88 / 11.18 ± 4.52 7.78 ± 3.76 / 21.70 ± 5.02 -1.46 ± 1.07 / 43.30 ± 2.30 -2.97 ± 1.29 / 44.41 ± 3.18 2.32 ± 0.68 / 6.65 ± 1.90 9.2.0 9.2.0 9.2.0 9.2.0 9.2.0 12.5.1
NorGLM/NorGPT-369M (few-shot) unknown 64 1024 True 19,896 ± 5,099 / 3,848 ± 1,251 4.72 3.14 ± 2.12 / 2.91 ± 2.02 3.00 ± 1.58 / 2.65 ± 1.41 3.41 ± 2.11 / 14.87 ± 2.38 0.22 ± 0.42 / 33.42 ± 0.29 0.27 ± 0.79 / 38.20 ± 3.48 0.00 ± 0.00 / 2.27 ± 0.89 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2
ai-forever/mGPT (few-shot) unknown 100 1024 True 13,551 ± 4,259 / 2,563 ± 838 4.72 0.08 ± 0.16 / 0.07 ± 0.14 0.00 ± 0.00 / 0.00 ± 0.00 4.76 ± 1.84 / 16.95 ± 5.07 0.67 ± 1.94 / 40.42 ± 4.43 -0.88 ± 1.89 / 40.70 ± 4.30 0.00 ± 0.00 / 0.74 ± 0.05 9.3.1 9.3.1 10.0.1 11.0.0 11.0.0 12.5.1
RJuro/kanelsnegl-v0.2 (few-shot) 7242 32 512 True 1,373 ± 120 / 709 ± 172 4.79 0.00 ± 0.00 / 0.00 ± 0.00 0.00 ± 0.00 / 0.00 ± 0.00 1.27 ± 1.21 / 9.77 ± 0.51 0.00 ± 0.00 / 33.25 ± 0.30 0.00 ± 0.00 / 32.79 ± 0.34 0.00 ± 0.00 / 32.25 ± 0.29 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1
RJuro/kanelsnegl-v0.1 (few-shot) 7242 32 512 True 9,757 ± 2,047 / 2,200 ± 705 4.83 0.00 ± 0.00 / 0.00 ± 0.00 0.00 ± 0.00 / 0.00 ± 0.00 0.95 ± 0.80 / 9.68 ± 0.28 0.00 ± 0.00 / 33.25 ± 0.30 0.00 ± 0.00 / 32.79 ± 0.34 0.00 ± 0.00 / 33.45 ± 0.27 9.3.1 9.3.1 9.3.1 9.3.1 9.3.1 12.5.1
Sigurdur/icebreaker (few-shot) 110 32 1024 False 48,619 ± 7,681 / 13,831 ± 4,404 4.83 0.00 ± 0.00 / 0.00 ± 0.00 0.00 ± 0.00 / 0.00 ± 0.00 0.00 ± 0.00 / 9.59 ± 0.29 0.00 ± 0.00 / 33.25 ± 0.30 0.00 ± 0.00 / 32.79 ± 0.34 0.00 ± 0.00 / 0.47 ± 0.03 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
Sigurdur/qa-icebreaker (few-shot) 110 32 1024 False 44,889 ± 6,944 / 13,506 ± 4,256 4.83 0.00 ± 0.00 / 0.00 ± 0.00 0.00 ± 0.00 / 0.00 ± 0.00 1.00 ± 1.27 / 13.84 ± 2.10 0.00 ± 0.00 / 33.25 ± 0.30 0.00 ± 0.00 / 32.79 ± 0.34 0.00 ± 0.00 / 0.53 ± 0.06 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
peter-sk/gpt-neox-da (few-shot) 1515 50 1024 True 6,025 ± 1,442 / 1,342 ± 431 4.86 0.29 ± 0.29 / 0.29 ± 0.27 0.25 ± 0.17 / 0.27 ± 0.21 -1.43 ± 1.45 / 20.90 ± 4.96 -0.42 ± 1.10 / 35.77 ± 3.09 1.11 ± 2.21 / 39.28 ± 4.12 0.00 ± 0.00 / 3.15 ± 0.55 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1
Download as CSV   •   Copy embed HTML