Swedish NLU 🇸🇪

Last updated: 02/05/2024 11:26:33 CET
Model ID Parameters Vocabulary size Context Commercial Speed Rank SUC3 SweReC ScaLA-sv ScandiQA-sv SUC3 version SweReC version ScaLA-sv version ScandiQA-sv version
gpt-4-0613 (few-shot, val) unknown 100 8192 True 597 ± 197 / 93 ± 33 1.20 76.86 ± 1.89 / 54.97 ± 4.44 79.19 ± 1.87 / 80.56 ± 1.82 80.93 ± 1.67 / 89.90 ± 0.93 53.81 ± 1.28 / 65.15 ± 1.11 0.0.0 0.0.0 0.0.0 12.9.0
meta-llama/Meta-Llama-3-70B (few-shot, val) 70554 128 8192 True 312 ± 55 / 177 ± 51 1.24 74.61 ± 2.99 / 56.50 ± 6.30 78.61 ± 1.40 / 78.64 ± 1.53 63.20 ± 3.34 / 80.61 ± 2.52 61.98 ± 1.65 / 66.85 ± 1.42 12.7.0 12.7.0 12.7.0 12.7.0
AI-Sweden-Models/roberta-large-1160k 355 50 512 True 5,741 ± 987 / 1,554 ± 494 1.26 82.65 ± 1.04 / 80.43 ± 0.93 77.25 ± 1.20 / 73.96 ± 2.59 77.90 ± 1.45 / 88.63 ± 0.76 49.64 ± 1.11 / 55.64 ± 1.07 10.0.1 10.0.1 10.0.1 10.0.1
AI-Sweden-Models/roberta-large-1350k 355 50 512 True 5,744 ± 969 / 1,539 ± 492 1.27 82.97 ± 0.98 / 81.14 ± 1.14 77.37 ± 1.25 / 73.57 ± 3.43 73.81 ± 4.60 / 86.33 ± 2.48 49.50 ± 1.32 / 55.34 ± 1.33 10.0.1 10.0.1 10.0.1 10.0.1
KBLab/megatron-bert-large-swedish-cased-165k 370 64 512 True 7,138 ± 1,111 / 2,067 ± 660 1.34 81.05 ± 1.34 / 76.08 ± 1.45 78.00 ± 0.89 / 75.01 ± 2.18 76.79 ± 1.70 / 87.59 ± 1.06 45.71 ± 1.09 / 51.70 ± 0.89 0.0.0 0.0.0 0.0.0 0.0.0
KBLab/megatron-bert-large-swedish-cased-110k 370 64 512 True 7,075 ± 1,093 / 2,057 ± 661 1.36 80.39 ± 1.34 / 74.83 ± 1.44 78.45 ± 0.79 / 77.12 ± 0.86 76.28 ± 1.86 / 87.37 ± 1.15 44.56 ± 0.52 / 50.85 ± 0.56 0.0.0 0.0.0 0.0.0 0.0.0
AI-Sweden-Models/bert-large-nordic-pile-1M-steps 369 64 512 True 6,571 ± 1,331 / 1,493 ± 479 1.41 80.65 ± 2.12 / 78.69 ± 1.68 77.43 ± 1.07 / 75.95 ± 2.00 76.56 ± 1.06 / 87.86 ± 0.70 41.54 ± 1.50 / 46.79 ± 1.60 0.0.0 0.0.0 0.0.0 0.0.0
AI-Nordics/bert-large-swedish-cased 335 31 512 True 7,199 ± 1,139 / 2,051 ± 651 1.43 78.61 ± 1.45 / 72.84 ± 1.51 77.47 ± 0.80 / 75.77 ± 2.13 72.87 ± 2.36 / 85.57 ± 1.43 43.11 ± 0.99 / 49.29 ± 1.05 0.0.0 0.0.0 0.0.0 0.0.0
KB/bert-base-swedish-cased 125 50 512 True 16,181 ± 2,451 / 4,620 ± 1,507 1.43 81.95 ± 1.55 / 76.66 ± 1.60 75.58 ± 1.17 / 73.35 ± 2.22 78.86 ± 0.83 / 89.07 ± 0.50 38.56 ± 1.53 / 43.79 ± 1.43 0.0.0 0.0.0 0.0.0 0.0.0
ltg/norbert3-large 354 50 508 True 5,048 ± 824 / 1,354 ± 429 1.43 79.01 ± 1.13 / 73.76 ± 1.48 75.32 ± 1.55 / 69.39 ± 3.64 69.11 ± 1.50 / 84.32 ± 0.70 48.88 ± 0.87 / 54.15 ± 0.83 0.0.0 0.0.0 0.0.0 0.0.0
google/rembert 576 250 256 True 11,736 ± 2,822 / 2,102 ± 677 1.44 78.23 ± 1.53 / 72.58 ± 1.51 75.99 ± 1.15 / 71.01 ± 4.17 72.17 ± 0.94 / 85.94 ± 0.54 46.00 ± 2.13 / 51.05 ± 2.40 0.0.0 0.0.0 0.0.0 0.0.0
intfloat/multilingual-e5-large 560 250 512 True 6,732 ± 1,273 / 1,633 ± 523 1.44 80.36 ± 1.12 / 78.57 ± 1.27 79.65 ± 1.05 / 78.90 ± 1.32 63.15 ± 1.65 / 81.06 ± 0.95 46.99 ± 1.18 / 53.49 ± 0.89 0.0.0 0.0.0 0.0.0 0.0.0
KBLab/bert-base-swedish-cased 125 50 512 True 16,164 ± 2,392 / 4,574 ± 1,478 1.45 81.23 ± 1.58 / 75.95 ± 1.72 75.73 ± 0.72 / 73.61 ± 1.47 78.60 ± 0.98 / 88.95 ± 0.57 38.56 ± 1.53 / 43.79 ± 1.43 0.0.0 0.0.0 0.0.0 0.0.0
microsoft/mdeberta-v3-base 279 251 512 True 20,637 ± 3,925 / 4,497 ± 1,502 1.45 78.84 ± 2.19 / 72.86 ± 2.04 75.24 ± 0.99 / 72.06 ± 2.67 72.30 ± 1.04 / 85.77 ± 0.65 44.74 ± 1.04 / 50.62 ± 0.85 0.0.0 0.0.0 0.0.0 0.0.0
gpt-3.5-turbo-0613 (few-shot, val) unknown 100 4095 True 921 ± 293 / 113 ± 37 1.49 73.04 ± 2.74 / 61.64 ± 3.63 72.77 ± 2.64 / 72.56 ± 2.45 58.06 ± 3.84 / 76.06 ± 2.51 58.02 ± 2.11 / 66.84 ± 1.38 0.0.0 0.0.0 0.0.0 12.9.0
gpt-3.5-turbo-0613 (few-shot) unknown 100 4096 True 837 ± 294 / 126 ± 43 1.50 71.43 ± 1.58 / 58.93 ± 3.29 77.50 ± 1.54 / 76.51 ± 1.63 55.99 ± 2.64 / 75.08 ± 2.22 55.46 ± 0.90 / 64.95 ± 0.64 0.0.0 0.0.0 0.0.0 0.0.0
sentence-transformers/use-cmlm-multilingual 471 501 512 True 30,231 ± 8,171 / 4,863 ± 1,598 1.51 80.05 ± 1.13 / 74.21 ± 1.26 75.09 ± 1.30 / 72.93 ± 2.37 61.83 ± 1.28 / 79.96 ± 0.82 45.69 ± 1.11 / 51.07 ± 1.04 0.0.0 0.0.0 0.0.0 0.0.0
KBLab/megatron-bert-base-swedish-cased-600k 135 64 512 True 15,726 ± 2,508 / 4,234 ± 1,365 1.52 78.91 ± 1.24 / 72.93 ± 1.08 76.09 ± 0.81 / 72.74 ± 2.11 70.08 ± 2.11 / 83.40 ± 1.46 41.14 ± 1.18 / 47.18 ± 0.98 0.0.0 0.0.0 0.0.0 0.0.0
danish-foundation-models/encoder-large-v1 355 50 512 True 6,671 ± 1,380 / 1,497 ± 482 1.55 74.18 ± 2.01 / 68.89 ± 2.46 75.11 ± 1.19 / 74.74 ± 0.94 64.11 ± 3.27 / 81.63 ± 1.66 46.79 ± 1.61 / 52.40 ± 1.77 0.0.0 0.0.0 0.0.0 0.0.0
NbAiLab/nb-roberta-base-scandi 278 250 512 True 15,079 ± 2,948 / 3,359 ± 1,091 1.56 80.02 ± 1.62 / 74.04 ± 1.75 76.21 ± 1.60 / 73.41 ± 2.08 71.92 ± 1.07 / 85.01 ± 0.74 33.80 ± 0.78 / 38.58 ± 0.70 0.0.0 0.0.0 0.0.0 0.0.0
KBLab/megatron-bert-base-swedish-cased-125k 135 64 512 True 15,763 ± 2,523 / 4,238 ± 1,370 1.58 79.29 ± 0.94 / 73.18 ± 0.95 75.85 ± 0.54 / 70.58 ± 1.96 70.43 ± 1.03 / 83.85 ± 0.65 37.56 ± 0.64 / 44.01 ± 0.43 0.0.0 0.0.0 0.0.0 0.0.0
NbAiLab/nb-roberta-base-scandi-1e4 278 250 512 True 15,074 ± 2,990 / 3,347 ± 1,080 1.58 79.90 ± 1.41 / 73.80 ± 1.53 76.20 ± 1.16 / 74.01 ± 2.34 73.62 ± 1.17 / 86.19 ± 0.80 32.38 ± 1.23 / 37.12 ± 1.20 0.0.0 0.0.0 0.0.0 0.0.0
vesteinn/ScandiBERT-no-faroese 124 50 512 True 15,436 ± 2,820 / 3,704 ± 1,187 1.58 79.08 ± 2.32 / 73.06 ± 2.01 72.53 ± 0.98 / 67.74 ± 3.02 73.01 ± 1.43 / 85.98 ± 0.81 36.92 ± 2.25 / 41.99 ± 2.38 0.0.0 0.0.0 0.0.0 0.0.0
KennethEnevoldsen/dfm-sentence-encoder-large-1 355 50 512 True 6,245 ± 1,260 / 1,416 ± 453 1.60 71.65 ± 1.55 / 69.08 ± 1.45 74.92 ± 0.98 / 72.01 ± 2.31 63.43 ± 2.30 / 81.00 ± 1.37 46.20 ± 1.03 / 51.94 ± 0.92 0.0.0 0.0.0 0.0.0 0.0.0
FacebookAI/xlm-roberta-large 560 250 512 True 17,897 ± 3,921 / 3,463 ± 1,141 1.61 80.33 ± 2.50 / 75.03 ± 3.79 76.63 ± 0.98 / 74.25 ± 3.20 49.72 ± 19.88 / 69.94 ± 13.64 46.64 ± 1.42 / 52.21 ± 1.45 0.0.0 0.0.0 0.0.0 0.0.0
KBLab/bert-base-swedish-cased-new 135 64 512 True 15,933 ± 2,541 / 4,289 ± 1,376 1.62 79.99 ± 1.32 / 74.07 ± 1.54 76.04 ± 0.97 / 72.61 ± 1.82 73.52 ± 2.31 / 85.57 ± 1.53 30.60 ± 1.30 / 35.83 ± 1.02 0.0.0 0.0.0 0.0.0 0.0.0
meta-llama/Llama-2-70b-hf (few-shot, val) 68977 32 4096 True 1,892 ± 650 / 318 ± 105 1.63 64.76 ± 3.91 / 61.08 ± 5.41 75.46 ± 1.99 / 74.35 ± 3.70 43.27 ± 5.03 / 65.62 ± 4.94 63.04 ± 1.52 / 66.95 ± 1.31 12.7.0 12.7.0 12.7.0 12.7.0
setu4993/LaBSE 471 501 512 True 25,418 ± 6,435 / 4,536 ± 1,452 1.63 77.78 ± 1.69 / 72.08 ± 1.81 73.58 ± 1.37 / 70.43 ± 2.49 60.36 ± 2.98 / 79.72 ± 1.52 41.71 ± 1.08 / 47.07 ± 0.98 0.0.0 0.0.0 0.0.0 0.0.0
KennethEnevoldsen/dfm-sentence-encoder-large-2 355 50 512 True 6,569 ± 1,320 / 1,492 ± 476 1.65 71.86 ± 1.73 / 69.05 ± 1.65 74.67 ± 1.43 / 71.15 ± 3.57 62.77 ± 3.14 / 80.75 ± 1.67 44.77 ± 3.06 / 50.58 ± 2.94 0.0.0 0.0.0 0.0.0 0.0.0
152334H/miqu-1-70b-sf (few-shot, val) 68977 32 32764 True 2,126 ± 676 / 319 ± 104 1.66 62.96 ± 3.44 / 52.14 ± 4.04 75.25 ± 2.41 / 78.80 ± 1.96 53.28 ± 3.33 / 75.37 ± 1.80 56.42 ± 1.65 / 65.04 ± 1.17 12.7.0 12.7.0 12.7.0 12.7.0
KennethEnevoldsen/dfm-sentence-encoder-medium-3 178 120 512 True 14,050 ± 3,278 / 2,749 ± 894 1.66 81.35 ± 1.26 / 79.18 ± 1.23 71.16 ± 1.21 / 69.78 ± 3.24 63.89 ± 1.18 / 81.45 ± 0.75 37.18 ± 2.04 / 42.09 ± 2.23 0.0.0 0.0.0 0.0.0 0.0.0
vesteinn/FoBERT 124 50 512 True 15,623 ± 2,828 / 3,737 ± 1,191 1.67 78.58 ± 1.52 / 72.45 ± 1.57 73.41 ± 0.98 / 68.72 ± 3.80 71.14 ± 1.62 / 84.55 ± 0.97 31.62 ± 1.35 / 36.20 ± 1.16 0.0.0 0.0.0 0.0.0 0.0.0
NbAiLab/nb-bert-base 178 120 512 True 14,050 ± 3,222 / 2,727 ± 886 1.70 80.38 ± 0.99 / 75.35 ± 0.88 71.21 ± 1.11 / 67.49 ± 2.90 64.03 ± 1.94 / 81.39 ± 1.29 35.33 ± 2.25 / 39.61 ± 2.31 0.0.0 0.0.0 0.0.0 0.0.0
ltg/norbert3-base 124 50 508 True 11,405 ± 1,970 / 2,856 ± 917 1.71 78.21 ± 0.92 / 72.63 ± 0.98 71.05 ± 1.70 / 70.72 ± 2.74 56.02 ± 2.92 / 77.31 ± 1.61 42.52 ± 1.02 / 47.31 ± 0.99 0.0.0 0.0.0 0.0.0 0.0.0
pere/roberta-debug-8 278 250 512 True 15,103 ± 2,954 / 3,356 ± 1,090 1.72 74.48 ± 2.35 / 67.89 ± 2.16 74.58 ± 1.29 / 70.97 ± 2.41 69.07 ± 2.22 / 83.17 ± 1.50 31.66 ± 1.18 / 37.05 ± 1.10 0.0.0 0.0.0 0.0.0 0.0.0
intfloat/multilingual-e5-base 278 250 512 True 14,965 ± 2,890 / 3,322 ± 1,074 1.73 79.02 ± 0.74 / 77.59 ± 0.76 76.06 ± 0.89 / 69.85 ± 3.16 50.19 ± 1.23 / 74.23 ± 0.99 40.65 ± 1.29 / 46.62 ± 1.30 0.0.0 0.0.0 0.0.0 0.0.0
pere/roberta-debug-32 278 250 512 True 14,958 ± 2,903 / 3,331 ± 1,077 1.73 72.25 ± 2.16 / 65.94 ± 2.04 75.04 ± 1.08 / 72.35 ± 2.45 70.16 ± 1.47 / 84.29 ± 0.90 31.89 ± 0.99 / 36.93 ± 0.90 0.0.0 0.0.0 0.0.0 0.0.0
flax-community/swe-roberta-wiki-oscar 125 50 512 True 15,437 ± 2,628 / 3,834 ± 1,252 1.76 75.40 ± 1.45 / 70.45 ± 1.62 76.22 ± 0.78 / 75.25 ± 1.16 65.73 ± 1.73 / 81.50 ± 1.14 29.34 ± 1.44 / 34.01 ± 1.50 0.0.0 0.0.0 0.0.0 0.0.0
four-two-labs/orpo-llama-3-swe (few-shot) 8030 128 8192 False 4,974 ± 1,208 / 1,032 ± 342 1.78 60.93 ± 2.85 / 38.87 ± 3.50 79.74 ± 0.68 / 75.13 ± 1.85 26.02 ± 4.38 / 52.19 ± 5.44 59.84 ± 0.92 / 65.92 ± 0.82 12.7.0 12.7.0 12.7.0 12.7.0
meta-llama/Meta-Llama-3-8B (few-shot) 8030 128 8192 True 4,687 ± 1,121 / 967 ± 313 1.78 60.36 ± 2.84 / 39.37 ± 3.56 79.74 ± 0.75 / 75.11 ± 1.91 28.24 ± 4.19 / 55.29 ± 5.35 59.73 ± 1.13 / 65.72 ± 0.94 12.6.1 12.6.1 12.6.1 12.6.1
pere/roberta-base-exp-32 278 250 512 True 15,081 ± 2,950 / 3,365 ± 1,092 1.79 79.75 ± 0.94 / 73.45 ± 0.86 74.73 ± 1.15 / 70.83 ± 3.72 53.55 ± 16.68 / 75.79 ± 8.05 32.20 ± 0.86 / 36.88 ± 0.81 0.0.0 0.0.0 0.0.0 0.0.0
pere/roberta-base-exp-8 278 250 512 True 15,112 ± 2,969 / 3,347 ± 1,093 1.80 73.44 ± 2.81 / 67.31 ± 3.11 73.63 ± 1.53 / 68.42 ± 4.24 58.91 ± 17.49 / 77.13 ± 10.99 32.39 ± 1.02 / 37.33 ± 0.86 0.0.0 0.0.0 0.0.0 0.0.0
timpal0l/Llama-3-8B-flashback-v1 (few-shot) 8030 128 8192 True 4,849 ± 1,171 / 974 ± 316 1.80 53.94 ± 3.15 / 34.56 ± 3.93 81.20 ± 1.26 / 80.02 ± 2.24 36.46 ± 3.11 / 65.43 ± 3.12 58.73 ± 1.11 / 64.90 ± 1.02 12.7.0 12.7.0 12.7.0 12.7.0
timpal0l/dolphin-2.9-llama3-8b-flashback (few-shot, val) 8030 128 8192 False 5,018 ± 1,216 / 996 ± 324 1.83 65.33 ± 2.38 / 46.88 ± 3.97 74.99 ± 3.45 / 76.76 ± 1.80 32.65 ± 5.08 / 61.25 ± 4.41 55.71 ± 1.34 / 64.54 ± 1.00 12.7.0 12.7.0 12.7.0 12.7.0
Nexusflow/Starling-LM-7B-beta (few-shot) 7242 32 8192 False 5,876 ± 1,021 / 1,677 ± 546 1.85 60.38 ± 1.60 / 36.17 ± 3.66 77.49 ± 0.98 / 72.07 ± 1.56 29.32 ± 2.34 / 54.43 ± 2.67 56.79 ± 0.83 / 65.84 ± 0.48 12.5.2 12.5.2 12.5.2 12.5.2
timpal0l/sol (few-shot) 10732 32 4096 False 3,701 ± 876 / 771 ± 247 1.86 57.51 ± 2.30 / 37.74 ± 3.15 77.31 ± 1.01 / 70.55 ± 2.26 25.06 ± 5.02 / 49.04 ± 4.68 60.16 ± 1.77 / 67.43 ± 1.02 12.7.0 12.7.0 12.7.0 12.7.0
upstage/SOLAR-10.7B-v1.0 (few-shot) 10732 32 4096 True 3,780 ± 906 / 799 ± 261 1.89 59.65 ± 2.22 / 39.33 ± 3.33 77.48 ± 1.23 / 70.13 ± 2.81 16.94 ± 2.36 / 40.98 ± 1.82 62.65 ± 0.56 / 68.15 ± 0.56 12.5.3 12.5.3 12.5.3 12.5.3
microsoft/infoxlm-large 560 250 512 True 6,696 ± 1,260 / 1,630 ± 515 1.90 79.53 ± 2.77 / 74.53 ± 2.70 75.42 ± 1.08 / 72.68 ± 3.19 18.44 ± 10.88 / 53.57 ± 7.20 48.19 ± 1.10 / 53.67 ± 0.81 0.0.0 0.0.0 0.0.0 0.0.0
Addedk/kbbert-distilled-cased 82 50 512 True 29,698 ± 4,287 / 8,677 ± 2,776 1.91 80.12 ± 1.41 / 73.78 ± 1.37 71.28 ± 1.09 / 69.73 ± 2.94 51.58 ± 2.89 / 73.82 ± 2.21 28.16 ± 0.76 / 33.47 ± 0.62 0.0.0 0.0.0 0.0.0 0.0.0
cardiffnlp/twitter-xlm-roberta-base 278 250 512 True 34,475 ± 7,465 / 6,712 ± 2,223 1.91 72.49 ± 1.68 / 67.03 ± 1.55 70.69 ± 1.08 / 67.03 ± 3.40 56.60 ± 3.25 / 76.70 ± 2.48 31.89 ± 1.88 / 37.63 ± 1.86 0.0.0 0.0.0 0.0.0 0.0.0
facebook/xlm-v-base 778 902 512 True 25,396 ± 6,394 / 4,534 ± 1,421 1.91 68.39 ± 7.26 / 63.66 ± 6.41 73.43 ± 0.91 / 61.29 ± 1.81 45.09 ± 15.90 / 68.48 ± 11.31 38.04 ± 2.09 / 43.73 ± 1.83 0.0.0 0.0.0 0.0.0 0.0.0
merge-crew/da-sv-slerp (few-shot, val) 7242 32 32768 True 2,467 ± 469 / 762 ± 244 1.91 46.57 ± 3.34 / 33.94 ± 3.73 76.53 ± 2.55 / 77.96 ± 3.04 33.43 ± 3.89 / 61.87 ± 4.02 59.87 ± 1.52 / 64.53 ± 1.41 10.0.1 10.0.1 10.0.1 10.0.1
merge-crew/da-sv-task-arithmetic (few-shot, val) 7242 32 32768 True 2,500 ± 469 / 762 ± 238 1.91 47.28 ± 3.05 / 34.01 ± 3.73 76.62 ± 2.52 / 78.04 ± 2.98 33.23 ± 4.72 / 61.29 ± 4.67 60.00 ± 1.69 / 64.62 ± 1.44 10.0.1 10.0.1 10.0.1 10.0.1
timpal0l/BeagleCatMunin (few-shot, val) 7242 32 32768 False 2,495 ± 458 / 775 ± 244 1.91 50.53 ± 3.30 / 37.77 ± 4.38 77.37 ± 2.25 / 78.66 ± 2.43 27.84 ± 4.72 / 49.46 ± 4.52 59.98 ± 1.65 / 65.44 ± 1.38 9.3.2 9.3.2 9.3.2 12.5.2
RJuro/munin-neuralbeagle-7b (few-shot, val) 7242 32 32768 False 2,493 ± 466 / 773 ± 243 1.92 62.96 ± 2.62 / 51.99 ± 5.66 77.13 ± 2.43 / 78.36 ± 1.88 15.73 ± 7.07 / 47.41 ± 5.31 58.43 ± 1.59 / 65.06 ± 1.19 9.3.2 9.3.2 9.3.2 12.5.2
pere/roberta-base-exp-32B 278 250 512 True 15,103 ± 2,982 / 3,357 ± 1,081 1.93 77.97 ± 0.82 / 72.10 ± 0.94 73.27 ± 0.75 / 71.87 ± 1.30 47.19 ± 16.37 / 72.10 ± 8.03 31.07 ± 0.93 / 36.17 ± 0.73 0.0.0 0.0.0 0.0.0 0.0.0
merge-crew/da-sv-dare-ties-density-0.9 (few-shot, val) 7242 32 32768 True 2,443 ± 458 / 750 ± 240 1.94 46.61 ± 3.11 / 34.10 ± 4.61 76.38 ± 2.01 / 78.30 ± 2.42 34.16 ± 4.39 / 60.06 ± 4.67 58.77 ± 1.76 / 63.50 ± 1.47 10.0.1 10.0.1 10.0.1 10.0.1
timpal0l/Mistral-7B-v0.1-flashback-v2 (few-shot) 7242 32 32768 True 5,054 ± 1,200 / 1,056 ± 339 1.94 44.14 ± 2.40 / 29.77 ± 4.06 80.14 ± 1.11 / 80.19 ± 0.78 34.23 ± 2.23 / 65.29 ± 2.17 57.07 ± 1.56 / 62.52 ± 1.11 12.5.3 12.5.3 12.5.3 12.5.3
timpal0l/njord-alpha (few-shot) 7242 32 32768 True 5,431 ± 1,267 / 1,139 ± 365 1.94 48.19 ± 2.55 / 37.50 ± 3.62 79.95 ± 0.87 / 81.24 ± 0.64 32.85 ± 2.28 / 61.74 ± 3.05 57.39 ± 1.52 / 63.58 ± 1.19 12.7.0 12.7.0 12.7.0 12.7.0
birgermoell/Rapid-Cycling (few-shot, val) 7242 32 32768 False 2,346 ± 450 / 666 ± 249 1.95 53.66 ± 3.57 / 41.97 ± 4.83 77.72 ± 2.51 / 78.40 ± 2.65 16.22 ± 4.46 / 43.17 ± 3.88 59.75 ± 1.13 / 64.72 ± 1.04 9.3.2 9.3.2 9.3.2 12.5.2
google-bert/bert-base-multilingual-cased 178 120 512 True 14,083 ± 3,264 / 2,738 ± 889 1.96 76.29 ± 1.28 / 70.33 ± 1.16 61.78 ± 1.21 / 60.94 ± 3.28 47.74 ± 7.69 / 72.98 ± 4.74 41.17 ± 1.01 / 46.07 ± 1.12 0.0.0 0.0.0 0.0.0 0.0.0
Mabeck/Heidrun-Mistral-7B-chat (few-shot) 7242 32 32768 False 5,822 ± 1,283 / 1,336 ± 430 1.98 55.06 ± 2.38 / 41.39 ± 4.31 77.50 ± 0.90 / 73.87 ± 1.21 17.47 ± 2.33 / 47.73 ± 3.35 58.67 ± 0.96 / 64.58 ± 0.78 10.0.1 10.0.1 10.0.1 12.5.0
birgermoell/Flashback-Bellman (few-shot, val) 7242 32 32768 False 2,887 ± 403 / 1,144 ± 345 1.98 55.29 ± 3.95 / 41.59 ± 4.48 78.29 ± 1.83 / 78.77 ± 2.06 18.45 ± 3.00 / 46.38 ± 2.81 58.42 ± 1.64 / 63.83 ± 1.18 9.3.1 9.3.1 9.3.1 12.5.2
meta-llama/Meta-Llama-3-70B-Instruct (few-shot, val) 70554 128 8192 True 1,673 ± 583 / 275 ± 85 1.99 77.06 ± 2.72 / 67.75 ± 5.69 53.56 ± 7.15 / 67.07 ± 3.93 47.50 ± 3.37 / 71.31 ± 2.69 46.86 ± 1.77 / 60.96 ± 1.04 12.7.0 12.7.0 12.7.0 12.7.0
Geotrend/bert-base-25lang-cased 151 85 512 True 13,908 ± 3,201 / 2,700 ± 872 2.00 75.62 ± 1.56 / 70.17 ± 1.46 62.50 ± 1.10 / 60.57 ± 2.75 38.18 ± 7.03 / 66.99 ± 4.92 40.96 ± 1.11 / 45.91 ± 1.20 0.0.0 0.0.0 0.0.0 0.0.0
ZurichNLP/unsup-simcse-xlm-roberta-base 278 250 512 True 34,520 ± 7,443 / 6,730 ± 2,224 2.00 75.49 ± 1.09 / 74.57 ± 0.60 71.12 ± 0.90 / 60.88 ± 2.58 36.69 ± 15.58 / 65.77 ± 8.64 33.55 ± 2.07 / 38.90 ± 2.21 12.6.1 12.6.1 12.6.1 0.0.0
AI-Sweden-Models/tyr (few-shot, val) 7242 32 32768 False 6,079 ± 1,051 / 1,760 ± 570 2.02 56.21 ± 2.49 / 44.78 ± 4.19 78.30 ± 1.71 / 79.80 ± 2.03 14.35 ± 5.65 / 48.69 ± 4.30 61.08 ± 1.47 / 65.72 ± 1.07 12.5.2 12.3.2 12.3.2 12.3.2
RJuro/munin-neuralbeagle-SkoleGPTOpenOrca-7b (few-shot, val) 7242 32 32768 False 3,008 ± 429 / 991 ± 323 2.03 59.36 ± 2.75 / 47.08 ± 4.17 72.04 ± 3.27 / 63.83 ± 2.07 22.38 ± 7.17 / 54.70 ± 5.49 57.96 ± 2.00 / 64.06 ± 1.76 9.3.2 9.3.2 9.3.2 12.5.2
google-bert/bert-base-multilingual-uncased 167 106 512 True 13,993 ± 3,217 / 2,752 ± 893 2.03 70.85 ± 1.56 / 65.50 ± 1.71 63.30 ± 0.93 / 59.96 ± 1.80 48.97 ± 1.14 / 73.78 ± 0.61 38.00 ± 1.52 / 42.69 ± 1.62 0.0.0 0.0.0 0.0.0 0.0.0
Mabeck/Heidrun-Mistral-7B-base (few-shot) 7242 32 32768 True 3,823 ± 967 / 860 ± 280 2.05 48.43 ± 2.75 / 35.31 ± 2.80 79.43 ± 0.85 / 78.21 ± 1.69 17.37 ± 2.57 / 52.91 ± 4.93 57.05 ± 1.22 / 62.72 ± 0.89 11.0.0 11.0.0 11.0.0 11.0.0
danish-foundation-models/munin-7b-v0.1dev0 (few-shot) 7242 32 8192 True 6,113 ± 1,044 / 1,790 ± 579 2.05 47.10 ± 2.60 / 35.06 ± 3.65 73.05 ± 5.27 / 74.56 ± 4.19 30.29 ± 2.63 / 61.40 ± 3.22 57.39 ± 1.38 / 63.51 ± 1.04 12.5.2 12.4.0 12.4.0 12.4.0
merge-crew/da-sv-dare-ties-density-0.6 (few-shot, val) 7242 32 32768 True 2,515 ± 465 / 785 ± 247 2.05 45.12 ± 2.72 / 30.73 ± 4.55 78.74 ± 2.13 / 80.11 ± 2.64 19.74 ± 6.09 / 46.97 ± 5.83 60.15 ± 1.71 / 65.22 ± 1.28 10.0.1 10.0.1 10.0.1 10.0.1
mlabonne/NeuralBeagle14-7B (few-shot, val) 7242 32 8192 False 2,549 ± 472 / 784 ± 245 2.05 61.25 ± 3.35 / 50.76 ± 5.94 76.03 ± 2.11 / 78.25 ± 1.95 16.28 ± 4.81 / 49.04 ± 3.60 50.96 ± 2.34 / 60.05 ± 1.18 9.3.2 9.3.2 9.3.2 12.5.2
Geotrend/bert-base-en-da-cased 111 33 512 True 14,062 ± 3,216 / 2,733 ± 885 2.06 74.88 ± 1.45 / 69.57 ± 1.83 61.89 ± 0.90 / 60.17 ± 3.06 40.22 ± 2.03 / 68.89 ± 2.06 39.95 ± 0.82 / 44.78 ± 0.99 0.0.0 0.0.0 0.0.0 0.0.0
Geotrend/bert-base-en-fr-de-no-da-cased 118 42 512 True 13,973 ± 3,205 / 2,725 ± 884 2.06 76.55 ± 1.28 / 70.38 ± 1.01 61.60 ± 1.38 / 62.28 ± 3.13 37.44 ± 6.65 / 66.67 ± 4.88 39.32 ± 1.25 / 43.87 ± 1.29 0.0.0 0.0.0 0.0.0 0.0.0
Geotrend/bert-base-en-no-cased 111 33 512 True 14,081 ± 3,231 / 2,748 ± 891 2.06 75.33 ± 0.99 / 69.89 ± 0.52 61.80 ± 1.76 / 58.93 ± 3.28 36.62 ± 5.98 / 66.91 ± 3.69 39.95 ± 1.95 / 44.71 ± 1.99 0.0.0 0.0.0 0.0.0 0.0.0
birgermoell/roberta-swedish-scandi 125 50 512 True 15,385 ± 2,815 / 3,578 ± 1,177 2.06 68.55 ± 3.16 / 62.00 ± 2.58 69.96 ± 1.75 / 68.67 ± 3.21 52.88 ± 14.23 / 75.25 ± 7.45 27.99 ± 1.23 / 32.49 ± 1.27 0.0.0 0.0.0 0.0.0 0.0.0
merge-crew/da-sv-ties (few-shot, val) 7242 32 32768 True 2,457 ± 451 / 757 ± 237 2.06 48.36 ± 3.07 / 34.48 ± 5.22 76.57 ± 2.19 / 78.11 ± 2.73 20.94 ± 5.55 / 44.72 ± 4.06 59.07 ± 1.90 / 63.87 ± 1.46 10.0.1 10.0.1 10.0.1 10.0.1
birgermoell/BeagleCatMunin-Flashback-Bellman (few-shot, val) 7242 32 32768 False 2,890 ± 401 / 1,155 ± 348 2.07 52.96 ± 3.45 / 41.51 ± 4.30 76.99 ± 2.37 / 76.84 ± 2.99 14.27 ± 4.36 / 40.60 ± 3.04 59.92 ± 1.64 / 64.87 ± 1.47 9.3.1 9.3.1 9.3.1 12.5.2
mhenrichsen/danskgpt-chat-v2.1 (few-shot) unknown 32 32768 True 5,085 ± 998 / 1,306 ± 404 2.07 54.37 ± 3.04 / 42.16 ± 4.00 75.98 ± 1.15 / 74.44 ± 1.12 17.98 ± 1.97 / 56.01 ± 2.08 55.07 ± 0.74 / 64.24 ± 0.61 12.0.0 12.0.0 12.0.0 12.0.0
meta-llama/Llama-2-70b-chat-hf (few-shot, val) 68977 32 4096 True 1,979 ± 621 / 320 ± 105 2.08 55.91 ± 3.25 / 39.73 ± 4.94 64.52 ± 3.15 / 70.51 ± 2.49 23.85 ± 7.34 / 56.89 ± 6.08 58.88 ± 1.51 / 65.82 ± 1.07 12.7.0 12.7.0 12.7.0 12.7.0
Twitter/twhin-bert-base 279 250 512 True 11,514 ± 2,041 / 2,862 ± 918 2.09 70.17 ± 0.99 / 64.19 ± 1.42 66.62 ± 1.71 / 61.90 ± 2.61 46.72 ± 3.65 / 72.15 ± 2.62 31.38 ± 1.36 / 35.79 ± 1.33 0.0.0 0.0.0 0.0.0 0.0.0
flax-community/nordic-roberta-wiki 125 50 512 True 16,227 ± 2,650 / 4,252 ± 1,393 2.09 72.90 ± 1.37 / 66.93 ± 1.30 61.11 ± 1.28 / 58.97 ± 2.27 55.05 ± 1.64 / 76.76 ± 0.93 29.04 ± 1.16 / 33.60 ± 1.06 0.0.0 0.0.0 0.0.0 0.0.0
timpal0l/BeagleCatMunin2 (few-shot, val) 7242 32 32768 False 2,477 ± 459 / 767 ± 241 2.09 60.87 ± 3.71 / 47.40 ± 5.32 73.72 ± 2.20 / 67.79 ± 2.37 6.78 ± 4.34 / 35.90 ± 2.11 58.75 ± 1.46 / 65.08 ± 1.15 9.3.1 9.3.1 9.3.1 12.5.2
Geotrend/bert-base-da-cased 104 23 512 True 15,432 ± 2,838 / 3,642 ± 1,189 2.11 74.13 ± 1.17 / 68.93 ± 1.36 62.18 ± 1.26 / 59.44 ± 2.35 36.93 ± 6.47 / 65.97 ± 6.05 37.59 ± 1.99 / 41.94 ± 2.23 0.0.0 0.0.0 0.0.0 0.0.0
birgermoell/Munin-NeuralBeagle-NorskGPT (few-shot, val) 7242 32 32768 False 2,903 ± 407 / 1,157 ± 350 2.11 63.85 ± 2.67 / 47.77 ± 4.72 73.72 ± 2.98 / 62.83 ± 1.64 -0.56 ± 2.24 / 33.54 ± 1.03 60.10 ± 1.48 / 66.26 ± 1.19 9.3.1 9.3.1 9.3.1 12.5.2
birgermoell/WestLake-Munin-Cat-NorskGPT (few-shot, val) 7242 32 32768 False 2,856 ± 391 / 1,142 ± 342 2.11 63.85 ± 2.67 / 47.77 ± 4.72 73.72 ± 2.98 / 62.83 ± 1.64 -0.56 ± 2.24 / 33.54 ± 1.03 60.10 ± 1.48 / 66.26 ± 1.19 9.3.1 9.3.1 9.3.1 12.5.2
mistralai/Mistral-7B-v0.1 (few-shot) 7242 32 32768 True 2,657 ± 524 / 880 ± 278 2.11 53.34 ± 2.55 / 40.48 ± 3.66 80.00 ± 0.70 / 79.80 ± 0.66 4.61 ± 2.18 / 34.51 ± 0.86 58.99 ± 1.05 / 64.65 ± 0.83 0.0.0 0.0.0 0.0.0 12.5.1
KennethEnevoldsen/munin_mistral-7b (few-shot, val) 7242 32 32768 False 2,543 ± 466 / 787 ± 247 2.12 52.34 ± 3.07 / 39.14 ± 4.60 77.66 ± 2.09 / 78.59 ± 2.41 6.00 ± 4.15 / 36.34 ± 2.20 60.16 ± 1.81 / 64.12 ± 1.59 12.5.2 12.3.1 12.3.1 12.3.2
ThatsGroes/munin-SkoleGPTOpenOrca-7b-16bit (few-shot) 7242 32 32768 False 3,006 ± 479 / 1,053 ± 319 2.12 44.64 ± 1.66 / 31.30 ± 2.96 77.98 ± 1.01 / 72.79 ± 2.47 16.57 ± 2.58 / 51.86 ± 3.69 57.31 ± 0.92 / 63.73 ± 1.04 11.0.0 11.0.0 11.0.0 12.4.0
alpindale/Mistral-7B-v0.2-hf (few-shot) 7242 32 32768 True 1,841 ± 297 / 651 ± 193 2.13 48.96 ± 2.72 / 39.25 ± 3.69 78.90 ± 0.95 / 78.62 ± 1.08 10.82 ± 3.46 / 38.95 ± 3.80 58.91 ± 1.02 / 64.72 ± 0.76 12.5.2 12.5.1 12.5.1 12.5.1
meta-llama/Meta-Llama-3-8B-Instruct (few-shot) 8030 128 8192 True 4,909 ± 1,215 / 978 ± 319 2.13 69.67 ± 1.30 / 52.94 ± 4.01 59.93 ± 4.70 / 67.54 ± 3.04 27.63 ± 3.19 / 60.85 ± 3.29 49.84 ± 1.61 / 60.85 ± 0.93 12.6.1 12.6.1 12.6.1 12.6.1
bineric/NorskGPT-Llama3-8b (few-shot) 8030 128 8192 False 3,695 ± 1,277 / 532 ± 183 2.14 63.19 ± 2.83 / 51.22 ± 3.61 76.06 ± 0.64 / 61.59 ± 0.77 5.34 ± 1.42 / 34.32 ± 0.56 56.70 ± 0.87 / 66.00 ± 0.59 12.7.0 12.7.0 12.7.0 12.7.0
jonfd/electra-small-nordic 22 96 128 True 5,989 ± 120 / 3,809 ± 1,230 2.15 71.07 ± 1.59 / 65.46 ± 1.28 66.42 ± 0.72 / 57.57 ± 1.23 69.19 ± 0.66 / 84.26 ± 0.36 11.85 ± 4.94 / 13.02 ± 5.55 0.0.0 0.0.0 0.0.0 0.0.0
mhenrichsen/hestenettetLM (few-shot) 7242 32 32768 True 5,160 ± 804 / 1,654 ± 516 2.16 53.00 ± 2.53 / 39.09 ± 3.72 79.70 ± 0.65 / 79.45 ± 0.68 4.32 ± 2.19 / 34.43 ± 0.87 59.03 ± 1.03 / 64.74 ± 0.84 12.5.2 12.3.2 12.3.2 12.3.2
danish-foundation-models/munin-7b-alpha (few-shot) 7242 32 32768 True 6,116 ± 1,049 / 1,784 ± 577 2.17 42.23 ± 2.44 / 30.30 ± 4.71 78.80 ± 0.93 / 75.28 ± 1.78 15.47 ± 1.79 / 54.26 ± 3.41 56.75 ± 1.15 / 62.43 ± 0.95 12.5.2 12.4.0 12.4.0 12.4.0
ltg/norbert3-small 41 50 508 True 13,515 ± 2,514 / 3,042 ± 1,004 2.18 74.22 ± 1.37 / 68.68 ± 1.40 63.80 ± 1.56 / 57.65 ± 2.24 37.77 ± 5.16 / 65.87 ± 4.30 31.45 ± 0.94 / 35.81 ± 0.94 0.0.0 0.0.0 0.0.0 0.0.0
timpal0l/Mistral-7B-v0.1-flashback-v2-instruct (few-shot) 7242 32 32768 False 5,172 ± 813 / 1,647 ± 518 2.19 46.74 ± 4.30 / 33.57 ± 4.51 77.06 ± 1.82 / 79.02 ± 1.37 14.00 ± 1.59 / 53.89 ± 3.10 56.74 ± 0.52 / 63.45 ± 0.49 12.5.2 12.3.2 12.3.2 12.4.0
vesteinn/DanskBERT 124 50 512 True 15,749 ± 2,665 / 4,014 ± 1,281 2.19 72.33 ± 0.82 / 67.15 ± 0.85 67.77 ± 1.19 / 62.98 ± 2.57 33.79 ± 7.61 / 64.01 ± 6.84 32.71 ± 0.77 / 37.46 ± 0.64 0.0.0 0.0.0 0.0.0 0.0.0
jhu-clsp/bernice 278 250 128 True 5,567 ± 450 / 2,483 ± 798 2.20 71.34 ± 0.91 / 65.04 ± 1.33 70.91 ± 1.23 / 67.12 ± 3.79 53.52 ± 1.22 / 76.15 ± 0.53 16.41 ± 4.10 / 18.47 ± 4.44 0.0.0 0.0.0 0.0.0 0.0.0
bineric/NorskGPT-Mistral-7b (few-shot) 7242 32 32768 False 2,443 ± 451 / 761 ± 237 2.21 58.40 ± 2.62 / 40.55 ± 3.65 74.30 ± 1.26 / 60.35 ± 0.41 0.00 ± 0.00 / 33.37 ± 0.27 59.16 ± 1.23 / 65.78 ± 0.72 9.3.1 9.3.1 9.3.1 12.5.1
DDSC/roberta-base-scandinavian 125 50 512 True 14,491 ± 2,800 / 3,182 ± 1,026 2.22 58.84 ± 13.92 / 53.63 ± 12.63 72.28 ± 0.79 / 71.62 ± 1.38 37.61 ± 17.89 / 66.93 ± 9.43 30.59 ± 0.68 / 35.43 ± 0.61 0.0.0 0.0.0 0.0.0 0.0.0
RuterNorway/Llama-2-13b-chat-norwegian (few-shot) unknown 32 4096 False 3,254 ± 1,068 / 484 ± 173 2.22 50.85 ± 2.44 / 39.65 ± 3.83 74.17 ± 2.12 / 76.62 ± 1.83 7.51 ± 1.94 / 37.81 ± 1.76 57.32 ± 0.63 / 63.28 ± 0.71 9.3.1 9.3.1 9.3.1 9.3.1
microsoft/xlm-align-base 278 250 512 True 14,744 ± 2,870 / 3,265 ± 1,053 2.22 78.60 ± 1.91 / 73.04 ± 2.25 73.67 ± 1.48 / 68.61 ± 3.14 15.41 ± 4.59 / 53.29 ± 3.93 32.41 ± 3.14 / 37.13 ± 3.07 0.0.0 0.0.0 0.0.0 0.0.0
senseable/WestLake-7B-v2 (few-shot) 7242 32 32768 False 5,993 ± 1,028 / 1,742 ± 561 2.22 58.90 ± 1.34 / 42.48 ± 3.97 67.74 ± 2.79 / 71.89 ± 1.89 16.52 ± 2.55 / 46.30 ± 2.62 49.41 ± 1.21 / 59.91 ± 0.48 12.6.1 12.6.1 12.6.1 12.6.1
meta-llama/Llama-2-7b-hf (few-shot) 6738 32 4096 True 930 ± 310 / 128 ± 43 2.24 44.11 ± 4.26 / 31.64 ± 4.48 79.05 ± 1.08 / 75.52 ± 2.66 7.34 ± 3.19 / 43.83 ± 5.31 57.49 ± 0.95 / 63.16 ± 0.77 9.2.0 9.2.0 9.2.0 12.5.1
sentence-transformers/stsb-xlm-r-multilingual 278 250 512 True 15,040 ± 2,953 / 3,417 ± 1,100 2.24 68.94 ± 1.53 / 62.54 ± 1.20 72.77 ± 0.89 / 68.13 ± 1.56 40.21 ± 2.53 / 67.11 ± 1.86 20.09 ± 1.31 / 25.99 ± 1.19 0.0.0 0.0.0 0.0.0 0.0.0
occiglot/occiglot-7b-eu5 (few-shot) 7242 32 32768 True 2,219 ± 427 / 717 ± 224 2.25 49.02 ± 3.23 / 41.69 ± 3.74 76.56 ± 1.52 / 78.16 ± 1.12 2.18 ± 2.34 / 36.26 ± 3.89 58.98 ± 0.95 / 63.65 ± 0.89 12.5.2 12.1.0 12.1.0 12.1.0
sentence-transformers/paraphrase-xlm-r-multilingual-v1 278 250 512 True 20,154 ± 4,438 / 3,890 ± 1,256 2.28 70.22 ± 1.49 / 63.97 ± 1.48 71.33 ± 1.20 / 65.44 ± 3.64 39.60 ± 5.87 / 66.60 ± 3.19 18.65 ± 1.15 / 24.75 ± 0.98 0.0.0 0.0.0 0.0.0 0.0.0
LumiOpen/Viking-33B (few-shot) 33119 131 4099 True 2,080 ± 700 / 331 ± 117 2.29 42.35 ± 1.51 / 28.31 ± 3.87 77.68 ± 1.11 / 78.86 ± 0.93 8.08 ± 1.69 / 50.52 ± 2.25 54.57 ± 1.25 / 60.34 ± 1.10 12.9.0 12.9.0 12.9.0 12.9.0
Twitter/twhin-bert-large 561 250 512 True 9,707 ± 1,664 / 2,549 ± 831 2.29 74.26 ± 1.65 / 68.20 ± 1.70 63.35 ± 5.43 / 60.33 ± 5.50 16.07 ± 10.73 / 52.48 ± 7.50 36.77 ± 3.78 / 41.72 ± 3.83 0.0.0 0.0.0 0.0.0 0.0.0
microsoft/infoxlm-base 278 250 512 True 34,735 ± 7,558 / 6,846 ± 2,312 2.29 79.43 ± 1.07 / 74.17 ± 1.10 71.48 ± 2.63 / 65.72 ± 4.78 7.26 ± 2.18 / 45.42 ± 4.53 33.72 ± 1.71 / 38.23 ± 1.57 0.0.0 0.0.0 0.0.0 0.0.0
sentence-transformers/paraphrase-multilingual-mpnet-base-v2 278 250 512 True 15,100 ± 3,019 / 3,369 ± 1,103 2.30 65.14 ± 1.57 / 59.82 ± 1.39 73.47 ± 0.84 / 70.20 ± 2.49 36.62 ± 6.55 / 66.09 ± 5.35 18.65 ± 0.91 / 25.00 ± 0.87 0.0.0 0.0.0 0.0.0 0.0.0
occiglot/occiglot-7b-eu5-instruct (few-shot) 7242 32 32768 False 2,088 ± 352 / 706 ± 214 2.31 47.67 ± 2.81 / 36.91 ± 3.50 71.73 ± 2.40 / 74.97 ± 1.84 7.90 ± 3.20 / 41.24 ± 4.78 57.78 ± 0.79 / 64.48 ± 0.73 12.5.2 12.2.0 12.3.1 12.4.0
distilbert/distilbert-base-multilingual-cased 135 120 512 True 26,355 ± 5,946 / 5,266 ± 1,714 2.32 70.08 ± 1.38 / 64.46 ± 1.31 59.66 ± 1.22 / 56.16 ± 2.13 33.71 ± 1.12 / 65.32 ± 0.86 31.48 ± 1.85 / 36.44 ± 1.87 0.0.0 0.0.0 0.0.0 0.0.0
clips/mfaq 278 250 128 True 5,591 ± 187 / 3,349 ± 1,105 2.33 76.31 ± 1.29 / 70.91 ± 1.27 73.32 ± 1.13 / 70.21 ± 3.74 32.29 ± 10.98 / 62.21 ± 5.02 16.12 ± 5.80 / 19.52 ± 6.73 0.0.0 0.0.0 0.0.0 0.0.0
01-ai/Yi-6B (few-shot) 6061 64 4096 True 2,786 ± 532 / 784 ± 250 2.34 46.69 ± 2.39 / 32.97 ± 4.57 75.39 ± 1.06 / 71.95 ± 1.42 2.91 ± 2.80 / 35.26 ± 2.12 54.95 ± 0.86 / 60.77 ± 0.75 9.3.2 10.0.0 10.0.0 12.5.1
mlabonne/AlphaMonarch-7B (few-shot, val) 7242 32 8192 False 5,340 ± 1,262 / 1,157 ± 375 2.34 60.53 ± 3.06 / 48.45 ± 5.19 67.03 ± 3.61 / 70.77 ± 1.95 15.10 ± 4.60 / 48.57 ± 2.91 42.46 ± 1.63 / 53.50 ± 1.40 12.5.2 12.5.2 12.5.2 12.5.2
tollefj/nordavind-7b-instruct-warm (few-shot) 7248 33 2048 False 6,450 ± 961 / 2,082 ± 658 2.34 47.24 ± 3.36 / 24.94 ± 3.21 77.91 ± 1.42 / 76.08 ± 2.54 5.55 ± 2.55 / 48.57 ± 3.21 51.41 ± 0.74 / 57.55 ± 0.69 12.5.2 12.3.2 12.3.2 12.4.0
neph1/bellman-7b-mistral-instruct-v0.2 (few-shot) 7242 32 32768 False 2,518 ± 463 / 779 ± 243 2.35 54.38 ± 2.92 / 39.66 ± 5.20 55.84 ± 2.51 / 66.96 ± 1.37 16.05 ± 2.15 / 54.22 ± 2.86 53.22 ± 0.88 / 61.85 ± 0.63 9.2.0 9.2.0 9.2.0 12.4.0
Geotrend/distilbert-base-25lang-cased 109 85 512 True 26,099 ± 5,881 / 5,178 ± 1,665 2.37 70.56 ± 1.36 / 64.49 ± 1.43 60.69 ± 0.46 / 56.69 ± 1.35 30.83 ± 1.47 / 63.39 ± 1.60 31.41 ± 1.05 / 36.45 ± 1.05 0.0.0 0.0.0 0.0.0 0.0.0
mistralai/Mistral-7B-Instruct-v0.1 (few-shot) 7242 32 32768 False 5,443 ± 1,273 / 1,144 ± 364 2.37 45.01 ± 2.11 / 27.59 ± 3.35 73.33 ± 1.98 / 76.19 ± 1.59 11.59 ± 3.45 / 40.89 ± 4.15 52.12 ± 1.42 / 59.29 ± 1.17 9.3.1 9.3.1 9.3.1 12.4.0
mistralai/Mistral-7B-Instruct-v0.2 (few-shot) 7242 32 32768 False 2,538 ± 415 / 821 ± 253 2.37 47.92 ± 2.66 / 33.00 ± 3.24 62.90 ± 2.44 / 70.61 ± 1.19 19.95 ± 2.24 / 56.49 ± 2.10 52.51 ± 0.36 / 61.42 ± 0.52 9.2.0 9.2.0 9.3.1 12.4.0
AI-Sweden-Models/gpt-sw3-20b (few-shot) 20918 64 2048 True 1,875 ± 673 / 261 ± 91 2.38 31.86 ± 5.09 / 21.95 ± 3.90 78.88 ± 1.58 / 79.56 ± 1.43 12.26 ± 1.97 / 46.90 ± 4.11 53.58 ± 0.97 / 60.28 ± 0.81 9.3.1 9.3.1 9.3.1 9.3.1
Geotrend/distilbert-base-en-da-cased 69 33 512 True 26,196 ± 5,956 / 5,220 ± 1,691 2.40 69.62 ± 0.88 / 63.51 ± 1.33 59.42 ± 1.21 / 55.74 ± 1.26 29.01 ± 2.06 / 62.65 ± 1.37 31.82 ± 1.07 / 36.82 ± 1.14 0.0.0 0.0.0 0.0.0 0.0.0
Geotrend/distilbert-base-en-fr-de-no-da-cased 76 42 512 True 26,081 ± 5,875 / 5,209 ± 1,692 2.40 69.94 ± 1.11 / 63.93 ± 1.47 59.83 ± 1.11 / 55.15 ± 0.99 29.82 ± 1.23 / 63.32 ± 1.41 31.13 ± 1.15 / 36.20 ± 1.22 0.0.0 0.0.0 0.0.0 0.0.0
Geotrend/distilbert-base-en-no-cased 69 33 512 True 26,597 ± 6,036 / 5,271 ± 1,697 2.40 69.28 ± 1.15 / 63.61 ± 1.27 59.53 ± 1.69 / 57.93 ± 2.20 29.36 ± 1.50 / 63.60 ± 0.89 30.42 ± 1.54 / 35.34 ± 1.63 0.0.0 0.0.0 0.0.0 0.0.0
dbmdz/bert-base-historic-multilingual-cased 111 32 512 True 20,047 ± 4,407 / 3,844 ± 1,259 2.40 68.83 ± 1.00 / 63.29 ± 1.48 64.25 ± 1.66 / 63.62 ± 2.92 28.62 ± 9.43 / 59.33 ± 5.91 28.78 ± 2.01 / 34.26 ± 2.03 0.0.0 0.0.0 0.0.0 0.0.0
merge-crew/da-sv-dare-ties-density-0.3 (few-shot, val) 7242 32 32768 True 2,461 ± 476 / 773 ± 248 2.40 32.37 ± 3.05 / 24.60 ± 3.81 75.33 ± 2.41 / 77.99 ± 2.58 12.73 ± 6.32 / 45.51 ± 7.43 53.05 ± 1.83 / 58.32 ± 1.46 10.0.1 10.0.1 10.0.1 10.0.1
norallm/normistral-7b-warm (few-shot) 7248 33 2048 True 3,175 ± 456 / 1,186 ± 354 2.41 48.78 ± 5.08 / 26.81 ± 3.42 76.09 ± 1.23 / 74.78 ± 1.97 2.53 ± 2.80 / 47.37 ± 2.29 48.93 ± 0.97 / 55.09 ± 0.85 11.0.0 11.0.0 11.0.0 11.0.0
Geotrend/distilbert-base-da-cased 61 23 512 True 28,950 ± 5,114 / 7,010 ± 2,267 2.42 69.25 ± 1.37 / 63.90 ± 1.27 58.47 ± 1.30 / 56.03 ± 2.36 29.80 ± 1.57 / 63.53 ± 0.90 30.61 ± 1.31 / 35.37 ± 1.52 0.0.0 0.0.0 0.0.0 0.0.0
Addedk/mbert-swedish-distilled-cased 135 120 512 True 26,091 ± 5,835 / 5,209 ± 1,690 2.45 73.41 ± 1.54 / 66.98 ± 1.68 62.10 ± 1.18 / 60.27 ± 2.82 34.86 ± 1.29 / 66.98 ± 0.77 18.10 ± 2.67 / 21.09 ± 3.18 0.0.0 0.0.0 0.0.0 0.0.0
bineric/NorskGPT-Llama-7B-v0.1 (few-shot) 6738 32 4096 False 5,384 ± 879 / 1,746 ± 553 2.49 53.95 ± 1.89 / 42.16 ± 4.59 60.91 ± 2.35 / 59.47 ± 1.21 0.32 ± 0.62 / 33.39 ± 0.28 55.28 ± 0.62 / 63.41 ± 0.55 12.5.2 12.3.2 12.3.2 12.3.2
mideind/IceBERT-xlmr-ic3 278 250 512 True 11,004 ± 2,244 / 2,324 ± 761 2.50 70.57 ± 1.07 / 70.41 ± 0.93 66.01 ± 1.43 / 57.28 ± 0.85 10.20 ± 5.38 / 50.80 ± 4.15 30.71 ± 0.94 / 36.08 ± 0.92 12.7.0 12.7.0 12.7.0 12.7.0
sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2 118 250 512 True 29,201 ± 6,282 / 6,045 ± 2,027 2.50 66.50 ± 1.49 / 59.99 ± 1.40 72.19 ± 0.71 / 67.88 ± 2.34 28.75 ± 5.58 / 63.30 ± 2.60 15.91 ± 0.87 / 23.08 ± 0.95 0.0.0 0.0.0 0.0.0 0.0.0
AI-Sweden-Models/gpt-sw3-6.7b-v2 (few-shot) 7111 64 2048 True 2,351 ± 448 / 707 ± 216 2.52 28.73 ± 3.63 / 20.43 ± 3.72 77.47 ± 1.36 / 78.60 ± 1.25 8.78 ± 2.01 / 42.28 ± 3.17 50.57 ± 0.94 / 56.51 ± 0.79 9.2.0 9.2.0 9.2.0 12.5.1
meta-llama/Llama-2-7b-chat-hf (few-shot) 6738 32 4096 False 2,643 ± 455 / 800 ± 247 2.54 39.72 ± 2.82 / 29.85 ± 2.99 66.18 ± 3.25 / 72.00 ± 1.75 6.74 ± 1.66 / 45.55 ± 4.31 54.05 ± 0.84 / 60.90 ± 0.82 9.3.1 9.3.1 9.3.1 12.4.0
norallm/normistral-7b-warm-instruct (few-shot) unknown 33 2048 True 6,194 ± 949 / 1,967 ± 619 2.54 51.45 ± 3.13 / 26.49 ± 3.00 63.64 ± 3.74 / 65.08 ± 2.46 5.80 ± 1.74 / 51.04 ± 1.54 48.95 ± 1.00 / 57.09 ± 0.92 12.6.1 12.6.1 12.6.1 12.6.1
dbmdz/bert-medium-historic-multilingual-cased 42 32 512 True 24,291 ± 4,887 / 5,096 ± 1,655 2.55 66.11 ± 1.24 / 61.03 ± 1.08 59.66 ± 1.84 / 55.24 ± 1.32 26.28 ± 8.44 / 59.64 ± 5.68 24.36 ± 0.92 / 30.54 ± 0.96 0.0.0 0.0.0 0.0.0 0.0.0
Qwen/Qwen1.5-4B-Chat (few-shot) 3950 152 32768 False 4,347 ± 893 / 1,135 ± 365 2.60 40.19 ± 2.97 / 31.88 ± 4.51 64.08 ± 2.44 / 69.62 ± 1.29 5.43 ± 2.02 / 38.32 ± 2.54 53.21 ± 1.08 / 59.57 ± 0.97 12.5.2 10.0.1 12.1.0 12.5.2
birgermoell/NeuralBeagle-Flashback (few-shot, val) 7242 32 32768 False 2,904 ± 405 / 1,155 ± 349 2.61 51.73 ± 4.51 / 40.50 ± 6.05 36.06 ± 3.31 / 53.46 ± 1.79 19.42 ± 5.08 / 46.92 ± 5.36 59.26 ± 1.66 / 64.40 ± 1.35 9.3.0 9.3.0 9.3.0 12.5.2
HPLT/gpt-13b-nordic-prerelease (few-shot) 14030 131 4099 True 3,520 ± 736 / 823 ± 273 2.65 32.19 ± 4.64 / 24.93 ± 4.09 72.26 ± 6.90 / 72.58 ± 5.87 2.39 ± 1.29 / 48.49 ± 2.46 48.92 ± 2.28 / 53.44 ± 2.49 12.6.1 12.6.1 12.6.1 12.6.1
LumiOpen/Viking-13B (few-shot) 14030 131 4099 True 3,480 ± 727 / 822 ± 274 2.65 32.30 ± 4.52 / 24.91 ± 3.98 72.28 ± 6.64 / 72.80 ± 5.64 2.46 ± 1.31 / 48.51 ± 2.46 48.88 ± 2.35 / 53.41 ± 2.56 12.5.2 12.5.2 12.5.2 12.5.2
allenai/OLMo-7B (few-shot) 6888 50 2051 True 5,403 ± 1,133 / 1,294 ± 423 2.65 37.36 ± 2.11 / 28.59 ± 3.03 72.08 ± 1.20 / 63.52 ± 3.36 -0.86 ± 1.61 / 33.84 ± 0.59 45.16 ± 0.96 / 51.46 ± 0.93 12.5.2 12.5.2 12.5.2 12.5.2
sentence-transformers/distilbert-multilingual-nli-stsb-quora-ranking 135 120 512 True 33,753 ± 8,349 / 5,937 ± 1,946 2.69 65.50 ± 1.20 / 59.72 ± 1.22 68.33 ± 1.03 / 64.03 ± 2.47 14.81 ± 6.63 / 55.50 ± 4.28 16.11 ± 1.18 / 22.88 ± 1.34 0.0.0 0.0.0 0.0.0 0.0.0
sentence-transformers/quora-distilbert-multilingual 135 120 512 True 26,458 ± 5,992 / 5,274 ± 1,731 2.69 65.50 ± 1.20 / 59.72 ± 1.22 68.36 ± 1.18 / 63.94 ± 2.47 14.81 ± 6.63 / 55.50 ± 4.28 16.11 ± 1.18 / 22.88 ± 1.34 0.0.0 0.0.0 0.0.0 0.0.0
DDSC/roberta-base-danish 125 50 512 True 15,004 ± 2,964 / 3,290 ± 1,092 2.71 65.95 ± 1.70 / 60.53 ± 1.38 64.02 ± 2.78 / 62.27 ± 4.19 0.80 ± 0.78 / 47.24 ± 3.43 28.46 ± 0.90 / 33.13 ± 0.88 0.0.0 0.0.0 0.0.0 0.0.0
google/gemma-2b (few-shot) 2506 256 8192 True 6,087 ± 1,046 / 1,902 ± 563 2.74 14.67 ± 4.71 / 14.85 ± 3.77 75.45 ± 1.10 / 64.08 ± 1.47 3.82 ± 1.23 / 44.81 ± 3.55 51.73 ± 0.88 / 57.35 ± 0.82 12.5.2 12.1.0 12.1.0 12.1.0
AI-Sweden-Models/gpt-sw3-20b-instruct (few-shot) 20918 64 2048 True 1,831 ± 587 / 268 ± 90 2.77 15.70 ± 1.54 / 14.65 ± 1.52 68.23 ± 3.81 / 71.17 ± 3.07 12.39 ± 1.39 / 50.99 ± 3.37 52.04 ± 0.97 / 60.86 ± 0.77 9.3.1 9.3.1 9.3.1 9.3.1
sarnikowski/convbert-medium-small-da-cased 24 29 512 True 13,821 ± 2,209 / 3,547 ± 1,184 2.80 58.01 ± 1.23 / 53.87 ± 1.25 57.67 ± 1.61 / 53.64 ± 0.68 13.40 ± 4.31 / 55.37 ± 2.61 24.92 ± 0.80 / 30.11 ± 0.83 0.0.0 0.0.0 0.0.0 0.0.0
ltg/norbert3-xs 15 50 508 True 14,208 ± 2,713 / 3,059 ± 1,002 2.81 67.53 ± 1.66 / 62.96 ± 1.62 59.27 ± 2.14 / 55.26 ± 1.73 2.83 ± 2.01 / 49.25 ± 1.48 24.11 ± 2.76 / 27.79 ± 2.63 0.0.0 0.0.0 0.0.0 0.0.0
AI-Sweden-Models/gpt-sw3-1.3b-instruct (few-shot) 1445 64 2048 True 4,544 ± 1,000 / 1,106 ± 359 2.82 19.04 ± 2.67 / 19.98 ± 2.64 73.34 ± 1.34 / 68.41 ± 2.31 2.90 ± 1.74 / 44.43 ± 4.49 47.45 ± 0.58 / 54.69 ± 0.56 12.5.2 9.3.1 12.1.0 12.4.0
allenai/OLMo-7B-Twin-2T (few-shot) 6888 50 2051 True 5,484 ± 1,125 / 1,317 ± 425 2.83 20.49 ± 7.78 / 19.50 ± 6.82 70.04 ± 2.28 / 60.77 ± 3.00 2.28 ± 1.77 / 36.86 ± 3.97 45.85 ± 1.19 / 51.08 ± 1.21 12.5.2 12.5.2 12.5.2 12.5.2
KBLab/albert-base-swedish-cased-alpha 14 50 512 True 15,925 ± 2,281 / 4,780 ± 1,554 2.87 47.19 ± 9.01 / 44.34 ± 8.27 56.57 ± 1.41 / 53.47 ± 0.84 20.92 ± 4.12 / 59.05 ± 1.86 23.86 ± 1.25 / 30.47 ± 1.51 0.0.0 0.0.0 0.0.0 0.0.0
HPLT/gpt-7b-nordic-prerelease (few-shot) 7550 131 4096 True 5,404 ± 931 / 1,638 ± 542 2.89 27.07 ± 6.33 / 25.24 ± 4.89 61.96 ± 2.69 / 67.81 ± 2.27 2.65 ± 1.46 / 40.25 ± 4.08 46.16 ± 0.91 / 52.35 ± 0.87 12.5.2 12.3.2 12.3.2 12.3.2
sarnikowski/electra-small-discriminator-da-256-cased 13 29 512 True 20,340 ± 3,185 / 5,178 ± 1,700 2.92 52.79 ± 1.21 / 48.47 ± 0.71 57.93 ± 1.56 / 53.71 ± 0.61 14.72 ± 2.01 / 55.92 ± 1.11 20.54 ± 0.97 / 26.37 ± 0.75 0.0.0 0.0.0 0.0.0 0.0.0
LumiOpen/Viking-7B (few-shot) 7550 131 4096 True 5,723 ± 1,025 / 1,670 ± 559 2.93 21.84 ± 3.27 / 21.14 ± 3.21 63.60 ± 4.10 / 68.73 ± 3.10 0.65 ± 1.59 / 43.82 ± 3.40 46.51 ± 1.10 / 52.58 ± 0.98 12.5.2 12.5.2 12.5.2 12.5.2
google/gemma-7b-it (few-shot) 8538 256 8192 False 1,792 ± 249 / 668 ± 203 2.94 59.26 ± 2.00 / 52.73 ± 2.71 28.63 ± 1.24 / 50.95 ± 0.75 11.43 ± 1.88 / 53.31 ± 1.74 46.67 ± 1.97 / 53.24 ± 1.72 12.7.0 12.7.0 12.7.0 12.7.0
AI-Sweden-Models/gpt-sw3-1.3b (few-shot) 1445 64 2048 True 4,608 ± 988 / 1,115 ± 354 2.95 6.08 ± 5.75 / 8.77 ± 4.46 71.38 ± 1.76 / 73.21 ± 1.18 1.17 ± 1.07 / 49.78 ± 0.86 45.55 ± 0.85 / 51.69 ± 0.79 9.3.1 9.3.1 9.3.1 12.5.1
AI-Sweden-Models/gpt-sw3-6.7b-v2-instruct (few-shot) 7111 64 2048 True 2,383 ± 451 / 718 ± 221 2.96 14.58 ± 1.30 / 14.79 ± 1.27 56.60 ± 3.37 / 62.73 ± 3.61 10.92 ± 1.83 / 52.63 ± 2.98 50.18 ± 0.54 / 57.90 ± 0.53 9.2.0 9.2.0 9.2.0 12.4.0
sarnikowski/convbert-small-da-cased 13 29 512 True 14,273 ± 2,312 / 3,555 ± 1,187 2.97 55.06 ± 0.96 / 51.37 ± 1.03 53.70 ± 1.46 / 51.98 ± 0.58 12.38 ± 3.23 / 55.18 ± 1.91 22.53 ± 0.86 / 27.59 ± 0.94 0.0.0 0.0.0 0.0.0 0.0.0
danish-foundation-models/encoder-medium-v1 111 32 512 True 16,130 ± 2,433 / 4,566 ± 1,473 2.99 49.62 ± 2.01 / 46.05 ± 2.11 58.70 ± 2.54 / 58.15 ± 3.35 2.23 ± 2.12 / 46.43 ± 4.85 25.45 ± 0.75 / 30.80 ± 0.81 0.0.0 0.0.0 0.0.0 0.0.0
EuropeanParliament/EUBERT 94 66 512 True 20,070 ± 3,977 / 4,400 ± 1,435 3.01 38.36 ± 1.60 / 37.63 ± 1.73 59.00 ± 1.72 / 56.25 ± 1.72 19.40 ± 5.65 / 58.63 ± 3.13 19.23 ± 0.59 / 26.64 ± 0.90 12.6.1 12.6.1 12.6.1 12.6.1
Maltehb/danish-bert-botxo 111 32 512 True 16,091 ± 2,427 / 4,575 ± 1,485 3.01 50.29 ± 1.22 / 47.15 ± 1.18 57.42 ± 1.88 / 56.53 ± 1.66 4.94 ± 1.62 / 51.57 ± 1.19 24.16 ± 1.28 / 30.28 ± 1.18 0.0.0 0.0.0 0.0.0 0.0.0
norallm/normistral-7b-scratch (few-shot) 7248 33 2048 True 3,192 ± 454 / 1,198 ± 357 3.01 13.79 ± 8.46 / 14.43 ± 7.23 71.59 ± 2.78 / 59.82 ± 1.71 -0.89 ± 1.22 / 43.82 ± 3.45 38.33 ± 1.79 / 44.00 ± 1.70 10.0.0 10.0.0 10.0.0 10.0.0
jannesg/bertsson 124 50 512 True 15,314 ± 2,786 / 3,666 ± 1,201 3.04 51.13 ± 2.13 / 46.67 ± 1.98 61.67 ± 1.40 / 59.12 ± 2.44 2.87 ± 1.53 / 48.92 ± 2.37 17.24 ± 1.13 / 25.10 ± 1.06 0.0.0 0.0.0 0.0.0 0.0.0
AI-Sweden-Models/gpt-sw3-6.7b (few-shot) 7111 64 2048 True 2,285 ± 443 / 671 ± 205 3.07 18.83 ± 6.41 / 17.59 ± 4.55 53.68 ± 10.39 / 58.92 ± 10.87 3.49 ± 2.20 / 46.13 ± 4.13 49.81 ± 0.70 / 55.99 ± 0.69 11.0.0 11.0.0 11.0.0 11.0.0
jannikskytt/MeDa-Bert 111 32 511 True 16,114 ± 2,429 / 4,566 ± 1,482 3.09 48.32 ± 1.62 / 45.04 ± 1.50 53.98 ± 2.05 / 52.94 ± 1.88 3.33 ± 2.12 / 51.06 ± 1.15 23.15 ± 2.61 / 29.17 ± 2.19 0.0.0 0.0.0 0.0.0 0.0.0
sentence-transformers/distiluse-base-multilingual-cased-v1 135 120 512 True 34,042 ± 8,482 / 5,951 ± 1,950 3.09 49.86 ± 1.85 / 44.53 ± 1.58 60.06 ± 1.30 / 55.65 ± 1.34 3.18 ± 1.63 / 48.38 ± 1.42 16.08 ± 2.60 / 23.46 ± 3.10 0.0.0 0.0.0 0.0.0 0.0.0
Maltehb/aelaectra-danish-electra-small-cased 14 32 128 True 4,593 ± 114 / 3,034 ± 973 3.12 57.82 ± 1.40 / 54.55 ± 1.33 55.68 ± 1.09 / 52.81 ± 0.44 19.26 ± 1.80 / 58.62 ± 1.22 0.00 ± 0.00 / 0.00 ± 0.00 0.0.0 0.0.0 0.0.0 0.0.0
sentence-transformers/distiluse-base-multilingual-cased-v2 135 120 512 True 33,247 ± 8,123 / 6,017 ± 1,977 3.13 51.67 ± 1.46 / 53.62 ± 1.02 62.71 ± 0.81 / 57.24 ± 1.73 2.32 ± 1.83 / 48.77 ± 1.62 8.76 ± 1.03 / 17.62 ± 1.24 12.6.1 12.6.1 12.6.1 12.6.1
sentence-transformers/distiluse-base-multilingual-cased 135 120 512 True 19,206 ± 4,451 / 3,658 ± 1,187 3.13 51.67 ± 1.46 / 53.62 ± 1.02 63.04 ± 0.70 / 56.50 ± 1.19 2.32 ± 1.83 / 48.77 ± 1.62 8.93 ± 1.00 / 17.75 ± 1.16 12.6.1 12.6.1 12.6.1 12.6.1
Qwen/Qwen1.5-1.8B-Chat (few-shot) 1837 152 32768 False 8,304 ± 1,846 / 1,933 ± 617 3.14 20.94 ± 3.73 / 18.26 ± 2.84 52.54 ± 3.33 / 60.44 ± 3.13 0.34 ± 1.22 / 36.61 ± 1.57 43.55 ± 1.14 / 50.53 ± 1.40 12.5.2 11.0.0 12.1.0 12.5.0
Qwen/Qwen1.5-1.8B (few-shot) 1837 152 32768 True 5,666 ± 1,328 / 1,256 ± 408 3.16 18.01 ± 6.41 / 18.55 ± 4.65 51.91 ± 4.78 / 59.44 ± 4.65 1.49 ± 1.95 / 40.76 ± 4.07 44.83 ± 0.63 / 51.87 ± 0.72 12.5.2 10.0.1 12.1.0 12.1.0
dbmdz/bert-mini-historic-multilingual-cased 12 32 512 True 47,122 ± 9,661 / 9,714 ± 3,152 3.16 50.07 ± 4.14 / 47.04 ± 3.83 56.10 ± 1.85 / 52.92 ± 0.76 5.05 ± 2.27 / 51.08 ± 1.44 14.49 ± 1.13 / 22.34 ± 1.16 0.0.0 0.0.0 0.0.0 0.0.0
mhenrichsen/danskgpt-tiny-chat (few-shot) 1100 32 2048 False 1,745 ± 978 / 686 ± 159 3.16 27.31 ± 4.23 / 26.33 ± 4.40 45.94 ± 12.82 / 55.94 ± 8.25 -0.97 ± 1.64 / 36.69 ± 2.34 35.57 ± 2.45 / 41.66 ± 2.41 9.1.2 9.1.2 9.1.2 12.4.0
NbAiLab/nb-gpt-j-6B-alpaca (few-shot) 6055 50 1024 False 2,607 ± 592 / 680 ± 208 3.20 13.28 ± 4.32 / 13.40 ± 2.95 60.17 ± 8.39 / 65.99 ± 4.66 1.52 ± 1.94 / 45.19 ± 3.80 37.23 ± 1.07 / 46.83 ± 0.82 9.3.1 10.0.1 10.0.1 12.4.0
google/gemma-2b-it (few-shot) 2506 256 8192 False 6,471 ± 1,142 / 1,961 ± 584 3.20 33.51 ± 2.12 / 23.48 ± 2.69 43.97 ± 1.64 / 57.41 ± 1.18 0.53 ± 1.09 / 39.60 ± 1.99 39.39 ± 1.04 / 47.28 ± 1.02 12.5.2 12.1.0 12.1.0 12.4.0
jjzha/dajobbert-base-uncased 110 32 512 True 16,243 ± 2,428 / 4,593 ± 1,484 3.24 42.99 ± 1.57 / 39.98 ± 1.42 55.49 ± 1.28 / 55.99 ± 2.15 4.69 ± 2.28 / 51.01 ± 1.58 14.22 ± 1.90 / 20.83 ± 1.46 0.0.0 0.0.0 0.0.0 0.0.0
AI-Sweden-Models/gpt-sw3-356m-instruct (few-shot) 471 64 2048 True 5,855 ± 1,373 / 1,223 ± 391 3.27 14.84 ± 1.63 / 15.90 ± 1.71 59.00 ± 3.60 / 54.09 ± 1.46 0.06 ± 1.21 / 34.76 ± 1.15 34.37 ± 1.36 / 40.44 ± 1.53 12.5.2 9.3.2 12.1.0 12.4.0
Maltehb/aelaectra-danish-electra-small-uncased 14 32 128 True 5,995 ± 135 / 3,839 ± 1,247 3.28 39.17 ± 4.06 / 36.74 ± 3.78 57.71 ± 1.40 / 53.54 ± 0.59 17.10 ± 2.57 / 57.41 ± 1.03 0.11 ± 0.11 / 0.13 ± 0.12 0.0.0 0.0.0 0.0.0 0.0.0
Qwen/Qwen1.5-4B (few-shot) 3950 152 32768 True 3,248 ± 739 / 761 ± 252 3.40 37.26 ± 4.28 / 29.89 ± 5.96 5.20 ± 7.35 / 30.65 ± 4.97 1.85 ± 1.54 / 33.71 ± 0.46 54.15 ± 0.58 / 60.15 ± 0.59 12.5.2 9.3.2 12.1.0 12.1.0
RuterNorway/Llama-2-7b-chat-norwegian (few-shot) unknown 32 4096 False 10,890 ± 2,686 / 2,186 ± 750 3.40 22.38 ± 3.00 / 22.09 ± 2.85 31.11 ± 12.17 / 36.84 ± 11.52 0.09 ± 0.67 / 33.42 ± 0.30 44.36 ± 1.34 / 50.14 ± 1.15 9.3.1 9.3.1 9.3.1 12.5.2
AI-Sweden-Models/gpt-sw3-356m (few-shot) 471 64 2048 True 5,758 ± 1,348 / 1,215 ± 391 3.50 23.77 ± 3.70 / 23.06 ± 3.46 34.29 ± 11.64 / 36.76 ± 7.46 1.57 ± 1.70 / 40.84 ± 1.99 33.70 ± 1.46 / 38.82 ± 1.54 9.3.1 9.3.1 9.3.2 12.5.1
Qwen/Qwen1.5-0.5B-Chat (few-shot) 620 152 32768 False 11,740 ± 3,000 / 2,209 ± 721 3.50 18.57 ± 4.62 / 17.69 ± 4.61 40.23 ± 5.86 / 49.01 ± 4.77 0.21 ± 1.06 / 39.60 ± 3.61 29.49 ± 2.47 / 35.01 ± 2.72 12.5.2 11.0.0 12.1.0 12.4.0
3ebdola/Dialectal-Arabic-XLM-R-Base 278 250 512 True 12,783 ± 2,537 / 2,712 ± 885 3.52 42.78 ± 3.26 / 40.46 ± 3.04 44.95 ± 2.30 / 48.17 ± 1.06 1.43 ± 1.34 / 48.66 ± 2.45 8.71 ± 2.58 / 16.77 ± 2.50 0.0.0 0.0.0 0.0.0 0.0.0
mhenrichsen/danskgpt-tiny (few-shot) 1100 32 2048 True 8,597 ± 1,983 / 1,926 ± 600 3.55 23.92 ± 6.88 / 22.42 ± 6.73 31.93 ± 14.68 / 43.80 ± 8.79 0.46 ± 1.91 / 43.45 ± 3.64 30.81 ± 2.73 / 35.67 ± 2.95 0.0.0 0.0.0 0.0.0 12.5.1
allenai/OLMo-1B (few-shot) 1177 50 2051 True 8,536 ± 1,926 / 1,940 ± 619 3.57 29.39 ± 3.08 / 29.93 ± 3.14 38.95 ± 11.78 / 43.61 ± 8.46 -1.35 ± 1.76 / 40.70 ± 4.25 17.85 ± 3.77 / 20.30 ± 4.04 12.5.2 12.1.0 12.1.0 12.1.0
Qwen/Qwen1.5-0.5B (few-shot) 620 152 32768 True 11,371 ± 2,924 / 2,122 ± 692 3.58 28.96 ± 2.39 / 26.49 ± 3.14 26.58 ± 5.12 / 28.64 ± 5.35 -1.88 ± 1.46 / 35.45 ± 2.92 34.59 ± 1.06 / 40.95 ± 1.11 12.5.2 10.0.1 12.1.0 12.1.0
dbmdz/bert-tiny-historic-multilingual-cased 5 32 512 True 78,027 ± 15,466 / 17,064 ± 5,335 3.58 26.87 ± 2.26 / 25.42 ± 2.14 57.41 ± 1.89 / 53.50 ± 0.75 -1.06 ± 1.33 / 48.72 ± 0.87 5.54 ± 1.44 / 10.83 ± 2.62 0.0.0 0.0.0 0.0.0 0.0.0
RabotaRu/HRBert-mini 80 200 512 True 54,951 ± 11,500 / 11,401 ± 3,819 3.70 24.61 ± 1.58 / 23.05 ± 1.43 52.31 ± 1.22 / 51.51 ± 0.49 1.32 ± 1.87 / 46.80 ± 4.29 2.86 ± 0.73 / 8.44 ± 2.01 0.0.0 0.0.0 0.0.0 0.0.0
alexanderfalk/danbert-small-cased 83 52 512 True 30,013 ± 4,309 / 8,840 ± 2,859 3.71 22.47 ± 1.55 / 20.74 ± 1.42 53.88 ± 1.97 / 52.30 ± 1.04 1.55 ± 1.62 / 44.06 ± 3.56 1.12 ± 0.85 / 3.07 ± 1.91 0.0.0 0.0.0 0.0.0 0.0.0
fresh-electra-small 14 31 512 True 7,840 ± 1,538 / 3,024 ± 438 3.84 10.54 ± 1.12 / 9.71 ± 1.00 55.54 ± 2.75 / 52.75 ± 1.09 -0.15 ± 0.52 / 35.33 ± 3.00 0.02 ± 0.03 / 0.04 ± 0.05 0.0.0 0.0.0 0.0.0 0.0.0
fresh-xlm-roberta-base 278 250 512 True 2,214 ± 94 / 1,494 ± 229 3.86 11.91 ± 3.04 / 11.31 ± 2.90 51.11 ± 5.32 / 50.09 ± 3.33 0.86 ± 0.82 / 39.16 ± 3.63 2.00 ± 0.63 / 7.20 ± 1.68 0.0.0 0.0.0 0.0.0 0.0.0
AI-Sweden-Models/gpt-sw3-126m-instruct (few-shot) 186 64 2048 True 7,717 ± 1,553 / 2,013 ± 625 4.07 23.05 ± 2.31 / 24.35 ± 1.99 12.47 ± 7.10 / 23.03 ± 8.78 0.08 ± 0.16 / 33.34 ± 0.30 20.43 ± 2.69 / 24.25 ± 2.67 9.3.2 9.3.2 11.0.0 12.4.0
NbAiLab/nb-gpt-j-6B-v2 (few-shot) 6051 50 1024 False 2,556 ± 580 / 681 ± 214 4.11 0.31 ± 0.55 / 0.29 ± 0.50 27.42 ± 12.16 / 38.74 ± 10.05 0.07 ± 1.06 / 35.80 ± 1.73 17.82 ± 11.21 / 31.12 ± 8.39 9.3.1 10.0.1 11.0.0 12.5.1
NbAiLab/nb-gpt-j-6B@sharded (few-shot) unknown 50 1024 True 2,630 ± 605 / 684 ± 217 4.19 0.01 ± 0.02 / 0.11 ± 0.12 33.50 ± 13.13 / 39.30 ± 11.93 -0.02 ± 0.60 / 34.92 ± 2.99 4.79 ± 3.55 / 18.06 ± 2.80 9.3.1 10.0.1 10.0.1 12.5.1
RJuro/kanelsnegl-v0.1 (few-shot) 7242 32 512 True 9,757 ± 2,047 / 2,200 ± 705 4.27 0.00 ± 0.00 / 0.00 ± 0.00 34.63 ± 9.69 / 40.92 ± 6.88 0.00 ± 0.00 / 33.30 ± 0.27 0.00 ± 0.00 / 8.92 ± 2.90 9.3.1 9.3.1 9.3.1 12.5.1
AI-Sweden-Models/gpt-sw3-126m (few-shot) 186 64 2048 True 8,958 ± 1,815 / 2,240 ± 696 4.32 5.66 ± 4.11 / 8.37 ± 3.24 8.15 ± 8.87 / 24.31 ± 7.12 -0.81 ± 1.16 / 36.81 ± 2.47 16.40 ± 2.88 / 19.18 ± 3.18 9.2.0 9.2.0 9.2.0 12.5.1
RJuro/kanelsnegl-v0.2 (few-shot) 7242 32 512 True 1,373 ± 120 / 709 ± 172 4.38 0.00 ± 0.00 / 0.00 ± 0.00 28.62 ± 12.67 / 35.36 ± 8.35 0.00 ± 0.00 / 33.30 ± 0.27 0.00 ± 0.00 / 19.59 ± 6.84 10.0.1 10.0.1 10.0.1 10.0.1
NorGLM/NorGPT-369M (few-shot) unknown 64 1024 True 19,896 ± 5,099 / 3,848 ± 1,251 4.60 1.47 ± 1.90 / 1.32 ± 1.69 5.50 ± 4.49 / 28.77 ± 3.76 -2.19 ± 1.29 / 40.52 ± 3.02 0.10 ± 0.06 / 4.36 ± 0.44 12.5.2 12.5.2 12.5.2 12.5.2
ai-forever/mGPT (few-shot) unknown 100 1024 True 13,551 ± 4,259 / 2,563 ± 838 4.66 0.00 ± 0.00 / 0.00 ± 0.00 0.00 ± 0.00 / 19.32 ± 0.16 0.49 ± 1.29 / 39.12 ± 3.92 6.24 ± 3.13 / 7.85 ± 3.67 9.3.1 10.0.1 11.0.0 12.5.1
peter-sk/gpt-neox-da (few-shot) 1515 50 1024 True 6,025 ± 1,442 / 1,342 ± 431 4.70 0.26 ± 0.16 / 0.26 ± 0.14 4.75 ± 2.54 / 27.85 ± 1.59 -0.60 ± 1.56 / 40.53 ± 2.93 0.06 ± 0.09 / 1.07 ± 0.35 10.0.1 10.0.1 10.0.1 10.0.1
Sigurdur/qa-icebreaker (few-shot) 110 32 1024 False 44,889 ± 6,944 / 13,506 ± 4,256 4.80 0.00 ± 0.00 / 0.00 ± 0.00 -0.10 ± 1.04 / 22.69 ± 0.52 0.00 ± 0.00 / 33.30 ± 0.27 0.00 ± 0.00 / 0.02 ± 0.01 12.7.0 12.7.0 12.7.0 12.7.0
Sigurdur/icebreaker (few-shot) 110 32 1024 False 48,619 ± 7,681 / 13,831 ± 4,404 4.85 0.00 ± 0.00 / 0.00 ± 0.00 -3.60 ± 3.63 / 20.29 ± 1.99 0.00 ± 0.00 / 33.30 ± 0.27 0.00 ± 0.00 / 0.05 ± 0.03 12.7.0 12.7.0 12.7.0 12.7.0
Download as CSV   •   Copy embed HTML