Dutch NLU 🇳🇱

Last updated: 29/04/2024 11:26:26 CET
Model ID Parameters Vocabulary size Context Commercial Speed Rank CoNLL-nl Dutch Social ScaLA-nl SQuAD-nl CoNLL-nl version Dutch Social version ScaLA-nl version SQuAD-nl version
intfloat/multilingual-e5-large 559 250 512 True 6,732 ± 1,273 / 1,633 ± 523 1.39 82.31 ± 2.14 / 86.91 ± 1.34 32.64 ± 2.91 / 49.90 ± 3.42 58.51 ± 4.12 / 78.17 ± 2.32 45.32 ± 1.91 / 57.53 ± 1.78 0.0.0 0.0.0 0.0.0 0.0.0
setu4993/LaBSE 470 501 512 True 25,418 ± 6,435 / 4,536 ± 1,452 1.47 82.02 ± 1.04 / 84.71 ± 0.59 33.99 ± 4.05 / 50.69 ± 4.23 60.77 ± 1.53 / 79.80 ± 0.87 41.55 ± 1.08 / 51.73 ± 1.18 0.0.0 0.0.0 0.0.0 0.0.0
DTAI-KULeuven/robbert-2022-dutch-base 118 43 512 True 11,307 ± 2,134 / 2,580 ± 834 1.75 79.84 ± 1.41 / 84.42 ± 1.03 24.58 ± 5.35 / 42.93 ± 4.78 68.76 ± 1.47 / 83.77 ± 0.93 27.63 ± 1.32 / 36.98 ± 1.39 0.0.0 0.0.0 0.0.0 0.0.0
pdelobelle/robbert-v2-dutch-base 116 40 512 True 15,481 ± 2,820 / 3,708 ± 1,186 1.77 78.30 ± 1.97 / 83.07 ± 1.30 26.68 ± 2.90 / 44.41 ± 2.97 63.83 ± 3.09 / 80.68 ± 2.18 28.34 ± 1.30 / 37.79 ± 1.37 0.0.0 0.0.0 0.0.0 0.0.0
meta-llama/Meta-Llama-3-70B (few-shot, val) 70554 128 8317 True 312 ± 55 / 177 ± 51 1.78 72.91 ± 3.24 / 68.06 ± 4.62 19.08 ± 3.37 / 42.04 ± 2.31 54.33 ± 3.49 / 75.54 ± 2.31 63.99 ± 2.07 / 77.63 ± 1.16 12.7.0 12.7.0 12.7.0 12.7.0
ZurichNLP/unsup-simcse-xlm-roberta-base 277 250 512 True 34,520 ± 7,443 / 6,730 ± 2,224 1.82 78.45 ± 1.88 / 83.50 ± 0.85 22.67 ± 7.22 / 44.07 ± 6.51 54.92 ± 9.62 / 76.14 ± 5.00 31.82 ± 2.84 / 40.85 ± 3.02 0.0.0 0.0.0 0.0.0 0.0.0
gpt-4-1106-preview (few-shot) unknown 100 128000 True 573 ± 185 / 87 ± 32 1.90 66.44 ± 2.18 / 56.97 ± 2.87 15.05 ± 1.85 / 35.62 ± 1.91 74.01 ± 1.29 / 86.71 ± 0.78 57.81 ± 1.23 / 74.51 ± 0.62 12.3.2 12.3.2 12.3.2 12.3.2
intfloat/multilingual-e5-base 277 250 512 True 14,965 ± 2,890 / 3,322 ± 1,074 1.96 79.12 ± 1.90 / 83.05 ± 1.09 27.67 ± 2.85 / 44.90 ± 2.69 39.28 ± 12.28 / 67.90 ± 5.94 35.71 ± 1.70 / 46.63 ± 1.40 0.0.0 0.0.0 0.0.0 0.0.0
DTAI-KULeuven/robbertje-1-gb-non-shuffled 74 40 512 True 21,007 ± 3,892 / 4,922 ± 1,588 2.03 74.50 ± 1.61 / 81.38 ± 0.87 32.23 ± 1.76 / 50.11 ± 2.55 54.57 ± 1.72 / 75.82 ± 1.07 6.31 ± 0.28 / 11.55 ± 0.22 0.0.0 0.0.0 0.0.0 0.0.0
FacebookAI/xlm-roberta-large 559 250 512 True 17,897 ± 3,921 / 3,463 ± 1,141 2.03 83.49 ± 1.51 / 86.12 ± 1.21 8.82 ± 7.93 / 30.82 ± 4.71 64.80 ± 8.79 / 80.93 ± 6.29 50.72 ± 1.20 / 61.66 ± 1.16 0.0.0 0.0.0 0.0.0 0.0.0
152334H/miqu-1-70b-sf (few-shot, val) 68977 32 32889 True 2,126 ± 676 / 319 ± 104 2.05 67.00 ± 3.69 / 56.41 ± 4.29 15.33 ± 4.14 / 36.14 ± 2.91 55.48 ± 4.37 / 77.55 ± 2.24 61.02 ± 1.67 / 76.87 ± 1.15 12.7.0 12.7.0 12.7.0 12.7.0
DTAI-KULeuven/robbert-2023-dutch-base 124 50 512 True 11,230 ± 1,939 / 2,750 ± 897 2.06 82.22 ± 1.28 / 86.32 ± 0.75 28.20 ± 3.75 / 44.38 ± 3.44 55.12 ± 11.67 / 76.12 ± 7.45 9.74 ± 0.34 / 44.34 ± 0.99 0.0.0 0.0.0 0.0.0 0.0.0
meta-llama/Meta-Llama-3-70B-Instruct (few-shot, val) 70554 128 8317 True 1,673 ± 583 / 275 ± 85 2.07 74.64 ± 3.67 / 71.84 ± 4.01 18.90 ± 2.04 / 41.93 ± 1.60 49.54 ± 4.22 / 74.03 ± 2.52 44.77 ± 1.67 / 71.44 ± 1.30 12.7.0 12.7.0 12.7.0 12.7.0
DTAI-KULeuven/robbertje-1-gb-merged 74 40 512 True 21,027 ± 3,902 / 4,932 ± 1,591 2.11 72.51 ± 0.97 / 80.14 ± 0.74 32.26 ± 2.51 / 47.44 ± 2.78 50.00 ± 2.09 / 73.38 ± 1.43 5.97 ± 0.44 / 11.08 ± 0.47 0.0.0 0.0.0 0.0.0 0.0.0
jhu-clsp/bernice 277 250 128 True 5,567 ± 450 / 2,483 ± 798 2.13 78.74 ± 1.42 / 84.14 ± 0.87 22.58 ± 5.79 / 41.55 ± 4.55 55.39 ± 2.71 / 76.38 ± 2.03 5.95 ± 3.06 / 7.23 ± 3.67 0.0.0 0.0.0 0.0.0 0.0.0
DTAI-KULeuven/robbert-2023-dutch-large 354 50 512 True 5,444 ± 911 / 1,413 ± 457 2.14 81.05 ± 2.44 / 85.20 ± 1.69 16.35 ± 6.39 / 36.89 ± 5.03 65.18 ± 1.83 / 82.29 ± 0.90 11.44 ± 0.50 / 52.98 ± 1.31 0.0.0 0.0.0 0.0.0 0.0.0
DTAI-KULeuven/robbertje-1-gb-shuffled 74 40 512 True 20,616 ± 3,755 / 4,819 ± 1,542 2.18 73.55 ± 2.27 / 80.69 ± 1.31 26.02 ± 3.29 / 42.90 ± 2.60 57.03 ± 1.80 / 77.24 ± 1.09 6.64 ± 0.36 / 12.04 ± 0.28 0.0.0 0.0.0 0.0.0 0.0.0
microsoft/mdeberta-v3-base 278 251 512 True 20,637 ± 3,925 / 4,497 ± 1,502 2.19 84.47 ± 1.84 / 87.98 ± 1.21 5.16 ± 5.21 / 27.85 ± 3.29 71.23 ± 1.62 / 85.45 ± 0.83 46.43 ± 0.72 / 57.80 ± 0.84 0.0.0 0.0.0 0.0.0 0.0.0
gpt-3.5-turbo-0613 (few-shot, val) unknown 100 4095 True 921 ± 293 / 113 ± 37 2.20 68.96 ± 3.80 / 58.45 ± 3.71 8.81 ± 3.30 / 30.88 ± 2.25 58.95 ± 4.48 / 78.64 ± 2.32 55.57 ± 2.33 / 68.26 ± 1.85 0.0.0 0.0.0 0.0.0 0.0.0
Nexusflow/Starling-LM-7B-beta (few-shot) 7242 32 8192 False 5,876 ± 1,021 / 1,677 ± 546 2.22 64.47 ± 2.31 / 40.89 ± 2.81 13.83 ± 1.91 / 41.53 ± 1.23 45.69 ± 1.76 / 72.13 ± 1.39 58.03 ± 1.37 / 73.17 ± 0.58 12.5.2 12.5.2 12.5.2 12.5.2
google/rembert 575 250 512 True 11,736 ± 2,822 / 2,102 ± 677 2.22 75.49 ± 1.75 / 81.37 ± 1.31 4.79 ± 3.93 / 27.49 ± 2.37 66.47 ± 2.04 / 83.16 ± 1.01 55.70 ± 1.62 / 68.38 ± 1.47 12.6.1 12.6.1 12.6.1 12.6.1
upstage/SOLAR-10.7B-v1.0 (few-shot) 10732 32 4096 True 3,780 ± 906 / 799 ± 261 2.27 65.37 ± 1.61 / 46.10 ± 1.53 11.93 ± 1.80 / 34.67 ± 2.84 41.67 ± 1.53 / 69.81 ± 1.38 67.75 ± 0.62 / 78.01 ± 0.45 12.5.3 12.5.3 12.5.3 12.5.3
cardiffnlp/twitter-xlm-roberta-base 277 250 512 True 34,475 ± 7,465 / 6,712 ± 2,223 2.31 77.15 ± 1.38 / 81.92 ± 1.32 18.78 ± 6.76 / 37.09 ± 4.14 56.72 ± 3.83 / 77.53 ± 2.17 14.61 ± 4.26 / 20.91 ± 5.21 0.0.0 0.0.0 0.0.0 0.0.0
meta-llama/Llama-2-70b-hf (few-shot, val) 68977 32 4221 True 1,892 ± 650 / 318 ± 105 2.32 66.50 ± 3.72 / 57.66 ± 3.78 7.82 ± 4.30 / 34.91 ± 2.53 49.55 ± 4.95 / 73.43 ± 3.38 65.26 ± 1.55 / 77.36 ± 1.41 12.7.0 12.7.0 12.7.0 12.7.0
sentence-transformers/paraphrase-xlm-r-multilingual-v1 277 250 512 True 20,154 ± 4,438 / 3,890 ± 1,256 2.38 70.59 ± 1.60 / 78.25 ± 1.22 21.37 ± 8.79 / 40.62 ± 7.64 45.86 ± 2.06 / 71.32 ± 1.40 5.20 ± 0.30 / 10.40 ± 0.38 0.0.0 0.0.0 0.0.0 0.0.0
meta-llama/Meta-Llama-3-8B-Instruct (few-shot) 8030 128 8192 True 4,909 ± 1,215 / 978 ± 319 2.42 68.72 ± 1.81 / 54.89 ± 2.10 14.67 ± 2.51 / 41.36 ± 2.04 32.91 ± 2.56 / 64.93 ± 1.97 45.36 ± 1.31 / 67.50 ± 0.69 12.6.1 12.6.1 12.6.1 12.6.1
meta-llama/Llama-2-70b-chat-hf (few-shot, val) 68977 32 4221 True 1,979 ± 621 / 320 ± 105 2.44 64.00 ± 3.52 / 48.94 ± 3.83 13.30 ± 3.75 / 30.50 ± 2.48 30.88 ± 4.62 / 59.62 ± 4.50 54.14 ± 1.55 / 70.96 ± 1.01 12.7.0 12.7.0 12.7.0 12.7.0
DTAI-KULeuven/robbertje-1-gb-bort 45 40 512 True 31,087 ± 5,833 / 7,147 ± 2,339 2.46 66.74 ± 1.53 / 75.07 ± 0.86 24.93 ± 6.85 / 41.47 ± 4.05 37.19 ± 6.22 / 66.68 ± 3.26 5.23 ± 0.43 / 10.67 ± 0.44 0.0.0 0.0.0 0.0.0 0.0.0
meta-llama/Meta-Llama-3-8B (few-shot) 8030 128 8192 True 4,687 ± 1,121 / 967 ± 313 2.48 62.26 ± 2.20 / 42.41 ± 2.02 10.45 ± 2.69 / 33.45 ± 1.99 30.30 ± 3.94 / 62.28 ± 2.89 62.99 ± 1.00 / 73.73 ± 0.98 12.6.1 12.6.1 12.6.1 12.6.1
senseable/WestLake-7B-v2 (few-shot) 7242 32 32768 False 5,993 ± 1,028 / 1,742 ± 561 2.52 64.25 ± 2.23 / 46.52 ± 1.72 13.66 ± 1.99 / 39.45 ± 1.52 28.59 ± 1.48 / 61.24 ± 1.46 49.64 ± 0.86 / 68.04 ± 0.55 12.6.1 12.6.1 12.6.1 12.6.1
robinsmits/Qwen1.5-7B-Dutch-Chat (few-shot) 7719 152 32768 False 4,686 ± 1,131 / 996 ± 326 2.55 57.81 ± 2.68 / 47.15 ± 2.77 14.62 ± 2.25 / 41.08 ± 1.81 25.34 ± 2.37 / 54.46 ± 3.43 56.81 ± 1.44 / 70.49 ± 0.68 12.5.3 12.5.3 12.5.3 12.5.3
robinsmits/Qwen1.5-7B-Dutch-Chat-Sft-Bf16 (few-shot) 7719 152 32768 False 2,413 ± 463 / 700 ± 220 2.59 56.83 ± 2.31 / 46.81 ± 2.87 14.79 ± 1.96 / 41.48 ± 1.53 23.58 ± 2.69 / 50.85 ± 3.74 55.90 ± 1.80 / 70.07 ± 0.77 12.6.1 12.6.1 12.6.1 12.6.1
mlabonne/NeuralBeagle14-7B (few-shot, val) 7242 32 8192 False 2,549 ± 472 / 784 ± 245 2.60 63.53 ± 3.80 / 50.43 ± 2.90 11.25 ± 4.22 / 39.00 ± 3.14 27.76 ± 4.44 / 62.44 ± 2.43 50.94 ± 1.12 / 70.12 ± 0.96 9.3.2 9.3.2 9.3.2 12.5.2
ReBatch/Llama-3-8B-dutch (few-shot) 8030 128 8317 False 3,800 ± 1,275 / 566 ± 194 2.64 60.14 ± 2.00 / 44.91 ± 2.19 11.07 ± 1.98 / 34.77 ± 1.31 15.67 ± 3.75 / 40.14 ± 2.65 59.93 ± 1.17 / 71.20 ± 1.30 12.7.0 12.7.0 12.7.0 12.7.0
microsoft/xlm-align-base 277 250 512 True 14,744 ± 2,870 / 3,265 ± 1,053 2.68 78.85 ± 2.48 / 83.35 ± 2.28 11.80 ± 7.64 / 33.49 ± 6.73 14.56 ± 8.02 / 53.64 ± 5.14 42.08 ± 7.94 / 51.94 ± 9.08 0.0.0 0.0.0 0.0.0 0.0.0
Rijgersberg/Mistral-7B-v0.1-chat-nl (few-shot) 7242 32 32768 False 5,907 ± 1,028 / 1,695 ± 549 2.69 56.73 ± 1.95 / 38.97 ± 1.84 11.08 ± 1.46 / 32.20 ± 1.43 19.41 ± 2.55 / 57.17 ± 2.38 58.91 ± 0.92 / 71.22 ± 0.72 12.5.2 12.5.2 12.5.2 12.5.2
mlabonne/AlphaMonarch-7B (few-shot, val) 7242 32 8192 False 5,340 ± 1,262 / 1,157 ± 375 2.69 64.71 ± 5.15 / 53.58 ± 3.82 11.14 ± 3.37 / 38.64 ± 2.36 25.22 ± 5.45 / 61.28 ± 2.51 46.34 ± 1.07 / 66.56 ± 1.49 12.5.2 12.5.2 12.5.2 12.5.2
sentence-transformers/quora-distilbert-multilingual 135 120 512 True 26,458 ± 5,992 / 5,274 ± 1,731 2.69 67.89 ± 1.61 / 74.48 ± 1.24 23.25 ± 6.95 / 44.88 ± 6.27 21.36 ± 7.80 / 59.50 ± 3.54 4.50 ± 0.39 / 9.94 ± 0.33 0.0.0 0.0.0 0.0.0 0.0.0
mistralai/Mistral-7B-v0.1 (few-shot) 7242 32 32768 True 2,657 ± 524 / 880 ± 278 2.70 58.15 ± 1.14 / 40.78 ± 1.91 7.94 ± 1.25 / 31.02 ± 3.45 25.41 ± 3.46 / 61.11 ± 2.36 62.56 ± 1.10 / 73.16 ± 0.93 9.1.2 9.1.2 9.1.2 12.5.1
alpindale/Mistral-7B-v0.2-hf (few-shot) 7242 32 32768 True 1,841 ± 297 / 651 ± 193 2.75 56.76 ± 1.52 / 42.03 ± 1.98 7.11 ± 1.17 / 26.36 ± 2.97 23.55 ± 2.76 / 59.14 ± 3.18 61.89 ± 1.10 / 72.41 ± 1.08 12.5.2 12.5.2 12.5.2 12.5.2
sentence-transformers/stsb-xlm-r-multilingual 278 250 512 True 15,040 ± 2,953 / 3,417 ± 1,100 2.77 66.85 ± 1.32 / 72.84 ± 0.82 20.56 ± 1.44 / 39.67 ± 0.86 35.56 ± 1.76 / 66.00 ± 1.15 5.04 ± 0.46 / 10.13 ± 0.40 12.6.1 12.6.1 12.6.1 12.6.1
mistralai/Mistral-7B-Instruct-v0.2 (few-shot) 7242 32 32768 False 2,538 ± 415 / 821 ± 253 2.78 55.56 ± 2.66 / 39.56 ± 2.13 12.37 ± 1.64 / 37.37 ± 1.35 21.50 ± 1.70 / 59.10 ± 1.32 50.77 ± 0.95 / 66.54 ± 0.79 9.3.1 9.2.0 9.3.1 12.4.0
Geotrend/distilbert-base-25lang-cased 108 85 512 True 26,099 ± 5,881 / 5,178 ± 1,665 2.79 75.02 ± 1.48 / 81.57 ± 0.76 7.45 ± 2.99 / 29.70 ± 1.94 45.28 ± 0.55 / 71.89 ± 0.59 20.18 ± 1.26 / 27.86 ± 1.48 0.0.0 0.0.0 0.0.0 0.0.0
RuterNorway/Llama-2-13b-chat-norwegian (few-shot) unknown 32 4096 False 7,778 ± 1,755 / 1,703 ± 552 2.82 57.66 ± 1.29 / 43.77 ± 2.78 8.41 ± 1.47 / 25.59 ± 1.30 16.93 ± 2.60 / 55.72 ± 3.35 56.29 ± 1.11 / 68.94 ± 0.81 9.3.1 9.3.1 9.3.1 9.3.1
BramVanroy/GEITje-7B-ultra (few-shot) 7242 32 8192 False 2,475 ± 460 / 765 ± 238 2.84 42.25 ± 2.12 / 27.85 ± 1.09 12.78 ± 2.52 / 42.17 ± 1.91 18.23 ± 1.91 / 50.04 ± 2.54 53.41 ± 1.11 / 66.45 ± 0.46 9.3.2 9.3.2 9.3.2 12.4.0
Rijgersberg/GEITje-7B-chat-v2 (few-shot) 7242 32 32768 False 5,908 ± 1,022 / 1,694 ± 551 2.85 42.12 ± 4.00 / 31.12 ± 1.86 11.06 ± 2.30 / 40.32 ± 1.64 19.71 ± 3.65 / 49.65 ± 4.28 59.19 ± 0.91 / 70.06 ± 0.82 12.5.2 12.5.2 12.5.2 12.5.2
occiglot/occiglot-7b-eu5-instruct (few-shot) 7242 32 32768 False 2,088 ± 352 / 706 ± 214 2.87 53.78 ± 1.86 / 41.29 ± 2.07 7.78 ± 1.43 / 24.33 ± 1.57 16.23 ± 2.49 / 55.09 ± 3.18 63.09 ± 1.18 / 73.88 ± 0.72 12.5.2 12.2.0 12.3.1 12.4.0
Rijgersberg/GEITje-7B-chat (few-shot) 7242 32 32768 False 5,920 ± 1,028 / 1,696 ± 550 2.90 50.69 ± 1.67 / 35.96 ± 2.63 8.16 ± 1.68 / 27.37 ± 1.95 20.45 ± 2.12 / 59.00 ± 1.21 54.48 ± 0.86 / 66.71 ± 0.59 12.5.2 12.5.2 12.5.2 12.5.2
Twitter/twhin-bert-base 278 250 512 True 11,514 ± 2,041 / 2,862 ± 918 2.92 74.03 ± 3.05 / 80.59 ± 2.24 9.53 ± 5.28 / 32.06 ± 4.17 39.12 ± 12.90 / 68.36 ± 6.85 7.71 ± 0.42 / 12.90 ± 0.39 0.0.0 0.0.0 0.0.0 0.0.0
timpal0l/Mistral-7B-v0.1-flashback-v2 (few-shot) 7242 32 32768 True 5,054 ± 1,200 / 1,056 ± 339 2.92 54.56 ± 2.96 / 37.86 ± 2.49 8.43 ± 1.27 / 24.23 ± 0.94 10.99 ± 2.55 / 50.46 ± 4.17 55.91 ± 1.08 / 66.78 ± 1.13 12.5.3 12.5.3 12.5.3 12.5.3
occiglot/occiglot-7b-eu5 (few-shot) 7242 32 32768 True 2,219 ± 427 / 717 ± 224 2.93 51.31 ± 2.32 / 42.95 ± 2.58 7.41 ± 1.24 / 26.93 ± 1.56 13.04 ± 1.93 / 53.54 ± 2.70 59.28 ± 1.15 / 69.67 ± 0.95 12.5.2 12.1.0 12.1.0 12.1.0
Qwen/Qwen1.5-4B-Chat (few-shot) 3950 152 32768 False 4,347 ± 893 / 1,135 ± 365 2.95 42.52 ± 2.25 / 37.46 ± 3.08 14.68 ± 1.40 / 40.53 ± 1.64 4.07 ± 2.16 / 35.24 ± 1.77 55.18 ± 0.74 / 66.50 ± 0.80 12.5.2 10.0.1 12.1.0 12.5.2
Rijgersberg/GEITje-7B (few-shot) 7242 32 32768 True 10,401 ± 2,529 / 2,123 ± 690 2.95 47.53 ± 1.90 / 32.42 ± 1.99 4.36 ± 2.96 / 28.11 ± 4.71 30.67 ± 4.45 / 63.78 ± 2.80 56.55 ± 0.70 / 67.56 ± 0.60 9.3.1 9.3.1 9.3.1 9.3.1
Twitter/twhin-bert-large 560 250 512 True 9,707 ± 1,664 / 2,549 ± 831 2.95 77.35 ± 2.80 / 82.50 ± 1.87 6.55 ± 5.33 / 28.68 ± 3.64 18.25 ± 8.41 / 54.00 ± 5.57 28.37 ± 4.84 / 36.84 ± 5.92 0.0.0 0.0.0 0.0.0 0.0.0
mistralai/Mistral-7B-Instruct-v0.1 (few-shot) 7242 32 32768 False 5,443 ± 1,273 / 1,144 ± 364 2.98 52.72 ± 2.58 / 33.51 ± 1.22 7.91 ± 2.16 / 27.82 ± 1.97 18.14 ± 2.10 / 55.42 ± 3.05 52.75 ± 0.88 / 67.15 ± 1.08 9.3.1 9.3.1 9.3.1 12.4.0
meta-llama/Llama-2-7b-chat-hf (few-shot) 6738 32 4096 False 2,643 ± 455 / 800 ± 247 2.99 50.23 ± 2.34 / 37.12 ± 3.30 10.07 ± 1.84 / 35.66 ± 2.24 14.73 ± 1.62 / 54.59 ± 2.24 53.42 ± 0.80 / 66.24 ± 0.84 9.3.1 9.3.1 9.3.1 12.4.0
meta-llama/Llama-2-7b-hf (few-shot) 6738 32 4096 True 2,648 ± 467 / 799 ± 250 3.00 40.49 ± 4.32 / 30.86 ± 2.27 7.10 ± 1.85 / 27.42 ± 1.76 18.66 ± 2.39 / 55.25 ± 3.77 59.92 ± 0.61 / 70.24 ± 0.75 9.2.0 9.2.0 9.2.0 12.5.1
EuropeanParliament/EUBERT 93 66 512 True 20,070 ± 3,977 / 4,400 ± 1,435 3.03 49.54 ± 1.42 / 50.44 ± 1.10 14.86 ± 3.09 / 35.33 ± 1.77 27.90 ± 5.58 / 62.47 ± 3.34 20.65 ± 1.02 / 29.40 ± 1.29 0.0.0 0.0.0 0.0.0 0.0.0
AI-Sweden-Models/roberta-large-1350k 354 50 512 True 5,744 ± 969 / 1,539 ± 492 3.11 73.03 ± 2.07 / 79.97 ± 1.63 3.65 ± 4.19 / 26.89 ± 2.64 2.00 ± 2.03 / 39.53 ± 4.47 42.85 ± 0.98 / 53.68 ± 0.89 10.0.1 10.0.1 10.0.1 10.0.1
01-ai/Yi-6B (few-shot) 6061 64 4096 True 2,786 ± 532 / 784 ± 250 3.13 46.34 ± 2.00 / 33.30 ± 1.78 8.96 ± 1.44 / 18.10 ± 2.39 0.88 ± 1.23 / 33.53 ± 0.48 55.33 ± 1.28 / 66.50 ± 0.94 9.3.2 10.0.0 10.0.0 12.5.1
AI-Sweden-Models/roberta-large-1160k 354 50 512 True 5,741 ± 987 / 1,554 ± 494 3.15 70.92 ± 1.61 / 78.52 ± 1.23 3.50 ± 3.15 / 27.25 ± 2.24 2.06 ± 1.79 / 41.06 ± 5.11 41.40 ± 1.92 / 51.93 ± 2.09 10.0.1 10.0.1 10.0.1 10.0.1
BramVanroy/GEITje-7B-ultra-sft (few-shot) 7242 32 8192 False 5,979 ± 1,044 / 1,724 ± 559 3.15 39.41 ± 2.93 / 30.59 ± 1.59 7.00 ± 3.04 / 35.01 ± 3.72 16.10 ± 2.34 / 52.05 ± 3.60 53.02 ± 0.97 / 65.63 ± 0.72 12.5.2 12.5.2 12.5.2 12.5.2
sentence-transformers/distiluse-base-multilingual-cased-v1 135 120 512 True 34,042 ± 8,482 / 5,951 ± 1,950 3.18 58.67 ± 1.07 / 68.27 ± 0.94 17.82 ± 4.47 / 37.11 ± 2.75 9.27 ± 4.66 / 52.04 ± 4.01 2.17 ± 0.34 / 8.02 ± 0.41 0.0.0 0.0.0 0.0.0 0.0.0
sentence-transformers/distilbert-multilingual-nli-stsb-quora-ranking 135 120 512 True 33,753 ± 8,349 / 5,937 ± 1,946 3.20 65.04 ± 1.07 / 70.94 ± 0.61 17.40 ± 6.56 / 39.25 ± 6.11 -0.95 ± 1.35 / 49.00 ± 0.64 3.94 ± 0.39 / 9.50 ± 0.40 12.6.1 12.6.1 12.6.1 12.6.1
AI-Sweden-Models/gpt-sw3-20b (few-shot) 20918 64 2048 True 4,880 ± 1,052 / 1,181 ± 380 3.21 35.30 ± 3.76 / 33.68 ± 1.80 15.67 ± 2.21 / 31.30 ± 4.51 1.76 ± 2.37 / 47.60 ± 1.68 45.05 ± 1.68 / 55.38 ± 1.66 9.3.1 9.3.1 9.3.1 9.3.1
AI-Sweden-Models/gpt-sw3-20b-instruct (few-shot) 20918 64 2048 True 1,831 ± 587 / 268 ± 90 3.22 24.44 ± 1.62 / 25.02 ± 1.72 18.40 ± 2.14 / 40.21 ± 2.51 4.85 ± 2.01 / 49.10 ± 2.56 39.83 ± 1.08 / 52.69 ± 1.15 9.3.1 9.3.1 9.3.1 9.3.1
Qwen/Qwen1.5-4B (few-shot) 3950 152 32768 True 3,248 ± 739 / 761 ± 252 3.23 35.74 ± 3.22 / 31.74 ± 2.24 12.55 ± 1.39 / 39.80 ± 1.38 0.23 ± 0.44 / 33.35 ± 0.31 51.30 ± 1.63 / 64.17 ± 0.87 12.5.2 10.0.1 12.1.0 12.1.0
google/gemma-2b-it (few-shot) 2506 256 8192 False 6,471 ± 1,142 / 1,961 ± 584 3.32 38.85 ± 3.77 / 32.18 ± 2.49 11.25 ± 1.90 / 28.36 ± 1.81 -2.27 ± 1.37 / 37.91 ± 2.26 45.95 ± 1.11 / 56.54 ± 0.95 12.5.2 12.1.0 12.1.0 12.4.0
dbmdz/bert-base-historic-multilingual-cased 111 32 512 True 20,047 ± 4,407 / 3,844 ± 1,259 3.36 56.69 ± 1.80 / 68.42 ± 0.85 9.29 ± 3.04 / 30.73 ± 2.40 3.02 ± 1.45 / 50.08 ± 1.17 22.14 ± 1.13 / 31.59 ± 0.96 12.6.1 12.6.1 12.6.1 12.6.1
RuterNorway/Llama-2-7b-chat-norwegian (few-shot) unknown 32 4096 False 10,890 ± 2,686 / 2,186 ± 750 3.41 35.49 ± 3.10 / 29.35 ± 2.75 11.36 ± 1.56 / 30.66 ± 3.68 2.52 ± 2.14 / 42.60 ± 4.80 37.49 ± 1.37 / 47.34 ± 1.53 9.3.1 9.3.1 9.3.1 12.5.2
sentence-transformers/distiluse-base-multilingual-cased 135 120 512 True 19,206 ± 4,451 / 3,658 ± 1,187 3.41 56.98 ± 1.37 / 66.91 ± 1.60 9.66 ± 4.65 / 31.17 ± 3.34 19.37 ± 4.34 / 56.74 ± 3.13 3.11 ± 0.37 / 7.91 ± 0.25 0.0.0 0.0.0 0.0.0 0.0.0
LumiOpen/Viking-13B (few-shot) 14030 131 4224 True 3,480 ± 727 / 822 ± 274 3.45 36.74 ± 3.36 / 32.36 ± 1.39 8.57 ± 2.44 / 34.17 ± 2.59 3.01 ± 1.94 / 46.03 ± 4.19 32.32 ± 1.55 / 40.73 ± 1.64 12.5.2 12.5.2 12.5.2 12.5.2
jpostma/DagoBERT 116 40 512 True 11,241 ± 2,115 / 2,565 ± 830 3.45 42.28 ± 1.41 / 47.68 ± 1.08 8.01 ± 2.88 / 31.60 ± 2.41 31.21 ± 1.62 / 64.82 ± 0.69 3.65 ± 0.33 / 9.49 ± 0.31 0.0.0 0.0.0 0.0.0 0.0.0
allenai/OLMo-7B (few-shot) 6888 50 2176 True 5,403 ± 1,133 / 1,294 ± 423 3.49 37.37 ± 2.22 / 30.45 ± 2.45 9.55 ± 1.82 / 23.90 ± 1.53 0.05 ± 1.35 / 35.78 ± 2.30 34.81 ± 1.54 / 46.37 ± 1.51 12.5.2 12.5.2 12.5.2 12.5.2
google/gemma-2b (few-shot) 2506 256 8192 True 6,087 ± 1,046 / 1,902 ± 563 3.56 16.90 ± 4.91 / 17.38 ± 4.30 9.95 ± 0.78 / 27.94 ± 1.43 0.41 ± 1.03 / 33.54 ± 0.32 49.15 ± 1.55 / 59.16 ± 1.44 12.5.2 12.1.0 12.1.0 12.1.0
Qwen/Qwen1.5-1.8B-Chat (few-shot) 1837 152 32768 False 8,304 ± 1,846 / 1,933 ± 617 3.70 23.44 ± 5.09 / 25.00 ± 2.33 6.82 ± 1.82 / 30.97 ± 2.65 4.11 ± 1.73 / 43.70 ± 3.47 33.16 ± 1.61 / 46.66 ± 1.27 12.5.2 11.0.0 12.1.0 12.5.0
3ebdola/Dialectal-Arabic-XLM-R-Base 277 250 512 True 12,783 ± 2,537 / 2,712 ± 885 3.76 44.46 ± 2.24 / 60.04 ± 1.09 8.39 ± 4.20 / 30.69 ± 2.83 2.07 ± 1.34 / 48.42 ± 1.31 4.30 ± 1.26 / 9.24 ± 1.13 0.0.0 0.0.0 0.0.0 0.0.0
dbmdz/bert-tiny-historic-multilingual-cased 5 32 512 True 78,027 ± 15,466 / 17,064 ± 5,335 3.76 41.38 ± 2.82 / 56.29 ± 1.61 8.45 ± 2.80 / 29.85 ± 1.86 1.55 ± 1.97 / 49.24 ± 1.16 4.40 ± 0.22 / 6.62 ± 0.38 0.0.0 0.0.0 0.0.0 0.0.0
Qwen/Qwen1.5-0.5B-Chat (few-shot) 620 152 32768 False 11,740 ± 3,000 / 2,209 ± 721 3.80 18.66 ± 4.43 / 17.56 ± 4.28 8.59 ± 3.20 / 29.65 ± 5.10 0.34 ± 2.02 / 43.92 ± 3.15 26.74 ± 1.57 / 35.03 ± 2.14 12.5.2 11.0.0 12.1.0 12.5.0
sentence-transformers/distiluse-base-multilingual-cased-v2 135 120 512 True 33,247 ± 8,123 / 6,017 ± 1,977 3.82 49.82 ± 2.71 / 62.06 ± 1.69 2.70 ± 3.10 / 26.12 ± 2.40 6.60 ± 3.84 / 50.71 ± 1.92 2.13 ± 0.10 / 6.80 ± 0.37 12.6.1 12.6.1 12.6.1 12.6.1
allenai/OLMo-7B-Twin-2T (few-shot) 6888 50 2176 True 5,484 ± 1,125 / 1,317 ± 425 3.84 18.70 ± 5.76 / 19.58 ± 4.59 3.70 ± 1.69 / 17.91 ± 1.48 2.19 ± 2.08 / 45.43 ± 3.44 38.08 ± 1.07 / 48.44 ± 1.55 12.5.2 12.5.2 12.5.2 12.5.2
sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2 118 250 512 True 29,201 ± 6,282 / 6,045 ± 2,027 3.87 59.61 ± 2.40 / 67.02 ± 1.16 0.00 ± 0.00 / 24.33 ± 0.14 -0.04 ± 1.84 / 48.65 ± 1.21 3.28 ± 0.31 / 9.04 ± 0.28 12.6.1 12.6.1 12.6.1 12.6.1
Qwen/Qwen1.5-1.8B (few-shot) 1837 152 32768 True 5,666 ± 1,328 / 1,256 ± 408 3.89 11.66 ± 6.46 / 15.15 ± 4.38 5.20 ± 1.78 / 35.43 ± 2.14 2.89 ± 1.91 / 41.36 ± 4.63 34.60 ± 2.17 / 48.83 ± 1.05 12.5.2 10.0.1 12.1.0 12.1.0
Qwen/Qwen1.5-0.5B (few-shot) 620 152 32768 True 11,371 ± 2,924 / 2,122 ± 692 3.93 28.30 ± 3.90 / 28.67 ± 3.15 4.54 ± 2.76 / 26.53 ± 3.74 -0.42 ± 2.41 / 37.60 ± 3.89 20.81 ± 2.21 / 29.05 ± 2.31 12.5.2 10.0.1 12.1.0 12.1.0
allenai/OLMo-1B (few-shot) 1177 50 2176 True 8,536 ± 1,926 / 1,940 ± 619 4.16 22.58 ± 5.05 / 26.82 ± 3.69 4.92 ± 2.71 / 19.51 ± 4.22 -1.27 ± 1.85 / 41.38 ± 3.59 6.64 ± 1.96 / 11.74 ± 1.62 12.5.2 12.1.0 12.1.0 12.1.0
fresh-xlm-roberta-base 277 250 512 True 2,214 ± 94 / 1,494 ± 229 4.45 13.09 ± 1.68 / 16.25 ± 2.85 0.92 ± 2.11 / 25.39 ± 1.86 1.93 ± 1.37 / 40.76 ± 4.83 0.26 ± 0.09 / 2.70 ± 1.10 0.0.0 0.0.0 0.0.0 0.0.0
fresh-electra-small 13 31 512 True 7,840 ± 1,538 / 3,024 ± 438 4.48 11.66 ± 1.16 / 13.45 ± 1.20 0.00 ± 0.00 / 24.33 ± 0.14 -0.21 ± 1.89 / 35.79 ± 2.99 0.17 ± 0.04 / 0.17 ± 0.04 12.6.1 12.6.1 12.6.1 12.6.1
RJuro/kanelsnegl-v0.1 (few-shot) 7242 32 512 True 9,757 ± 2,047 / 2,200 ± 705 4.66 0.00 ± 0.00 / 0.00 ± 0.00 0.95 ± 1.17 / 9.87 ± 0.86 0.00 ± 0.00 / 33.34 ± 0.31 0.00 ± 0.00 / 5.43 ± 0.58 9.3.1 9.3.1 9.3.1 12.5.1
ai-forever/mGPT (few-shot) unknown 100 1024 True 13,551 ± 4,259 / 2,563 ± 838 4.69 0.11 ± 0.21 / 0.27 ± 0.53 -0.67 ± 1.33 / 8.96 ± 0.37 -0.97 ± 1.56 / 34.83 ± 1.94 0.29 ± 0.21 / 1.56 ± 0.19 9.3.1 10.0.1 11.0.0 12.5.1
Download as CSV   •   Copy embed HTML