German NLU 🇩🇪

Last updated: 29/04/2024 11:26:16 CET
Model ID Parameters Vocabulary size Context Commercial Speed Rank GermEval SB10k ScaLA-de GermanQuAD GermEval version SB10k version ScaLA-de version GermanQuAD version
deepset/gbert-large 337 31 512 True 19,109 ± 3,442 / 4,438 ± 1,468 1.17 80.30 ± 0.94 / 79.06 ± 1.05 65.12 ± 1.34 / 76.67 ± 0.90 75.12 ± 1.07 / 87.22 ± 0.63 28.91 ± 1.58 / 56.61 ± 1.98 12.8.0 12.8.0 12.8.0 12.7.0
FacebookAI/xlm-roberta-large 561 250 512 True 17,897 ± 3,921 / 3,463 ± 1,141 1.34 80.64 ± 0.81 / 80.03 ± 0.97 63.02 ± 1.62 / 75.25 ± 1.08 54.83 ± 9.12 / 76.02 ± 5.19 29.09 ± 1.00 / 55.68 ± 1.11 12.7.0 12.7.0 12.7.0 12.7.0
google/rembert 575 250 512 True 11,736 ± 2,822 / 2,102 ± 677 1.34 77.62 ± 0.80 / 76.16 ± 0.79 60.65 ± 2.10 / 73.52 ± 1.58 62.60 ± 2.16 / 81.12 ± 1.14 33.62 ± 1.34 / 60.41 ± 1.36 12.7.0 12.7.0 12.7.0 12.7.0
intfloat/multilingual-e5-large 560 250 512 True 6,732 ± 1,273 / 1,633 ± 523 1.49 79.73 ± 1.38 / 78.52 ± 1.44 64.78 ± 1.34 / 76.30 ± 0.98 47.24 ± 13.67 / 71.51 ± 8.03 28.11 ± 0.94 / 54.87 ± 1.08 12.6.1 12.6.1 12.6.1 12.6.1
sentence-transformers/use-cmlm-multilingual 472 501 512 True 30,231 ± 8,171 / 4,863 ± 1,598 1.49 77.03 ± 1.06 / 76.61 ± 1.13 59.50 ± 1.14 / 72.91 ± 0.76 59.30 ± 1.32 / 78.31 ± 0.74 26.67 ± 0.90 / 50.81 ± 0.88 12.7.0 12.7.0 12.7.0 12.7.0
meta-llama/Meta-Llama-3-70B (few-shot, val) 70554 128 8221 True 312 ± 55 / 177 ± 51 1.56 69.04 ± 2.51 / 61.10 ± 3.39 63.51 ± 2.57 / 75.01 ± 1.74 37.41 ± 2.43 / 67.63 ± 1.05 38.29 ± 3.54 / 69.69 ± 2.78 12.7.0 12.7.0 12.7.0 12.7.0
gwlms/deberta-base-dewiki-v1 112 32 512 True 18,084 ± 2,413 / 5,561 ± 1,826 1.63 79.67 ± 1.47 / 78.63 ± 1.31 50.90 ± 2.77 / 66.61 ± 2.07 75.16 ± 1.45 / 87.41 ± 0.72 18.57 ± 2.14 / 41.99 ± 2.69 12.6.1 12.6.1 12.6.1 12.6.1
setu4993/LaBSE 471 501 512 True 25,418 ± 6,435 / 4,536 ± 1,452 1.67 79.44 ± 0.96 / 78.42 ± 0.98 58.65 ± 1.67 / 72.12 ± 1.18 52.19 ± 2.96 / 74.17 ± 1.82 23.66 ± 0.97 / 45.11 ± 1.50 12.7.0 12.7.0 12.7.0 12.7.0
upstage/SOLAR-10.7B-v1.0 (few-shot) 10732 32 4096 True 3,780 ± 906 / 799 ± 261 1.67 68.11 ± 1.32 / 56.25 ± 1.65 59.79 ± 1.60 / 71.47 ± 1.54 35.45 ± 3.06 / 66.13 ± 1.28 37.27 ± 1.23 / 68.54 ± 1.94 12.5.3 12.5.3 12.5.3 12.5.3
gpt-4-1106-preview (few-shot, val) unknown 100 127998 True 576 ± 221 / 81 ± 28 1.69 68.94 ± 2.50 / 44.89 ± 2.85 57.46 ± 3.32 / 69.93 ± 2.43 49.58 ± 7.26 / 74.12 ± 3.50 30.04 ± 1.30 / 58.77 ± 1.50 12.6.1 12.6.1 12.6.1 12.6.1
gwlms/teams-base-dewiki-v1-discriminator 111 32 512 True 30,608 ± 4,466 / 8,477 ± 2,725 1.72 79.59 ± 0.89 / 78.76 ± 0.74 47.85 ± 2.41 / 64.86 ± 1.71 68.48 ± 2.29 / 83.93 ± 1.25 19.90 ± 1.38 / 41.74 ± 2.30 12.6.1 12.6.1 12.6.1 12.6.1
meta-llama/Llama-2-70b-hf (few-shot, val) 68977 32 4125 True 1,892 ± 650 / 318 ± 105 1.76 63.71 ± 2.43 / 57.08 ± 2.70 58.17 ± 2.51 / 71.34 ± 1.62 36.33 ± 5.00 / 64.51 ± 3.38 36.06 ± 2.89 / 69.62 ± 2.81 12.7.0 12.7.0 12.7.0 12.7.0
microsoft/mdeberta-v3-base 278 251 512 True 20,637 ± 3,925 / 4,497 ± 1,502 1.76 77.42 ± 1.30 / 76.39 ± 1.16 50.90 ± 6.20 / 66.89 ± 4.34 59.38 ± 2.00 / 79.12 ± 1.09 20.28 ± 1.15 / 42.79 ± 1.37 12.7.0 12.7.0 12.7.0 12.7.0
dbmdz/bert-base-german-cased 111 31 512 True 37,150 ± 6,555 / 8,659 ± 2,843 1.82 78.54 ± 1.02 / 77.24 ± 0.95 53.91 ± 1.63 / 69.21 ± 1.06 59.23 ± 3.17 / 78.21 ± 2.03 13.71 ± 1.48 / 32.14 ± 2.81 12.7.0 12.7.0 12.7.0 12.7.0
dbmdz/bert-base-german-uncased 111 31 512 True 36,020 ± 6,698 / 8,035 ± 2,647 1.83 77.55 ± 0.54 / 75.74 ± 0.56 56.48 ± 1.78 / 70.91 ± 1.23 63.49 ± 4.58 / 80.65 ± 2.60 12.39 ± 0.86 / 30.22 ± 1.38 12.7.0 12.7.0 12.7.0 12.7.0
deepset/gbert-base 111 31 512 True 37,268 ± 6,577 / 8,719 ± 2,865 1.85 80.09 ± 0.84 / 78.71 ± 0.84 59.80 ± 2.24 / 73.18 ± 1.49 47.48 ± 7.30 / 70.97 ± 3.94 14.39 ± 0.71 / 32.78 ± 1.01 12.7.0 12.7.0 12.7.0 12.7.0
152334H/miqu-1-70b-sf (few-shot, val) 68977 32 32793 True 2,126 ± 676 / 319 ± 104 1.86 65.19 ± 2.58 / 56.17 ± 3.57 59.80 ± 2.15 / 71.98 ± 1.46 41.86 ± 5.44 / 69.70 ± 2.31 25.51 ± 3.79 / 63.19 ± 2.48 12.7.0 12.7.0 12.7.0 12.7.0
gwlms/bert-base-dewiki-v1 111 32 512 True 30,650 ± 4,495 / 8,500 ± 2,755 1.87 80.53 ± 1.03 / 79.35 ± 1.07 45.61 ± 1.73 / 63.19 ± 1.32 67.09 ± 2.48 / 82.65 ± 1.55 16.61 ± 0.99 / 35.77 ± 1.33 12.6.1 12.6.1 12.6.1 12.6.1
meta-llama/Meta-Llama-3-70B-Instruct (few-shot, val) 70554 128 8221 True 1,673 ± 583 / 275 ± 85 1.87 75.20 ± 2.15 / 64.06 ± 3.60 54.38 ± 3.31 / 68.02 ± 2.21 36.59 ± 4.24 / 67.36 ± 1.77 26.90 ± 2.67 / 58.28 ± 2.22 12.7.0 12.7.0 12.7.0 12.7.0
intfloat/multilingual-e5-base 278 250 512 True 14,965 ± 2,890 / 3,322 ± 1,074 1.91 74.79 ± 1.45 / 75.10 ± 1.37 63.29 ± 1.54 / 75.42 ± 1.03 45.32 ± 8.38 / 71.30 ± 4.11 16.42 ± 0.54 / 34.46 ± 1.03 12.6.1 12.6.1 12.6.1 12.6.1
VAGOsolutions/SauerkrautLM-7b-HerO (few-shot, val) 7242 32 32768 False 2,477 ± 467 / 744 ± 232 1.92 59.70 ± 2.05 / 50.40 ± 2.63 60.22 ± 2.99 / 72.76 ± 2.10 35.99 ± 4.53 / 67.21 ± 2.14 29.68 ± 3.29 / 65.65 ± 3.10 12.6.1 12.6.1 12.6.1 12.6.1
facebook/xlm-v-base 778 902 512 True 25,396 ± 6,394 / 4,534 ± 1,421 1.92 76.45 ± 1.57 / 76.50 ± 1.16 58.25 ± 2.98 / 72.01 ± 1.97 34.43 ± 16.58 / 61.45 ± 12.19 21.08 ± 1.57 / 41.84 ± 1.98 12.7.0 12.7.0 12.7.0 12.7.0
meta-llama/Meta-Llama-3-8B-Instruct (few-shot) 8030 128 8192 True 4,909 ± 1,215 / 978 ± 319 1.92 68.18 ± 0.95 / 57.72 ± 1.15 58.33 ± 2.83 / 69.31 ± 3.16 29.12 ± 3.17 / 63.60 ± 1.63 28.68 ± 1.99 / 56.42 ± 3.34 12.6.1 12.6.1 12.6.1 12.6.1
Nexusflow/Starling-LM-7B-beta (few-shot) 7242 32 8192 False 5,876 ± 1,021 / 1,677 ± 546 1.93 65.01 ± 0.68 / 43.34 ± 2.80 51.80 ± 1.29 / 67.45 ± 0.87 36.18 ± 1.31 / 67.86 ± 0.51 32.12 ± 2.08 / 67.30 ± 1.66 12.5.2 12.5.2 12.5.2 12.5.2
VAGOsolutions/SauerkrautLM-7b-LaserChat (few-shot) 7242 32 8192 False 2,387 ± 456 / 717 ± 226 1.93 64.73 ± 1.02 / 48.09 ± 2.90 51.08 ± 1.67 / 67.20 ± 1.34 36.46 ± 1.89 / 67.99 ± 0.80 33.30 ± 1.87 / 67.32 ± 1.73 12.6.1 12.6.1 12.6.1 12.6.1
VAGOsolutions/FC-SauerkrautLM-7b-beta (few-shot) unknown 32 8192 False 2,160 ± 514 / 668 ± 256 1.98 56.70 ± 1.55 / 41.89 ± 2.00 53.39 ± 1.89 / 67.16 ± 1.65 35.64 ± 1.50 / 66.83 ± 0.97 34.22 ± 1.43 / 67.00 ± 1.27 12.6.1 12.6.1 12.6.1 12.6.1
gpt-3.5-turbo-0613 (few-shot, val) unknown 100 4095 True 921 ± 293 / 113 ± 37 1.98 61.50 ± 2.96 / 46.22 ± 3.41 55.50 ± 2.58 / 68.96 ± 2.00 38.96 ± 4.39 / 68.89 ± 2.54 30.20 ± 1.59 / 56.58 ± 1.78 0.0.0 0.0.0 0.0.0 0.0.0
occiglot/occiglot-7b-de-en-instruct (few-shot) 7242 32 32768 False 1,584 ± 217 / 635 ± 178 2.00 55.76 ± 1.16 / 40.04 ± 3.21 55.91 ± 2.49 / 70.31 ± 1.76 22.47 ± 3.37 / 56.77 ± 3.69 35.95 ± 1.89 / 66.86 ± 2.33 12.5.2 12.3.1 12.3.1 12.4.0
cstr/Spaetzle-v8-7b (few-shot, val) 7242 32 32768 False 5,980 ± 1,031 / 1,714 ± 552 2.03 58.90 ± 2.30 / 45.55 ± 3.30 61.34 ± 1.90 / 72.98 ± 1.30 31.58 ± 4.39 / 65.51 ± 2.23 24.91 ± 3.98 / 60.88 ± 3.31 12.5.2 12.5.2 12.5.2 12.5.2
mlabonne/NeuralBeagle14-7B (few-shot, val) 7242 32 8192 False 2,549 ± 472 / 784 ± 245 2.03 64.81 ± 3.03 / 53.01 ± 3.41 59.60 ± 2.81 / 72.42 ± 1.83 27.06 ± 4.53 / 63.33 ± 2.30 25.22 ± 3.84 / 60.93 ± 2.99 9.3.2 9.3.2 9.3.2 12.5.2
meta-llama/Meta-Llama-3-8B (few-shot) 8030 128 8192 True 4,687 ± 1,121 / 967 ± 313 2.05 56.00 ± 1.94 / 43.49 ± 2.05 56.40 ± 3.89 / 70.17 ± 2.91 22.01 ± 5.17 / 56.97 ± 3.54 35.39 ± 2.49 / 64.61 ± 2.42 12.6.1 12.6.1 12.6.1 12.6.1
meta-llama/Llama-2-70b-chat-hf (few-shot, val) 68977 32 4125 True 1,979 ± 621 / 320 ± 105 2.07 62.39 ± 2.72 / 50.86 ± 2.31 53.16 ± 3.17 / 64.24 ± 3.42 31.81 ± 5.15 / 62.15 ± 4.02 28.99 ± 2.22 / 60.53 ± 2.92 12.7.0 12.7.0 12.7.0 12.7.0
mistralai/Mistral-7B-v0.1 (few-shot) 7242 32 32768 True 2,657 ± 524 / 880 ± 278 2.14 55.37 ± 1.32 / 44.65 ± 2.48 54.27 ± 1.71 / 68.13 ± 1.16 23.12 ± 4.07 / 57.81 ± 3.70 31.89 ± 3.29 / 59.77 ± 4.31 9.1.2 9.1.2 9.1.2 12.5.1
ZurichNLP/unsup-simcse-xlm-roberta-base 277 250 512 True 34,520 ± 7,443 / 6,730 ± 2,224 2.16 74.50 ± 1.06 / 74.21 ± 0.83 58.23 ± 1.13 / 72.00 ± 0.75 34.74 ± 14.22 / 64.18 ± 7.54 11.19 ± 3.07 / 26.81 ± 6.54 12.7.0 12.7.0 12.7.0 12.7.0
jhu-clsp/bernice 277 250 128 True 5,567 ± 450 / 2,483 ± 798 2.18 72.25 ± 1.06 / 71.08 ± 1.17 62.00 ± 2.11 / 74.40 ± 1.38 48.10 ± 4.34 / 71.95 ± 3.29 0.00 ± 0.00 / 0.00 ± 0.00 0.0.0 0.0.0 0.0.0 0.0.0
alpindale/Mistral-7B-v0.2-hf (few-shot) 7242 32 32768 True 1,841 ± 297 / 651 ± 193 2.19 55.32 ± 1.55 / 48.33 ± 1.45 52.49 ± 2.16 / 67.50 ± 1.61 24.34 ± 2.29 / 59.66 ± 2.93 31.54 ± 3.00 / 59.96 ± 3.89 12.5.2 12.5.2 12.5.2 12.5.2
senseable/WestLake-7B-v2 (few-shot) 7242 32 32768 False 5,993 ± 1,028 / 1,742 ± 561 2.19 64.38 ± 1.60 / 50.26 ± 2.53 54.44 ± 1.45 / 69.32 ± 1.02 26.03 ± 2.23 / 61.88 ± 1.38 25.68 ± 2.81 / 62.48 ± 2.93 12.6.1 12.6.1 12.6.1 12.6.1
mlabonne/AlphaMonarch-7B (few-shot, val) 7242 32 8192 False 5,340 ± 1,262 / 1,157 ± 375 2.23 63.36 ± 2.68 / 51.59 ± 3.44 59.80 ± 3.18 / 72.32 ± 2.23 22.98 ± 8.11 / 60.88 ± 3.98 20.96 ± 3.59 / 57.36 ± 2.94 12.5.2 12.5.2 12.5.2 12.5.2
microsoft/xlm-align-base 277 250 512 True 14,744 ± 2,870 / 3,265 ± 1,053 2.26 79.38 ± 0.80 / 79.33 ± 0.74 58.58 ± 2.31 / 72.09 ± 1.64 15.34 ± 5.24 / 52.99 ± 1.90 16.58 ± 6.50 / 32.33 ± 11.35 0.0.0 0.0.0 0.0.0 0.0.0
occiglot/occiglot-7b-de-en (few-shot) 7242 32 32768 True 1,992 ± 319 / 706 ± 211 2.26 48.11 ± 2.01 / 39.66 ± 3.29 54.96 ± 2.69 / 69.64 ± 2.09 21.57 ± 4.18 / 55.63 ± 4.56 31.49 ± 3.11 / 61.33 ± 3.41 12.3.2 12.3.1 12.3.1 12.3.1
occiglot/occiglot-7b-eu5-instruct (few-shot) 7242 32 32768 False 2,088 ± 352 / 706 ± 214 2.28 52.63 ± 1.89 / 42.99 ± 2.40 43.16 ± 4.45 / 57.79 ± 4.61 27.09 ± 1.92 / 60.29 ± 1.99 34.01 ± 4.01 / 63.29 ± 3.97 12.5.2 12.2.0 12.3.1 12.4.0
cardiffnlp/twitter-xlm-roberta-base 277 250 512 True 34,475 ± 7,465 / 6,712 ± 2,223 2.31 74.89 ± 0.86 / 73.54 ± 0.84 63.01 ± 1.50 / 75.21 ± 1.03 36.60 ± 2.17 / 64.93 ± 1.39 0.65 ± 0.25 / 5.05 ± 1.30 12.7.0 12.7.0 12.7.0 12.7.0
clips/mfaq 277 250 128 True 5,591 ± 187 / 3,349 ± 1,105 2.31 76.68 ± 0.91 / 76.46 ± 0.98 59.51 ± 1.54 / 72.84 ± 1.06 32.54 ± 11.48 / 60.57 ± 7.28 1.53 ± 0.96 / 2.39 ± 1.52 0.0.0 0.0.0 0.0.0 0.0.0
seedboxai/KafkaLM-7B-German-V0.1-DPO (few-shot) 7242 32 4096 False 6,070 ± 1,042 / 1,769 ± 573 2.32 48.92 ± 2.76 / 38.62 ± 2.42 52.57 ± 1.74 / 61.25 ± 2.84 20.74 ± 3.20 / 56.59 ± 3.27 32.87 ± 1.83 / 62.31 ± 2.13 12.5.2 12.3.2 12.3.2 12.4.0
seedboxai/KafkaLM-7B-German-V0.1 (few-shot) 7242 32 32768 True 6,065 ± 1,038 / 1,766 ± 570 2.32 48.35 ± 2.96 / 38.58 ± 2.35 52.51 ± 1.72 / 61.27 ± 2.77 20.36 ± 3.59 / 56.14 ± 3.61 32.88 ± 1.78 / 62.19 ± 2.05 12.5.2 12.3.2 12.3.2 12.3.2
occiglot/occiglot-7b-eu5 (few-shot) 7242 32 32768 True 2,219 ± 427 / 717 ± 224 2.34 51.39 ± 1.35 / 44.47 ± 2.77 47.30 ± 4.44 / 62.28 ± 4.24 21.83 ± 1.98 / 57.05 ± 2.18 31.55 ± 3.67 / 60.39 ± 4.29 12.5.2 12.1.0 12.1.0 12.1.0
Twitter/twhin-bert-base 279 250 512 True 11,514 ± 2,041 / 2,862 ± 918 2.36 70.35 ± 1.34 / 69.09 ± 1.36 55.03 ± 2.38 / 69.65 ± 1.75 43.87 ± 8.85 / 69.71 ± 6.73 2.81 ± 0.73 / 10.15 ± 2.34 12.6.1 12.6.1 12.6.1 12.6.1
timpal0l/Mistral-7B-v0.1-flashback-v2 (few-shot) 7242 32 32768 True 5,054 ± 1,200 / 1,056 ± 339 2.36 50.66 ± 1.53 / 39.89 ± 2.43 54.79 ± 3.53 / 68.79 ± 3.00 20.17 ± 1.69 / 58.67 ± 1.13 27.86 ± 4.70 / 54.38 ± 5.91 12.5.3 12.5.3 12.5.3 12.5.3
Twitter/twhin-bert-large 562 250 512 True 9,707 ± 1,664 / 2,549 ± 831 2.37 74.36 ± 1.33 / 73.61 ± 1.47 53.52 ± 2.28 / 68.49 ± 1.49 22.26 ± 11.63 / 58.51 ± 6.63 11.68 ± 4.11 / 22.54 ± 7.80 12.7.0 12.7.0 12.7.0 12.7.0
RuterNorway/Llama-2-13b-chat-norwegian (few-shot) unknown 32 4096 False 7,778 ± 1,755 / 1,703 ± 552 2.41 56.71 ± 1.34 / 47.69 ± 2.04 49.77 ± 2.03 / 62.42 ± 3.31 19.92 ± 3.22 / 52.83 ± 5.45 27.87 ± 2.01 / 57.64 ± 2.05 9.3.1 9.3.1 9.3.1 9.3.1
mistralai/Mistral-7B-Instruct-v0.2 (few-shot) 7242 32 32768 False 2,538 ± 415 / 821 ± 253 2.45 55.15 ± 1.17 / 41.83 ± 1.49 47.85 ± 2.29 / 65.02 ± 1.69 24.29 ± 2.18 / 60.90 ± 1.65 24.00 ± 2.19 / 57.66 ± 1.94 9.3.1 9.2.0 9.3.1 12.4.0
DiscoResearch/DiscoLM_German_7b_v1 (few-shot) 7242 32 32768 False 1,972 ± 315 / 689 ± 204 2.49 42.39 ± 2.43 / 32.42 ± 1.53 48.67 ± 3.85 / 59.21 ± 4.18 8.72 ± 2.15 / 43.37 ± 3.69 36.12 ± 2.35 / 66.54 ± 2.34 12.5.2 12.0.0 12.1.0 12.4.0
mistralai/Mistral-7B-Instruct-v0.1 (few-shot) 7242 32 32768 False 5,443 ± 1,273 / 1,144 ± 364 2.49 51.79 ± 0.92 / 36.09 ± 1.73 47.27 ± 3.06 / 63.50 ± 2.88 22.15 ± 1.83 / 56.64 ± 4.04 24.35 ± 3.75 / 54.56 ± 4.42 9.3.1 9.3.1 9.3.1 12.4.0
mayflowergmbh/Wiedervereinigung-7b-dpo (few-shot, val) 7242 32 32768 False 2,374 ± 432 / 744 ± 230 2.53 52.17 ± 2.87 / 40.26 ± 2.43 51.92 ± 3.19 / 67.12 ± 2.11 29.06 ± 5.04 / 62.77 ± 2.22 14.59 ± 2.77 / 50.41 ± 3.79 12.5.2 12.1.0 12.1.0 12.4.0
meta-llama/Llama-2-7b-hf (few-shot) 6738 32 4125 True 2,648 ± 467 / 799 ± 250 2.53 43.02 ± 1.93 / 32.69 ± 1.98 50.21 ± 2.43 / 65.81 ± 1.82 15.79 ± 2.35 / 53.25 ± 4.45 28.57 ± 5.09 / 55.54 ± 6.14 12.8.0 12.8.0 12.8.0 12.8.0
01-ai/Yi-6B (few-shot) 6061 64 4096 True 2,786 ± 532 / 784 ± 250 2.55 44.97 ± 1.75 / 35.79 ± 1.66 53.14 ± 2.67 / 67.89 ± 2.15 7.64 ± 2.47 / 37.95 ± 2.67 30.19 ± 1.61 / 57.19 ± 2.48 9.3.2 10.0.0 10.0.0 12.5.1
Geotrend/distilbert-base-25lang-cased 109 85 512 True 26,099 ± 5,881 / 5,178 ± 1,665 2.60 72.97 ± 0.68 / 71.83 ± 0.80 41.51 ± 1.91 / 60.79 ± 1.27 45.39 ± 1.52 / 71.60 ± 0.73 1.89 ± 0.52 / 6.88 ± 1.73 12.6.1 12.6.1 12.6.1 12.6.1
sentence-transformers/stsb-xlm-r-multilingual 277 250 512 True 15,040 ± 2,953 / 3,417 ± 1,100 2.61 67.47 ± 1.09 / 66.34 ± 1.08 52.85 ± 1.53 / 68.48 ± 1.02 29.59 ± 6.40 / 60.98 ± 2.57 0.73 ± 0.17 / 4.35 ± 1.32 12.8.0 12.8.0 12.8.0 0.0.0
meta-llama/Llama-2-7b-chat-hf (few-shot) 6738 32 4096 False 2,643 ± 455 / 800 ± 247 2.66 50.00 ± 1.33 / 38.45 ± 1.68 46.54 ± 2.92 / 63.66 ± 2.14 15.30 ± 1.79 / 55.12 ± 1.92 25.57 ± 3.59 / 56.09 ± 3.74 9.3.1 9.3.1 9.3.1 12.4.0
microsoft/infoxlm-base 277 250 512 True 34,735 ± 7,558 / 6,846 ± 2,312 2.66 77.84 ± 0.92 / 77.81 ± 1.05 59.16 ± 2.05 / 72.70 ± 1.38 3.66 ± 2.14 / 49.63 ± 2.38 3.67 ± 2.18 / 10.66 ± 4.52 12.7.0 12.7.0 12.7.0 12.7.0
sentence-transformers/paraphrase-xlm-r-multilingual-v1 278 250 512 True 20,154 ± 4,438 / 3,890 ± 1,256 2.66 69.45 ± 0.82 / 68.02 ± 0.87 57.94 ± 2.12 / 71.92 ± 1.45 21.81 ± 12.05 / 58.13 ± 4.73 0.33 ± 0.15 / 3.03 ± 1.06 12.8.0 12.8.0 12.8.0 12.8.0
Qwen/Qwen1.5-4B-Chat (few-shot) 3950 152 32768 False 4,347 ± 893 / 1,135 ± 365 2.74 42.08 ± 1.65 / 36.90 ± 2.00 41.52 ± 3.53 / 57.69 ± 3.35 12.78 ± 3.75 / 46.43 ± 5.48 29.35 ± 2.51 / 59.90 ± 2.80 12.5.2 10.0.1 12.1.0 12.5.2
AI-Sweden-Models/roberta-large-1160k 354 50 512 True 5,741 ± 987 / 1,554 ± 494 2.76 68.39 ± 0.98 / 67.24 ± 1.10 45.91 ± 3.76 / 62.57 ± 3.33 2.79 ± 2.11 / 39.08 ± 4.38 18.83 ± 1.45 / 35.01 ± 2.44 10.0.1 10.0.1 10.0.1 10.0.1
AI-Sweden-Models/roberta-large-1350k 354 50 512 True 5,744 ± 969 / 1,539 ± 492 2.76 67.24 ± 1.51 / 66.01 ± 1.32 45.84 ± 2.11 / 63.35 ± 1.48 2.28 ± 1.62 / 35.18 ± 2.11 18.17 ± 1.85 / 33.37 ± 2.80 10.0.1 10.0.1 10.0.1 10.0.1
Rijgersberg/GEITje-7B (few-shot) 7242 32 32768 True 10,401 ± 2,529 / 2,123 ± 690 2.76 39.09 ± 2.92 / 31.71 ± 2.34 47.83 ± 2.81 / 60.24 ± 3.30 10.31 ± 2.60 / 46.65 ± 4.50 26.13 ± 3.79 / 53.13 ± 4.50 9.3.1 9.3.1 9.3.1 9.3.1
dbmdz/bert-base-historic-multilingual-cased 111 32 512 True 20,047 ± 4,407 / 3,844 ± 1,259 3.03 65.35 ± 1.08 / 63.62 ± 1.05 37.77 ± 2.20 / 58.20 ± 1.43 16.07 ± 3.83 / 54.09 ± 2.44 5.67 ± 1.14 / 16.63 ± 2.52 12.8.0 12.8.0 12.8.0 12.8.0
Qwen/Qwen1.5-4B (few-shot) 3950 152 32768 True 3,248 ± 739 / 761 ± 252 3.04 31.52 ± 2.96 / 29.20 ± 1.88 39.91 ± 3.29 / 53.66 ± 3.20 3.27 ± 2.51 / 34.30 ± 1.29 27.55 ± 3.12 / 57.60 ± 3.34 12.5.2 10.0.1 12.1.0 12.1.0
EuropeanParliament/EUBERT 94 66 512 True 20,070 ± 3,977 / 4,400 ± 1,435 3.06 49.95 ± 0.72 / 49.10 ± 0.71 40.29 ± 2.70 / 59.76 ± 1.78 25.88 ± 6.61 / 61.54 ± 3.28 2.59 ± 0.71 / 8.94 ± 2.03 12.6.1 12.6.1 12.6.1 12.6.1
sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2 118 250 512 True 29,201 ± 6,282 / 6,045 ± 2,027 3.08 60.54 ± 1.96 / 59.68 ± 1.94 54.99 ± 2.05 / 70.00 ± 1.37 0.52 ± 2.01 / 49.40 ± 1.06 0.80 ± 0.23 / 7.76 ± 2.10 12.8.0 12.8.0 12.8.0 12.8.0
sentence-transformers/quora-distilbert-multilingual 135 120 512 True 26,458 ± 5,992 / 5,274 ± 1,731 3.13 64.12 ± 0.92 / 62.50 ± 0.85 49.66 ± 1.68 / 66.33 ± 1.12 0.58 ± 1.23 / 49.27 ± 0.86 0.05 ± 0.06 / 0.36 ± 0.30 12.6.1 12.6.1 12.6.1 12.6.1
sentence-transformers/distilbert-multilingual-nli-stsb-quora-ranking 135 120 512 True 33,753 ± 8,349 / 5,937 ± 1,946 3.14 63.78 ± 0.75 / 62.34 ± 0.64 49.69 ± 1.56 / 66.32 ± 1.07 0.74 ± 0.83 / 48.97 ± 0.85 0.02 ± 0.02 / 0.20 ± 0.17 12.8.0 12.8.0 12.8.0 12.8.0
VAGOsolutions/SauerkrautLM-Gemma-2b (few-shot) 2506 256 8192 False 3,607 ± 565 / 1,212 ± 349 3.26 12.21 ± 2.76 / 11.93 ± 2.08 44.84 ± 2.70 / 57.27 ± 3.65 2.02 ± 2.19 / 37.47 ± 3.26 24.59 ± 2.70 / 49.79 ± 2.96 12.6.1 12.6.1 12.6.1 12.6.1
LumiOpen/Viking-13B (few-shot) 14030 131 4128 True 3,480 ± 727 / 822 ± 274 3.27 34.53 ± 1.24 / 29.89 ± 1.96 42.90 ± 2.66 / 56.64 ± 4.71 1.51 ± 1.64 / 43.36 ± 4.05 15.83 ± 1.42 / 29.77 ± 2.54 12.5.2 12.5.2 12.5.2 12.5.2
AI-Sweden-Models/gpt-sw3-20b (few-shot) 20918 64 2048 True 4,880 ± 1,052 / 1,181 ± 380 3.28 36.17 ± 2.52 / 27.29 ± 1.74 34.17 ± 7.08 / 46.97 ± 8.28 2.21 ± 1.64 / 38.29 ± 3.56 17.99 ± 4.02 / 38.26 ± 5.37 12.9.0 12.9.0 12.9.0 9.3.1
AI-Sweden-Models/gpt-sw3-20b-instruct (few-shot) 20918 64 2077 True 1,831 ± 587 / 268 ± 90 3.31 24.35 ± 1.72 / 21.90 ± 0.85 43.35 ± 3.81 / 60.49 ± 3.18 2.38 ± 1.21 / 37.27 ± 1.09 15.56 ± 2.24 / 34.68 ± 3.15 12.7.0 12.7.0 12.7.0 12.7.0
sentence-transformers/distiluse-base-multilingual-cased-v2 135 120 512 True 33,247 ± 8,123 / 6,017 ± 1,977 3.35 41.82 ± 3.25 / 42.03 ± 3.15 49.38 ± 1.72 / 66.10 ± 1.17 4.77 ± 2.08 / 49.48 ± 2.07 0.05 ± 0.06 / 0.92 ± 0.55 12.8.0 12.8.0 12.8.0 12.8.0
sentence-transformers/distiluse-base-multilingual-cased 135 120 512 True 19,206 ± 4,451 / 3,658 ± 1,187 3.35 40.20 ± 3.20 / 40.33 ± 3.13 48.71 ± 1.60 / 65.32 ± 1.57 5.53 ± 1.92 / 51.10 ± 1.38 0.06 ± 0.06 / 1.12 ± 1.10 12.6.1 12.6.1 12.6.1 12.6.1
google/gemma-2b-it (few-shot) 2506 256 8192 False 6,471 ± 1,142 / 1,961 ± 584 3.38 36.62 ± 1.56 / 28.22 ± 1.66 28.54 ± 2.70 / 50.10 ± 1.65 1.15 ± 1.66 / 38.16 ± 2.78 23.39 ± 1.00 / 51.61 ± 1.04 12.5.2 12.1.0 12.1.0 12.4.0
google/gemma-2b (few-shot) 2506 256 8192 True 6,087 ± 1,046 / 1,902 ± 563 3.38 16.95 ± 2.96 / 15.80 ± 2.16 44.96 ± 3.30 / 61.27 ± 2.88 0.77 ± 1.22 / 33.68 ± 0.59 17.92 ± 4.72 / 40.68 ± 6.34 12.5.2 12.1.0 12.1.0 12.1.0
Qwen/Qwen1.5-1.8B-Chat (few-shot) 1837 152 32768 False 8,304 ± 1,846 / 1,933 ± 617 3.39 28.04 ± 2.71 / 24.08 ± 1.58 36.21 ± 3.42 / 54.82 ± 3.32 3.12 ± 1.42 / 46.21 ± 2.93 16.33 ± 3.22 / 41.91 ± 4.34 12.5.2 11.0.0 12.1.0 12.5.0
allenai/OLMo-7B (few-shot) 6888 50 2080 True 5,403 ± 1,133 / 1,294 ± 423 3.43 30.85 ± 4.69 / 24.38 ± 3.10 49.77 ± 2.81 / 64.87 ± 2.42 2.67 ± 1.77 / 41.55 ± 4.54 4.09 ± 1.94 / 12.70 ± 2.66 12.5.2 12.5.2 12.5.2 12.5.2
sentence-transformers/distiluse-base-multilingual-cased-v1 135 120 512 True 34,042 ± 8,482 / 5,951 ± 1,950 3.45 28.29 ± 6.50 / 27.54 ± 6.33 51.70 ± 1.63 / 67.81 ± 1.08 2.12 ± 2.30 / 49.13 ± 2.33 0.03 ± 0.05 / 0.69 ± 0.53 12.8.0 12.8.0 12.8.0 12.8.0
RuterNorway/Llama-2-7b-chat-norwegian (few-shot) 6738 32 4096 False 10,890 ± 2,686 / 2,186 ± 750 3.46 27.22 ± 1.38 / 24.48 ± 1.76 33.54 ± 5.12 / 49.63 ± 5.78 0.45 ± 0.91 / 35.24 ± 3.71 20.44 ± 3.29 / 45.50 ± 3.33 9.3.1 9.3.1 9.3.1 12.5.2
Qwen/Qwen1.5-1.8B (few-shot) 1837 152 32768 True 5,666 ± 1,328 / 1,256 ± 408 3.62 9.23 ± 4.86 / 10.43 ± 3.83 38.30 ± 2.90 / 56.94 ± 2.83 0.39 ± 1.17 / 33.47 ± 0.34 16.67 ± 3.02 / 41.61 ± 3.00 12.5.2 10.0.1 12.1.0 12.1.0
dbmdz/bert-tiny-historic-multilingual-cased 5 32 512 True 78,027 ± 15,466 / 17,064 ± 5,335 3.78 33.18 ± 2.13 / 32.48 ± 2.13 33.61 ± 2.23 / 55.01 ± 2.11 1.83 ± 1.54 / 49.40 ± 1.24 0.00 ± 0.00 / 0.00 ± 0.00 12.6.1 12.6.1 12.6.1 12.6.1
3ebdola/Dialectal-Arabic-XLM-R-Base 277 250 512 True 12,783 ± 2,537 / 2,712 ± 885 3.80 30.18 ± 6.85 / 29.11 ± 6.64 32.66 ± 3.87 / 53.17 ± 4.09 2.10 ± 1.30 / 47.10 ± 2.29 0.46 ± 0.39 / 4.04 ± 2.36 12.8.0 12.8.0 12.8.0 12.8.0
allenai/OLMo-7B-Twin-2T (few-shot) 6888 50 2080 True 5,484 ± 1,125 / 1,317 ± 425 3.94 14.06 ± 5.31 / 12.90 ± 4.52 28.07 ± 6.33 / 38.61 ± 7.42 2.31 ± 1.88 / 44.45 ± 4.23 6.89 ± 2.35 / 17.95 ± 3.37 12.5.2 12.5.2 12.5.2 12.5.2
Qwen/Qwen1.5-0.5B-Chat (few-shot) 620 152 32768 False 11,740 ± 3,000 / 2,209 ± 721 4.05 24.67 ± 0.99 / 23.98 ± 0.73 9.31 ± 2.97 / 21.50 ± 2.70 1.11 ± 1.69 / 37.88 ± 4.05 13.60 ± 1.60 / 29.10 ± 1.94 12.5.2 11.0.0 12.1.0 12.5.0
Qwen/Qwen1.5-0.5B (few-shot) 620 152 32768 True 11,371 ± 2,924 / 2,122 ± 692 4.06 27.34 ± 1.95 / 24.46 ± 1.25 10.64 ± 5.31 / 26.79 ± 4.73 0.33 ± 1.20 / 35.20 ± 2.45 11.81 ± 2.10 / 27.38 ± 2.49 12.5.2 10.0.1 12.1.0 12.1.0
allenai/OLMo-1B (few-shot) 1177 50 2080 True 8,536 ± 1,926 / 1,940 ± 619 4.17 21.46 ± 2.04 / 20.83 ± 1.63 21.03 ± 6.33 / 38.33 ± 7.79 0.13 ± 1.48 / 43.17 ± 4.90 0.71 ± 0.53 / 6.02 ± 1.37 12.5.2 12.1.0 12.1.0 12.1.0
fresh-xlm-roberta-base 277 250 512 True 2,214 ± 94 / 1,494 ± 229 4.20 8.03 ± 1.35 / 8.63 ± 1.36 23.44 ± 7.21 / 42.87 ± 7.78 -0.17 ± 1.13 / 39.21 ± 4.70 0.00 ± 0.00 / 0.00 ± 0.00 12.6.1 12.6.1 12.6.1 12.6.1
fresh-electra-small 13 31 512 True 7,840 ± 1,538 / 3,024 ± 438 4.33 9.53 ± 0.71 / 9.93 ± 0.76 13.29 ± 8.29 / 29.94 ± 8.32 -0.15 ± 0.77 / 33.37 ± 0.37 0.00 ± 0.00 / 0.00 ± 0.00 12.8.0 12.8.0 12.8.0 12.8.0
RJuro/kanelsnegl-v0.1 (few-shot) 7242 32 512 True 9,757 ± 2,047 / 2,200 ± 705 4.82 0.00 ± 0.00 / 0.00 ± 0.00 0.00 ± 0.00 / 17.05 ± 0.35 0.00 ± 0.00 / 33.34 ± 0.31 0.00 ± 0.00 / 14.61 ± 0.65 9.3.1 9.3.1 9.3.1 12.5.1
ai-forever/mGPT (few-shot) unknown 100 1024 True 13,551 ± 4,259 / 2,563 ± 838 4.82 0.30 ± 0.60 / 0.26 ± 0.50 0.29 ± 1.28 / 17.22 ± 1.25 -0.11 ± 1.16 / 36.65 ± 3.98 0.00 ± 0.00 / 1.54 ± 0.12 9.3.1 10.0.1 11.0.0 12.5.1
Download as CSV   •   Copy embed HTML