Swedish NLG 🇸🇪

Last updated: 03/05/2024 12:49:05 CET
Model ID Parameters Vocabulary size Context Commercial Speed Rank SUC3 SweReC ScaLA-sv ScandiQA-sv SweDN MMLU-sv HellaSwag-sv SUC3 version SweReC version ScaLA-sv version ScandiQA-sv version SweDN version MMLU-sv version HellaSwag-sv version
gpt-4-0613 (few-shot, val) unknown 100 8191 True 597 ± 197 / 93 ± 33 1.09 76.86 ± 1.89 / 54.97 ± 4.44 79.19 ± 1.87 / 80.56 ± 1.82 80.93 ± 1.67 / 89.90 ± 0.93 53.81 ± 1.28 / 65.15 ± 1.11 67.83 ± 0.15 / 22.67 ± 0.39 72.53 ± 1.82 / 79.26 ± 1.44 85.67 ± 2.59 / 89.14 ± 2.05 0.0.0 0.0.0 0.0.0 12.9.0 9.3.2 9.3.2 9.3.2
meta-llama/Meta-Llama-3-70B (few-shot, val) 70554 128 8192 True 312 ± 55 / 177 ± 51 1.40 74.61 ± 2.99 / 56.50 ± 6.30 78.61 ± 1.40 / 78.64 ± 1.53 63.20 ± 3.34 / 80.61 ± 2.52 61.98 ± 1.65 / 66.85 ± 1.42 67.60 ± 0.41 / 22.47 ± 0.82 61.55 ± 1.68 / 71.02 ± 1.21 66.21 ± 3.22 / 73.40 ± 2.77 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
152334H/miqu-1-70b-sf (few-shot, val) 68977 32 32764 True 2,126 ± 676 / 319 ± 104 1.78 62.96 ± 3.44 / 52.14 ± 4.04 75.25 ± 2.41 / 78.80 ± 1.96 53.28 ± 3.33 / 75.37 ± 1.80 56.42 ± 1.65 / 65.04 ± 1.17 67.60 ± 0.30 / 21.60 ± 0.76 53.56 ± 2.20 / 64.88 ± 1.66 59.70 ± 4.69 / 69.38 ± 3.70 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
meta-llama/Meta-Llama-3-70B-Instruct (few-shot, val) 70554 128 8192 True 1,673 ± 583 / 275 ± 85 1.83 77.06 ± 2.72 / 67.75 ± 5.69 53.56 ± 7.15 / 67.07 ± 3.93 47.50 ± 3.37 / 71.31 ± 2.69 46.86 ± 1.77 / 60.96 ± 1.04 68.25 ± 0.18 / 22.84 ± 0.44 61.31 ± 2.07 / 70.78 ± 1.60 66.73 ± 2.34 / 74.65 ± 1.78 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
gpt-3.5-turbo-0613 (few-shot, val) unknown 100 4095 True 921 ± 293 / 113 ± 37 1.86 73.04 ± 2.74 / 61.64 ± 3.63 72.77 ± 2.64 / 72.56 ± 2.45 58.06 ± 3.84 / 76.06 ± 2.51 58.02 ± 2.11 / 66.84 ± 1.38 66.92 ± 0.16 / 19.00 ± 0.28 40.73 ± 3.36 / 55.16 ± 2.75 50.51 ± 2.33 / 62.07 ± 1.95 0.0.0 0.0.0 0.0.0 12.9.0 11.0.0 0.0.0 0.0.0
meta-llama/Llama-2-70b-hf (few-shot, val) 68977 32 4096 True 1,892 ± 650 / 318 ± 105 1.90 64.76 ± 3.91 / 61.08 ± 5.41 75.46 ± 1.99 / 74.35 ± 3.70 43.27 ± 5.03 / 65.62 ± 4.94 63.04 ± 1.52 / 66.95 ± 1.31 68.43 ± 0.33 / 24.92 ± 0.74 46.16 ± 2.67 / 59.53 ± 2.04 50.41 ± 5.18 / 62.34 ± 3.86 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
timpal0l/sol (few-shot) 10732 32 4096 False 3,701 ± 876 / 771 ± 247 2.12 57.51 ± 2.30 / 37.74 ± 3.15 77.31 ± 1.01 / 70.55 ± 2.26 25.06 ± 5.02 / 49.04 ± 4.68 60.16 ± 1.77 / 67.43 ± 1.02 65.22 ± 0.19 / 18.86 ± 0.30 39.52 ± 0.58 / 54.47 ± 0.38 70.93 ± 1.29 / 78.03 ± 1.01 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
upstage/SOLAR-10.7B-v1.0 (few-shot) 10732 32 4096 True 3,780 ± 906 / 799 ± 261 2.16 59.65 ± 2.22 / 39.33 ± 3.33 77.48 ± 1.23 / 70.13 ± 2.81 16.94 ± 2.36 / 40.98 ± 1.82 62.65 ± 0.56 / 68.15 ± 0.56 65.19 ± 0.36 / 19.09 ± 0.55 39.82 ± 0.69 / 54.84 ± 0.53 68.87 ± 1.35 / 76.46 ± 1.04 12.5.3 12.5.3 12.5.3 12.5.3 12.5.3 12.5.3 12.5.3
Nexusflow/Starling-LM-7B-beta (few-shot) 7242 32 8192 False 5,876 ± 1,021 / 1,677 ± 546 2.24 60.38 ± 1.60 / 36.17 ± 3.66 77.49 ± 0.98 / 72.07 ± 1.56 29.32 ± 2.34 / 54.43 ± 2.67 56.79 ± 0.83 / 65.84 ± 0.48 65.75 ± 0.16 / 20.23 ± 0.23 36.05 ± 1.10 / 51.86 ± 0.87 51.15 ± 1.71 / 63.20 ± 1.32 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2
timpal0l/dolphin-2.9-llama3-8b-flashback (few-shot, val) 8030 128 8192 False 5,018 ± 1,216 / 996 ± 324 2.35 65.33 ± 2.38 / 46.88 ± 3.97 74.99 ± 3.45 / 76.76 ± 1.80 32.65 ± 5.08 / 61.25 ± 4.41 55.71 ± 1.34 / 64.54 ± 1.00 66.53 ± 0.29 / 19.24 ± 0.57 33.16 ± 2.11 / 49.26 ± 1.55 32.51 ± 2.97 / 48.24 ± 2.10 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
timpal0l/BeagleCatMunin (few-shot, val) 7242 32 32768 False 2,495 ± 458 / 775 ± 244 2.37 50.53 ± 3.30 / 37.77 ± 4.38 77.37 ± 2.25 / 78.66 ± 2.43 27.84 ± 4.72 / 49.46 ± 4.52 59.98 ± 1.65 / 65.44 ± 1.38 67.89 ± 0.44 / 23.94 ± 0.68 34.80 ± 2.79 / 50.82 ± 2.10 36.65 ± 5.07 / 51.56 ± 4.13 9.3.2 9.3.2 9.3.2 12.5.2 9.3.2 9.3.2 9.3.2
timpal0l/Llama-3-8B-flashback-v1 (few-shot) 8030 128 8192 True 4,807 ± 1,152 / 979 ± 319 2.37 59.03 ± 2.04 / 41.99 ± 4.35 81.13 ± 0.94 / 80.80 ± 1.09 33.06 ± 3.65 / 61.21 ± 3.26 58.21 ± 0.67 / 64.01 ± 0.68 64.42 ± 0.69 / 18.01 ± 0.55 35.24 ± 0.83 / 50.70 ± 0.57 25.14 ± 2.02 / 42.54 ± 1.99 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
four-two-labs/orpo-llama-3-swe (few-shot) 8030 128 8192 False 4,974 ± 1,208 / 1,032 ± 342 2.39 60.93 ± 2.85 / 38.87 ± 3.50 79.74 ± 0.68 / 75.13 ± 1.85 26.02 ± 4.38 / 52.19 ± 5.44 59.84 ± 0.92 / 65.92 ± 0.82 64.99 ± 0.26 / 18.65 ± 0.32 36.35 ± 1.03 / 51.91 ± 0.77 27.22 ± 2.24 / 45.02 ± 1.74 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
meta-llama/Meta-Llama-3-8B (few-shot) 8030 128 8192 True 4,687 ± 1,121 / 967 ± 313 2.39 60.36 ± 2.84 / 39.37 ± 3.56 79.74 ± 0.75 / 75.11 ± 1.91 28.24 ± 4.19 / 55.29 ± 5.35 59.73 ± 1.13 / 65.72 ± 0.94 64.81 ± 0.24 / 18.56 ± 0.35 35.86 ± 0.90 / 51.39 ± 0.69 26.49 ± 1.89 / 44.41 ± 1.56 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1
RJuro/munin-neuralbeagle-7b (few-shot, val) 7242 32 32768 False 2,493 ± 466 / 773 ± 243 2.42 62.96 ± 2.62 / 51.99 ± 5.66 77.13 ± 2.43 / 78.36 ± 1.88 15.73 ± 7.07 / 47.41 ± 5.31 58.43 ± 1.59 / 65.06 ± 1.19 67.58 ± 0.22 / 22.52 ± 0.52 32.54 ± 2.61 / 49.30 ± 1.93 34.94 ± 3.79 / 50.39 ± 3.23 9.3.2 9.3.2 9.3.2 12.5.2 9.3.2 9.3.2 9.3.2
meta-llama/Llama-2-70b-chat-hf (few-shot, val) 68977 32 3843 True 1,979 ± 621 / 320 ± 105 2.44 55.91 ± 3.25 / 39.73 ± 4.94 64.52 ± 3.15 / 70.51 ± 2.49 23.85 ± 7.34 / 56.89 ± 6.08 58.88 ± 1.51 / 65.82 ± 1.07 67.57 ± 0.24 / 21.77 ± 0.61 37.60 ± 3.30 / 52.46 ± 2.31 31.78 ± 2.98 / 47.11 ± 2.71 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
merge-crew/da-sv-slerp (few-shot, val) 7242 32 32768 True 2,467 ± 469 / 762 ± 244 2.45 46.57 ± 3.34 / 33.94 ± 3.73 76.53 ± 2.55 / 77.96 ± 3.04 33.43 ± 3.89 / 61.87 ± 4.02 59.87 ± 1.52 / 64.53 ± 1.41 66.76 ± 0.41 / 21.92 ± 0.79 28.89 ± 2.22 / 46.41 ± 1.55 30.36 ± 2.80 / 47.15 ± 2.26 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1
merge-crew/da-sv-task-arithmetic (few-shot, val) 7242 32 32768 True 2,500 ± 469 / 762 ± 238 2.45 47.28 ± 3.05 / 34.01 ± 3.73 76.62 ± 2.52 / 78.04 ± 2.98 33.23 ± 4.72 / 61.29 ± 4.67 60.00 ± 1.69 / 64.62 ± 1.44 66.68 ± 0.43 / 21.83 ± 0.75 29.95 ± 1.74 / 47.23 ± 1.24 31.12 ± 3.68 / 47.85 ± 2.80 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1
birgermoell/Flashback-Bellman (few-shot, val) 7242 32 32768 False 2,887 ± 403 / 1,144 ± 345 2.47 55.29 ± 3.95 / 41.59 ± 4.48 78.29 ± 1.83 / 78.77 ± 2.06 18.45 ± 3.00 / 46.38 ± 2.81 58.42 ± 1.64 / 63.83 ± 1.18 67.54 ± 0.48 / 23.67 ± 0.69 29.44 ± 2.34 / 46.95 ± 1.75 37.45 ± 3.61 / 52.85 ± 2.76 9.3.1 9.3.1 9.3.1 12.5.2 9.3.1 9.3.1 9.3.1
RJuro/munin-neuralbeagle-SkoleGPTOpenOrca-7b (few-shot, val) 7242 32 32768 False 3,008 ± 429 / 991 ± 323 2.48 59.36 ± 2.75 / 47.08 ± 4.17 72.04 ± 3.27 / 63.83 ± 2.07 22.38 ± 7.17 / 54.70 ± 5.49 57.96 ± 2.00 / 64.06 ± 1.76 65.13 ± 0.34 / 19.26 ± 0.43 29.81 ± 2.25 / 47.46 ± 1.72 35.59 ± 3.75 / 51.76 ± 2.63 9.3.2 9.3.2 9.3.2 12.5.2 9.3.2 9.3.2 9.3.2
mlabonne/NeuralBeagle14-7B (few-shot, val) 7242 32 8192 False 2,549 ± 472 / 784 ± 245 2.48 61.25 ± 3.35 / 50.76 ± 5.94 76.03 ± 2.11 / 78.25 ± 1.95 16.28 ± 4.81 / 49.04 ± 3.60 50.96 ± 2.34 / 60.05 ± 1.18 68.35 ± 0.32 / 24.05 ± 0.66 32.30 ± 2.48 / 48.98 ± 1.96 38.78 ± 5.70 / 52.89 ± 4.91 9.3.2 9.3.2 9.3.2 12.5.2 9.3.2 9.3.2 9.3.2
timpal0l/BeagleCatMunin2 (few-shot, val) 7242 32 32768 False 2,477 ± 459 / 767 ± 241 2.48 60.87 ± 3.71 / 47.40 ± 5.32 73.72 ± 2.20 / 67.79 ± 2.37 6.78 ± 4.34 / 35.90 ± 2.11 58.75 ± 1.46 / 65.08 ± 1.15 68.06 ± 0.31 / 23.91 ± 0.64 33.71 ± 2.28 / 50.08 ± 1.68 41.45 ± 3.36 / 55.51 ± 2.69 9.3.1 9.3.1 9.3.1 12.5.2 9.3.1 9.3.1 9.3.1
timpal0l/Mistral-7B-v0.1-flashback-v2 (few-shot) 7242 32 32768 True 5,054 ± 1,200 / 1,056 ± 339 2.49 44.14 ± 2.40 / 29.77 ± 4.06 80.14 ± 1.11 / 80.19 ± 0.78 34.23 ± 2.23 / 65.29 ± 2.17 57.07 ± 1.56 / 62.52 ± 1.11 65.15 ± 0.31 / 18.72 ± 0.57 33.24 ± 0.85 / 49.69 ± 0.67 25.50 ± 2.25 / 43.44 ± 2.03 12.5.3 12.5.3 12.5.3 12.5.3 12.5.3 12.5.3 12.5.3
AI-Sweden-Models/tyr (few-shot, val) 7242 32 32768 False 6,079 ± 1,051 / 1,760 ± 570 2.50 56.21 ± 2.49 / 44.78 ± 4.19 78.30 ± 1.71 / 79.80 ± 2.03 14.35 ± 5.65 / 48.69 ± 4.30 61.08 ± 1.47 / 65.72 ± 1.07 67.96 ± 0.40 / 24.14 ± 0.74 31.74 ± 2.48 / 48.52 ± 1.88 30.12 ± 2.07 / 47.58 ± 1.29 12.5.2 12.3.2 12.3.2 12.3.2 12.3.2 12.3.2 12.3.2
meta-llama/Meta-Llama-3-8B-Instruct (few-shot) 8030 128 8192 True 4,909 ± 1,215 / 978 ± 319 2.50 69.67 ± 1.30 / 52.94 ± 4.01 59.93 ± 4.70 / 67.54 ± 3.04 27.63 ± 3.19 / 60.85 ± 3.29 49.84 ± 1.61 / 60.85 ± 0.93 66.60 ± 0.07 / 19.13 ± 0.31 33.54 ± 1.40 / 49.20 ± 1.13 30.32 ± 2.27 / 45.96 ± 1.87 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1
bineric/NorskGPT-Llama3-8b (few-shot) 8030 128 8192 False 3,695 ± 1,277 / 532 ± 183 2.51 63.19 ± 2.83 / 51.22 ± 3.61 76.06 ± 0.64 / 61.59 ± 0.77 5.34 ± 1.42 / 34.32 ± 0.56 56.70 ± 0.87 / 66.00 ± 0.59 66.25 ± 0.16 / 20.28 ± 0.33 36.23 ± 0.91 / 51.68 ± 0.73 43.60 ± 1.33 / 56.86 ± 1.15 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
merge-crew/da-sv-dare-ties-density-0.9 (few-shot, val) 7242 32 32768 True 2,443 ± 458 / 750 ± 240 2.51 46.61 ± 3.11 / 34.10 ± 4.61 76.38 ± 2.01 / 78.30 ± 2.42 34.16 ± 4.39 / 60.06 ± 4.67 58.77 ± 1.76 / 63.50 ± 1.47 66.77 ± 0.46 / 22.42 ± 0.84 29.77 ± 2.44 / 46.25 ± 1.64 25.38 ± 3.56 / 39.34 ± 3.89 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1
birgermoell/BeagleCatMunin-Flashback-Bellman (few-shot, val) 7242 32 32768 False 2,890 ± 401 / 1,155 ± 348 2.53 52.96 ± 3.45 / 41.51 ± 4.30 76.99 ± 2.37 / 76.84 ± 2.99 14.27 ± 4.36 / 40.60 ± 3.04 59.92 ± 1.64 / 64.87 ± 1.47 67.62 ± 0.42 / 23.90 ± 0.77 27.95 ± 2.57 / 45.86 ± 1.85 36.11 ± 3.54 / 51.60 ± 2.44 9.3.1 9.3.1 9.3.1 12.5.2 9.3.1 9.3.1 9.3.1
birgermoell/Rapid-Cycling (few-shot, val) 7242 32 32768 False 2,346 ± 450 / 666 ± 249 2.55 53.66 ± 3.57 / 41.97 ± 4.83 77.72 ± 2.51 / 78.40 ± 2.65 16.22 ± 4.46 / 43.17 ± 3.88 59.75 ± 1.13 / 64.72 ± 1.04 67.57 ± 0.48 / 23.65 ± 0.72 27.24 ± 2.07 / 45.51 ± 1.53 32.04 ± 4.21 / 48.67 ± 3.11 9.3.2 9.3.2 9.3.2 12.5.2 9.3.2 9.3.2 9.3.2
mhenrichsen/danskgpt-chat-v2.1 (few-shot) unknown 32 32768 True 5,085 ± 998 / 1,306 ± 404 2.55 54.37 ± 3.04 / 42.16 ± 4.00 75.98 ± 1.15 / 74.44 ± 1.12 17.98 ± 1.97 / 56.01 ± 2.08 55.07 ± 0.74 / 64.24 ± 0.61 64.42 ± 0.09 / 14.42 ± 0.51 32.81 ± 0.91 / 49.28 ± 0.64 36.24 ± 1.44 / 51.96 ± 1.13 12.0.0 12.0.0 12.0.0 12.0.0 12.0.0 12.0.0 12.0.0
merge-crew/da-sv-ties (few-shot, val) 7242 32 32768 True 2,457 ± 451 / 757 ± 237 2.56 48.36 ± 3.07 / 34.48 ± 5.22 76.57 ± 2.19 / 78.11 ± 2.73 20.94 ± 5.55 / 44.72 ± 4.06 59.07 ± 1.90 / 63.87 ± 1.46 66.59 ± 0.50 / 22.19 ± 0.78 31.44 ± 1.94 / 47.30 ± 1.54 26.04 ± 3.42 / 38.83 ± 4.24 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1
birgermoell/Munin-NeuralBeagle-NorskGPT (few-shot, val) 7242 32 32768 False 2,903 ± 407 / 1,157 ± 350 2.57 63.85 ± 2.67 / 47.77 ± 4.72 73.72 ± 2.98 / 62.83 ± 1.64 -0.56 ± 2.24 / 33.54 ± 1.03 60.10 ± 1.48 / 66.26 ± 1.19 68.11 ± 0.21 / 23.63 ± 0.56 27.79 ± 2.32 / 45.82 ± 1.61 42.43 ± 2.76 / 56.52 ± 2.13 9.3.1 9.3.1 9.3.1 12.5.2 9.3.1 9.3.1 9.3.1
birgermoell/WestLake-Munin-Cat-NorskGPT (few-shot, val) 7242 32 32768 False 2,856 ± 391 / 1,142 ± 342 2.57 63.85 ± 2.67 / 47.77 ± 4.72 73.72 ± 2.98 / 62.83 ± 1.64 -0.56 ± 2.24 / 33.54 ± 1.03 60.10 ± 1.48 / 66.26 ± 1.19 68.11 ± 0.21 / 23.63 ± 0.56 27.79 ± 2.32 / 45.82 ± 1.61 42.43 ± 2.76 / 56.52 ± 2.13 9.3.1 9.3.1 9.3.1 12.5.2 9.3.1 9.3.1 9.3.1
senseable/WestLake-7B-v2 (few-shot) 7242 32 32768 False 5,993 ± 1,028 / 1,742 ± 561 2.57 58.90 ± 1.34 / 42.48 ± 3.97 67.74 ± 2.79 / 71.89 ± 1.89 16.52 ± 2.55 / 46.30 ± 2.62 49.41 ± 1.21 / 59.91 ± 0.48 66.09 ± 0.17 / 19.64 ± 0.27 31.76 ± 0.89 / 48.64 ± 0.69 45.84 ± 1.47 / 59.27 ± 1.16 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1
bineric/NorskGPT-Mistral-7b (few-shot) 7242 32 32768 False 2,443 ± 451 / 761 ± 237 2.60 58.40 ± 2.62 / 40.55 ± 3.65 74.30 ± 1.26 / 60.35 ± 0.41 0.00 ± 0.00 / 33.37 ± 0.27 59.16 ± 1.23 / 65.78 ± 0.72 65.36 ± 0.14 / 18.81 ± 0.17 35.01 ± 0.99 / 51.07 ± 0.70 43.72 ± 0.69 / 57.66 ± 0.50 9.3.1 9.3.1 9.3.1 12.5.1 11.0.0 9.3.1 9.3.1
Mabeck/Heidrun-Mistral-7B-chat (few-shot) 7242 32 32768 False 5,822 ± 1,283 / 1,336 ± 430 2.62 55.06 ± 2.38 / 41.39 ± 4.31 77.50 ± 0.90 / 73.87 ± 1.21 17.47 ± 2.33 / 47.73 ± 3.35 58.67 ± 0.96 / 64.58 ± 0.78 64.18 ± 0.24 / 18.13 ± 0.35 31.04 ± 1.08 / 48.29 ± 0.82 23.57 ± 1.68 / 42.37 ± 1.34 10.0.1 10.0.1 10.0.1 12.5.0 12.5.0 10.0.1 10.0.1
timpal0l/njord-alpha (few-shot) 7242 32 32768 True 5,431 ± 1,267 / 1,139 ± 365 2.62 48.19 ± 2.55 / 37.50 ± 3.62 79.95 ± 0.87 / 81.24 ± 0.64 32.85 ± 2.28 / 61.74 ± 3.05 57.39 ± 1.52 / 63.58 ± 1.19 65.95 ± 0.25 / 20.56 ± 0.37 25.32 ± 0.99 / 42.09 ± 1.04 14.55 ± 2.32 / 31.99 ± 2.16 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
merge-crew/da-sv-dare-ties-density-0.6 (few-shot, val) 7242 32 32768 True 2,515 ± 465 / 785 ± 247 2.63 45.12 ± 2.72 / 30.73 ± 4.55 78.74 ± 2.13 / 80.11 ± 2.64 19.74 ± 6.09 / 46.97 ± 5.83 60.15 ± 1.71 / 65.22 ± 1.28 66.41 ± 0.46 / 21.90 ± 0.70 31.24 ± 3.01 / 47.77 ± 2.19 22.30 ± 3.50 / 39.45 ± 2.60 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1
danish-foundation-models/munin-7b-v0.1dev0 (few-shot) 7242 32 8192 True 6,113 ± 1,044 / 1,790 ± 579 2.65 47.10 ± 2.60 / 35.06 ± 3.65 73.05 ± 5.27 / 74.56 ± 4.19 30.29 ± 2.63 / 61.40 ± 3.22 57.39 ± 1.38 / 63.51 ± 1.04 64.69 ± 0.53 / 19.03 ± 0.41 27.40 ± 0.76 / 44.17 ± 0.82 21.08 ± 3.28 / 38.46 ± 2.96 12.5.2 12.4.0 12.4.0 12.4.0 12.4.0 12.4.0 12.4.0
mlabonne/AlphaMonarch-7B (few-shot, val) 7242 32 8192 False 5,340 ± 1,262 / 1,157 ± 375 2.65 60.53 ± 3.06 / 48.45 ± 5.19 67.03 ± 3.61 / 70.77 ± 1.95 15.10 ± 4.60 / 48.57 ± 2.91 42.46 ± 1.63 / 53.50 ± 1.40 67.94 ± 0.21 / 22.99 ± 0.24 27.51 ± 3.08 / 45.43 ± 2.37 42.29 ± 5.08 / 55.43 ± 4.50 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2
KennethEnevoldsen/munin_mistral-7b (few-shot, val) 7242 32 32768 False 2,543 ± 466 / 787 ± 247 2.71 52.34 ± 3.07 / 39.14 ± 4.60 77.66 ± 2.09 / 78.59 ± 2.41 6.00 ± 4.15 / 36.34 ± 2.20 60.16 ± 1.81 / 64.12 ± 1.59 65.54 ± 0.49 / 19.31 ± 0.71 31.83 ± 2.27 / 48.55 ± 1.67 20.55 ± 3.93 / 38.95 ± 3.23 12.5.2 12.3.1 12.3.1 12.3.2 12.3.2 12.3.2 12.3.2
Mabeck/Heidrun-Mistral-7B-base (few-shot) 7242 32 32768 True 3,823 ± 967 / 860 ± 280 2.71 48.43 ± 2.75 / 35.31 ± 2.80 79.43 ± 0.85 / 78.21 ± 1.69 17.37 ± 2.57 / 52.91 ± 4.93 57.05 ± 1.22 / 62.72 ± 0.89 63.81 ± 0.34 / 18.13 ± 0.31 31.72 ± 0.55 / 48.70 ± 0.45 15.69 ± 2.43 / 35.96 ± 2.03 11.0.0 11.0.0 11.0.0 11.0.0 11.0.0 11.0.0 11.0.0
alpindale/Mistral-7B-v0.2-hf (few-shot) 7242 32 32768 True 1,841 ± 297 / 651 ± 193 2.72 48.96 ± 2.72 / 39.25 ± 3.69 78.90 ± 0.95 / 78.62 ± 1.08 10.82 ± 3.46 / 38.95 ± 3.80 58.91 ± 1.02 / 64.72 ± 0.76 64.78 ± 0.33 / 19.24 ± 0.42 34.52 ± 1.19 / 50.47 ± 0.93 20.96 ± 1.96 / 39.95 ± 1.66 12.5.2 12.5.1 12.5.1 12.5.1 12.5.1 12.5.1 12.5.1
mhenrichsen/hestenettetLM (few-shot) 7242 32 32768 True 5,160 ± 804 / 1,654 ± 516 2.73 53.00 ± 2.53 / 39.09 ± 3.72 79.70 ± 0.65 / 79.45 ± 0.68 4.32 ± 2.19 / 34.43 ± 0.87 59.03 ± 1.03 / 64.74 ± 0.84 64.89 ± 0.28 / 19.31 ± 0.40 35.48 ± 0.99 / 51.54 ± 0.72 20.54 ± 2.14 / 39.66 ± 1.80 12.5.2 12.3.2 12.3.2 12.3.2 12.3.2 12.3.2 12.3.2
mistralai/Mistral-7B-v0.1 (few-shot) 7242 32 32768 True 2,657 ± 524 / 880 ± 278 2.73 53.34 ± 2.55 / 40.48 ± 3.66 80.00 ± 0.70 / 79.80 ± 0.66 4.61 ± 2.18 / 34.51 ± 0.86 58.99 ± 1.05 / 64.65 ± 0.83 64.87 ± 0.31 / 19.30 ± 0.43 35.52 ± 1.01 / 51.52 ± 0.73 19.67 ± 2.31 / 38.98 ± 1.98 0.0.0 0.0.0 0.0.0 12.5.1 11.0.0 9.1.2 9.1.2
ThatsGroes/munin-SkoleGPTOpenOrca-7b-16bit (few-shot) 7242 32 32768 False 3,006 ± 479 / 1,053 ± 319 2.77 44.64 ± 1.66 / 31.30 ± 2.96 77.98 ± 1.01 / 72.79 ± 2.47 16.57 ± 2.58 / 51.86 ± 3.69 57.31 ± 0.92 / 63.73 ± 1.04 63.23 ± 0.35 / 15.35 ± 0.57 28.15 ± 0.90 / 45.69 ± 0.72 23.58 ± 1.41 / 42.30 ± 1.04 11.0.0 11.0.0 11.0.0 12.4.0 12.4.0 11.0.0 11.0.0
danish-foundation-models/munin-7b-alpha (few-shot) 7242 32 32768 True 6,116 ± 1,049 / 1,784 ± 577 2.79 42.23 ± 2.44 / 30.30 ± 4.71 78.80 ± 0.93 / 75.28 ± 1.78 15.47 ± 1.79 / 54.26 ± 3.41 56.75 ± 1.15 / 62.43 ± 0.95 62.78 ± 0.76 / 16.74 ± 0.45 30.86 ± 1.12 / 47.83 ± 0.93 19.11 ± 2.74 / 38.55 ± 2.29 12.5.2 12.4.0 12.4.0 12.4.0 12.4.0 12.4.0 12.4.0
timpal0l/Mistral-7B-v0.1-flashback-v2-instruct (few-shot) 7242 32 32768 False 5,172 ± 813 / 1,647 ± 518 2.83 46.74 ± 4.30 / 33.57 ± 4.51 77.06 ± 1.82 / 79.02 ± 1.37 14.00 ± 1.59 / 53.89 ± 3.10 56.74 ± 0.52 / 63.45 ± 0.49 62.56 ± 0.85 / 15.85 ± 0.34 30.87 ± 1.35 / 47.77 ± 1.01 15.79 ± 1.57 / 35.66 ± 0.84 12.5.2 12.3.2 12.3.2 12.4.0 12.4.0 12.3.2 12.3.2
mistralai/Mistral-7B-Instruct-v0.2 (few-shot) 7242 32 32768 False 2,538 ± 415 / 821 ± 253 2.84 47.92 ± 2.66 / 33.00 ± 3.24 62.90 ± 2.44 / 70.61 ± 1.19 19.95 ± 2.24 / 56.49 ± 2.10 52.51 ± 0.36 / 61.42 ± 0.52 66.11 ± 0.18 / 19.64 ± 0.28 25.60 ± 1.10 / 43.53 ± 0.90 21.75 ± 1.61 / 40.57 ± 1.45 9.2.0 9.2.0 9.3.1 12.4.0 12.4.0 9.3.2 9.3.2
birgermoell/NeuralBeagle-Flashback (few-shot, val) 7242 32 32768 False 2,904 ± 405 / 1,155 ± 349 2.85 51.73 ± 4.51 / 40.50 ± 6.05 36.06 ± 3.31 / 53.46 ± 1.79 19.42 ± 5.08 / 46.92 ± 5.36 59.26 ± 1.66 / 64.40 ± 1.35 67.55 ± 0.53 / 23.64 ± 0.72 23.10 ± 2.38 / 42.58 ± 1.74 29.31 ± 5.03 / 47.11 ± 3.62 9.3.0 9.3.0 9.3.0 12.5.2 9.3.0 9.3.0 9.3.0
RuterNorway/Llama-2-13b-chat-norwegian (few-shot) unknown 32 4096 False 3,254 ± 1,068 / 484 ± 173 2.90 50.85 ± 2.44 / 39.65 ± 3.83 74.17 ± 2.12 / 76.62 ± 1.83 7.51 ± 1.94 / 37.81 ± 1.76 57.32 ± 0.63 / 63.28 ± 0.71 65.20 ± 0.45 / 19.06 ± 0.15 23.92 ± 0.88 / 42.25 ± 0.73 17.67 ± 1.53 / 37.32 ± 1.20 9.3.1 9.3.1 9.3.1 9.3.1 9.3.1 9.3.1 9.3.1
occiglot/occiglot-7b-eu5-instruct (few-shot) 7242 32 32768 False 2,088 ± 352 / 706 ± 214 2.92 47.67 ± 2.81 / 36.91 ± 3.50 71.73 ± 2.40 / 74.97 ± 1.84 7.90 ± 3.20 / 41.24 ± 4.78 57.78 ± 0.79 / 64.48 ± 0.73 65.07 ± 0.34 / 19.59 ± 0.38 25.52 ± 1.30 / 43.68 ± 1.03 14.06 ± 1.68 / 35.12 ± 1.47 12.5.2 12.2.0 12.3.1 12.4.0 12.4.0 12.3.1 12.3.1
occiglot/occiglot-7b-eu5 (few-shot) 7242 32 32768 True 2,219 ± 427 / 717 ± 224 2.94 49.02 ± 3.23 / 41.69 ± 3.74 76.56 ± 1.52 / 78.16 ± 1.12 2.18 ± 2.34 / 36.26 ± 3.89 58.98 ± 0.95 / 63.65 ± 0.89 64.42 ± 0.45 / 18.79 ± 0.47 23.68 ± 1.41 / 42.15 ± 1.14 14.05 ± 1.60 / 34.81 ± 1.58 12.5.2 12.1.0 12.1.0 12.1.0 12.1.0 12.1.0 12.2.0
01-ai/Yi-6B (few-shot) 6061 64 4096 True 2,786 ± 532 / 784 ± 250 2.97 46.69 ± 2.39 / 32.97 ± 4.57 75.39 ± 1.06 / 71.95 ± 1.42 2.91 ± 2.80 / 35.26 ± 2.12 54.95 ± 0.86 / 60.77 ± 0.75 62.70 ± 0.76 / 17.52 ± 0.40 25.28 ± 0.72 / 43.71 ± 0.56 19.20 ± 1.18 / 38.76 ± 0.96 9.3.2 10.0.0 10.0.0 12.5.1 12.0.0 10.0.1 10.0.1
mistralai/Mistral-7B-Instruct-v0.1 (few-shot) 7242 32 32768 False 5,443 ± 1,273 / 1,144 ± 364 2.97 45.01 ± 2.11 / 27.59 ± 3.35 73.33 ± 1.98 / 76.19 ± 1.59 11.59 ± 3.45 / 40.89 ± 4.15 52.12 ± 1.42 / 59.29 ± 1.17 63.10 ± 0.60 / 18.05 ± 0.36 24.03 ± 1.09 / 42.32 ± 0.70 15.37 ± 0.71 / 35.78 ± 0.69 9.3.1 9.3.1 9.3.1 12.4.0 12.4.0 9.3.1 9.3.1
neph1/bellman-7b-mistral-instruct-v0.2 (few-shot) 7242 32 32768 False 2,518 ± 463 / 779 ± 243 2.98 54.38 ± 2.92 / 39.66 ± 5.20 55.84 ± 2.51 / 66.96 ± 1.37 16.05 ± 2.15 / 54.22 ± 2.86 53.22 ± 0.88 / 61.85 ± 0.63 64.90 ± 0.14 / 16.99 ± 0.20 22.36 ± 1.17 / 41.14 ± 0.78 12.52 ± 1.41 / 33.90 ± 1.11 9.2.0 9.2.0 9.2.0 12.4.0 12.4.0 9.2.0 9.2.0
meta-llama/Llama-2-7b-hf (few-shot) 6738 32 4096 True 930 ± 310 / 128 ± 43 3.03 44.11 ± 4.26 / 31.64 ± 4.48 79.05 ± 1.08 / 75.52 ± 2.66 7.34 ± 3.19 / 43.83 ± 5.31 57.49 ± 0.95 / 63.16 ± 0.77 64.63 ± 0.39 / 18.68 ± 0.39 15.65 ± 0.55 / 36.32 ± 0.55 8.74 ± 1.34 / 29.87 ± 1.40 9.2.0 9.2.0 9.2.0 12.5.1 12.0.0 9.2.0 9.2.0
bineric/NorskGPT-Llama-7B-v0.1 (few-shot) 6738 32 4096 False 5,384 ± 879 / 1,746 ± 553 3.04 53.95 ± 1.89 / 42.16 ± 4.59 60.91 ± 2.35 / 59.47 ± 1.21 0.32 ± 0.62 / 33.39 ± 0.28 55.28 ± 0.62 / 63.41 ± 0.55 63.73 ± 0.18 / 15.64 ± 0.27 20.96 ± 0.77 / 40.70 ± 0.59 25.76 ± 1.39 / 43.71 ± 1.09 12.5.2 12.3.2 12.3.2 12.3.2 12.3.2 12.3.2 12.3.2
merge-crew/da-sv-dare-ties-density-0.3 (few-shot, val) 7242 32 32768 True 2,461 ± 476 / 773 ± 248 3.11 32.37 ± 3.05 / 24.60 ± 3.81 75.33 ± 2.41 / 77.99 ± 2.58 12.73 ± 6.32 / 45.51 ± 7.43 53.05 ± 1.83 / 58.32 ± 1.46 64.74 ± 0.74 / 19.59 ± 0.87 15.60 ± 1.96 / 33.16 ± 1.77 9.81 ± 2.55 / 28.12 ± 2.70 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1
meta-llama/Llama-2-7b-chat-hf (few-shot) 6738 32 4096 False 2,643 ± 455 / 800 ± 247 3.11 39.72 ± 2.82 / 29.85 ± 2.99 66.18 ± 3.25 / 72.00 ± 1.75 6.74 ± 1.66 / 45.55 ± 4.31 54.05 ± 0.84 / 60.90 ± 0.82 65.92 ± 0.05 / 18.51 ± 0.18 17.73 ± 0.98 / 37.55 ± 0.69 12.85 ± 0.93 / 33.37 ± 0.90 9.3.1 9.3.1 9.3.1 12.4.0 12.4.0 9.3.1 9.3.1
Qwen/Qwen1.5-4B-Chat (few-shot) 3950 152 32768 False 4,347 ± 893 / 1,135 ± 365 3.16 40.19 ± 2.97 / 31.88 ± 4.51 64.08 ± 2.44 / 69.62 ± 1.29 5.43 ± 2.02 / 38.32 ± 2.54 53.21 ± 1.08 / 59.57 ± 0.97 61.90 ± 0.87 / 17.34 ± 0.52 20.95 ± 0.97 / 40.87 ± 0.76 16.59 ± 1.45 / 36.76 ± 1.20 12.5.2 10.0.1 12.1.0 12.5.2 12.1.0 12.1.0 12.1.0
AI-Sweden-Models/gpt-sw3-20b (few-shot) 20918 64 2048 True 1,875 ± 673 / 261 ± 91 3.27 31.86 ± 5.09 / 21.95 ± 3.90 78.88 ± 1.58 / 79.56 ± 1.43 12.26 ± 1.97 / 46.90 ± 4.11 53.58 ± 0.97 / 60.28 ± 0.81 64.14 ± 0.46 / 18.76 ± 0.39 3.15 ± 0.80 / 27.43 ± 0.91 2.77 ± 1.26 / 26.43 ± 0.84 9.3.1 9.3.1 9.3.1 9.3.1 9.3.1 9.3.1 9.3.1
tollefj/nordavind-7b-instruct-warm (few-shot) 7248 33 2048 False 6,450 ± 961 / 2,082 ± 658 3.33 47.24 ± 3.36 / 24.94 ± 3.21 77.91 ± 1.42 / 76.08 ± 2.54 5.55 ± 2.55 / 48.57 ± 3.21 51.41 ± 0.74 / 57.55 ± 0.69 61.11 ± 1.02 / 17.57 ± 0.33 1.49 ± 1.11 / 25.90 ± 0.71 3.97 ± 0.92 / 27.45 ± 0.68 12.5.2 12.3.2 12.3.2 12.4.0 12.4.0 12.3.2 12.3.2
AI-Sweden-Models/gpt-sw3-6.7b-v2 (few-shot) 7111 64 2048 True 2,351 ± 448 / 707 ± 216 3.34 28.73 ± 3.63 / 20.43 ± 3.72 77.47 ± 1.36 / 78.60 ± 1.25 8.78 ± 2.01 / 42.28 ± 3.17 50.57 ± 0.94 / 56.51 ± 0.79 62.41 ± 0.85 / 16.45 ± 0.64 5.23 ± 1.02 / 28.63 ± 0.82 5.39 ± 0.81 / 28.86 ± 0.60 9.2.0 9.2.0 9.2.0 12.5.1 11.0.0 9.2.0 9.2.0
LumiOpen/Viking-33B (few-shot) 33119 131 4099 True 2,080 ± 700 / 331 ± 117 3.35 42.35 ± 1.51 / 28.31 ± 3.87 77.68 ± 1.11 / 78.86 ± 0.93 8.08 ± 1.69 / 50.52 ± 2.25 54.57 ± 1.25 / 60.34 ± 1.10 58.30 ± 1.94 / 14.29 ± 0.83 1.73 ± 1.04 / 24.98 ± 0.69 -0.32 ± 1.25 / 25.53 ± 0.82 12.9.0 12.9.0 12.9.0 12.9.0 12.9.0 12.9.0 12.9.0
norallm/normistral-7b-warm-instruct (few-shot) 7248 33 2048 True 6,194 ± 949 / 1,967 ± 619 3.35 51.45 ± 3.13 / 26.49 ± 3.00 63.64 ± 3.74 / 65.08 ± 2.46 5.80 ± 1.74 / 51.04 ± 1.54 48.95 ± 1.00 / 57.09 ± 0.92 62.18 ± 0.57 / 16.32 ± 0.33 4.88 ± 0.59 / 25.13 ± 0.54 4.63 ± 1.09 / 27.29 ± 1.02 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1
AI-Sweden-Models/gpt-sw3-20b-instruct (few-shot) 20918 64 2048 True 1,831 ± 587 / 268 ± 90 3.40 15.70 ± 1.54 / 14.65 ± 1.52 68.23 ± 3.81 / 71.17 ± 3.07 12.39 ± 1.39 / 50.99 ± 3.37 52.04 ± 0.97 / 60.86 ± 0.77 65.44 ± 0.22 / 19.75 ± 0.32 6.86 ± 0.91 / 29.83 ± 1.12 6.92 ± 0.75 / 28.96 ± 0.67 9.3.1 9.3.1 9.3.1 9.3.1 9.3.1 9.3.1 9.3.1
norallm/normistral-7b-warm (few-shot) 7248 33 2048 True 3,175 ± 456 / 1,186 ± 354 3.43 48.78 ± 5.08 / 26.81 ± 3.42 76.09 ± 1.23 / 74.78 ± 1.97 2.53 ± 2.80 / 47.37 ± 2.29 48.93 ± 0.97 / 55.09 ± 0.85 57.49 ± 2.27 / 16.17 ± 0.78 1.28 ± 1.28 / 23.12 ± 0.63 1.27 ± 0.61 / 25.74 ± 0.70 11.0.0 11.0.0 11.0.0 11.0.0 11.0.0 11.0.0 11.0.0
AI-Sweden-Models/gpt-sw3-6.7b-v2-instruct (few-shot) 7111 64 2048 True 2,383 ± 451 / 718 ± 221 3.51 14.58 ± 1.30 / 14.79 ± 1.27 56.60 ± 3.37 / 62.73 ± 3.61 10.92 ± 1.83 / 52.63 ± 2.98 50.18 ± 0.54 / 57.90 ± 0.53 64.89 ± 0.15 / 18.79 ± 0.22 6.16 ± 0.81 / 28.35 ± 0.97 10.90 ± 0.86 / 32.01 ± 0.54 9.2.0 9.2.0 9.2.0 12.4.0 12.4.0 9.2.0 9.3.1
google/gemma-2b (few-shot) 2506 256 8192 True 6,087 ± 1,046 / 1,902 ± 563 3.51 14.67 ± 4.71 / 14.85 ± 3.77 75.45 ± 1.10 / 64.08 ± 1.47 3.82 ± 1.23 / 44.81 ± 3.55 51.73 ± 0.88 / 57.35 ± 0.82 59.72 ± 1.46 / 15.26 ± 0.64 10.98 ± 0.98 / 31.92 ± 0.80 4.24 ± 0.47 / 27.53 ± 0.44 12.5.2 12.1.0 12.1.0 12.1.0 12.1.0 12.1.0 12.1.0
HPLT/gpt-13b-nordic-prerelease (few-shot) 14030 131 4099 True 3,520 ± 736 / 823 ± 273 3.58 32.19 ± 4.64 / 24.93 ± 4.09 72.26 ± 6.90 / 72.58 ± 5.87 2.39 ± 1.29 / 48.49 ± 2.46 48.92 ± 2.28 / 53.44 ± 2.49 57.46 ± 1.64 / 13.21 ± 0.57 -0.49 ± 0.50 / 25.03 ± 0.45 0.50 ± 1.04 / 25.50 ± 0.72 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1
LumiOpen/Viking-13B (few-shot) 14030 131 4099 True 3,480 ± 727 / 822 ± 274 3.58 32.30 ± 4.52 / 24.91 ± 3.98 72.28 ± 6.64 / 72.80 ± 5.64 2.46 ± 1.31 / 48.51 ± 2.46 48.88 ± 2.35 / 53.41 ± 2.56 57.44 ± 1.66 / 13.23 ± 0.56 -0.50 ± 0.53 / 25.03 ± 0.47 0.44 ± 1.06 / 25.47 ± 0.73 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2
AI-Sweden-Models/gpt-sw3-1.3b-instruct (few-shot) 1445 64 2048 True 4,544 ± 1,000 / 1,106 ± 359 3.60 19.04 ± 2.67 / 19.98 ± 2.64 73.34 ± 1.34 / 68.41 ± 2.31 2.90 ± 1.74 / 44.43 ± 4.49 47.45 ± 0.58 / 54.69 ± 0.56 63.33 ± 0.86 / 17.11 ± 0.61 0.65 ± 1.12 / 25.94 ± 0.76 -0.18 ± 0.36 / 24.70 ± 0.60 12.5.2 9.3.1 12.1.0 12.4.0 12.4.0 12.1.0 12.1.0
Qwen/Qwen1.5-4B (few-shot) 3950 152 32768 True 3,248 ± 739 / 761 ± 252 3.62 37.26 ± 4.28 / 29.89 ± 5.96 5.20 ± 7.35 / 30.65 ± 4.97 1.85 ± 1.54 / 33.71 ± 0.46 54.15 ± 0.58 / 60.15 ± 0.59 58.24 ± 1.76 / 16.02 ± 0.88 22.04 ± 0.60 / 41.36 ± 0.54 14.76 ± 1.28 / 35.27 ± 1.32 12.5.2 9.3.2 12.1.0 12.1.0 12.1.0 12.1.0 12.1.0
Qwen/Qwen1.5-1.8B-Chat (few-shot) 1837 152 32768 False 8,304 ± 1,846 / 1,933 ± 617 3.65 20.94 ± 3.73 / 18.26 ± 2.84 52.54 ± 3.33 / 60.44 ± 3.13 0.34 ± 1.22 / 36.61 ± 1.57 43.55 ± 1.14 / 50.53 ± 1.40 61.19 ± 0.69 / 15.92 ± 0.24 10.74 ± 0.92 / 32.65 ± 0.68 4.83 ± 0.62 / 28.76 ± 0.55 12.5.2 11.0.0 12.1.0 12.5.0 12.5.0 12.1.0 12.1.0
AI-Sweden-Models/gpt-sw3-6.7b (few-shot) 7111 64 2048 True 2,285 ± 443 / 671 ± 205 3.72 18.83 ± 6.41 / 17.59 ± 4.55 53.68 ± 10.39 / 58.92 ± 10.87 3.49 ± 2.20 / 46.13 ± 4.13 49.81 ± 0.70 / 55.99 ± 0.69 61.05 ± 1.33 / 15.89 ± 0.85 1.22 ± 0.65 / 26.19 ± 0.64 0.60 ± 1.34 / 25.62 ± 0.72 11.0.0 11.0.0 11.0.0 11.0.0 11.0.0 11.0.0 11.0.0
HPLT/gpt-7b-nordic-prerelease (few-shot) 7550 131 4096 True 5,404 ± 931 / 1,638 ± 542 3.74 27.07 ± 6.33 / 25.24 ± 4.89 61.96 ± 2.69 / 67.81 ± 2.27 2.65 ± 1.46 / 40.25 ± 4.08 46.16 ± 0.91 / 52.35 ± 0.87 55.11 ± 1.21 / 12.07 ± 0.32 0.32 ± 0.43 / 21.99 ± 0.56 -0.00 ± 0.01 / 25.00 ± 0.77 12.5.2 12.3.2 12.3.2 12.3.2 12.3.2 12.3.2 12.3.2
Qwen/Qwen1.5-1.8B (few-shot) 1837 152 32768 True 5,666 ± 1,328 / 1,256 ± 408 3.74 18.01 ± 6.41 / 18.55 ± 4.65 51.91 ± 4.78 / 59.44 ± 4.65 1.49 ± 1.95 / 40.76 ± 4.07 44.83 ± 0.63 / 51.87 ± 0.72 54.82 ± 1.62 / 14.43 ± 0.68 11.54 ± 0.73 / 32.55 ± 0.60 7.19 ± 1.40 / 29.76 ± 1.22 12.5.2 10.0.1 12.1.0 12.1.0 12.1.0 12.1.0 12.1.0
AI-Sweden-Models/gpt-sw3-1.3b (few-shot) 1445 64 2048 True 4,608 ± 988 / 1,115 ± 354 3.75 6.08 ± 5.75 / 8.77 ± 4.46 71.38 ± 1.76 / 73.21 ± 1.18 1.17 ± 1.07 / 49.78 ± 0.86 45.55 ± 0.85 / 51.69 ± 0.79 60.11 ± 1.59 / 15.02 ± 0.84 2.20 ± 0.88 / 25.62 ± 0.86 0.67 ± 1.39 / 25.25 ± 0.51 9.3.1 9.3.1 9.3.1 12.5.1 11.0.0 9.3.1 9.3.1
LumiOpen/Viking-7B (few-shot) 7550 131 4096 True 5,723 ± 1,025 / 1,670 ± 559 3.75 21.84 ± 3.27 / 21.14 ± 3.21 63.60 ± 4.10 / 68.73 ± 3.10 0.65 ± 1.59 / 43.82 ± 3.40 46.51 ± 1.10 / 52.58 ± 0.98 56.54 ± 1.89 / 13.47 ± 0.77 0.25 ± 1.01 / 25.43 ± 0.56 -0.89 ± 1.37 / 24.52 ± 0.65 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2
norallm/normistral-7b-scratch (few-shot) 7248 33 2048 True 3,192 ± 454 / 1,198 ± 357 3.87 13.79 ± 8.46 / 14.43 ± 7.23 71.59 ± 2.78 / 59.82 ± 1.71 -0.89 ± 1.22 / 43.82 ± 3.45 38.33 ± 1.79 / 44.00 ± 1.70 55.77 ± 0.83 / 14.15 ± 0.51 -0.39 ± 1.21 / 22.30 ± 0.78 -0.52 ± 1.01 / 25.20 ± 0.85 10.0.0 10.0.0 10.0.0 10.0.0 11.0.0 10.0.1 10.0.1
AI-Sweden-Models/gpt-sw3-356m-instruct (few-shot) 471 64 2048 True 5,855 ± 1,373 / 1,223 ± 391 3.88 14.84 ± 1.63 / 15.90 ± 1.71 59.00 ± 3.60 / 54.09 ± 1.46 0.06 ± 1.21 / 34.76 ± 1.15 34.37 ± 1.36 / 40.44 ± 1.53 61.28 ± 0.92 / 14.60 ± 0.79 0.48 ± 1.07 / 23.44 ± 0.67 0.33 ± 0.50 / 25.01 ± 0.76 12.5.2 9.3.2 12.1.0 12.4.0 12.4.0 12.1.0 12.1.0
mhenrichsen/danskgpt-tiny-chat (few-shot) 1100 32 2048 False 1,745 ± 978 / 686 ± 159 3.89 27.31 ± 4.23 / 26.33 ± 4.40 45.94 ± 12.82 / 55.94 ± 8.25 -0.97 ± 1.64 / 36.69 ± 2.34 35.57 ± 2.45 / 41.66 ± 2.41 55.79 ± 0.24 / 10.61 ± 0.29 0.14 ± 1.02 / 24.76 ± 0.75 0.52 ± 0.83 / 25.53 ± 0.62 9.1.2 9.1.2 9.1.2 12.4.0 12.4.0 9.1.2 9.1.2
allenai/OLMo-7B (few-shot) 6888 50 2051 True 5,403 ± 1,133 / 1,294 ± 423 3.91 37.36 ± 2.11 / 28.59 ± 3.03 72.08 ± 1.20 / 63.52 ± 3.36 -0.86 ± 1.61 / 33.84 ± 0.59 45.16 ± 0.96 / 51.46 ± 0.93 41.03 ± 0.33 / 4.86 ± 0.09 -0.83 ± 1.04 / 25.47 ± 0.54 -0.62 ± 0.73 / 24.51 ± 0.53 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2
allenai/OLMo-7B-Twin-2T (few-shot) 6888 50 2051 True 5,484 ± 1,125 / 1,317 ± 425 3.97 20.49 ± 7.78 / 19.50 ± 6.82 70.04 ± 2.28 / 60.77 ± 3.00 2.28 ± 1.77 / 36.86 ± 3.97 45.85 ± 1.19 / 51.08 ± 1.21 39.53 ± 0.34 / 5.71 ± 0.10 0.69 ± 0.90 / 24.20 ± 0.89 0.12 ± 1.51 / 24.97 ± 1.28 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2
RuterNorway/Llama-2-7b-chat-norwegian (few-shot) unknown 32 4096 False 10,890 ± 2,686 / 2,186 ± 750 4.01 22.38 ± 3.00 / 22.09 ± 2.85 31.11 ± 12.17 / 36.84 ± 11.52 0.09 ± 0.67 / 33.42 ± 0.30 44.36 ± 1.34 / 50.14 ± 1.15 55.44 ± 0.79 / 12.95 ± 0.51 1.12 ± 0.42 / 25.27 ± 0.68 -0.91 ± 0.96 / 24.26 ± 0.64 9.3.1 9.3.1 9.3.1 12.5.2 11.0.0 9.3.1 9.3.1
google/gemma-2b-it (few-shot) 2506 256 8192 False 6,471 ± 1,142 / 1,961 ± 584 4.05 33.51 ± 2.12 / 23.48 ± 2.69 43.97 ± 1.64 / 57.41 ± 1.18 0.53 ± 1.09 / 39.60 ± 1.99 39.39 ± 1.04 / 47.28 ± 1.02 40.55 ± 6.41 / 11.10 ± 1.63 11.06 ± 0.98 / 31.69 ± 0.81 1.03 ± 0.85 / 25.55 ± 0.60 12.5.2 12.1.0 12.1.0 12.4.0 12.4.0 12.1.0 12.1.0
NbAiLab/nb-gpt-j-6B-alpaca (few-shot) 6055 50 1024 False 2,607 ± 592 / 680 ± 208 4.10 13.28 ± 4.32 / 13.40 ± 2.95 60.17 ± 8.39 / 65.99 ± 4.66 1.52 ± 1.94 / 45.19 ± 3.80 37.23 ± 1.07 / 46.83 ± 0.82 46.68 ± 0.33 / 12.40 ± 0.17 -0.03 ± 1.31 / 23.73 ± 1.11 0.02 ± 0.88 / 25.04 ± 0.61 9.3.1 10.0.1 10.0.1 12.4.0 12.4.0 10.0.1 10.0.1
Qwen/Qwen1.5-0.5B (few-shot) 620 152 32768 True 11,371 ± 2,924 / 2,122 ± 692 4.11 28.96 ± 2.39 / 26.49 ± 3.14 26.58 ± 5.12 / 28.64 ± 5.35 -1.88 ± 1.46 / 35.45 ± 2.92 34.59 ± 1.06 / 40.95 ± 1.11 53.36 ± 1.44 / 12.82 ± 0.58 6.52 ± 1.02 / 28.83 ± 0.78 1.91 ± 1.30 / 26.10 ± 0.65 12.5.2 10.0.1 12.1.0 12.1.0 12.1.0 12.1.0 12.1.0
mhenrichsen/danskgpt-tiny (few-shot) 1100 32 2048 True 8,597 ± 1,983 / 1,926 ± 600 4.13 23.92 ± 6.88 / 22.42 ± 6.73 31.93 ± 14.68 / 43.80 ± 8.79 0.46 ± 1.91 / 43.45 ± 3.64 30.81 ± 2.73 / 35.67 ± 2.95 52.68 ± 0.76 / 11.19 ± 0.36 -0.85 ± 1.05 / 24.38 ± 0.51 -1.24 ± 0.90 / 24.30 ± 0.63 0.0.0 0.0.0 0.0.0 12.5.1 11.0.0 0.0.0 0.0.0
Qwen/Qwen1.5-0.5B-Chat (few-shot) 620 152 32768 False 11,740 ± 3,000 / 2,209 ± 721 4.14 18.57 ± 4.62 / 17.69 ± 4.61 40.23 ± 5.86 / 49.01 ± 4.77 0.21 ± 1.06 / 39.60 ± 3.61 29.49 ± 2.47 / 35.01 ± 2.72 53.29 ± 6.52 / 13.04 ± 1.68 2.59 ± 0.72 / 26.87 ± 0.72 -0.84 ± 1.01 / 24.44 ± 0.61 12.5.2 11.0.0 12.1.0 12.4.0 12.5.0 12.1.0 12.1.0
AI-Sweden-Models/gpt-sw3-356m (few-shot) 471 64 2048 True 5,758 ± 1,348 / 1,215 ± 391 4.16 23.77 ± 3.70 / 23.06 ± 3.46 34.29 ± 11.64 / 36.76 ± 7.46 1.57 ± 1.70 / 40.84 ± 1.99 33.70 ± 1.46 / 38.82 ± 1.54 51.36 ± 2.01 / 10.76 ± 0.54 -0.96 ± 1.08 / 21.85 ± 0.45 0.30 ± 0.48 / 25.10 ± 0.69 9.3.1 9.3.1 9.3.2 12.5.1 11.0.0 9.3.2 9.3.2
AI-Sweden-Models/gpt-sw3-126m-instruct (few-shot) 186 64 2048 True 7,717 ± 1,553 / 2,013 ± 625 4.31 23.05 ± 2.31 / 24.35 ± 1.99 12.47 ± 7.10 / 23.03 ± 8.78 0.08 ± 0.16 / 33.34 ± 0.30 20.43 ± 2.69 / 24.25 ± 2.67 59.80 ± 0.93 / 14.56 ± 0.36 0.72 ± 0.72 / 23.30 ± 0.96 0.11 ± 0.91 / 25.15 ± 0.81 9.3.2 9.3.2 11.0.0 12.4.0 12.4.0 11.0.0 11.0.0
allenai/OLMo-1B (few-shot) 1177 50 2051 True 8,536 ± 1,926 / 1,940 ± 619 4.32 29.39 ± 3.08 / 29.93 ± 3.14 38.95 ± 11.78 / 43.61 ± 8.46 -1.35 ± 1.76 / 40.70 ± 4.25 17.85 ± 3.77 / 20.30 ± 4.04 43.75 ± 0.28 / 4.67 ± 0.12 -0.22 ± 0.80 / 23.76 ± 0.84 0.75 ± 1.00 / 25.27 ± 0.56 12.5.2 12.1.0 12.1.0 12.1.0 12.1.0 12.1.0 12.1.0
RJuro/kanelsnegl-v0.1 (few-shot) 7242 32 512 True 9,757 ± 2,047 / 2,200 ± 705 4.51 0.00 ± 0.00 / 0.00 ± 0.00 34.63 ± 9.69 / 40.92 ± 6.88 0.00 ± 0.00 / 33.30 ± 0.27 0.00 ± 0.00 / 8.92 ± 2.90 59.04 ± 0.07 / 10.84 ± 0.09 -0.25 ± 0.97 / 21.96 ± 0.57 0.08 ± 0.78 / 24.93 ± 0.77 9.3.1 9.3.1 9.3.1 12.5.1 11.0.0 9.3.1 9.3.1
RJuro/kanelsnegl-v0.2 (few-shot) 7242 32 512 True 1,373 ± 120 / 709 ± 172 4.53 0.00 ± 0.00 / 0.00 ± 0.00 28.62 ± 12.67 / 35.36 ± 8.35 0.00 ± 0.00 / 33.30 ± 0.27 0.00 ± 0.00 / 19.59 ± 6.84 58.16 ± 0.07 / 8.81 ± 0.07 0.47 ± 0.86 / 22.03 ± 0.59 0.71 ± 0.64 / 25.02 ± 0.72 10.0.1 10.0.1 10.0.1 10.0.1 11.0.0 10.0.1 11.0.0
AI-Sweden-Models/gpt-sw3-126m (few-shot) 186 64 2048 True 8,958 ± 1,815 / 2,240 ± 696 4.61 5.66 ± 4.11 / 8.37 ± 3.24 8.15 ± 8.87 / 24.31 ± 7.12 -0.81 ± 1.16 / 36.81 ± 2.47 16.40 ± 2.88 / 19.18 ± 3.18 51.48 ± 1.14 / 10.63 ± 0.31 -0.49 ± 0.60 / 22.53 ± 0.75 1.17 ± 0.86 / 25.54 ± 0.87 9.2.0 9.2.0 9.2.0 12.5.1 11.0.0 9.2.0 9.2.0
NbAiLab/nb-gpt-j-6B-v2 (few-shot) 6051 50 1024 False 2,556 ± 580 / 681 ± 214 4.84 0.31 ± 0.55 / 0.29 ± 0.50 27.42 ± 12.16 / 38.74 ± 10.05 0.07 ± 1.06 / 35.80 ± 1.73 17.82 ± 11.21 / 31.12 ± 8.39 27.09 ± 0.29 / 6.80 ± 0.12 -0.67 ± 0.81 / 22.55 ± 0.71 0.86 ± 0.82 / 25.38 ± 0.51 9.3.1 10.0.1 11.0.0 12.5.1 11.0.0 11.0.0 11.0.0
peter-sk/gpt-neox-da (few-shot) 1515 50 1024 True 6,025 ± 1,442 / 1,342 ± 431 4.94 0.26 ± 0.16 / 0.26 ± 0.14 4.75 ± 2.54 / 27.85 ± 1.59 -0.60 ± 1.56 / 40.53 ± 2.93 0.06 ± 0.09 / 1.07 ± 0.35 41.84 ± 0.24 / 5.74 ± 0.09 -0.41 ± 1.39 / 24.48 ± 0.97 0.52 ± 0.81 / 25.32 ± 0.65 10.0.1 10.0.1 10.0.1 10.0.1 11.0.0 10.0.1 10.0.1
NorGLM/NorGPT-369M (few-shot) unknown 64 1024 True 19,896 ± 5,099 / 3,848 ± 1,251 4.95 1.47 ± 1.90 / 1.32 ± 1.69 5.50 ± 4.49 / 28.77 ± 3.76 -2.19 ± 1.29 / 40.52 ± 3.02 0.10 ± 0.06 / 4.36 ± 0.44 37.40 ± 0.61 / 6.53 ± 0.13 -0.53 ± 1.01 / 24.38 ± 1.08 0.25 ± 1.22 / 25.23 ± 0.73 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2
NbAiLab/nb-gpt-j-6B@sharded (few-shot) unknown 50 1024 True 2,630 ± 605 / 684 ± 217 4.96 0.01 ± 0.02 / 0.11 ± 0.12 33.50 ± 13.13 / 39.30 ± 11.93 -0.02 ± 0.60 / 34.92 ± 2.99 4.79 ± 3.55 / 18.06 ± 2.80 26.97 ± 0.41 / 6.56 ± 0.18 -0.11 ± 1.16 / 23.32 ± 0.92 0.56 ± 1.22 / 24.79 ± 0.91 9.3.1 10.0.1 10.0.1 12.5.1 11.0.0 10.0.1 10.0.1
Sigurdur/icebreaker (few-shot) 110 32 1024 False 48,619 ± 7,681 / 13,831 ± 4,404 5.09 0.00 ± 0.00 / 0.00 ± 0.00 -3.60 ± 3.63 / 20.29 ± 1.99 0.00 ± 0.00 / 33.30 ± 0.27 0.00 ± 0.00 / 0.05 ± 0.03 39.68 ± 0.08 / 1.23 ± 0.02 -0.20 ± 0.77 / 24.13 ± 0.67 -0.25 ± 0.67 / 24.68 ± 0.44 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
ai-forever/mGPT (few-shot) unknown 100 1024 True 13,551 ± 4,259 / 2,563 ± 838 5.11 0.00 ± 0.00 / 0.00 ± 0.00 0.00 ± 0.00 / 19.32 ± 0.16 0.49 ± 1.29 / 39.12 ± 3.92 6.24 ± 3.13 / 7.85 ± 3.67 31.89 ± 0.27 / 2.03 ± 0.10 -0.37 ± 1.08 / 22.43 ± 0.55 0.36 ± 0.83 / 25.08 ± 0.77 9.3.1 10.0.1 11.0.0 12.5.1 12.0.0 11.0.0 11.0.0
Download as CSV   •   Copy embed HTML