[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"summary-d64cbc961f981052-openmythos-770m-rdt-matches-1-3b-transformer-power-summary":3,"summaries-facets-categories":105,"summary-related-d64cbc961f981052-openmythos-770m-rdt-matches-1-3b-transformer-power-summary":3691},{"id":4,"title":5,"ai":6,"body":13,"categories":58,"created_at":59,"date_modified":59,"description":52,"extension":60,"faq":59,"featured":61,"kicker_label":59,"meta":62,"navigation":86,"path":87,"published_at":88,"question":59,"scraped_at":89,"seo":90,"sitemap":91,"source_id":92,"source_name":93,"source_type":94,"source_url":95,"stem":96,"tags":97,"thumbnail_url":59,"tldr":102,"tweet":59,"unknown_tags":103,"__hash__":104},"summaries\u002Fsummaries\u002Fd64cbc961f981052-openmythos-770m-rdt-matches-1-3b-transformer-power-summary.md","OpenMythos: 770M RDT Matches 1.3B Transformer Power",{"provider":7,"model":8,"input_tokens":9,"output_tokens":10,"processing_time_ms":11,"cost_usd":12},"openrouter","x-ai\u002Fgrok-4.1-fast",5480,2000,15694,0.0020735,{"type":14,"value":15,"toc":51},"minimark",[16,21,25,28,31,35,38,41,45,48],[17,18,20],"h2",{"id":19},"recurrent-depth-transformers-scale-reasoning-via-inference-loops","Recurrent-Depth Transformers Scale Reasoning via Inference Loops",[22,23,24],"p",{},"Recurrent-Depth Transformers (RDTs), or Looped Transformers, differ from standard transformers by reusing a fixed set of weights iteratively across T loop steps (up to 16 in OpenMythos) in a single forward pass. This decouples reasoning depth from parameter count: deeper reasoning comes from more loops at inference, not more layers or params. The structure follows Prelude → Recurrent Block → Coda, where Prelude and Coda are one-time standard transformer layers.",[22,26,27],{},"In the Recurrent Block, update hidden state ht+1 = A·ht + B·e + Transformer(ht, e), with encoded input e re-injected each step to prevent drift. This mimics draft refinement, enabling continuous latent-space reasoning without mid-loop token emissions—equivalent to chain-of-thought over vectors, per Saunshi et al. (2025). Unlike standard transformers failing on unseen depths (e.g., 5-hop trained model flops on 10-hop), RDTs extend depth at inference without retraining: allocate more loops to hard problems.",[22,29,30],{},"Replace standard FFN with Mixture-of-Experts (MoE) from DeepSeekMoE: sparse top-K experts per token plus shared experts, routed differently per loop for distinct computation despite tied weights. Use Multi-Latent Attention from DeepSeek-V2, caching compressed low-rank KV latents for 10–20× KV memory savings.",[17,32,34],{"id":33},"stability-and-adaptive-depth-prevent-explosion-or-overthinking","Stability and Adaptive Depth Prevent Explosion or Overthinking",[22,36,37],{},"Looping risks residual explosion (unbounded ht growth) or overthinking (drift past solutions). Enforce Linear Time-Invariant (LTI) constraint from Parcae: spectral radius ρ(A) \u003C 1 by construction, ensuring stability independent of learning rate. Add Adaptive Computation Time (ACT) halting: learned scalar per position dynamically stops loops when converged—harder tokens get more compute.",[22,39,40],{},"Depth-Wise LoRA adapters apply small rank-r matrices per iteration, differentiating behavior without bloating params, blending pure tying and unique layers.",[17,42,44],{"id":43},"half-the-params-equivalent-performance-via-predictable-scaling","Half the Params, Equivalent Performance via Predictable Scaling",[22,46,47],{},"At 770M params, OpenMythos RDT matches 1.3B standard transformer on identical data, per Parcae (Prairie et al., 2026) scaling laws: optimal recurrence and token count follow power laws. This shifts scaling focus from training params to inference loops, challenging bigger-is-better assumptions.",[22,49,50],{},"OpenMythos delivers PyTorch code for RDT with MoE, LTI training, LoRA adapters, and baselines—falsifiable hypothesis for Claude Mythos, runnable for experimenting with looped dynamics.",{"title":52,"searchDepth":53,"depth":53,"links":54},"",2,[55,56,57],{"id":19,"depth":53,"text":20},{"id":33,"depth":53,"text":34},{"id":43,"depth":53,"text":44},[],null,"md",false,{"content_references":63,"triage":81},[64,69,74,78],{"type":65,"title":66,"url":67,"context":68},"tool","OpenMythos","https:\u002F\u002Fgithub.com\u002Fkyegomez\u002FOpenMythos","mentioned",{"type":70,"title":71,"url":72,"context":73},"paper","Saunshi et al. (2025)","https:\u002F\u002Farxiv.org\u002Fabs\u002F2502.17416","cited",{"type":70,"title":75,"author":76,"url":77,"context":73},"Parcae","Prairie et al.","https:\u002F\u002Farxiv.org\u002Fabs\u002F2604.12946",{"type":79,"title":80,"context":68},"other","COCONUT (2024)",{"relevance":82,"novelty":82,"quality":83,"actionability":53,"composite":84,"reasoning":85},3,4,3.05,"Category: AI & LLMs. The article discusses a new architecture for transformers, which is relevant to AI engineering, but it lacks practical applications or examples for product builders to implement this technology. While it presents some novel insights into the structure and functioning of Recurrent-Depth Transformers, it does not provide actionable steps or frameworks that the audience can directly apply.",true,"\u002Fsummaries\u002Fd64cbc961f981052-openmythos-770m-rdt-matches-1-3b-transformer-power-summary","2026-04-19 19:47:49","2026-04-21 15:26:59",{"title":5,"description":52},{"loc":87},"d64cbc961f981052","MarkTechPost","article","https:\u002F\u002Fwww.marktechpost.com\u002F2026\u002F04\u002F19\u002Fmeet-openmythos-an-open-source-pytorch-reconstruction-of-claude-mythos-where-770m-parameters-match-a-1-3b-transformer\u002F","summaries\u002Fd64cbc961f981052-openmythos-770m-rdt-matches-1-3b-transformer-power-summary",[98,99,100,101],"llm","machine-learning","open-source","python","OpenMythos reconstructs Claude Mythos as a Recurrent-Depth Transformer (RDT) in PyTorch: loop the same weights T=16 times for reasoning depth, achieving 1.3B transformer performance at 770M params via MoE, stability fixes, and inference-time scaling.",[],"catU0v9NcZQXj7dgnu-iH80ub7d_pZ-fh6mDqyuTN3c",[106,109,112,115,118,121,123,125,127,129,131,133,136,138,140,142,144,146,148,150,152,154,157,160,162,164,167,169,171,174,176,178,180,182,184,186,188,190,192,194,196,198,200,202,204,206,208,210,212,214,216,218,220,222,224,226,228,230,232,234,236,238,240,242,244,246,248,250,252,254,256,258,260,262,264,266,268,270,272,274,276,278,280,282,284,286,288,290,292,294,296,298,300,302,304,306,308,310,312,314,316,318,320,322,324,326,328,330,332,334,336,338,340,342,344,346,348,350,352,354,356,358,360,362,364,366,368,370,372,374,376,378,380,382,384,386,388,390,392,394,396,398,400,402,404,406,408,410,412,414,416,418,420,422,424,426,428,431,433,435,437,439,441,443,445,447,449,451,453,455,457,459,461,463,465,467,469,471,473,475,477,479,481,483,485,487,489,491,493,495,497,499,501,503,505,507,509,511,513,515,517,519,521,523,525,527,529,531,533,535,537,539,541,543,545,547,549,551,553,555,557,559,561,563,565,567,569,571,573,575,577,579,581,583,585,587,589,591,593,595,597,599,601,603,605,607,609,611,613,615,617,619,621,623,625,627,629,631,633,635,637,639,641,643,645,647,649,651,653,655,657,659,661,663,665,667,669,671,673,675,677,679,681,683,685,687,689,691,693,695,697,699,701,703,705,707,709,711,713,715,717,719,721,723,725,727,729,731,733,735,737,739,741,743,745,747,749,751,753,755,757,759,761,763,765,767,769,771,773,775,777,779,781,783,785,787,789,791,793,795,797,799,801,803,805,807,809,811,813,815,817,819,821,823,825,827,829,831,833,835,837,839,841,843,845,847,849,851,853,855,857,859,861,863,865,867,869,871,873,875,877,879,881,883,885,887,889,891,893,895,897,899,901,903,905,907,909,911,913,915,917,919,921,923,925,927,929,931,933,935,937,939,941,943,945,947,949,951,953,955,957,959,961,963,965,967,969,971,973,975,977,979,981,983,985,987,989,991,993,995,997,999,1001,1003,1005,1007,1009,1011,1013,1015,1017,1019,1021,1023,1025,1027,1029,1031,1033,1035,1037,1039,1041,1043,1045,1047,1049,1051,1053,1055,1057,1059,1061,1063,1065,1067,1069,1071,1073,1075,1077,1079,1081,1083,1085,1087,1089,1091,1093,1095,1097,1099,1101,1103,1105,1107,1109,1111,1113,1115,1117,1119,1121,1123,1125,1127,1129,1131,1133,1135,1137,1139,1141,1143,1145,1147,1149,1151,1153,1155,1157,1159,1161,1163,1165,1167,1169,1171,1173,1175,1177,1179,1181,1183,1185,1187,1189,1191,1193,1195,1197,1199,1201,1203,1205,1207,1209,1211,1213,1215,1217,1219,1221,1223,1225,1227,1229,1231,1233,1235,1237,1239,1241,1243,1245,1247,1249,1251,1253,1255,1257,1259,1261,1263,1265,1267,1269,1271,1273,1275,1277,1279,1281,1283,1285,1287,1289,1291,1293,1295,1297,1299,1301,1303,1305,1307,1309,1311,1313,1315,1317,1319,1321,1323,1325,1327,1329,1331,1333,1335,1337,1339,1341,1343,1345,1347,1349,1351,1353,1355,1357,1359,1361,1363,1365,1367,1369,1371,1373,1375,1377,1379,1381,1383,1385,1387,1389,1391,1393,1395,1397,1399,1401,1403,1405,1407,1409,1411,1413,1415,1417,1419,1421,1423,1425,1427,1429,1431,1433,1435,1437,1439,1441,1443,1445,1447,1449,1451,1453,1455,1457,1459,1461,1463,1465,1467,1469,1471,1473,1475,1477,1479,1481,1483,1485,1487,1489,1491,1493,1495,1497,1499,1501,1503,1505,1507,1509,1511,1513,1515,1517,1519,1521,1523,1525,1527,1529,1531,1533,1535,1537,1539,1541,1543,1545,1547,1549,1551,1553,1555,1557,1559,1561,1563,1565,1567,1569,1571,1573,1575,1577,1579,1581,1583,1585,1587,1589,1591,1593,1595,1597,1599,1601,1603,1605,1607,1609,1611,1613,1615,1617,1619,1621,1623,1625,1627,1629,1631,1633,1635,1637,1639,1641,1643,1645,1647,1649,1651,1653,1655,1657,1659,1661,1663,1665,1667,1669,1671,1673,1675,1677,1679,1681,1683,1685,1687,1689,1691,1693,1695,1697,1699,1701,1703,1705,1707,1709,1711,1713,1715,1717,1719,1721,1723,1725,1727,1729,1731,1733,1735,1737,1739,1741,1743,1745,1747,1749,1751,1753,1755,1757,1759,1761,1763,1765,1767,1769,1771,1773,1775,1777,1779,1781,1783,1785,1787,1789,1791,1793,1795,1797,1799,1801,1803,1805,1807,1809,1811,1813,1815,1817,1819,1821,1823,1825,1827,1829,1831,1833,1835,1837,1839,1841,1843,1845,1847,1849,1851,1853,1855,1857,1859,1861,1863,1865,1867,1869,1871,1873,1875,1877,1879,1881,1883,1885,1887,1889,1891,1893,1895,1897,1899,1901,1903,1905,1907,1909,1911,1913,1915,1917,1919,1921,1923,1925,1927,1929,1931,1933,1935,1937,1939,1941,1943,1945,1947,1949,1951,1953,1955,1957,1959,1961,1963,1965,1967,1969,1971,1973,1975,1977,1979,1981,1983,1985,1987,1989,1991,1993,1995,1997,1999,2001,2003,2005,2007,2009,2011,2013,2015,2017,2019,2021,2023,2025,2027,2029,2031,2033,2035,2037,2039,2041,2043,2045,2047,2049,2051,2053,2055,2057,2059,2061,2063,2065,2067,2069,2071,2073,2075,2077,2079,2081,2083,2085,2087,2089,2091,2093,2095,2097,2099,2101,2103,2105,2107,2109,2111,2113,2115,2117,2119,2121,2123,2125,2127,2129,2131,2133,2135,2137,2139,2141,2143,2145,2147,2149,2151,2153,2155,2157,2159,2161,2163,2165,2167,2169,2171,2173,2175,2177,2179,2181,2183,2185,2187,2189,2191,2193,2195,2197,2199,2201,2203,2205,2207,2209,2211,2213,2215,2217,2219,2221,2223,2225,2227,2229,2231,2233,2235,2237,2239,2241,2243,2245,2247,2249,2251,2253,2255,2257,2259,2261,2263,2265,2267,2269,2271,2273,2275,2277,2279,2281,2283,2285,2287,2289,2291,2293,2295,2297,2299,2301,2303,2305,2307,2309,2311,2313,2315,2317,2319,2321,2323,2325,2327,2329,2331,2333,2335,2337,2339,2341,2343,2345,2347,2349,2351,2353,2355,2357,2359,2361,2363,2365,2367,2369,2371,2373,2375,2377,2379,2381,2383,2385,2387,2389,2391,2393,2395,2397,2399,2401,2403,2405,2407,2409,2411,2413,2415,2417,2419,2421,2423,2425,2427,2429,2431,2433,2435,2437,2439,2441,2443,2445,2447,2449,2451,2453,2455,2457,2459,2461,2463,2465,2467,2469,2471,2473,2475,2477,2479,2481,2483,2485,2487,2489,2491,2493,2495,2497,2499,2501,2503,2505,2507,2509,2511,2513,2515,2517,2519,2521,2523,2525,2527,2529,2531,2533,2535,2537,2539,2541,2543,2545,2547,2549,2551,2553,2555,2557,2559,2561,2563,2565,2567,2569,2571,2573,2575,2577,2579,2581,2583,2585,2587,2589,2591,2593,2595,2597,2599,2601,2603,2605,2607,2609,2611,2613,2615,2617,2619,2621,2623,2625,2627,2629,2631,2633,2635,2637,2639,2641,2643,2645,2647,2649,2651,2653,2655,2657,2659,2661,2663,2665,2667,2669,2671,2673,2675,2677,2679,2681,2683,2685,2687,2689,2691,2693,2695,2697,2699,2701,2703,2705,2707,2709,2711,2713,2715,2717,2719,2721,2723,2725,2727,2729,2731,2733,2735,2737,2739,2741,2743,2745,2747,2749,2751,2753,2755,2757,2759,2761,2763,2765,2767,2769,2771,2773,2775,2777,2779,2781,2783,2785,2787,2789,2791,2793,2795,2797,2799,2801,2803,2805,2807,2809,2811,2813,2815,2817,2819,2821,2823,2825,2827,2829,2831,2833,2835,2837,2839,2841,2843,2845,2847,2849,2851,2853,2855,2857,2859,2861,2863,2865,2867,2869,2871,2873,2875,2877,2879,2881,2883,2885,2887,2889,2891,2893,2895,2897,2899,2901,2903,2905,2907,2909,2911,2913,2915,2917,2919,2921,2923,2925,2927,2929,2931,2933,2935,2937,2939,2941,2943,2945,2947,2949,2951,2953,2955,2957,2959,2961,2963,2965,2967,2969,2971,2973,2975,2977,2979,2981,2983,2985,2987,2989,2991,2993,2995,2997,2999,3001,3003,3005,3007,3009,3011,3013,3015,3017,3019,3021,3023,3025,3027,3029,3031,3033,3035,3037,3039,3041,3043,3045,3047,3049,3051,3053,3055,3057,3059,3061,3063,3065,3067,3069,3071,3073,3075,3077,3079,3081,3083,3085,3087,3089,3091,3093,3095,3097,3099,3101,3103,3105,3107,3109,3111,3113,3115,3117,3119,3121,3123,3125,3127,3129,3131,3133,3135,3137,3139,3141,3143,3145,3147,3149,3151,3153,3155,3157,3159,3161,3163,3165,3167,3169,3171,3173,3175,3177,3179,3181,3183,3185,3187,3189,3191,3193,3195,3197,3199,3201,3203,3205,3207,3209,3211,3213,3215,3217,3219,3221,3223,3225,3227,3229,3231,3233,3235,3237,3239,3241,3243,3245,3247,3249,3251,3253,3255,3257,3259,3261,3263,3265,3267,3269,3271,3273,3275,3277,3279,3281,3283,3285,3287,3289,3291,3293,3295,3297,3299,3301,3303,3305,3307,3309,3311,3313,3315,3317,3319,3321,3323,3325,3327,3329,3331,3333,3335,3337,3339,3341,3343,3345,3347,3349,3351,3353,3355,3357,3359,3361,3363,3365,3367,3369,3371,3373,3375,3377,3379,3381,3383,3385,3387,3389,3391,3393,3395,3397,3399,3401,3403,3405,3407,3409,3411,3413,3415,3417,3419,3421,3423,3425,3427,3429,3431,3433,3435,3437,3439,3441,3443,3445,3447,3449,3451,3453,3455,3457,3459,3461,3463,3465,3467,3469,3471,3473,3475,3477,3479,3481,3483,3485,3487,3489,3491,3493,3495,3497,3499,3501,3503,3505,3507,3509,3511,3513,3515,3517,3519,3521,3523,3525,3527,3529,3531,3533,3535,3537,3539,3541,3543,3545,3547,3549,3551,3553,3555,3557,3559,3561,3563,3565,3567,3569,3571,3573,3575,3577,3579,3581,3583,3585,3587,3589,3591,3593,3595,3597,3599,3601,3603,3605,3607,3609,3611,3613,3615,3617,3619,3621,3623,3625,3627,3629,3631,3633,3635,3637,3639,3641,3643,3645,3647,3649,3651,3653,3655,3657,3659,3661,3663,3665,3667,3669,3671,3673,3675,3677,3679,3681,3683,3685,3687,3689],{"categories":107},[108],"Developer Productivity",{"categories":110},[111],"Business & SaaS",{"categories":113},[114],"AI & LLMs",{"categories":116},[117],"AI Automation",{"categories":119},[120],"Product Strategy",{"categories":122},[114],{"categories":124},[108],{"categories":126},[111],{"categories":128},[],{"categories":130},[114],{"categories":132},[],{"categories":134},[135],"AI News & Trends",{"categories":137},[117],{"categories":139},[135],{"categories":141},[117],{"categories":143},[117],{"categories":145},[114],{"categories":147},[114],{"categories":149},[135],{"categories":151},[114],{"categories":153},[],{"categories":155},[156],"Design & Frontend",{"categories":158},[159],"Data Science & Visualization",{"categories":161},[135],{"categories":163},[],{"categories":165},[166],"Software Engineering",{"categories":168},[114],{"categories":170},[117],{"categories":172},[173],"Marketing & Growth",{"categories":175},[114],{"categories":177},[117],{"categories":179},[],{"categories":181},[],{"categories":183},[156],{"categories":185},[117],{"categories":187},[108],{"categories":189},[156],{"categories":191},[114],{"categories":193},[117],{"categories":195},[135],{"categories":197},[],{"categories":199},[],{"categories":201},[117],{"categories":203},[166],{"categories":205},[],{"categories":207},[111],{"categories":209},[],{"categories":211},[],{"categories":213},[117],{"categories":215},[117],{"categories":217},[114],{"categories":219},[],{"categories":221},[166],{"categories":223},[],{"categories":225},[],{"categories":227},[],{"categories":229},[114],{"categories":231},[173],{"categories":233},[156],{"categories":235},[156],{"categories":237},[114],{"categories":239},[117],{"categories":241},[114],{"categories":243},[114],{"categories":245},[117],{"categories":247},[117],{"categories":249},[159],{"categories":251},[135],{"categories":253},[117],{"categories":255},[173],{"categories":257},[117],{"categories":259},[120],{"categories":261},[],{"categories":263},[117],{"categories":265},[],{"categories":267},[117],{"categories":269},[166],{"categories":271},[156],{"categories":273},[114],{"categories":275},[],{"categories":277},[],{"categories":279},[117],{"categories":281},[],{"categories":283},[114],{"categories":285},[],{"categories":287},[108],{"categories":289},[166],{"categories":291},[111],{"categories":293},[135],{"categories":295},[114],{"categories":297},[],{"categories":299},[114],{"categories":301},[],{"categories":303},[166],{"categories":305},[159],{"categories":307},[],{"categories":309},[114],{"categories":311},[156],{"categories":313},[],{"categories":315},[156],{"categories":317},[117],{"categories":319},[],{"categories":321},[117],{"categories":323},[135],{"categories":325},[111],{"categories":327},[114],{"categories":329},[],{"categories":331},[117],{"categories":333},[114],{"categories":335},[120],{"categories":337},[],{"categories":339},[114],{"categories":341},[117],{"categories":343},[117],{"categories":345},[],{"categories":347},[159],{"categories":349},[114],{"categories":351},[],{"categories":353},[108],{"categories":355},[111],{"categories":357},[114],{"categories":359},[117],{"categories":361},[166],{"categories":363},[114],{"categories":365},[],{"categories":367},[],{"categories":369},[114],{"categories":371},[],{"categories":373},[156],{"categories":375},[],{"categories":377},[114],{"categories":379},[],{"categories":381},[117],{"categories":383},[114],{"categories":385},[156],{"categories":387},[],{"categories":389},[114],{"categories":391},[114],{"categories":393},[111],{"categories":395},[117],{"categories":397},[114],{"categories":399},[156],{"categories":401},[117],{"categories":403},[],{"categories":405},[],{"categories":407},[135],{"categories":409},[],{"categories":411},[114],{"categories":413},[111,173],{"categories":415},[],{"categories":417},[114],{"categories":419},[],{"categories":421},[],{"categories":423},[114],{"categories":425},[],{"categories":427},[114],{"categories":429},[430],"DevOps & Cloud",{"categories":432},[],{"categories":434},[135],{"categories":436},[156],{"categories":438},[],{"categories":440},[135],{"categories":442},[135],{"categories":444},[114],{"categories":446},[173],{"categories":448},[],{"categories":450},[111],{"categories":452},[],{"categories":454},[114,430],{"categories":456},[114],{"categories":458},[114],{"categories":460},[117],{"categories":462},[114,166],{"categories":464},[159],{"categories":466},[114],{"categories":468},[173],{"categories":470},[117],{"categories":472},[117],{"categories":474},[],{"categories":476},[117],{"categories":478},[114,111],{"categories":480},[],{"categories":482},[156],{"categories":484},[156],{"categories":486},[],{"categories":488},[],{"categories":490},[135],{"categories":492},[],{"categories":494},[108],{"categories":496},[166],{"categories":498},[114],{"categories":500},[156],{"categories":502},[117],{"categories":504},[166],{"categories":506},[135],{"categories":508},[156],{"categories":510},[],{"categories":512},[114],{"categories":514},[114],{"categories":516},[114],{"categories":518},[135],{"categories":520},[108],{"categories":522},[114],{"categories":524},[117],{"categories":526},[430],{"categories":528},[156],{"categories":530},[117],{"categories":532},[],{"categories":534},[],{"categories":536},[156],{"categories":538},[135],{"categories":540},[159],{"categories":542},[],{"categories":544},[114],{"categories":546},[114],{"categories":548},[111],{"categories":550},[114],{"categories":552},[114],{"categories":554},[135],{"categories":556},[],{"categories":558},[117],{"categories":560},[166],{"categories":562},[],{"categories":564},[114],{"categories":566},[114],{"categories":568},[117],{"categories":570},[],{"categories":572},[],{"categories":574},[114],{"categories":576},[],{"categories":578},[111],{"categories":580},[117],{"categories":582},[],{"categories":584},[108],{"categories":586},[114],{"categories":588},[111],{"categories":590},[135],{"categories":592},[],{"categories":594},[],{"categories":596},[],{"categories":598},[135],{"categories":600},[135],{"categories":602},[],{"categories":604},[],{"categories":606},[111],{"categories":608},[],{"categories":610},[],{"categories":612},[108],{"categories":614},[],{"categories":616},[173],{"categories":618},[117],{"categories":620},[111],{"categories":622},[117],{"categories":624},[166],{"categories":626},[],{"categories":628},[120],{"categories":630},[156],{"categories":632},[166],{"categories":634},[114],{"categories":636},[117],{"categories":638},[111],{"categories":640},[114],{"categories":642},[],{"categories":644},[],{"categories":646},[166],{"categories":648},[159],{"categories":650},[120],{"categories":652},[117],{"categories":654},[114],{"categories":656},[],{"categories":658},[430],{"categories":660},[],{"categories":662},[117],{"categories":664},[],{"categories":666},[],{"categories":668},[114],{"categories":670},[156],{"categories":672},[173],{"categories":674},[117],{"categories":676},[],{"categories":678},[108],{"categories":680},[],{"categories":682},[135],{"categories":684},[114,430],{"categories":686},[135],{"categories":688},[114],{"categories":690},[111],{"categories":692},[114],{"categories":694},[],{"categories":696},[111],{"categories":698},[],{"categories":700},[166],{"categories":702},[156],{"categories":704},[135],{"categories":706},[159],{"categories":708},[108],{"categories":710},[114],{"categories":712},[166],{"categories":714},[],{"categories":716},[],{"categories":718},[120],{"categories":720},[],{"categories":722},[114],{"categories":724},[],{"categories":726},[156],{"categories":728},[156],{"categories":730},[156],{"categories":732},[],{"categories":734},[],{"categories":736},[135],{"categories":738},[117],{"categories":740},[114],{"categories":742},[114],{"categories":744},[114],{"categories":746},[111],{"categories":748},[114],{"categories":750},[],{"categories":752},[166],{"categories":754},[166],{"categories":756},[111],{"categories":758},[],{"categories":760},[114],{"categories":762},[114],{"categories":764},[111],{"categories":766},[135],{"categories":768},[173],{"categories":770},[117],{"categories":772},[],{"categories":774},[156],{"categories":776},[],{"categories":778},[114],{"categories":780},[],{"categories":782},[111],{"categories":784},[117],{"categories":786},[],{"categories":788},[430],{"categories":790},[159],{"categories":792},[166],{"categories":794},[173],{"categories":796},[166],{"categories":798},[117],{"categories":800},[],{"categories":802},[],{"categories":804},[117],{"categories":806},[108],{"categories":808},[117],{"categories":810},[120],{"categories":812},[111],{"categories":814},[],{"categories":816},[114],{"categories":818},[120],{"categories":820},[114],{"categories":822},[114],{"categories":824},[173],{"categories":826},[156],{"categories":828},[117],{"categories":830},[],{"categories":832},[],{"categories":834},[430],{"categories":836},[166],{"categories":838},[],{"categories":840},[117],{"categories":842},[114],{"categories":844},[156,114],{"categories":846},[108],{"categories":848},[],{"categories":850},[114],{"categories":852},[108],{"categories":854},[156],{"categories":856},[117],{"categories":858},[166],{"categories":860},[],{"categories":862},[114],{"categories":864},[],{"categories":866},[108],{"categories":868},[],{"categories":870},[117],{"categories":872},[120],{"categories":874},[114],{"categories":876},[114],{"categories":878},[156],{"categories":880},[117],{"categories":882},[430],{"categories":884},[156],{"categories":886},[117],{"categories":888},[114],{"categories":890},[114],{"categories":892},[114],{"categories":894},[135],{"categories":896},[],{"categories":898},[120],{"categories":900},[117],{"categories":902},[156],{"categories":904},[117],{"categories":906},[166],{"categories":908},[156],{"categories":910},[117],{"categories":912},[135],{"categories":914},[],{"categories":916},[114],{"categories":918},[156],{"categories":920},[114],{"categories":922},[108],{"categories":924},[135],{"categories":926},[114],{"categories":928},[173],{"categories":930},[114],{"categories":932},[114],{"categories":934},[117],{"categories":936},[117],{"categories":938},[114],{"categories":940},[117],{"categories":942},[156],{"categories":944},[114],{"categories":946},[],{"categories":948},[],{"categories":950},[166],{"categories":952},[],{"categories":954},[108],{"categories":956},[430],{"categories":958},[],{"categories":960},[108],{"categories":962},[111],{"categories":964},[173],{"categories":966},[],{"categories":968},[111],{"categories":970},[],{"categories":972},[],{"categories":974},[],{"categories":976},[],{"categories":978},[],{"categories":980},[114],{"categories":982},[117],{"categories":984},[430],{"categories":986},[108],{"categories":988},[114],{"categories":990},[166],{"categories":992},[120],{"categories":994},[114],{"categories":996},[173],{"categories":998},[114],{"categories":1000},[114],{"categories":1002},[114],{"categories":1004},[114,108],{"categories":1006},[166],{"categories":1008},[166],{"categories":1010},[156],{"categories":1012},[114],{"categories":1014},[],{"categories":1016},[],{"categories":1018},[],{"categories":1020},[166],{"categories":1022},[159],{"categories":1024},[135],{"categories":1026},[156],{"categories":1028},[],{"categories":1030},[114],{"categories":1032},[114],{"categories":1034},[],{"categories":1036},[],{"categories":1038},[117],{"categories":1040},[114],{"categories":1042},[111],{"categories":1044},[],{"categories":1046},[108],{"categories":1048},[114],{"categories":1050},[108],{"categories":1052},[114],{"categories":1054},[166],{"categories":1056},[173],{"categories":1058},[114,156],{"categories":1060},[135],{"categories":1062},[156],{"categories":1064},[],{"categories":1066},[430],{"categories":1068},[156],{"categories":1070},[117],{"categories":1072},[],{"categories":1074},[],{"categories":1076},[],{"categories":1078},[],{"categories":1080},[166],{"categories":1082},[117],{"categories":1084},[117],{"categories":1086},[430],{"categories":1088},[114],{"categories":1090},[114],{"categories":1092},[114],{"categories":1094},[],{"categories":1096},[156],{"categories":1098},[],{"categories":1100},[],{"categories":1102},[117],{"categories":1104},[],{"categories":1106},[],{"categories":1108},[173],{"categories":1110},[173],{"categories":1112},[117],{"categories":1114},[],{"categories":1116},[114],{"categories":1118},[114],{"categories":1120},[166],{"categories":1122},[156],{"categories":1124},[156],{"categories":1126},[117],{"categories":1128},[108],{"categories":1130},[114],{"categories":1132},[156],{"categories":1134},[156],{"categories":1136},[117],{"categories":1138},[117],{"categories":1140},[114],{"categories":1142},[],{"categories":1144},[],{"categories":1146},[114],{"categories":1148},[117],{"categories":1150},[135],{"categories":1152},[166],{"categories":1154},[108],{"categories":1156},[114],{"categories":1158},[],{"categories":1160},[117],{"categories":1162},[117],{"categories":1164},[],{"categories":1166},[108],{"categories":1168},[114],{"categories":1170},[108],{"categories":1172},[108],{"categories":1174},[],{"categories":1176},[],{"categories":1178},[117],{"categories":1180},[117],{"categories":1182},[114],{"categories":1184},[114],{"categories":1186},[135],{"categories":1188},[159],{"categories":1190},[120],{"categories":1192},[135],{"categories":1194},[156],{"categories":1196},[],{"categories":1198},[135],{"categories":1200},[],{"categories":1202},[],{"categories":1204},[],{"categories":1206},[],{"categories":1208},[166],{"categories":1210},[159],{"categories":1212},[],{"categories":1214},[114],{"categories":1216},[114],{"categories":1218},[159],{"categories":1220},[166],{"categories":1222},[],{"categories":1224},[],{"categories":1226},[117],{"categories":1228},[135],{"categories":1230},[135],{"categories":1232},[117],{"categories":1234},[108],{"categories":1236},[114,430],{"categories":1238},[],{"categories":1240},[156],{"categories":1242},[108],{"categories":1244},[117],{"categories":1246},[156],{"categories":1248},[],{"categories":1250},[117],{"categories":1252},[117],{"categories":1254},[114],{"categories":1256},[173],{"categories":1258},[166],{"categories":1260},[156],{"categories":1262},[],{"categories":1264},[117],{"categories":1266},[114],{"categories":1268},[117],{"categories":1270},[117],{"categories":1272},[117],{"categories":1274},[173],{"categories":1276},[117],{"categories":1278},[114],{"categories":1280},[],{"categories":1282},[173],{"categories":1284},[135],{"categories":1286},[117],{"categories":1288},[],{"categories":1290},[],{"categories":1292},[114],{"categories":1294},[117],{"categories":1296},[135],{"categories":1298},[117],{"categories":1300},[],{"categories":1302},[],{"categories":1304},[],{"categories":1306},[117],{"categories":1308},[],{"categories":1310},[],{"categories":1312},[159],{"categories":1314},[114],{"categories":1316},[159],{"categories":1318},[135],{"categories":1320},[114],{"categories":1322},[114],{"categories":1324},[117],{"categories":1326},[114],{"categories":1328},[],{"categories":1330},[],{"categories":1332},[430],{"categories":1334},[],{"categories":1336},[],{"categories":1338},[108],{"categories":1340},[],{"categories":1342},[],{"categories":1344},[],{"categories":1346},[],{"categories":1348},[166],{"categories":1350},[135],{"categories":1352},[173],{"categories":1354},[111],{"categories":1356},[114],{"categories":1358},[114],{"categories":1360},[111],{"categories":1362},[],{"categories":1364},[156],{"categories":1366},[117],{"categories":1368},[111],{"categories":1370},[114],{"categories":1372},[114],{"categories":1374},[108],{"categories":1376},[],{"categories":1378},[108],{"categories":1380},[114],{"categories":1382},[173],{"categories":1384},[117],{"categories":1386},[135],{"categories":1388},[111],{"categories":1390},[114],{"categories":1392},[117],{"categories":1394},[],{"categories":1396},[114],{"categories":1398},[108],{"categories":1400},[114],{"categories":1402},[],{"categories":1404},[135],{"categories":1406},[114],{"categories":1408},[],{"categories":1410},[111],{"categories":1412},[114],{"categories":1414},[],{"categories":1416},[],{"categories":1418},[],{"categories":1420},[114],{"categories":1422},[],{"categories":1424},[430],{"categories":1426},[114],{"categories":1428},[],{"categories":1430},[114],{"categories":1432},[114],{"categories":1434},[114],{"categories":1436},[114,430],{"categories":1438},[114],{"categories":1440},[114],{"categories":1442},[156],{"categories":1444},[117],{"categories":1446},[],{"categories":1448},[117],{"categories":1450},[114],{"categories":1452},[114],{"categories":1454},[114],{"categories":1456},[108],{"categories":1458},[108],{"categories":1460},[166],{"categories":1462},[156],{"categories":1464},[117],{"categories":1466},[],{"categories":1468},[114],{"categories":1470},[135],{"categories":1472},[114],{"categories":1474},[111],{"categories":1476},[],{"categories":1478},[430],{"categories":1480},[156],{"categories":1482},[156],{"categories":1484},[117],{"categories":1486},[135],{"categories":1488},[117],{"categories":1490},[114],{"categories":1492},[],{"categories":1494},[114],{"categories":1496},[],{"categories":1498},[],{"categories":1500},[114],{"categories":1502},[114],{"categories":1504},[114],{"categories":1506},[117],{"categories":1508},[114],{"categories":1510},[],{"categories":1512},[159],{"categories":1514},[117],{"categories":1516},[],{"categories":1518},[],{"categories":1520},[114],{"categories":1522},[135],{"categories":1524},[],{"categories":1526},[156],{"categories":1528},[430],{"categories":1530},[135],{"categories":1532},[166],{"categories":1534},[166],{"categories":1536},[135],{"categories":1538},[135],{"categories":1540},[430],{"categories":1542},[],{"categories":1544},[135],{"categories":1546},[114],{"categories":1548},[108],{"categories":1550},[135],{"categories":1552},[],{"categories":1554},[159],{"categories":1556},[135],{"categories":1558},[166],{"categories":1560},[135],{"categories":1562},[430],{"categories":1564},[114],{"categories":1566},[114],{"categories":1568},[],{"categories":1570},[111],{"categories":1572},[],{"categories":1574},[],{"categories":1576},[114],{"categories":1578},[114],{"categories":1580},[114],{"categories":1582},[114],{"categories":1584},[],{"categories":1586},[159],{"categories":1588},[108],{"categories":1590},[],{"categories":1592},[114],{"categories":1594},[114],{"categories":1596},[430],{"categories":1598},[430],{"categories":1600},[],{"categories":1602},[117],{"categories":1604},[135],{"categories":1606},[135],{"categories":1608},[114],{"categories":1610},[117],{"categories":1612},[],{"categories":1614},[156],{"categories":1616},[114],{"categories":1618},[114],{"categories":1620},[],{"categories":1622},[],{"categories":1624},[430],{"categories":1626},[114],{"categories":1628},[166],{"categories":1630},[111],{"categories":1632},[114],{"categories":1634},[],{"categories":1636},[117],{"categories":1638},[108],{"categories":1640},[108],{"categories":1642},[],{"categories":1644},[114],{"categories":1646},[156],{"categories":1648},[117],{"categories":1650},[],{"categories":1652},[114],{"categories":1654},[114],{"categories":1656},[117],{"categories":1658},[],{"categories":1660},[117],{"categories":1662},[166],{"categories":1664},[],{"categories":1666},[114],{"categories":1668},[],{"categories":1670},[114],{"categories":1672},[],{"categories":1674},[114],{"categories":1676},[114],{"categories":1678},[],{"categories":1680},[114],{"categories":1682},[135],{"categories":1684},[114],{"categories":1686},[114],{"categories":1688},[108],{"categories":1690},[114],{"categories":1692},[135],{"categories":1694},[117],{"categories":1696},[],{"categories":1698},[114],{"categories":1700},[173],{"categories":1702},[],{"categories":1704},[],{"categories":1706},[],{"categories":1708},[108],{"categories":1710},[135],{"categories":1712},[117],{"categories":1714},[114],{"categories":1716},[156],{"categories":1718},[117],{"categories":1720},[],{"categories":1722},[117],{"categories":1724},[],{"categories":1726},[114],{"categories":1728},[117],{"categories":1730},[114],{"categories":1732},[],{"categories":1734},[114],{"categories":1736},[114],{"categories":1738},[135],{"categories":1740},[156],{"categories":1742},[117],{"categories":1744},[156],{"categories":1746},[111],{"categories":1748},[],{"categories":1750},[],{"categories":1752},[114],{"categories":1754},[108],{"categories":1756},[135],{"categories":1758},[],{"categories":1760},[],{"categories":1762},[166],{"categories":1764},[156],{"categories":1766},[],{"categories":1768},[114],{"categories":1770},[],{"categories":1772},[173],{"categories":1774},[114],{"categories":1776},[430],{"categories":1778},[166],{"categories":1780},[],{"categories":1782},[117],{"categories":1784},[114],{"categories":1786},[117],{"categories":1788},[117],{"categories":1790},[114],{"categories":1792},[],{"categories":1794},[108],{"categories":1796},[114],{"categories":1798},[111],{"categories":1800},[166],{"categories":1802},[156],{"categories":1804},[],{"categories":1806},[],{"categories":1808},[],{"categories":1810},[117],{"categories":1812},[156],{"categories":1814},[135],{"categories":1816},[114],{"categories":1818},[135],{"categories":1820},[156],{"categories":1822},[],{"categories":1824},[156],{"categories":1826},[135],{"categories":1828},[111],{"categories":1830},[114],{"categories":1832},[135],{"categories":1834},[173],{"categories":1836},[],{"categories":1838},[],{"categories":1840},[159],{"categories":1842},[114,166],{"categories":1844},[135],{"categories":1846},[114],{"categories":1848},[117],{"categories":1850},[117],{"categories":1852},[114],{"categories":1854},[],{"categories":1856},[166],{"categories":1858},[114],{"categories":1860},[159],{"categories":1862},[117],{"categories":1864},[173],{"categories":1866},[430],{"categories":1868},[],{"categories":1870},[108],{"categories":1872},[117],{"categories":1874},[117],{"categories":1876},[166],{"categories":1878},[114],{"categories":1880},[114],{"categories":1882},[],{"categories":1884},[],{"categories":1886},[],{"categories":1888},[430],{"categories":1890},[135],{"categories":1892},[114],{"categories":1894},[114],{"categories":1896},[114],{"categories":1898},[],{"categories":1900},[159],{"categories":1902},[111],{"categories":1904},[],{"categories":1906},[117],{"categories":1908},[430],{"categories":1910},[],{"categories":1912},[156],{"categories":1914},[156],{"categories":1916},[],{"categories":1918},[166],{"categories":1920},[156],{"categories":1922},[114],{"categories":1924},[],{"categories":1926},[135],{"categories":1928},[114],{"categories":1930},[156],{"categories":1932},[117],{"categories":1934},[135],{"categories":1936},[],{"categories":1938},[117],{"categories":1940},[156],{"categories":1942},[114],{"categories":1944},[],{"categories":1946},[114],{"categories":1948},[114],{"categories":1950},[430],{"categories":1952},[135],{"categories":1954},[159],{"categories":1956},[159],{"categories":1958},[],{"categories":1960},[],{"categories":1962},[],{"categories":1964},[117],{"categories":1966},[166],{"categories":1968},[166],{"categories":1970},[],{"categories":1972},[],{"categories":1974},[114],{"categories":1976},[],{"categories":1978},[117],{"categories":1980},[114],{"categories":1982},[],{"categories":1984},[114],{"categories":1986},[111],{"categories":1988},[114],{"categories":1990},[173],{"categories":1992},[117],{"categories":1994},[114],{"categories":1996},[166],{"categories":1998},[135],{"categories":2000},[117],{"categories":2002},[],{"categories":2004},[135],{"categories":2006},[117],{"categories":2008},[117],{"categories":2010},[],{"categories":2012},[111],{"categories":2014},[117],{"categories":2016},[],{"categories":2018},[114],{"categories":2020},[108],{"categories":2022},[135],{"categories":2024},[430],{"categories":2026},[117],{"categories":2028},[117],{"categories":2030},[108],{"categories":2032},[114],{"categories":2034},[],{"categories":2036},[],{"categories":2038},[156],{"categories":2040},[114,111],{"categories":2042},[],{"categories":2044},[108],{"categories":2046},[159],{"categories":2048},[114],{"categories":2050},[166],{"categories":2052},[114],{"categories":2054},[117],{"categories":2056},[114],{"categories":2058},[114],{"categories":2060},[135],{"categories":2062},[117],{"categories":2064},[],{"categories":2066},[],{"categories":2068},[117],{"categories":2070},[114],{"categories":2072},[430],{"categories":2074},[],{"categories":2076},[114],{"categories":2078},[117],{"categories":2080},[],{"categories":2082},[114],{"categories":2084},[173],{"categories":2086},[159],{"categories":2088},[117],{"categories":2090},[114],{"categories":2092},[430],{"categories":2094},[],{"categories":2096},[114],{"categories":2098},[173],{"categories":2100},[156],{"categories":2102},[114],{"categories":2104},[],{"categories":2106},[173],{"categories":2108},[135],{"categories":2110},[114],{"categories":2112},[114],{"categories":2114},[108],{"categories":2116},[],{"categories":2118},[],{"categories":2120},[156],{"categories":2122},[114],{"categories":2124},[159],{"categories":2126},[173],{"categories":2128},[173],{"categories":2130},[135],{"categories":2132},[],{"categories":2134},[],{"categories":2136},[114],{"categories":2138},[],{"categories":2140},[114,166],{"categories":2142},[135],{"categories":2144},[117],{"categories":2146},[166],{"categories":2148},[114],{"categories":2150},[108],{"categories":2152},[],{"categories":2154},[],{"categories":2156},[108],{"categories":2158},[173],{"categories":2160},[114],{"categories":2162},[],{"categories":2164},[156,114],{"categories":2166},[430],{"categories":2168},[108],{"categories":2170},[],{"categories":2172},[111],{"categories":2174},[111],{"categories":2176},[114],{"categories":2178},[166],{"categories":2180},[117],{"categories":2182},[135],{"categories":2184},[173],{"categories":2186},[156],{"categories":2188},[114],{"categories":2190},[114],{"categories":2192},[114],{"categories":2194},[108],{"categories":2196},[114],{"categories":2198},[117],{"categories":2200},[135],{"categories":2202},[],{"categories":2204},[],{"categories":2206},[159],{"categories":2208},[166],{"categories":2210},[114],{"categories":2212},[156],{"categories":2214},[159],{"categories":2216},[114],{"categories":2218},[114],{"categories":2220},[117],{"categories":2222},[117],{"categories":2224},[114,111],{"categories":2226},[],{"categories":2228},[156],{"categories":2230},[],{"categories":2232},[114],{"categories":2234},[135],{"categories":2236},[108],{"categories":2238},[108],{"categories":2240},[117],{"categories":2242},[114],{"categories":2244},[111],{"categories":2246},[166],{"categories":2248},[173],{"categories":2250},[],{"categories":2252},[135],{"categories":2254},[114],{"categories":2256},[114],{"categories":2258},[135],{"categories":2260},[166],{"categories":2262},[114],{"categories":2264},[117],{"categories":2266},[135],{"categories":2268},[114],{"categories":2270},[156],{"categories":2272},[114],{"categories":2274},[114],{"categories":2276},[430],{"categories":2278},[120],{"categories":2280},[117],{"categories":2282},[114],{"categories":2284},[135],{"categories":2286},[117],{"categories":2288},[173],{"categories":2290},[114],{"categories":2292},[],{"categories":2294},[114],{"categories":2296},[],{"categories":2298},[],{"categories":2300},[],{"categories":2302},[111],{"categories":2304},[114],{"categories":2306},[117],{"categories":2308},[135],{"categories":2310},[135],{"categories":2312},[135],{"categories":2314},[135],{"categories":2316},[],{"categories":2318},[108],{"categories":2320},[117],{"categories":2322},[135],{"categories":2324},[108],{"categories":2326},[117],{"categories":2328},[114],{"categories":2330},[114,117],{"categories":2332},[117],{"categories":2334},[430],{"categories":2336},[135],{"categories":2338},[135],{"categories":2340},[117],{"categories":2342},[114],{"categories":2344},[],{"categories":2346},[135],{"categories":2348},[173],{"categories":2350},[108],{"categories":2352},[114],{"categories":2354},[114],{"categories":2356},[],{"categories":2358},[166],{"categories":2360},[],{"categories":2362},[108],{"categories":2364},[117],{"categories":2366},[135],{"categories":2368},[114],{"categories":2370},[135],{"categories":2372},[108],{"categories":2374},[135],{"categories":2376},[135],{"categories":2378},[],{"categories":2380},[111],{"categories":2382},[117],{"categories":2384},[135],{"categories":2386},[135],{"categories":2388},[135],{"categories":2390},[135],{"categories":2392},[135],{"categories":2394},[135],{"categories":2396},[135],{"categories":2398},[135],{"categories":2400},[135],{"categories":2402},[135],{"categories":2404},[159],{"categories":2406},[108],{"categories":2408},[114],{"categories":2410},[114],{"categories":2412},[],{"categories":2414},[114,108],{"categories":2416},[],{"categories":2418},[117],{"categories":2420},[135],{"categories":2422},[117],{"categories":2424},[114],{"categories":2426},[114],{"categories":2428},[114],{"categories":2430},[114],{"categories":2432},[114],{"categories":2434},[117],{"categories":2436},[111],{"categories":2438},[156],{"categories":2440},[135],{"categories":2442},[114],{"categories":2444},[],{"categories":2446},[],{"categories":2448},[117],{"categories":2450},[156],{"categories":2452},[114],{"categories":2454},[],{"categories":2456},[],{"categories":2458},[173],{"categories":2460},[114],{"categories":2462},[],{"categories":2464},[],{"categories":2466},[108],{"categories":2468},[111],{"categories":2470},[114],{"categories":2472},[111],{"categories":2474},[156],{"categories":2476},[],{"categories":2478},[135],{"categories":2480},[],{"categories":2482},[156],{"categories":2484},[114],{"categories":2486},[173],{"categories":2488},[],{"categories":2490},[173],{"categories":2492},[],{"categories":2494},[],{"categories":2496},[117],{"categories":2498},[],{"categories":2500},[111],{"categories":2502},[108],{"categories":2504},[156],{"categories":2506},[166],{"categories":2508},[],{"categories":2510},[],{"categories":2512},[114],{"categories":2514},[108],{"categories":2516},[173],{"categories":2518},[],{"categories":2520},[117],{"categories":2522},[117],{"categories":2524},[135],{"categories":2526},[114],{"categories":2528},[117],{"categories":2530},[114],{"categories":2532},[117],{"categories":2534},[114],{"categories":2536},[120],{"categories":2538},[135],{"categories":2540},[],{"categories":2542},[173],{"categories":2544},[166],{"categories":2546},[117],{"categories":2548},[],{"categories":2550},[114],{"categories":2552},[117],{"categories":2554},[111],{"categories":2556},[108],{"categories":2558},[114],{"categories":2560},[156],{"categories":2562},[166],{"categories":2564},[166],{"categories":2566},[114],{"categories":2568},[159],{"categories":2570},[114],{"categories":2572},[117],{"categories":2574},[111],{"categories":2576},[117],{"categories":2578},[114],{"categories":2580},[114],{"categories":2582},[117],{"categories":2584},[135],{"categories":2586},[],{"categories":2588},[108],{"categories":2590},[114],{"categories":2592},[117],{"categories":2594},[114],{"categories":2596},[114],{"categories":2598},[],{"categories":2600},[156],{"categories":2602},[111],{"categories":2604},[135],{"categories":2606},[114],{"categories":2608},[114],{"categories":2610},[156],{"categories":2612},[173],{"categories":2614},[159],{"categories":2616},[114],{"categories":2618},[135],{"categories":2620},[114],{"categories":2622},[117],{"categories":2624},[430],{"categories":2626},[114],{"categories":2628},[117],{"categories":2630},[159],{"categories":2632},[],{"categories":2634},[117],{"categories":2636},[166],{"categories":2638},[156],{"categories":2640},[114],{"categories":2642},[108],{"categories":2644},[111],{"categories":2646},[166],{"categories":2648},[],{"categories":2650},[117],{"categories":2652},[114],{"categories":2654},[],{"categories":2656},[135],{"categories":2658},[],{"categories":2660},[135],{"categories":2662},[114],{"categories":2664},[117],{"categories":2666},[117],{"categories":2668},[117],{"categories":2670},[],{"categories":2672},[],{"categories":2674},[114],{"categories":2676},[114],{"categories":2678},[],{"categories":2680},[156],{"categories":2682},[117],{"categories":2684},[173],{"categories":2686},[108],{"categories":2688},[],{"categories":2690},[],{"categories":2692},[135],{"categories":2694},[166],{"categories":2696},[114],{"categories":2698},[114],{"categories":2700},[114],{"categories":2702},[166],{"categories":2704},[135],{"categories":2706},[156],{"categories":2708},[114],{"categories":2710},[114],{"categories":2712},[114],{"categories":2714},[135],{"categories":2716},[114],{"categories":2718},[135],{"categories":2720},[117],{"categories":2722},[117],{"categories":2724},[166],{"categories":2726},[117],{"categories":2728},[114],{"categories":2730},[166],{"categories":2732},[156],{"categories":2734},[],{"categories":2736},[117],{"categories":2738},[],{"categories":2740},[],{"categories":2742},[],{"categories":2744},[111],{"categories":2746},[114],{"categories":2748},[117],{"categories":2750},[108],{"categories":2752},[117],{"categories":2754},[173],{"categories":2756},[],{"categories":2758},[117],{"categories":2760},[],{"categories":2762},[108],{"categories":2764},[117],{"categories":2766},[],{"categories":2768},[117],{"categories":2770},[114],{"categories":2772},[135],{"categories":2774},[114],{"categories":2776},[117],{"categories":2778},[135],{"categories":2780},[117],{"categories":2782},[166],{"categories":2784},[156],{"categories":2786},[108],{"categories":2788},[],{"categories":2790},[117],{"categories":2792},[156],{"categories":2794},[430],{"categories":2796},[135],{"categories":2798},[114],{"categories":2800},[156],{"categories":2802},[108],{"categories":2804},[],{"categories":2806},[117],{"categories":2808},[117],{"categories":2810},[114],{"categories":2812},[],{"categories":2814},[117],{"categories":2816},[120],{"categories":2818},[135],{"categories":2820},[117],{"categories":2822},[111],{"categories":2824},[],{"categories":2826},[114],{"categories":2828},[120],{"categories":2830},[114],{"categories":2832},[117],{"categories":2834},[135],{"categories":2836},[108],{"categories":2838},[430],{"categories":2840},[114],{"categories":2842},[114],{"categories":2844},[114],{"categories":2846},[135],{"categories":2848},[111],{"categories":2850},[114],{"categories":2852},[156],{"categories":2854},[135],{"categories":2856},[430],{"categories":2858},[114],{"categories":2860},[],{"categories":2862},[],{"categories":2864},[430],{"categories":2866},[159],{"categories":2868},[117],{"categories":2870},[117],{"categories":2872},[135],{"categories":2874},[114],{"categories":2876},[108],{"categories":2878},[156],{"categories":2880},[117],{"categories":2882},[114],{"categories":2884},[173],{"categories":2886},[114],{"categories":2888},[117],{"categories":2890},[],{"categories":2892},[114],{"categories":2894},[114],{"categories":2896},[135],{"categories":2898},[108],{"categories":2900},[],{"categories":2902},[114],{"categories":2904},[114],{"categories":2906},[166],{"categories":2908},[156],{"categories":2910},[114,117],{"categories":2912},[173,111],{"categories":2914},[114],{"categories":2916},[],{"categories":2918},[117],{"categories":2920},[],{"categories":2922},[166],{"categories":2924},[114],{"categories":2926},[135],{"categories":2928},[],{"categories":2930},[117],{"categories":2932},[],{"categories":2934},[156],{"categories":2936},[117],{"categories":2938},[108],{"categories":2940},[117],{"categories":2942},[114],{"categories":2944},[430],{"categories":2946},[173],{"categories":2948},[111],{"categories":2950},[111],{"categories":2952},[108],{"categories":2954},[108],{"categories":2956},[114],{"categories":2958},[117],{"categories":2960},[114],{"categories":2962},[114],{"categories":2964},[108],{"categories":2966},[114],{"categories":2968},[173],{"categories":2970},[135],{"categories":2972},[114],{"categories":2974},[117],{"categories":2976},[114],{"categories":2978},[],{"categories":2980},[166],{"categories":2982},[],{"categories":2984},[117],{"categories":2986},[108],{"categories":2988},[],{"categories":2990},[430],{"categories":2992},[114],{"categories":2994},[],{"categories":2996},[135],{"categories":2998},[117],{"categories":3000},[166],{"categories":3002},[114],{"categories":3004},[117],{"categories":3006},[166],{"categories":3008},[117],{"categories":3010},[135],{"categories":3012},[108],{"categories":3014},[135],{"categories":3016},[166],{"categories":3018},[114],{"categories":3020},[156],{"categories":3022},[114],{"categories":3024},[114],{"categories":3026},[114],{"categories":3028},[114],{"categories":3030},[117],{"categories":3032},[114],{"categories":3034},[117],{"categories":3036},[114],{"categories":3038},[108],{"categories":3040},[114],{"categories":3042},[117],{"categories":3044},[156],{"categories":3046},[108],{"categories":3048},[117],{"categories":3050},[156],{"categories":3052},[],{"categories":3054},[114],{"categories":3056},[114],{"categories":3058},[166],{"categories":3060},[],{"categories":3062},[117],{"categories":3064},[173],{"categories":3066},[114],{"categories":3068},[135],{"categories":3070},[173],{"categories":3072},[117],{"categories":3074},[111],{"categories":3076},[111],{"categories":3078},[114],{"categories":3080},[108],{"categories":3082},[],{"categories":3084},[114],{"categories":3086},[],{"categories":3088},[108],{"categories":3090},[114],{"categories":3092},[117],{"categories":3094},[117],{"categories":3096},[],{"categories":3098},[166],{"categories":3100},[166],{"categories":3102},[173],{"categories":3104},[156],{"categories":3106},[],{"categories":3108},[114],{"categories":3110},[108],{"categories":3112},[114],{"categories":3114},[166],{"categories":3116},[108],{"categories":3118},[135],{"categories":3120},[135],{"categories":3122},[],{"categories":3124},[135],{"categories":3126},[117],{"categories":3128},[156],{"categories":3130},[159],{"categories":3132},[114],{"categories":3134},[],{"categories":3136},[135],{"categories":3138},[166],{"categories":3140},[111],{"categories":3142},[114],{"categories":3144},[108],{"categories":3146},[430],{"categories":3148},[108],{"categories":3150},[],{"categories":3152},[],{"categories":3154},[135],{"categories":3156},[],{"categories":3158},[117],{"categories":3160},[117],{"categories":3162},[117],{"categories":3164},[],{"categories":3166},[114],{"categories":3168},[],{"categories":3170},[135],{"categories":3172},[108],{"categories":3174},[156],{"categories":3176},[114],{"categories":3178},[135],{"categories":3180},[135],{"categories":3182},[],{"categories":3184},[135],{"categories":3186},[108],{"categories":3188},[114],{"categories":3190},[],{"categories":3192},[117],{"categories":3194},[117],{"categories":3196},[108],{"categories":3198},[],{"categories":3200},[],{"categories":3202},[],{"categories":3204},[156],{"categories":3206},[117],{"categories":3208},[114],{"categories":3210},[],{"categories":3212},[],{"categories":3214},[],{"categories":3216},[156],{"categories":3218},[],{"categories":3220},[108],{"categories":3222},[],{"categories":3224},[],{"categories":3226},[156],{"categories":3228},[114],{"categories":3230},[135],{"categories":3232},[],{"categories":3234},[173],{"categories":3236},[135],{"categories":3238},[173],{"categories":3240},[114],{"categories":3242},[],{"categories":3244},[],{"categories":3246},[117],{"categories":3248},[],{"categories":3250},[],{"categories":3252},[117],{"categories":3254},[114],{"categories":3256},[],{"categories":3258},[117],{"categories":3260},[135],{"categories":3262},[173],{"categories":3264},[159],{"categories":3266},[117],{"categories":3268},[117],{"categories":3270},[],{"categories":3272},[],{"categories":3274},[],{"categories":3276},[135],{"categories":3278},[],{"categories":3280},[],{"categories":3282},[156],{"categories":3284},[108],{"categories":3286},[],{"categories":3288},[111],{"categories":3290},[173],{"categories":3292},[114],{"categories":3294},[166],{"categories":3296},[108],{"categories":3298},[159],{"categories":3300},[111],{"categories":3302},[166],{"categories":3304},[],{"categories":3306},[],{"categories":3308},[117],{"categories":3310},[108],{"categories":3312},[156],{"categories":3314},[108],{"categories":3316},[117],{"categories":3318},[430],{"categories":3320},[117],{"categories":3322},[],{"categories":3324},[114],{"categories":3326},[135],{"categories":3328},[166],{"categories":3330},[],{"categories":3332},[156],{"categories":3334},[135],{"categories":3336},[108],{"categories":3338},[117],{"categories":3340},[114],{"categories":3342},[111],{"categories":3344},[117,430],{"categories":3346},[117],{"categories":3348},[166],{"categories":3350},[114],{"categories":3352},[159],{"categories":3354},[173],{"categories":3356},[117],{"categories":3358},[],{"categories":3360},[117],{"categories":3362},[114],{"categories":3364},[111],{"categories":3366},[],{"categories":3368},[],{"categories":3370},[114],{"categories":3372},[159],{"categories":3374},[114],{"categories":3376},[],{"categories":3378},[135],{"categories":3380},[],{"categories":3382},[135],{"categories":3384},[166],{"categories":3386},[117],{"categories":3388},[114],{"categories":3390},[173],{"categories":3392},[166],{"categories":3394},[],{"categories":3396},[135],{"categories":3398},[114],{"categories":3400},[],{"categories":3402},[114],{"categories":3404},[117],{"categories":3406},[114],{"categories":3408},[117],{"categories":3410},[114],{"categories":3412},[114],{"categories":3414},[114],{"categories":3416},[114],{"categories":3418},[111],{"categories":3420},[],{"categories":3422},[120],{"categories":3424},[135],{"categories":3426},[114],{"categories":3428},[],{"categories":3430},[166],{"categories":3432},[114],{"categories":3434},[114],{"categories":3436},[117],{"categories":3438},[135],{"categories":3440},[114],{"categories":3442},[114],{"categories":3444},[111],{"categories":3446},[117],{"categories":3448},[156],{"categories":3450},[],{"categories":3452},[159],{"categories":3454},[114],{"categories":3456},[],{"categories":3458},[135],{"categories":3460},[173],{"categories":3462},[],{"categories":3464},[],{"categories":3466},[135],{"categories":3468},[135],{"categories":3470},[173],{"categories":3472},[108],{"categories":3474},[117],{"categories":3476},[117],{"categories":3478},[114],{"categories":3480},[111],{"categories":3482},[],{"categories":3484},[],{"categories":3486},[135],{"categories":3488},[159],{"categories":3490},[166],{"categories":3492},[117],{"categories":3494},[156],{"categories":3496},[159],{"categories":3498},[159],{"categories":3500},[],{"categories":3502},[135],{"categories":3504},[114],{"categories":3506},[114],{"categories":3508},[166],{"categories":3510},[],{"categories":3512},[135],{"categories":3514},[135],{"categories":3516},[135],{"categories":3518},[],{"categories":3520},[117],{"categories":3522},[114],{"categories":3524},[],{"categories":3526},[108],{"categories":3528},[111],{"categories":3530},[],{"categories":3532},[114],{"categories":3534},[114],{"categories":3536},[],{"categories":3538},[166],{"categories":3540},[],{"categories":3542},[],{"categories":3544},[],{"categories":3546},[],{"categories":3548},[114],{"categories":3550},[135],{"categories":3552},[],{"categories":3554},[],{"categories":3556},[114],{"categories":3558},[114],{"categories":3560},[114],{"categories":3562},[159],{"categories":3564},[114],{"categories":3566},[159],{"categories":3568},[],{"categories":3570},[159],{"categories":3572},[159],{"categories":3574},[430],{"categories":3576},[117],{"categories":3578},[166],{"categories":3580},[],{"categories":3582},[],{"categories":3584},[159],{"categories":3586},[166],{"categories":3588},[166],{"categories":3590},[166],{"categories":3592},[],{"categories":3594},[108],{"categories":3596},[166],{"categories":3598},[166],{"categories":3600},[108],{"categories":3602},[166],{"categories":3604},[111],{"categories":3606},[166],{"categories":3608},[166],{"categories":3610},[166],{"categories":3612},[159],{"categories":3614},[135],{"categories":3616},[135],{"categories":3618},[114],{"categories":3620},[166],{"categories":3622},[159],{"categories":3624},[430],{"categories":3626},[159],{"categories":3628},[159],{"categories":3630},[159],{"categories":3632},[],{"categories":3634},[111],{"categories":3636},[],{"categories":3638},[430],{"categories":3640},[166],{"categories":3642},[166],{"categories":3644},[166],{"categories":3646},[117],{"categories":3648},[135,111],{"categories":3650},[159],{"categories":3652},[],{"categories":3654},[],{"categories":3656},[159],{"categories":3658},[],{"categories":3660},[159],{"categories":3662},[135],{"categories":3664},[117],{"categories":3666},[],{"categories":3668},[166],{"categories":3670},[114],{"categories":3672},[156],{"categories":3674},[],{"categories":3676},[114],{"categories":3678},[],{"categories":3680},[135],{"categories":3682},[108],{"categories":3684},[159],{"categories":3686},[],{"categories":3688},[166],{"categories":3690},[135],[3692,3960,4037,4131],{"id":3693,"title":3694,"ai":3695,"body":3700,"categories":3924,"created_at":59,"date_modified":59,"description":52,"extension":60,"faq":59,"featured":61,"kicker_label":59,"meta":3925,"navigation":86,"path":3948,"published_at":59,"question":59,"scraped_at":3949,"seo":3950,"sitemap":3951,"source_id":3952,"source_name":3953,"source_type":94,"source_url":3954,"stem":3955,"tags":3956,"thumbnail_url":59,"tldr":3957,"tweet":59,"unknown_tags":3958,"__hash__":3959},"summaries\u002Fsummaries\u002F2a9849ad35620d4f-turboquant-6-4x-kv-cache-compression-at-q8-0-speed-summary.md","TurboQuant+: 6.4x KV Cache Compression at q8_0 Speed",{"provider":7,"model":8,"input_tokens":3696,"output_tokens":3697,"processing_time_ms":3698,"cost_usd":3699},11014,3209,20267,0.0037848,{"type":14,"value":3701,"toc":3917},[3702,3706,3709,3712,3718,3721,3725,3728,3731,3736,3739,3742,3746,3749,3860,3865,3869,3872,3877,3880,3884],[17,3703,3705],{"id":3704},"turboquant-formats-deliver-extreme-compression-with-minimal-quality-loss","TurboQuant Formats Deliver Extreme Compression with Minimal Quality Loss",[22,3707,3708],{},"TurboQuant+ ports Google's TurboQuant (ICLR 2026) to llama.cpp, compressing KV cache via PolarQuant (multi-centroid scalar quantization) + Walsh-Hadamard Transform (WHT) rotation, dropping the paper's 1-bit QJL error correction which amplified softmax variance. Formats: turbo2 (2.5 bits\u002Fval, 6.4x vs fp16), turbo3 (3.5 bits\u002Fval at block=32, 4.6x; 3.125 bits\u002Fval at block=128, 5.12x), turbo4 (4.25 bits\u002Fval, 3.8x). On M5 Max (Qwen3.5-27B\u002F35B-A3B), turbo4 PPL 6.125 (+0.23% vs q8_0 baseline 6.111 on wikitext-2 512 chunks); turbo3 6.176 (+1.06%). turbo4 outperforms q4_0 (6.142, +0.52%) in quality at similar compression.",[22,3710,3711],{},"Block size optimization (study: docs\u002Fpapers\u002Fblock-size-experiment.md) boosts turbo3 to 5.12x at block=128 with identical PPL across 512-32K contexts, 3 architectures (Qwen2.5-1.5B, Llama3.1-8B, Qwen3.5-27B), validated on M2 Pro\u002FM5 Max Metal. Larger blocks reduce overhead but risk cache thrashing on older hardware—default block=32 balances.",[3713,3714,3715],"blockquote",{},[22,3716,3717],{},"\"Compresses transformer KV cache 3.8-6.4x using PolarQuant + Walsh-Hadamard rotation. Near q8_0 prefill speed and ~0.9x decode throughput at long context (Apple Silicon).\"",[22,3719,3720],{},"Asymmetric K\u002FV caching preserves quality on Q4_K_M weights: keep K at q8_0 (attention routing), compress V (turbo3\u002F4). E.g., Qwen2.5-7B Q4_K_M: q8_0-K + turbo4-V PPL 6.64 (+1.0% vs q8_0); symmetric turbo3 catastrophic (3556 PPL). Bigger models tolerate symmetric better (104B Command-R+: turbo3 +3.6%). Config guide: docs\u002Fturboquant-recommendations.md.",[17,3722,3724],{"id":3723},"layer-aware-and-sparse-optimizations-maximize-speed-and-quality","Layer-Aware and Sparse Optimizations Maximize Speed and Quality",[22,3726,3727],{},"Boundary V (layer-aware): Protects first\u002Flast 2 layers at q8_0-V, turbo2-V elsewhere. Recovers 37-91% of quality gap to turbo3 (e.g., Qwen3.5-35B MoE: turbo2 5.257 → Boundary 5.148 vs turbo3 5.137). Scales with depth (91% on 64L MoE). Enabled via TURBO_LAYER_ADAPTIVE=7; no speed hit.",[22,3729,3730],{},"Sparse V dequant: Skips V dequant for softmax weights \u003C1e-6 (most at long context). +22.8% decode at 32K (turbo3: 0.76x → 0.93x q8_0), no PPL change (wikitext-103 50 chunks, CI±0.021). General opt: +5% on q8_0 KV. Validated 1.5B-104B; dense models gain less (1-2% as FFN dominates).",[3713,3732,3733],{},[22,3734,3735],{},"\"Sparse V: Attention-gated KV cache decoding that skips low-weight V positions during inference. Up to +22.8% decode speed at 32K context... no measurable PPL change.\"",[22,3737,3738],{},"Prefill scales 2K-32K: turbo3\u002F4 ≥ q8_0 (e.g., 32K: turbo3 1204 vs 1098 t\u002Fs). Decode (M5 Max Qwen3.5-35B-A3B Sparse V): turbo4 1060 t\u002Fs long ctx (0.90x q8_0); real 24K PDF: turbo4 63.7 t\u002Fs (0.93x). M1 Max 38K doc: turbo4 +33.9% decode vs q8_0.",[22,3740,3741],{},"Optimization path (4K prefill): fp32 WHT (739 t\u002Fs, 0.27x q8_0) → fp16 + vectorized butterfly + graph rotation + block-32 + dequant → 2524 t\u002Fs (0.98x). KL div vs f16: turbo4 0.009633 (lower than q4_0 0.008091? Wait, table shows turbo4 better top-p agreement 95.98%).",[17,3743,3745],{"id":3744},"cross-hardware-benchmarks-confirm-production-readiness","Cross-Hardware Benchmarks Confirm Production Readiness",[22,3747,3748],{},"Apple Silicon (M5 Max 128GB): 104B@128K turbo3 (PPL 4.024? Wait, table 6.415 +3.6%; 74GB peak). Raise iogpu.wired_limit_mb=117964. M1 Max: turbo4 beats q8_0 long ctx. CUDA (RTX3090 Qwen3.5-9B Q4_K_M): turbo3\u002F4 decode 95-98 t\u002Fs (0.93-0.96x q8_0). AMD RX9070 XT (RDNA4 HIP): q8_0-K + turbo4-V +1.0% PPL, +2.5% decode.",[3750,3751,3752,3777],"table",{},[3753,3754,3755],"thead",{},[3756,3757,3758,3762,3765,3768,3771,3774],"tr",{},[3759,3760,3761],"th",{},"Hardware",[3759,3763,3764],{},"Model",[3759,3766,3767],{},"Config",[3759,3769,3770],{},"Decode t\u002Fs",[3759,3772,3773],{},"vs q8_0",[3759,3775,3776],{},"Notes",[3778,3779,3780,3801,3821,3840],"tbody",{},[3756,3781,3782,3786,3789,3792,3795,3798],{},[3783,3784,3785],"td",{},"M5 Max",[3783,3787,3788],{},"Qwen3.5-35B-A3B",[3783,3790,3791],{},"turbo4 + Sparse V",[3783,3793,3794],{},"1060 (32K)",[3783,3796,3797],{},"0.90x",[3783,3799,3800],{},"MoE",[3756,3802,3803,3806,3809,3812,3815,3818],{},[3783,3804,3805],{},"RTX3090",[3783,3807,3808],{},"Qwen3.5-9B Q4_K_M",[3783,3810,3811],{},"turbo4\u002Fturbo4",[3783,3813,3814],{},"95.87",[3783,3816,3817],{},"0.93x",[3783,3819,3820],{},"CUDA",[3756,3822,3823,3826,3828,3831,3834,3837],{},[3783,3824,3825],{},"M1 Max 64GB",[3783,3827,3788],{},[3783,3829,3830],{},"turbo4",[3783,3832,3833],{},"16.6 (38K)",[3783,3835,3836],{},"+33.9%",[3783,3838,3839],{},"Real doc",[3756,3841,3842,3845,3848,3851,3854,3857],{},[3783,3843,3844],{},"RX9070 XT",[3783,3846,3847],{},"Qwen2.5-7B Q4_K_M",[3783,3849,3850],{},"q8_0-K\u002Fturbo4-V",[3783,3852,3853],{},"86.8",[3783,3855,3856],{},"+2.5%",[3783,3858,3859],{},"HIP",[3713,3861,3862],{},[22,3863,3864],{},"\"104B at 128K context on a MacBook with turbo3 (PPL 4.024, 74 GB peak memory).\"",[17,3866,3868],{"id":3867},"retrieval-and-perplexity-validate-fidelity","Retrieval and Perplexity Validate Fidelity",[22,3870,3871],{},"NIAH (Kamradt\u002FRULER): turbo4 31\u002F33 (+3% vs q8_0 30\u002F33); turbo3 + Sparse V 9\u002F9. Multi-key 100% to 32K. Long ctx PPL (32K wikitext-103 50ch): turbo3 +1.64% vs q8_0, Sparse V delta=0. PPL stable: Llama3.1-70B turbo4 +6.3%, Command-R+104B +1.9%.",[3713,3873,3874],{},[22,3875,3876],{},"\"turbo4 beats q8_0 on retrieval (31\u002F33 vs 30\u002F33). Shared failure at 8K\u002F100% is a model weakness, not quantization.\"",[22,3878,3879],{},"Python prototype confirms: turbo4 cosine sim 0.96, MSE 0.0007. Gaussianization exact (kurtosis 900→2.9).",[17,3881,3883],{"id":3882},"key-takeaways","Key Takeaways",[3885,3886,3887,3891,3899,3902,3905,3908,3911,3914],"ul",{},[3888,3889,3890],"li",{},"Use turbo4 for best quality\u002Fcompression balance (3.8x, +0.23% PPL); turbo3 for max (5.12x block=128, +1% PPL).",[3888,3892,3893,3894,3898],{},"Asymmetric q8_0-K + turbo",[3895,3896,3897],"span",{},"3\u002F4","-V on Q4_K_M weights; symmetric on Q8_0+ or large models.",[3888,3900,3901],{},"Enable Sparse V always (+22% long decode, no PPL hit); Boundary V on deep models.",[3888,3903,3904],{},"Prefill ≥ q8_0 speed; validate decode on your hardware (M5+ best for turbo3).",[3888,3906,3907],{},"Build llama.cpp from fork; test PPL\u002FNIAH on your model before deploy.",[3888,3909,3910],{},"For Apple Silicon max ctx: sysctl iogpu.wired_limit_mb=90% RAM.",[3888,3912,3913],{},"Upstream path: Stable pieces as llama.cpp patches.",[3888,3915,3916],{},"MLX Swift fork for 2.5x faster Apple decode (144 t\u002Fs Qwen3.5-35B-A3B).",{"title":52,"searchDepth":53,"depth":53,"links":3918},[3919,3920,3921,3922,3923],{"id":3704,"depth":53,"text":3705},{"id":3723,"depth":53,"text":3724},{"id":3744,"depth":53,"text":3745},{"id":3867,"depth":53,"text":3868},{"id":3882,"depth":53,"text":3883},[],{"content_references":3926,"triage":3946},[3927,3930,3933,3938,3942],{"type":70,"title":3928,"url":3929,"context":68},"TurboQuant: Redefining AI Efficiency with Extreme Compression","https:\u002F\u002Fresearch.google\u002Fblog\u002Fturboquant-redefining-ai-efficiency-with-extreme-compression\u002F",{"type":65,"title":3931,"url":3932,"context":68},"llama-cpp-turboquant","https:\u002F\u002Fgithub.com\u002FTheTom\u002Fllama-cpp-turboquant",{"type":65,"title":3934,"author":3935,"url":3936,"context":3937},"mlx-swift-lm","ekryski","https:\u002F\u002Fgithub.com\u002Fekryski\u002Fmlx-swift-lm","recommended",{"type":65,"title":3939,"author":3940,"url":3941,"context":73},"LLMTest_NeedleInAHaystack","gkamradt","https:\u002F\u002Fgithub.com\u002Fgkamradt\u002FLLMTest_NeedleInAHaystack",{"type":65,"title":3943,"author":3944,"url":3945,"context":73},"RULER","NVIDIA","https:\u002F\u002Fgithub.com\u002FNVIDIA\u002FRULER",{"relevance":82,"novelty":82,"quality":83,"actionability":53,"composite":84,"reasoning":3947},"Category: AI & LLMs. The article discusses a specific implementation of TurboQuant for KV cache compression, which is relevant to AI engineering. However, it lacks practical application details that the target audience could act on immediately, focusing more on technical specifications and performance metrics.","\u002Fsummaries\u002F2a9849ad35620d4f-turboquant-6-4x-kv-cache-compression-at-q8-0-speed-summary","2026-04-16 03:08:34",{"title":3694,"description":52},{"loc":3948},"2a9849ad35620d4f","__oneoff__","https:\u002F\u002Fgithub.com\u002FTheTom\u002Fturboquant_plus.git","summaries\u002F2a9849ad35620d4f-turboquant-6-4x-kv-cache-compression-at-q8-0-speed-summary",[98,100,99,101],"Implements TurboQuant in llama.cpp for 3.8-6.4x KV cache compression (turbo2\u002F3\u002F4 formats) with PPL near q8_0, matching prefill speed, and 0.9x decode on Apple Silicon, CUDA, AMD—plus Sparse V for +22.8% decode.",[],"B-pb0MWnzaai4T1NyHjyPyKUvFNfyk7UQEPHQ791a_c",{"id":3961,"title":3962,"ai":3963,"body":3968,"categories":4001,"created_at":59,"date_modified":59,"description":52,"extension":60,"faq":59,"featured":61,"kicker_label":59,"meta":4002,"navigation":86,"path":4025,"published_at":4026,"question":59,"scraped_at":4027,"seo":4028,"sitemap":4029,"source_id":4030,"source_name":93,"source_type":94,"source_url":4031,"stem":4032,"tags":4033,"thumbnail_url":59,"tldr":4034,"tweet":59,"unknown_tags":4035,"__hash__":4036},"summaries\u002Fsummaries\u002F07f85059ce2b1c55-antangelmed-103b-moe-medical-llm-matches-40b-dense-summary.md","AntAngelMed: 103B MoE Medical LLM Matches 40B Dense at 7x Speed",{"provider":7,"model":8,"input_tokens":3964,"output_tokens":3965,"processing_time_ms":3966,"cost_usd":3967},8023,3168,43093,0.00316595,{"type":14,"value":3969,"toc":3996},[3970,3974,3977,3981,3984,3988],[17,3971,3973],{"id":3972},"sparse-moe-delivers-massive-capacity-at-low-compute","Sparse MoE Delivers Massive Capacity at Low Compute",[22,3975,3976],{},"AntAngelMed packs 103B total parameters into a 1\u002F32 activation-ratio Mixture-of-Experts (MoE) architecture, activating just 6.1B params per inference to match performance of ~40B dense models while achieving up to 7x efficiency over equivalently sized dense setups—speed advantages grow further with longer outputs. MoE works by routing inputs to a subset of 'expert' sub-networks instead of using all params per token, scaling knowledge without proportional compute hikes. Builds on Ling-flash-2.0 base via Ling Scaling Laws, with refinements like finer expert granularity, optimized shared expert ratio, attention balancing, auxiliary-loss-free sigmoid routing, Multi-Token Prediction (MTP) layer, QK-Norm, and Partial-RoPE (subset of attention heads). On H20 GPUs, hits >200 tokens\u002Fsecond (3x a 36B dense model), extends to 128K context via YaRN for full clinical docs or multi-turn dialogues. FP8 quantization + EAGLE3 speculative decoding yields 71% HumanEval uplift, 45% GSM8K, 94% Math-500 at 32 concurrency, stabilizing throughput for coding\u002Fmath proxies.",[17,3978,3980],{"id":3979},"three-stage-training-infuses-medical-depth","Three-Stage Training Infuses Medical Depth",[22,3982,3983],{},"Layer general reasoning atop medical specialization through: (1) Continual pre-training on vast medical corpora—encyclopedias, web text, papers—from Ling-flash-2.0 checkpoint; (2) Supervised Fine-Tuning (SFT) on mixed instructions preserving chain-of-thought via math\u002Fcoding\u002Flogic tasks alongside doctor-patient Q&A, diagnostics, ethics\u002Fsafety; (3) GRPO Reinforcement Learning (lighter PPO variant estimating baselines from group scores, per DeepSeekMath paper) with rewards targeting empathy, structured clinical outputs, safety, evidence-based reasoning to slash hallucinations. This progression embeds domain expertise without eroding broad capabilities.",[17,3985,3987],{"id":3986},"leads-benchmarks-deploys-easily-open-source","Leads Benchmarks, Deploys Easily Open-Source",[22,3989,3990,3991,3995],{},"Tops HealthBench (OpenAI's multi-turn clinical dialogues): #1 open-source, beats proprietary models, widest margin on HealthBench-Hard. Dominates MedAIBench (China Nat’l AI Medical Facility): elite in knowledge Q&A\u002Fethics-safety. #1 overall MedBench (36 datasets, ~700K samples across knowledge QA, understanding, generation, complex reasoning, safety\u002Fethics). Apache 2.0 weights (HuggingFace: MedAIBase\u002FAntAngelMed), MIT code (GitHub: MedAIBase\u002FAntAngelMed). Transformers load: ",[3992,3993,3994],"code",{},"AutoModelForCausalLM.from_pretrained(\"MedAIBase\u002FAntAngelMed\", device_map=\"auto\", trust_remote_code=True)",". Runs on vLLM v0.11.0 (4-GPU tensor parallel), SGLang+FlashAttention-3, vLLM-Ascend (Huawei 910B NPUs). From Health Information Center of Zhejiang Province, Ant Healthcare, Zhejiang Anzhen’er Medical AI Technology Co., Ltd.",{"title":52,"searchDepth":53,"depth":53,"links":3997},[3998,3999,4000],{"id":3972,"depth":53,"text":3973},{"id":3979,"depth":53,"text":3980},{"id":3986,"depth":53,"text":3987},[],{"content_references":4003,"triage":4022},[4004,4007,4010,4013,4016,4020],{"type":70,"title":4005,"url":4006,"context":73},"DeepSeekMath","https:\u002F\u002Farxiv.org\u002Fabs\u002F2402.03300",{"type":65,"title":4008,"url":4009,"context":3937},"AntAngelMed","https:\u002F\u002Fhuggingface.co\u002FMedAIBase\u002FAntAngelMed",{"type":65,"title":4011,"url":4012,"context":3937},"AntAngelMed GitHub Repo","https:\u002F\u002Fgithub.com\u002FMedAIBase\u002FAntAngelMed",{"type":79,"title":4014,"author":4015,"context":68},"Ling-flash-2.0","inclusionAI",{"type":4017,"title":4018,"author":4019,"context":73},"dataset","HealthBench","OpenAI",{"type":4017,"title":4021,"context":73},"MedBench",{"relevance":82,"novelty":83,"quality":83,"actionability":53,"composite":4023,"reasoning":4024},3.25,"Category: AI & LLMs. The article discusses a new medical LLM that showcases innovative architecture and efficiency, which is relevant to AI product builders. However, it lacks specific actionable insights or frameworks that the audience could directly implement in their projects.","\u002Fsummaries\u002F07f85059ce2b1c55-antangelmed-103b-moe-medical-llm-matches-40b-dense-summary","2026-05-12 21:21:47","2026-05-13 12:00:59",{"title":3962,"description":52},{"loc":4025},"07f85059ce2b1c55","https:\u002F\u002Fwww.marktechpost.com\u002F2026\u002F05\u002F12\u002Fmeet-antangelmed-a-103b-parameter-open-source-medical-language-model-built-on-a-1-32-activation-ratio-moe-architecture\u002F","summaries\u002F07f85059ce2b1c55-antangelmed-103b-moe-medical-llm-matches-40b-dense-summary",[98,100,99],"103B-param open-source medical LLM activates only 6.1B params via 1\u002F32 MoE, rivals 40B dense models with 7x efficiency, tops HealthBench\u002FMedBench, runs 200+ tps on H20.",[],"BMkdtRqd6qJuSshJwJCoVJVxaHNukE4u3QyIRxxvstU",{"id":4038,"title":4039,"ai":4040,"body":4045,"categories":4100,"created_at":59,"date_modified":59,"description":52,"extension":60,"faq":59,"featured":61,"kicker_label":59,"meta":4101,"navigation":86,"path":4119,"published_at":4120,"question":59,"scraped_at":4121,"seo":4122,"sitemap":4123,"source_id":4124,"source_name":93,"source_type":94,"source_url":4125,"stem":4126,"tags":4127,"thumbnail_url":59,"tldr":4128,"tweet":59,"unknown_tags":4129,"__hash__":4130},"summaries\u002Fsummaries\u002F79f82c07ea7441fe-trl-code-guide-sft-to-grpo-llm-alignment-on-t4-gpu-summary.md","TRL Code Guide: SFT to GRPO LLM Alignment on T4 GPU",{"provider":7,"model":8,"input_tokens":4041,"output_tokens":4042,"processing_time_ms":4043,"cost_usd":4044},9458,2615,35753,0.00269195,{"type":14,"value":4046,"toc":4094},[4047,4051,4058,4062,4072,4076,4082,4086],[17,4048,4050],{"id":4049},"lora-and-trl-setup-enables-post-training-on-limited-hardware","LoRA and TRL Setup Enables Post-Training on Limited Hardware",[22,4052,4053,4054,4057],{},"Use LoRA (r=8, alpha=16, dropout=0.05, targets=",[3895,4055,4056],{},"'q_proj','k_proj','v_proj','o_proj'",") with TRL trainers to adapt Qwen\u002FQwen2.5-0.5B-Instruct on T4 GPU (16GB). Common args across stages: num_train_epochs=1, gradient_checkpointing=True, bf16 if supported else fp16, logging_steps=10, report_to=\"none\", save_strategy=\"no\". Install stack: torchao>=0.16, trl>=0.20, transformers>=4.45, peft>=0.13, bitsandbytes. Helpers like chat_generate apply chat template, generate with temp=0.7\u002Ftop_p=0.9. Cleanup VRAM with gc.collect() + torch.cuda.empty_cache() between stages to fit in Colab.",[17,4059,4061],{"id":4060},"sft-and-rm-build-imitation-and-reward-signals","SFT and RM Build Imitation and Reward Signals",[22,4063,4064,4065,4068,4069,4071],{},"For Supervised Fine-Tuning, load trl-lib\u002FCapybara (train",[3895,4066,4067],{},":300","), use SFTConfig(per_device_train_batch_size=2, gradient_accumulation_steps=4, learning_rate=2e-4, max_length=768). Trainer imitates high-quality chat responses; post-train inference on \"Explain bias-variance tradeoff in two sentences\" yields coherent output. Reward Modeling on trl-lib\u002Fultrafeedback_binarized (train",[3895,4070,4067],{},") uses RewardConfig(batch_size=2, accum_steps=2, lr=1e-4, max_length=512), LoRA task_type=\"SEQ_CLS\". Trains to score chosen vs. rejected pairs, producing a preference-based reward without explicit RL.",[17,4073,4075],{"id":4074},"dpo-skips-rm-for-direct-preference-alignment","DPO Skips RM for Direct Preference Alignment",[22,4077,4078,4079,4081],{},"DPOTrainer on same ultrafeedback_binarized",[3895,4080,4067],{}," simplifies via implicit rewards: DPOConfig(batch_size=1, accum_steps=4, lr=5e-6, beta=0.1, max_length=512, max_prompt_length=256). Beta controls KL-divergence from reference policy, preventing mode collapse. Optimizes policy to prefer chosen over rejected responses directly, reducing steps vs. traditional RM+PPO.",[17,4083,4085],{"id":4084},"grpo-uses-custom-rewards-to-sharpen-reasoning","GRPO Uses Custom Rewards to Sharpen Reasoning",[22,4087,4088,4089,4093],{},"GRPOTrainer generates num_generations=4 completions per prompt (max_prompt_length=128, max_completion_length=96, max_steps=15), ranks via reward_funcs. Custom dataset: 200 synthetic math problems (e.g., \"Solve 17 + 28 =\", gold=eval). Rewards: correctness_reward (1.0 if last extracted number matches gold else 0), brevity_reward (max(0,1-len(c)\u002F200)",[4090,4091,4092],"em",{},"0.2). GRPOConfig(lr=1e-5, batch=2, accum=2). Inference on \"17+28?\", \"9","7?\", \"100-47?\" produces accurate, concise answers like final numbers, improving verifiable task performance over base.",{"title":52,"searchDepth":53,"depth":53,"links":4095},[4096,4097,4098,4099],{"id":4049,"depth":53,"text":4050},{"id":4060,"depth":53,"text":4061},{"id":4074,"depth":53,"text":4075},{"id":4084,"depth":53,"text":4085},[114],{"content_references":4102,"triage":4115},[4103,4106,4108,4110,4112],{"type":65,"title":4104,"url":4105,"context":68},"TRL","https:\u002F\u002Fgithub.com\u002Fhuggingface\u002Ftrl",{"type":4017,"title":4107,"context":68},"trl-lib\u002FCapybara",{"type":4017,"title":4109,"context":68},"trl-lib\u002Fultrafeedback_binarized",{"type":65,"title":4111,"context":68},"Qwen\u002FQwen2.5-0.5B-Instruct",{"type":79,"title":4113,"url":4114,"context":3937},"trl_llm_post_training_sft_dpo_grpo_marktechpost.py","https:\u002F\u002Fgithub.com\u002FMarktechpost\u002FAI-Agents-Projects-Tutorials\u002Fblob\u002Fmain\u002FLLM%20Projects\u002Ftrl_llm_post_training_sft_dpo_grpo_marktechpost.py",{"relevance":4116,"novelty":83,"quality":83,"actionability":4116,"composite":4117,"reasoning":4118},5,4.55,"Category: AI & LLMs. The article provides a detailed guide on using TRL and LoRA for LLM post-training, addressing practical applications for developers looking to implement AI features. It includes specific configurations and techniques that can be directly applied in production, making it highly actionable.","\u002Fsummaries\u002F79f82c07ea7441fe-trl-code-guide-sft-to-grpo-llm-alignment-on-t4-gpu-summary","2026-05-01 20:52:08","2026-05-03 17:01:49",{"title":4039,"description":52},{"loc":4119},"79f82c07ea7441fe","https:\u002F\u002Fwww.marktechpost.com\u002F2026\u002F05\u002F01\u002Fa-coding-guide-on-llm-post-training-with-trl-from-supervised-fine-tuning-to-dpo-and-grpo-reasoning\u002F","summaries\u002F79f82c07ea7441fe-trl-code-guide-sft-to-grpo-llm-alignment-on-t4-gpu-summary",[98,101,99],"Train Qwen2.5-0.5B via SFT, RM, DPO, GRPO using TRL+LoRA on Colab T4: configs include r=8 LoRA, 300-sample datasets, epochs=1, small batches\u002Faccum for memory efficiency, custom math rewards boost reasoning.",[],"py8Fe1-Noi99CHywKy61Q363dqRBmUxl6tZ9TDJOp3E",{"id":4132,"title":4133,"ai":4134,"body":4139,"categories":4289,"created_at":59,"date_modified":59,"description":52,"extension":60,"faq":59,"featured":61,"kicker_label":59,"meta":4290,"navigation":86,"path":4319,"published_at":59,"question":59,"scraped_at":4320,"seo":4321,"sitemap":4322,"source_id":4323,"source_name":3953,"source_type":94,"source_url":4324,"stem":4325,"tags":4326,"thumbnail_url":59,"tldr":4327,"tweet":59,"unknown_tags":4328,"__hash__":4329},"summaries\u002Fsummaries\u002F5f72f336c67bc8d8-gemma-2-open-llms-trained-on-13t-tokens-top-benchm-summary.md","Gemma 2: Open LLMs Trained on 13T Tokens, Top Benchmarks",{"provider":7,"model":8,"input_tokens":4135,"output_tokens":4136,"processing_time_ms":4137,"cost_usd":4138},6087,2342,14579,0.0023659,{"type":14,"value":4140,"toc":4284},[4141,4145,4148,4151,4213,4217,4220,4223,4227,4230,4233,4281],[17,4142,4144],{"id":4143},"deploy-high-performance-llms-on-limited-hardware","Deploy High-Performance LLMs on Limited Hardware",[22,4146,4147],{},"Gemma 2 models (2B, 9B, 27B parameters) are text-to-text, decoder-only LLMs optimized for question answering, summarization, and reasoning. Their small size enables deployment on laptops, desktops, or personal cloud setups, unlike larger models needing massive clusters. Train the 27B on 13T tokens, 9B on 8T, and 2B on 2T from diverse sources like web docs, code, math\u002Fscience, and multilingual text. Preprocessing filters duplicates, PII, low-quality content, and adult material using heuristics and classifiers, ensuring broad task coverage without common failure modes.",[22,4149,4150],{},"On benchmarks, larger variants excel: 27B PT hits 75.2 MMLU (5-shot), 86.4 HellaSwag (10-shot), 51.8 HumanEval pass@1, 74.0 GSM8K (5-shot maj@1); 9B PT at 71.3 MMLU, 40.2 HumanEval; 2B PT at 51.3 MMLU. They surpass comparably-sized open alternatives across reasoning (ARC-c 71.4 for 27B), QA (TriviaQA 83.7), and math (MATH 42.3), proving state-of-the-art efficiency.",[3750,4152,4153,4169],{},[3753,4154,4155],{},[3756,4156,4157,4160,4163,4166],{},[3759,4158,4159],{},"Benchmark",[3759,4161,4162],{},"2B PT",[3759,4164,4165],{},"9B PT",[3759,4167,4168],{},"27B PT",[3778,4170,4171,4185,4199],{},[3756,4172,4173,4176,4179,4182],{},[3783,4174,4175],{},"MMLU 5-shot",[3783,4177,4178],{},"51.3",[3783,4180,4181],{},"71.3",[3783,4183,4184],{},"75.2",[3756,4186,4187,4190,4193,4196],{},[3783,4188,4189],{},"HumanEval pass@1",[3783,4191,4192],{},"17.7",[3783,4194,4195],{},"40.2",[3783,4197,4198],{},"51.8",[3756,4200,4201,4204,4207,4210],{},[3783,4202,4203],{},"GSM8K 5-shot",[3783,4205,4206],{},"23.9",[3783,4208,4209],{},"68.6",[3783,4211,4212],{},"74.0",[17,4214,4216],{"id":4215},"train-efficiently-with-tpuv5p-jax-and-pathways","Train Efficiently with TPUv5p, JAX, and Pathways",[22,4218,4219],{},"Leverage TPUv5p hardware for matrix-heavy training, offering higher throughput than GPUs for LLMs. Use JAX for hardware acceleration and ML Pathways for multi-task orchestration in a single Python process, simplifying workflows as in Gemini papers. This combo scales to 13T tokens while cutting development overhead—ideal for replicating on custom infra.",[22,4221,4222],{},"Data mix includes web, code, math, and polyglot sources; dedupe at sentence\u002Fparagraph levels, filter via quality classifiers, and remove PII\u002Fadult content to boost generalization without memorization risks.",[17,4224,4226],{"id":4225},"pass-safety-and-dangerous-capability-thresholds","Pass Safety and Dangerous Capability Thresholds",[22,4228,4229],{},"Instruction-tuned (IT) variants score low toxicity (RealToxicity 8.84 avg for 27B IT) and bias (CrowS-Pairs 36.67 top-1), with strong BBQ (86.94 Disambig for 27B) and TruthfulQA (51.60). They meet Google's internal policies on child safety, harms, and memorization.",[22,4231,4232],{},"Dangerous evals cap risks: 27B IT solves 34\u002F76 InterCode-CTF cyber challenges (low success), 1\u002F13 internal CTF, 0\u002F13 HackTheBox; persuasion tests show 81% find it interesting but minimal harmful shifts (1% toward incorrect beliefs, £3.72 mean donation). Mitigate via preprocessing, post-training, and monitoring—users must add safeguards for production.",[3750,4234,4235,4251],{},[3753,4236,4237],{},[3756,4238,4239,4242,4245,4248],{},[3759,4240,4241],{},"Safety Benchmark",[3759,4243,4244],{},"2B IT",[3759,4246,4247],{},"9B IT",[3759,4249,4250],{},"27B IT",[3778,4252,4253,4267],{},[3756,4254,4255,4258,4261,4264],{},[3783,4256,4257],{},"RealToxicity avg",[3783,4259,4260],{},"8.16",[3783,4262,4263],{},"8.25",[3783,4265,4266],{},"8.84",[3756,4268,4269,4272,4275,4278],{},[3783,4270,4271],{},"TruthfulQA",[3783,4273,4274],{},"43.72",[3783,4276,4277],{},"50.27",[3783,4279,4280],{},"51.60",[22,4282,4283],{},"Limitations: May amplify biases, hallucinate, or violate policies without filters; not for high-risk uses like medical\u002Flegal advice.",{"title":52,"searchDepth":53,"depth":53,"links":4285},[4286,4287,4288],{"id":4143,"depth":53,"text":4144},{"id":4215,"depth":53,"text":4216},{"id":4225,"depth":53,"text":4226},[],{"content_references":4291,"triage":4316},[4292,4297,4300,4303,4307,4310,4313],{"type":70,"title":4293,"author":4294,"publisher":4295,"url":4296,"context":73},"Gemma","Gemma Team","Kaggle","https:\u002F\u002Fwww.kaggle.com\u002Fm\u002F3301",{"type":70,"title":4298,"url":4299,"context":73},"Gemma 2 technical report","https:\u002F\u002Fstorage.googleapis.com\u002Fdeepmind-media\u002Fgemma\u002Fgemma-2-report.pdf",{"type":70,"title":4301,"url":4302,"context":73},"Evaluating Frontier Models for Dangerous Capabilities","https:\u002F\u002Farxiv.org\u002Fabs\u002F2403.13793",{"type":4304,"title":4305,"url":4306,"context":73},"report","2023 Google AI Principles Progress Update","https:\u002F\u002Fstorage.googleapis.com\u002Fgweb-uniblog-publish-prod\u002Fdocuments\u002F2023_Google_AI_Principles_Progress_Update.pdf#page=11",{"type":65,"title":4308,"url":4309,"context":68},"Tensor Processing Unit (TPU)","https:\u002F\u002Fcloud.google.com\u002Ftpu\u002Fdocs\u002Fintro-to-tpu",{"type":65,"title":4311,"url":4312,"context":68},"JAX","https:\u002F\u002Fgithub.com\u002Fjax-ml\u002Fjax",{"type":79,"title":4314,"url":4315,"context":68},"ML Pathways","https:\u002F\u002Fblog.google\u002Ftechnology\u002Fai\u002Fintroducing-pathways-next-generation-ai-architecture\u002F",{"relevance":83,"novelty":82,"quality":83,"actionability":82,"composite":4317,"reasoning":4318},3.6,"Category: AI & LLMs. The article discusses the performance and deployment of the Gemma 2 LLMs, which addresses the audience's interest in practical AI applications. It provides insights into model efficiency and training techniques, but lacks detailed actionable steps for implementation.","\u002Fsummaries\u002F5f72f336c67bc8d8-gemma-2-open-llms-trained-on-13t-tokens-top-benchm-summary","2026-04-16 03:04:59",{"title":4133,"description":52},{"loc":4319},"5f72f336c67bc8d8","https:\u002F\u002Fai.google.dev\u002Fgemma\u002Fdocs\u002Fcore\u002Fmodel_card_2","summaries\u002F5f72f336c67bc8d8-gemma-2-open-llms-trained-on-13t-tokens-top-benchm-summary",[98,100,99],"Google's Gemma 2 family (2B, 9B, 27B params) are lightweight open decoder-only LLMs trained on 2-13T tokens, outperforming similar-sized open models on MMLU (75.2 for 27B), HumanEval (51.8), and safety benchmarks while running on laptops.",[],"FxYd5aKdUzJM6mZmlDOJJQimECfkFIoTj2FT-EqQlhs"]