[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"summary-866e10e8d404e5bf-sagemaker-fine-tuning-lora-beats-qlora-on-cost-per-summary":3,"summaries-facets-categories":106,"summary-related-866e10e8d404e5bf-sagemaker-fine-tuning-lora-beats-qlora-on-cost-per-summary":3691},{"id":4,"title":5,"ai":6,"body":13,"categories":69,"created_at":71,"date_modified":71,"description":62,"extension":72,"faq":71,"featured":73,"kicker_label":71,"meta":74,"navigation":87,"path":88,"published_at":89,"question":71,"scraped_at":90,"seo":91,"sitemap":92,"source_id":93,"source_name":94,"source_type":95,"source_url":96,"stem":97,"tags":98,"thumbnail_url":71,"tldr":103,"tweet":71,"unknown_tags":104,"__hash__":105},"summaries\u002Fsummaries\u002F866e10e8d404e5bf-sagemaker-fine-tuning-lora-beats-qlora-on-cost-per-summary.md","SageMaker Fine-Tuning: LoRA Beats QLoRA on Cost-Perf Balance",{"provider":7,"model":8,"input_tokens":9,"output_tokens":10,"processing_time_ms":11,"cost_usd":12},"openrouter","x-ai\u002Fgrok-4.1-fast",8501,2110,17961,0.00273255,{"type":14,"value":15,"toc":61},"minimark",[16,21,25,28,31,35,38,41,44,48,51,54,58],[17,18,20],"h2",{"id":19},"fine-tuning-methods-trade-offs-in-params-memory-and-speed","Fine-Tuning Methods: Trade-Offs in Params, Memory, and Speed",[22,23,24],"p",{},"Full fine-tuning updates all 7B parameters of models like Llama2-7B, delivering top accuracy (e.g., highest Rouge1\u002F2\u002FL, Bert F1, Intent Accuracy on Banking77 dataset) but at highest cost and time—ideal only for unrestricted budgets or compliance needs where no accuracy compromise is allowed.",[22,26,27],{},"LoRA (PEFT) freezes original weights and trains low-rank matrices A\u002FB: for a 2048x2048 update matrix (4M params), it uses (2048x4) + (4x2048) = 16K params, a 96% reduction. Process merges on-the-fly during inference, preserving general knowledge while specializing on domain data like finance intents; slight accuracy drop vs full but massive GPU\u002Ftime savings, with minor inference delay unless merged.",[22,29,30],{},"QLoRA quantizes LoRA weights to 4-bit NF4 (e.g., 0.117 → 0.12), yielding 8x memory savings via higher precision near zero and less for outliers. It enables fine-tuning large models on single GPUs but slows training 25%+ due to gradient checkpointing (trades compute for 45% activation memory), dequantization per forward\u002Fbackward pass, and paged_adam_8bit optimizer—use for prototypes or severe constraints where slight accuracy loss is ok.",[17,32,34],{"id":33},"aws-sagemaker-implementation-universal-script-across-approaches","AWS SageMaker Implementation: Universal Script Across Approaches",[22,36,37],{},"Prepare Banking77 dataset (HF: PolyAI\u002Fbanking77) into train\u002Ftest .jsonl, upload to S3 bucket (e.g., finetuning-llm-blog-harshitdawar\u002FBanking77\u002F{train,test}). Bundle requirements.txt (key libs: torch, transformers, peft, bitsandbytes, trl, datasets, accelerate) and training_script.py into training-scripts.tar.gz—script handles model_name (Llama2-7B, Mistral7B-v0.1, GPT-NeoX-20B), approach (full\u002Flora\u002Fqlora), epochs, batch_size=8, lr (auto-tuned), hf_token for gated models.",[22,39,40],{},"Add S3 bucket policy for SageMaker access. In SageMaker Training Jobs: use HuggingFace PyTorch container (e.g., 763104351884.dkr.ecr.ap-south-1.amazonaws.com\u002Fhuggingface-pytorch-training:2.1.0-...), ml.g5.xlarge+ GPU instances (scale per table: e.g., Llama2 QLoRA on g5.xlarge batch=8; GPT-NeoX-20B LoRA on p4d.24xlarge batch=1). Hyperparams reference S3 code\u002Foutput paths; channels for train\u002Ftest data; output to S3\u002Fmodels\u002F{model}-{approach}. Spot instances optional; ensure IAM role has S3 perms, request quotas for instances.",[22,42,43],{},"Run jobs for 9 combos (excluding GPT-NeoX full FT due to cost); eval on 500 test samples with Rouge\u002FBert\u002FIntent Acc\u002FParse Rate\u002FInference Sec.",[17,45,47],{"id":46},"results-lora-wins-on-cost-per-performance-point","Results: LoRA Wins on Cost per Performance Point",[22,49,50],{},"On Banking77 intents: Full FT tops metrics (e.g., Llama2 full: high Intent Acc), LoRA close (slight drop), QLoRA lowest but viable baseline. Training time\u002Fcost: QLoRA cheapest upfront (memory savings) yet higher total due to overheads; LoRA optimal (e.g., lower than full by orders, beats QLoRA on perf\u002F$). Inference: Full\u002FLoRA faster\u002Fsec than QLoRA; cost per perf point favors LoRA.",[22,52,53],{},"Resources: Fine-tuned sizes ~original (merging bloats); GPU util high across (e.g., Llama2 QLoRA peaks 100% GPU mem); QLoRA maxes smaller instances. Author spent >$200 across runs—get credits\u002Festimates first.",[17,55,57],{"id":56},"recommendations-match-approach-to-constraints","Recommendations: Match Approach to Constraints",[22,59,60],{},"Full FT: Max accuracy, no compromises (e.g., regulated finance). LoRA: Production sweet spot—96% param cut, near-full perf, preserves base knowledge. QLoRA: Quick prototypes\u002Fhigh constraints (democratizes research). Scale instances per model (e.g., 7B on g5.12xlarge full; 20B LoRA p4d.24xlarge). Merge LoRA for inference speed; test baselines before scaling.",{"title":62,"searchDepth":63,"depth":63,"links":64},"",2,[65,66,67,68],{"id":19,"depth":63,"text":20},{"id":33,"depth":63,"text":34},{"id":46,"depth":63,"text":47},{"id":56,"depth":63,"text":57},[70],"AI & LLMs",null,"md",false,{"content_references":75,"triage":82},[76],{"type":77,"title":78,"author":79,"url":80,"context":81},"dataset","Banking77","PolyAI","https:\u002F\u002Fhuggingface.co\u002Fdatasets\u002FPolyAI\u002Fbanking77","mentioned",{"relevance":83,"novelty":84,"quality":84,"actionability":84,"composite":85,"reasoning":86},5,4,4.35,"Category: AI & LLMs. The article provides a detailed comparison of fine-tuning methods for large language models, specifically focusing on LoRA and QLoRA, which directly addresses the audience's need for practical AI engineering insights. It includes specific implementation steps for using AWS SageMaker, making it actionable for developers looking to integrate these techniques into their workflows.",true,"\u002Fsummaries\u002F866e10e8d404e5bf-sagemaker-fine-tuning-lora-beats-qlora-on-cost-per-summary","2026-05-03 07:33:04","2026-05-03 17:01:03",{"title":5,"description":62},{"loc":88},"866e10e8d404e5bf","Towards AI","article","https:\u002F\u002Fpub.towardsai.net\u002Fthe-ultimate-guide-to-fine-tuning-foundation-models-on-aws-sagemaker-efc673509bb2?source=rss----98111c9905da---4","summaries\u002F866e10e8d404e5bf-sagemaker-fine-tuning-lora-beats-qlora-on-cost-per-summary",[99,100,101,102],"llm","machine-learning","devops","cloud","LoRA cuts trainable params by 96% vs full fine-tuning, balancing cost savings and accuracy on Llama2-7B\u002FMistral7B; QLoRA saves 8x memory but trains slower due to dequantization overhead.",[],"zrXCCVv4m3PpFgLRs2NdWo10XbP8h3vRPQKkaW6c8mg",[107,110,113,115,118,121,123,125,127,129,131,133,136,138,140,142,144,146,148,150,152,154,157,160,162,164,167,169,171,174,176,178,180,182,184,186,188,190,192,194,196,198,200,202,204,206,208,210,212,214,216,218,220,222,224,226,228,230,232,234,236,238,240,242,244,246,248,250,252,254,256,258,260,262,264,266,268,270,272,274,276,278,280,282,284,286,288,290,292,294,296,298,300,302,304,306,308,310,312,314,316,318,320,322,324,326,328,330,332,334,336,338,340,342,344,346,348,350,352,354,356,358,360,362,364,366,368,370,372,374,376,378,380,382,384,386,388,390,392,394,396,398,400,402,404,406,408,410,412,414,416,418,420,422,424,426,428,431,433,435,437,439,441,443,445,447,449,451,453,455,457,459,461,463,465,467,469,471,473,475,477,479,481,483,485,487,489,491,493,495,497,499,501,503,505,507,509,511,513,515,517,519,521,523,525,527,529,531,533,535,537,539,541,543,545,547,549,551,553,555,557,559,561,563,565,567,569,571,573,575,577,579,581,583,585,587,589,591,593,595,597,599,601,603,605,607,609,611,613,615,617,619,621,623,625,627,629,631,633,635,637,639,641,643,645,647,649,651,653,655,657,659,661,663,665,667,669,671,673,675,677,679,681,683,685,687,689,691,693,695,697,699,701,703,705,707,709,711,713,715,717,719,721,723,725,727,729,731,733,735,737,739,741,743,745,747,749,751,753,755,757,759,761,763,765,767,769,771,773,775,777,779,781,783,785,787,789,791,793,795,797,799,801,803,805,807,809,811,813,815,817,819,821,823,825,827,829,831,833,835,837,839,841,843,845,847,849,851,853,855,857,859,861,863,865,867,869,871,873,875,877,879,881,883,885,887,889,891,893,895,897,899,901,903,905,907,909,911,913,915,917,919,921,923,925,927,929,931,933,935,937,939,941,943,945,947,949,951,953,955,957,959,961,963,965,967,969,971,973,975,977,979,981,983,985,987,989,991,993,995,997,999,1001,1003,1005,1007,1009,1011,1013,1015,1017,1019,1021,1023,1025,1027,1029,1031,1033,1035,1037,1039,1041,1043,1045,1047,1049,1051,1053,1055,1057,1059,1061,1063,1065,1067,1069,1071,1073,1075,1077,1079,1081,1083,1085,1087,1089,1091,1093,1095,1097,1099,1101,1103,1105,1107,1109,1111,1113,1115,1117,1119,1121,1123,1125,1127,1129,1131,1133,1135,1137,1139,1141,1143,1145,1147,1149,1151,1153,1155,1157,1159,1161,1163,1165,1167,1169,1171,1173,1175,1177,1179,1181,1183,1185,1187,1189,1191,1193,1195,1197,1199,1201,1203,1205,1207,1209,1211,1213,1215,1217,1219,1221,1223,1225,1227,1229,1231,1233,1235,1237,1239,1241,1243,1245,1247,1249,1251,1253,1255,1257,1259,1261,1263,1265,1267,1269,1271,1273,1275,1277,1279,1281,1283,1285,1287,1289,1291,1293,1295,1297,1299,1301,1303,1305,1307,1309,1311,1313,1315,1317,1319,1321,1323,1325,1327,1329,1331,1333,1335,1337,1339,1341,1343,1345,1347,1349,1351,1353,1355,1357,1359,1361,1363,1365,1367,1369,1371,1373,1375,1377,1379,1381,1383,1385,1387,1389,1391,1393,1395,1397,1399,1401,1403,1405,1407,1409,1411,1413,1415,1417,1419,1421,1423,1425,1427,1429,1431,1433,1435,1437,1439,1441,1443,1445,1447,1449,1451,1453,1455,1457,1459,1461,1463,1465,1467,1469,1471,1473,1475,1477,1479,1481,1483,1485,1487,1489,1491,1493,1495,1497,1499,1501,1503,1505,1507,1509,1511,1513,1515,1517,1519,1521,1523,1525,1527,1529,1531,1533,1535,1537,1539,1541,1543,1545,1547,1549,1551,1553,1555,1557,1559,1561,1563,1565,1567,1569,1571,1573,1575,1577,1579,1581,1583,1585,1587,1589,1591,1593,1595,1597,1599,1601,1603,1605,1607,1609,1611,1613,1615,1617,1619,1621,1623,1625,1627,1629,1631,1633,1635,1637,1639,1641,1643,1645,1647,1649,1651,1653,1655,1657,1659,1661,1663,1665,1667,1669,1671,1673,1675,1677,1679,1681,1683,1685,1687,1689,1691,1693,1695,1697,1699,1701,1703,1705,1707,1709,1711,1713,1715,1717,1719,1721,1723,1725,1727,1729,1731,1733,1735,1737,1739,1741,1743,1745,1747,1749,1751,1753,1755,1757,1759,1761,1763,1765,1767,1769,1771,1773,1775,1777,1779,1781,1783,1785,1787,1789,1791,1793,1795,1797,1799,1801,1803,1805,1807,1809,1811,1813,1815,1817,1819,1821,1823,1825,1827,1829,1831,1833,1835,1837,1839,1841,1843,1845,1847,1849,1851,1853,1855,1857,1859,1861,1863,1865,1867,1869,1871,1873,1875,1877,1879,1881,1883,1885,1887,1889,1891,1893,1895,1897,1899,1901,1903,1905,1907,1909,1911,1913,1915,1917,1919,1921,1923,1925,1927,1929,1931,1933,1935,1937,1939,1941,1943,1945,1947,1949,1951,1953,1955,1957,1959,1961,1963,1965,1967,1969,1971,1973,1975,1977,1979,1981,1983,1985,1987,1989,1991,1993,1995,1997,1999,2001,2003,2005,2007,2009,2011,2013,2015,2017,2019,2021,2023,2025,2027,2029,2031,2033,2035,2037,2039,2041,2043,2045,2047,2049,2051,2053,2055,2057,2059,2061,2063,2065,2067,2069,2071,2073,2075,2077,2079,2081,2083,2085,2087,2089,2091,2093,2095,2097,2099,2101,2103,2105,2107,2109,2111,2113,2115,2117,2119,2121,2123,2125,2127,2129,2131,2133,2135,2137,2139,2141,2143,2145,2147,2149,2151,2153,2155,2157,2159,2161,2163,2165,2167,2169,2171,2173,2175,2177,2179,2181,2183,2185,2187,2189,2191,2193,2195,2197,2199,2201,2203,2205,2207,2209,2211,2213,2215,2217,2219,2221,2223,2225,2227,2229,2231,2233,2235,2237,2239,2241,2243,2245,2247,2249,2251,2253,2255,2257,2259,2261,2263,2265,2267,2269,2271,2273,2275,2277,2279,2281,2283,2285,2287,2289,2291,2293,2295,2297,2299,2301,2303,2305,2307,2309,2311,2313,2315,2317,2319,2321,2323,2325,2327,2329,2331,2333,2335,2337,2339,2341,2343,2345,2347,2349,2351,2353,2355,2357,2359,2361,2363,2365,2367,2369,2371,2373,2375,2377,2379,2381,2383,2385,2387,2389,2391,2393,2395,2397,2399,2401,2403,2405,2407,2409,2411,2413,2415,2417,2419,2421,2423,2425,2427,2429,2431,2433,2435,2437,2439,2441,2443,2445,2447,2449,2451,2453,2455,2457,2459,2461,2463,2465,2467,2469,2471,2473,2475,2477,2479,2481,2483,2485,2487,2489,2491,2493,2495,2497,2499,2501,2503,2505,2507,2509,2511,2513,2515,2517,2519,2521,2523,2525,2527,2529,2531,2533,2535,2537,2539,2541,2543,2545,2547,2549,2551,2553,2555,2557,2559,2561,2563,2565,2567,2569,2571,2573,2575,2577,2579,2581,2583,2585,2587,2589,2591,2593,2595,2597,2599,2601,2603,2605,2607,2609,2611,2613,2615,2617,2619,2621,2623,2625,2627,2629,2631,2633,2635,2637,2639,2641,2643,2645,2647,2649,2651,2653,2655,2657,2659,2661,2663,2665,2667,2669,2671,2673,2675,2677,2679,2681,2683,2685,2687,2689,2691,2693,2695,2697,2699,2701,2703,2705,2707,2709,2711,2713,2715,2717,2719,2721,2723,2725,2727,2729,2731,2733,2735,2737,2739,2741,2743,2745,2747,2749,2751,2753,2755,2757,2759,2761,2763,2765,2767,2769,2771,2773,2775,2777,2779,2781,2783,2785,2787,2789,2791,2793,2795,2797,2799,2801,2803,2805,2807,2809,2811,2813,2815,2817,2819,2821,2823,2825,2827,2829,2831,2833,2835,2837,2839,2841,2843,2845,2847,2849,2851,2853,2855,2857,2859,2861,2863,2865,2867,2869,2871,2873,2875,2877,2879,2881,2883,2885,2887,2889,2891,2893,2895,2897,2899,2901,2903,2905,2907,2909,2911,2913,2915,2917,2919,2921,2923,2925,2927,2929,2931,2933,2935,2937,2939,2941,2943,2945,2947,2949,2951,2953,2955,2957,2959,2961,2963,2965,2967,2969,2971,2973,2975,2977,2979,2981,2983,2985,2987,2989,2991,2993,2995,2997,2999,3001,3003,3005,3007,3009,3011,3013,3015,3017,3019,3021,3023,3025,3027,3029,3031,3033,3035,3037,3039,3041,3043,3045,3047,3049,3051,3053,3055,3057,3059,3061,3063,3065,3067,3069,3071,3073,3075,3077,3079,3081,3083,3085,3087,3089,3091,3093,3095,3097,3099,3101,3103,3105,3107,3109,3111,3113,3115,3117,3119,3121,3123,3125,3127,3129,3131,3133,3135,3137,3139,3141,3143,3145,3147,3149,3151,3153,3155,3157,3159,3161,3163,3165,3167,3169,3171,3173,3175,3177,3179,3181,3183,3185,3187,3189,3191,3193,3195,3197,3199,3201,3203,3205,3207,3209,3211,3213,3215,3217,3219,3221,3223,3225,3227,3229,3231,3233,3235,3237,3239,3241,3243,3245,3247,3249,3251,3253,3255,3257,3259,3261,3263,3265,3267,3269,3271,3273,3275,3277,3279,3281,3283,3285,3287,3289,3291,3293,3295,3297,3299,3301,3303,3305,3307,3309,3311,3313,3315,3317,3319,3321,3323,3325,3327,3329,3331,3333,3335,3337,3339,3341,3343,3345,3347,3349,3351,3353,3355,3357,3359,3361,3363,3365,3367,3369,3371,3373,3375,3377,3379,3381,3383,3385,3387,3389,3391,3393,3395,3397,3399,3401,3403,3405,3407,3409,3411,3413,3415,3417,3419,3421,3423,3425,3427,3429,3431,3433,3435,3437,3439,3441,3443,3445,3447,3449,3451,3453,3455,3457,3459,3461,3463,3465,3467,3469,3471,3473,3475,3477,3479,3481,3483,3485,3487,3489,3491,3493,3495,3497,3499,3501,3503,3505,3507,3509,3511,3513,3515,3517,3519,3521,3523,3525,3527,3529,3531,3533,3535,3537,3539,3541,3543,3545,3547,3549,3551,3553,3555,3557,3559,3561,3563,3565,3567,3569,3571,3573,3575,3577,3579,3581,3583,3585,3587,3589,3591,3593,3595,3597,3599,3601,3603,3605,3607,3609,3611,3613,3615,3617,3619,3621,3623,3625,3627,3629,3631,3633,3635,3637,3639,3641,3643,3645,3647,3649,3651,3653,3655,3657,3659,3661,3663,3665,3667,3669,3671,3673,3675,3677,3679,3681,3683,3685,3687,3689],{"categories":108},[109],"Developer Productivity",{"categories":111},[112],"Business & SaaS",{"categories":114},[70],{"categories":116},[117],"AI Automation",{"categories":119},[120],"Product Strategy",{"categories":122},[70],{"categories":124},[109],{"categories":126},[112],{"categories":128},[],{"categories":130},[70],{"categories":132},[],{"categories":134},[135],"AI News & Trends",{"categories":137},[117],{"categories":139},[135],{"categories":141},[117],{"categories":143},[117],{"categories":145},[70],{"categories":147},[70],{"categories":149},[135],{"categories":151},[70],{"categories":153},[],{"categories":155},[156],"Design & Frontend",{"categories":158},[159],"Data Science & Visualization",{"categories":161},[135],{"categories":163},[],{"categories":165},[166],"Software Engineering",{"categories":168},[70],{"categories":170},[117],{"categories":172},[173],"Marketing & Growth",{"categories":175},[70],{"categories":177},[117],{"categories":179},[],{"categories":181},[],{"categories":183},[156],{"categories":185},[117],{"categories":187},[109],{"categories":189},[156],{"categories":191},[70],{"categories":193},[117],{"categories":195},[135],{"categories":197},[],{"categories":199},[],{"categories":201},[117],{"categories":203},[166],{"categories":205},[],{"categories":207},[112],{"categories":209},[],{"categories":211},[],{"categories":213},[117],{"categories":215},[117],{"categories":217},[70],{"categories":219},[],{"categories":221},[166],{"categories":223},[],{"categories":225},[],{"categories":227},[],{"categories":229},[70],{"categories":231},[173],{"categories":233},[156],{"categories":235},[156],{"categories":237},[70],{"categories":239},[117],{"categories":241},[70],{"categories":243},[70],{"categories":245},[117],{"categories":247},[117],{"categories":249},[159],{"categories":251},[135],{"categories":253},[117],{"categories":255},[173],{"categories":257},[117],{"categories":259},[120],{"categories":261},[],{"categories":263},[117],{"categories":265},[],{"categories":267},[117],{"categories":269},[166],{"categories":271},[156],{"categories":273},[70],{"categories":275},[],{"categories":277},[],{"categories":279},[117],{"categories":281},[],{"categories":283},[70],{"categories":285},[],{"categories":287},[109],{"categories":289},[166],{"categories":291},[112],{"categories":293},[135],{"categories":295},[70],{"categories":297},[],{"categories":299},[70],{"categories":301},[],{"categories":303},[166],{"categories":305},[159],{"categories":307},[],{"categories":309},[70],{"categories":311},[156],{"categories":313},[],{"categories":315},[156],{"categories":317},[117],{"categories":319},[],{"categories":321},[117],{"categories":323},[135],{"categories":325},[112],{"categories":327},[70],{"categories":329},[],{"categories":331},[117],{"categories":333},[70],{"categories":335},[120],{"categories":337},[],{"categories":339},[70],{"categories":341},[117],{"categories":343},[117],{"categories":345},[],{"categories":347},[159],{"categories":349},[70],{"categories":351},[],{"categories":353},[109],{"categories":355},[112],{"categories":357},[70],{"categories":359},[117],{"categories":361},[166],{"categories":363},[70],{"categories":365},[],{"categories":367},[],{"categories":369},[70],{"categories":371},[],{"categories":373},[156],{"categories":375},[],{"categories":377},[70],{"categories":379},[],{"categories":381},[117],{"categories":383},[70],{"categories":385},[156],{"categories":387},[],{"categories":389},[70],{"categories":391},[70],{"categories":393},[112],{"categories":395},[117],{"categories":397},[70],{"categories":399},[156],{"categories":401},[117],{"categories":403},[],{"categories":405},[],{"categories":407},[135],{"categories":409},[],{"categories":411},[70],{"categories":413},[112,173],{"categories":415},[],{"categories":417},[70],{"categories":419},[],{"categories":421},[],{"categories":423},[70],{"categories":425},[],{"categories":427},[70],{"categories":429},[430],"DevOps & Cloud",{"categories":432},[],{"categories":434},[135],{"categories":436},[156],{"categories":438},[],{"categories":440},[135],{"categories":442},[135],{"categories":444},[70],{"categories":446},[173],{"categories":448},[],{"categories":450},[112],{"categories":452},[],{"categories":454},[70,430],{"categories":456},[70],{"categories":458},[70],{"categories":460},[117],{"categories":462},[70,166],{"categories":464},[159],{"categories":466},[70],{"categories":468},[173],{"categories":470},[117],{"categories":472},[117],{"categories":474},[],{"categories":476},[117],{"categories":478},[70,112],{"categories":480},[],{"categories":482},[156],{"categories":484},[156],{"categories":486},[],{"categories":488},[],{"categories":490},[135],{"categories":492},[],{"categories":494},[109],{"categories":496},[166],{"categories":498},[70],{"categories":500},[156],{"categories":502},[117],{"categories":504},[166],{"categories":506},[135],{"categories":508},[156],{"categories":510},[],{"categories":512},[70],{"categories":514},[70],{"categories":516},[70],{"categories":518},[135],{"categories":520},[109],{"categories":522},[70],{"categories":524},[117],{"categories":526},[430],{"categories":528},[156],{"categories":530},[117],{"categories":532},[],{"categories":534},[],{"categories":536},[156],{"categories":538},[135],{"categories":540},[159],{"categories":542},[],{"categories":544},[70],{"categories":546},[70],{"categories":548},[112],{"categories":550},[70],{"categories":552},[70],{"categories":554},[135],{"categories":556},[],{"categories":558},[117],{"categories":560},[166],{"categories":562},[],{"categories":564},[70],{"categories":566},[70],{"categories":568},[117],{"categories":570},[],{"categories":572},[],{"categories":574},[70],{"categories":576},[],{"categories":578},[112],{"categories":580},[117],{"categories":582},[],{"categories":584},[109],{"categories":586},[70],{"categories":588},[112],{"categories":590},[135],{"categories":592},[],{"categories":594},[],{"categories":596},[],{"categories":598},[135],{"categories":600},[135],{"categories":602},[],{"categories":604},[],{"categories":606},[112],{"categories":608},[],{"categories":610},[],{"categories":612},[109],{"categories":614},[],{"categories":616},[173],{"categories":618},[117],{"categories":620},[112],{"categories":622},[117],{"categories":624},[166],{"categories":626},[],{"categories":628},[120],{"categories":630},[156],{"categories":632},[166],{"categories":634},[70],{"categories":636},[117],{"categories":638},[112],{"categories":640},[70],{"categories":642},[],{"categories":644},[],{"categories":646},[166],{"categories":648},[159],{"categories":650},[120],{"categories":652},[117],{"categories":654},[70],{"categories":656},[],{"categories":658},[430],{"categories":660},[],{"categories":662},[117],{"categories":664},[],{"categories":666},[],{"categories":668},[70],{"categories":670},[156],{"categories":672},[173],{"categories":674},[117],{"categories":676},[],{"categories":678},[109],{"categories":680},[],{"categories":682},[135],{"categories":684},[70,430],{"categories":686},[135],{"categories":688},[70],{"categories":690},[112],{"categories":692},[70],{"categories":694},[],{"categories":696},[112],{"categories":698},[],{"categories":700},[166],{"categories":702},[156],{"categories":704},[135],{"categories":706},[159],{"categories":708},[109],{"categories":710},[70],{"categories":712},[166],{"categories":714},[],{"categories":716},[],{"categories":718},[120],{"categories":720},[],{"categories":722},[70],{"categories":724},[],{"categories":726},[156],{"categories":728},[156],{"categories":730},[156],{"categories":732},[],{"categories":734},[],{"categories":736},[135],{"categories":738},[117],{"categories":740},[70],{"categories":742},[70],{"categories":744},[70],{"categories":746},[112],{"categories":748},[70],{"categories":750},[],{"categories":752},[166],{"categories":754},[166],{"categories":756},[112],{"categories":758},[],{"categories":760},[70],{"categories":762},[70],{"categories":764},[112],{"categories":766},[135],{"categories":768},[173],{"categories":770},[117],{"categories":772},[],{"categories":774},[156],{"categories":776},[],{"categories":778},[70],{"categories":780},[],{"categories":782},[112],{"categories":784},[117],{"categories":786},[],{"categories":788},[430],{"categories":790},[159],{"categories":792},[166],{"categories":794},[173],{"categories":796},[166],{"categories":798},[117],{"categories":800},[],{"categories":802},[],{"categories":804},[117],{"categories":806},[109],{"categories":808},[117],{"categories":810},[120],{"categories":812},[112],{"categories":814},[],{"categories":816},[70],{"categories":818},[120],{"categories":820},[70],{"categories":822},[70],{"categories":824},[173],{"categories":826},[156],{"categories":828},[117],{"categories":830},[],{"categories":832},[],{"categories":834},[430],{"categories":836},[166],{"categories":838},[],{"categories":840},[117],{"categories":842},[70],{"categories":844},[156,70],{"categories":846},[109],{"categories":848},[],{"categories":850},[70],{"categories":852},[109],{"categories":854},[156],{"categories":856},[117],{"categories":858},[166],{"categories":860},[],{"categories":862},[70],{"categories":864},[],{"categories":866},[109],{"categories":868},[],{"categories":870},[117],{"categories":872},[120],{"categories":874},[70],{"categories":876},[70],{"categories":878},[156],{"categories":880},[117],{"categories":882},[430],{"categories":884},[156],{"categories":886},[117],{"categories":888},[70],{"categories":890},[70],{"categories":892},[70],{"categories":894},[135],{"categories":896},[],{"categories":898},[120],{"categories":900},[117],{"categories":902},[156],{"categories":904},[117],{"categories":906},[166],{"categories":908},[156],{"categories":910},[117],{"categories":912},[135],{"categories":914},[],{"categories":916},[70],{"categories":918},[156],{"categories":920},[70],{"categories":922},[109],{"categories":924},[135],{"categories":926},[70],{"categories":928},[173],{"categories":930},[70],{"categories":932},[70],{"categories":934},[117],{"categories":936},[117],{"categories":938},[70],{"categories":940},[117],{"categories":942},[156],{"categories":944},[70],{"categories":946},[],{"categories":948},[],{"categories":950},[166],{"categories":952},[],{"categories":954},[109],{"categories":956},[430],{"categories":958},[],{"categories":960},[109],{"categories":962},[112],{"categories":964},[173],{"categories":966},[],{"categories":968},[112],{"categories":970},[],{"categories":972},[],{"categories":974},[],{"categories":976},[],{"categories":978},[],{"categories":980},[70],{"categories":982},[117],{"categories":984},[430],{"categories":986},[109],{"categories":988},[70],{"categories":990},[166],{"categories":992},[120],{"categories":994},[70],{"categories":996},[173],{"categories":998},[70],{"categories":1000},[70],{"categories":1002},[70],{"categories":1004},[70,109],{"categories":1006},[166],{"categories":1008},[166],{"categories":1010},[156],{"categories":1012},[70],{"categories":1014},[],{"categories":1016},[],{"categories":1018},[],{"categories":1020},[166],{"categories":1022},[159],{"categories":1024},[135],{"categories":1026},[156],{"categories":1028},[],{"categories":1030},[70],{"categories":1032},[70],{"categories":1034},[],{"categories":1036},[],{"categories":1038},[117],{"categories":1040},[70],{"categories":1042},[112],{"categories":1044},[],{"categories":1046},[109],{"categories":1048},[70],{"categories":1050},[109],{"categories":1052},[70],{"categories":1054},[166],{"categories":1056},[173],{"categories":1058},[70,156],{"categories":1060},[135],{"categories":1062},[156],{"categories":1064},[],{"categories":1066},[430],{"categories":1068},[156],{"categories":1070},[117],{"categories":1072},[],{"categories":1074},[],{"categories":1076},[],{"categories":1078},[],{"categories":1080},[166],{"categories":1082},[117],{"categories":1084},[117],{"categories":1086},[430],{"categories":1088},[70],{"categories":1090},[70],{"categories":1092},[70],{"categories":1094},[],{"categories":1096},[156],{"categories":1098},[],{"categories":1100},[],{"categories":1102},[117],{"categories":1104},[],{"categories":1106},[],{"categories":1108},[173],{"categories":1110},[173],{"categories":1112},[117],{"categories":1114},[],{"categories":1116},[70],{"categories":1118},[70],{"categories":1120},[166],{"categories":1122},[156],{"categories":1124},[156],{"categories":1126},[117],{"categories":1128},[109],{"categories":1130},[70],{"categories":1132},[156],{"categories":1134},[156],{"categories":1136},[117],{"categories":1138},[117],{"categories":1140},[70],{"categories":1142},[],{"categories":1144},[],{"categories":1146},[70],{"categories":1148},[117],{"categories":1150},[135],{"categories":1152},[166],{"categories":1154},[109],{"categories":1156},[70],{"categories":1158},[],{"categories":1160},[117],{"categories":1162},[117],{"categories":1164},[],{"categories":1166},[109],{"categories":1168},[70],{"categories":1170},[109],{"categories":1172},[109],{"categories":1174},[],{"categories":1176},[],{"categories":1178},[117],{"categories":1180},[117],{"categories":1182},[70],{"categories":1184},[70],{"categories":1186},[135],{"categories":1188},[159],{"categories":1190},[120],{"categories":1192},[135],{"categories":1194},[156],{"categories":1196},[],{"categories":1198},[135],{"categories":1200},[],{"categories":1202},[],{"categories":1204},[],{"categories":1206},[],{"categories":1208},[166],{"categories":1210},[159],{"categories":1212},[],{"categories":1214},[70],{"categories":1216},[70],{"categories":1218},[159],{"categories":1220},[166],{"categories":1222},[],{"categories":1224},[],{"categories":1226},[117],{"categories":1228},[135],{"categories":1230},[135],{"categories":1232},[117],{"categories":1234},[109],{"categories":1236},[70,430],{"categories":1238},[],{"categories":1240},[156],{"categories":1242},[109],{"categories":1244},[117],{"categories":1246},[156],{"categories":1248},[],{"categories":1250},[117],{"categories":1252},[117],{"categories":1254},[70],{"categories":1256},[173],{"categories":1258},[166],{"categories":1260},[156],{"categories":1262},[],{"categories":1264},[117],{"categories":1266},[70],{"categories":1268},[117],{"categories":1270},[117],{"categories":1272},[117],{"categories":1274},[173],{"categories":1276},[117],{"categories":1278},[70],{"categories":1280},[],{"categories":1282},[173],{"categories":1284},[135],{"categories":1286},[117],{"categories":1288},[],{"categories":1290},[],{"categories":1292},[70],{"categories":1294},[117],{"categories":1296},[135],{"categories":1298},[117],{"categories":1300},[],{"categories":1302},[],{"categories":1304},[],{"categories":1306},[117],{"categories":1308},[],{"categories":1310},[],{"categories":1312},[159],{"categories":1314},[70],{"categories":1316},[159],{"categories":1318},[135],{"categories":1320},[70],{"categories":1322},[70],{"categories":1324},[117],{"categories":1326},[70],{"categories":1328},[],{"categories":1330},[],{"categories":1332},[430],{"categories":1334},[],{"categories":1336},[],{"categories":1338},[109],{"categories":1340},[],{"categories":1342},[],{"categories":1344},[],{"categories":1346},[],{"categories":1348},[166],{"categories":1350},[135],{"categories":1352},[173],{"categories":1354},[112],{"categories":1356},[70],{"categories":1358},[70],{"categories":1360},[112],{"categories":1362},[],{"categories":1364},[156],{"categories":1366},[117],{"categories":1368},[112],{"categories":1370},[70],{"categories":1372},[70],{"categories":1374},[109],{"categories":1376},[],{"categories":1378},[109],{"categories":1380},[70],{"categories":1382},[173],{"categories":1384},[117],{"categories":1386},[135],{"categories":1388},[112],{"categories":1390},[70],{"categories":1392},[117],{"categories":1394},[],{"categories":1396},[70],{"categories":1398},[109],{"categories":1400},[70],{"categories":1402},[],{"categories":1404},[135],{"categories":1406},[70],{"categories":1408},[],{"categories":1410},[112],{"categories":1412},[70],{"categories":1414},[],{"categories":1416},[],{"categories":1418},[],{"categories":1420},[70],{"categories":1422},[],{"categories":1424},[430],{"categories":1426},[70],{"categories":1428},[],{"categories":1430},[70],{"categories":1432},[70],{"categories":1434},[70],{"categories":1436},[70,430],{"categories":1438},[70],{"categories":1440},[70],{"categories":1442},[156],{"categories":1444},[117],{"categories":1446},[],{"categories":1448},[117],{"categories":1450},[70],{"categories":1452},[70],{"categories":1454},[70],{"categories":1456},[109],{"categories":1458},[109],{"categories":1460},[166],{"categories":1462},[156],{"categories":1464},[117],{"categories":1466},[],{"categories":1468},[70],{"categories":1470},[135],{"categories":1472},[70],{"categories":1474},[112],{"categories":1476},[],{"categories":1478},[430],{"categories":1480},[156],{"categories":1482},[156],{"categories":1484},[117],{"categories":1486},[135],{"categories":1488},[117],{"categories":1490},[70],{"categories":1492},[],{"categories":1494},[70],{"categories":1496},[],{"categories":1498},[],{"categories":1500},[70],{"categories":1502},[70],{"categories":1504},[70],{"categories":1506},[117],{"categories":1508},[70],{"categories":1510},[],{"categories":1512},[159],{"categories":1514},[117],{"categories":1516},[],{"categories":1518},[],{"categories":1520},[70],{"categories":1522},[135],{"categories":1524},[],{"categories":1526},[156],{"categories":1528},[430],{"categories":1530},[135],{"categories":1532},[166],{"categories":1534},[166],{"categories":1536},[135],{"categories":1538},[135],{"categories":1540},[430],{"categories":1542},[],{"categories":1544},[135],{"categories":1546},[70],{"categories":1548},[109],{"categories":1550},[135],{"categories":1552},[],{"categories":1554},[159],{"categories":1556},[135],{"categories":1558},[166],{"categories":1560},[135],{"categories":1562},[430],{"categories":1564},[70],{"categories":1566},[70],{"categories":1568},[],{"categories":1570},[112],{"categories":1572},[],{"categories":1574},[],{"categories":1576},[70],{"categories":1578},[70],{"categories":1580},[70],{"categories":1582},[70],{"categories":1584},[],{"categories":1586},[159],{"categories":1588},[109],{"categories":1590},[],{"categories":1592},[70],{"categories":1594},[70],{"categories":1596},[430],{"categories":1598},[430],{"categories":1600},[],{"categories":1602},[117],{"categories":1604},[135],{"categories":1606},[135],{"categories":1608},[70],{"categories":1610},[117],{"categories":1612},[],{"categories":1614},[156],{"categories":1616},[70],{"categories":1618},[70],{"categories":1620},[],{"categories":1622},[],{"categories":1624},[430],{"categories":1626},[70],{"categories":1628},[166],{"categories":1630},[112],{"categories":1632},[70],{"categories":1634},[],{"categories":1636},[117],{"categories":1638},[109],{"categories":1640},[109],{"categories":1642},[],{"categories":1644},[70],{"categories":1646},[156],{"categories":1648},[117],{"categories":1650},[],{"categories":1652},[70],{"categories":1654},[70],{"categories":1656},[117],{"categories":1658},[],{"categories":1660},[117],{"categories":1662},[166],{"categories":1664},[],{"categories":1666},[70],{"categories":1668},[],{"categories":1670},[70],{"categories":1672},[],{"categories":1674},[70],{"categories":1676},[70],{"categories":1678},[],{"categories":1680},[70],{"categories":1682},[135],{"categories":1684},[70],{"categories":1686},[70],{"categories":1688},[109],{"categories":1690},[70],{"categories":1692},[135],{"categories":1694},[117],{"categories":1696},[],{"categories":1698},[70],{"categories":1700},[173],{"categories":1702},[],{"categories":1704},[],{"categories":1706},[],{"categories":1708},[109],{"categories":1710},[135],{"categories":1712},[117],{"categories":1714},[70],{"categories":1716},[156],{"categories":1718},[117],{"categories":1720},[],{"categories":1722},[117],{"categories":1724},[],{"categories":1726},[70],{"categories":1728},[117],{"categories":1730},[70],{"categories":1732},[],{"categories":1734},[70],{"categories":1736},[70],{"categories":1738},[135],{"categories":1740},[156],{"categories":1742},[117],{"categories":1744},[156],{"categories":1746},[112],{"categories":1748},[],{"categories":1750},[],{"categories":1752},[70],{"categories":1754},[109],{"categories":1756},[135],{"categories":1758},[],{"categories":1760},[],{"categories":1762},[166],{"categories":1764},[156],{"categories":1766},[],{"categories":1768},[70],{"categories":1770},[],{"categories":1772},[173],{"categories":1774},[70],{"categories":1776},[430],{"categories":1778},[166],{"categories":1780},[],{"categories":1782},[117],{"categories":1784},[70],{"categories":1786},[117],{"categories":1788},[117],{"categories":1790},[70],{"categories":1792},[],{"categories":1794},[109],{"categories":1796},[70],{"categories":1798},[112],{"categories":1800},[166],{"categories":1802},[156],{"categories":1804},[],{"categories":1806},[],{"categories":1808},[],{"categories":1810},[117],{"categories":1812},[156],{"categories":1814},[135],{"categories":1816},[70],{"categories":1818},[135],{"categories":1820},[156],{"categories":1822},[],{"categories":1824},[156],{"categories":1826},[135],{"categories":1828},[112],{"categories":1830},[70],{"categories":1832},[135],{"categories":1834},[173],{"categories":1836},[],{"categories":1838},[],{"categories":1840},[159],{"categories":1842},[70,166],{"categories":1844},[135],{"categories":1846},[70],{"categories":1848},[117],{"categories":1850},[117],{"categories":1852},[70],{"categories":1854},[],{"categories":1856},[166],{"categories":1858},[70],{"categories":1860},[159],{"categories":1862},[117],{"categories":1864},[173],{"categories":1866},[430],{"categories":1868},[],{"categories":1870},[109],{"categories":1872},[117],{"categories":1874},[117],{"categories":1876},[166],{"categories":1878},[70],{"categories":1880},[70],{"categories":1882},[],{"categories":1884},[],{"categories":1886},[],{"categories":1888},[430],{"categories":1890},[135],{"categories":1892},[70],{"categories":1894},[70],{"categories":1896},[70],{"categories":1898},[],{"categories":1900},[159],{"categories":1902},[112],{"categories":1904},[],{"categories":1906},[117],{"categories":1908},[430],{"categories":1910},[],{"categories":1912},[156],{"categories":1914},[156],{"categories":1916},[],{"categories":1918},[166],{"categories":1920},[156],{"categories":1922},[70],{"categories":1924},[],{"categories":1926},[135],{"categories":1928},[70],{"categories":1930},[156],{"categories":1932},[117],{"categories":1934},[135],{"categories":1936},[],{"categories":1938},[117],{"categories":1940},[156],{"categories":1942},[70],{"categories":1944},[],{"categories":1946},[70],{"categories":1948},[70],{"categories":1950},[430],{"categories":1952},[135],{"categories":1954},[159],{"categories":1956},[159],{"categories":1958},[],{"categories":1960},[],{"categories":1962},[],{"categories":1964},[117],{"categories":1966},[166],{"categories":1968},[166],{"categories":1970},[],{"categories":1972},[],{"categories":1974},[70],{"categories":1976},[],{"categories":1978},[117],{"categories":1980},[70],{"categories":1982},[],{"categories":1984},[70],{"categories":1986},[112],{"categories":1988},[70],{"categories":1990},[173],{"categories":1992},[117],{"categories":1994},[70],{"categories":1996},[166],{"categories":1998},[135],{"categories":2000},[117],{"categories":2002},[],{"categories":2004},[135],{"categories":2006},[117],{"categories":2008},[117],{"categories":2010},[],{"categories":2012},[112],{"categories":2014},[117],{"categories":2016},[],{"categories":2018},[70],{"categories":2020},[109],{"categories":2022},[135],{"categories":2024},[430],{"categories":2026},[117],{"categories":2028},[117],{"categories":2030},[109],{"categories":2032},[70],{"categories":2034},[],{"categories":2036},[],{"categories":2038},[156],{"categories":2040},[70,112],{"categories":2042},[],{"categories":2044},[109],{"categories":2046},[159],{"categories":2048},[70],{"categories":2050},[166],{"categories":2052},[70],{"categories":2054},[117],{"categories":2056},[70],{"categories":2058},[70],{"categories":2060},[135],{"categories":2062},[117],{"categories":2064},[],{"categories":2066},[],{"categories":2068},[117],{"categories":2070},[70],{"categories":2072},[430],{"categories":2074},[],{"categories":2076},[70],{"categories":2078},[117],{"categories":2080},[],{"categories":2082},[70],{"categories":2084},[173],{"categories":2086},[159],{"categories":2088},[117],{"categories":2090},[70],{"categories":2092},[430],{"categories":2094},[],{"categories":2096},[70],{"categories":2098},[173],{"categories":2100},[156],{"categories":2102},[70],{"categories":2104},[],{"categories":2106},[173],{"categories":2108},[135],{"categories":2110},[70],{"categories":2112},[70],{"categories":2114},[109],{"categories":2116},[],{"categories":2118},[],{"categories":2120},[156],{"categories":2122},[70],{"categories":2124},[159],{"categories":2126},[173],{"categories":2128},[173],{"categories":2130},[135],{"categories":2132},[],{"categories":2134},[],{"categories":2136},[70],{"categories":2138},[],{"categories":2140},[70,166],{"categories":2142},[135],{"categories":2144},[117],{"categories":2146},[166],{"categories":2148},[70],{"categories":2150},[109],{"categories":2152},[],{"categories":2154},[],{"categories":2156},[109],{"categories":2158},[173],{"categories":2160},[70],{"categories":2162},[],{"categories":2164},[156,70],{"categories":2166},[430],{"categories":2168},[109],{"categories":2170},[],{"categories":2172},[112],{"categories":2174},[112],{"categories":2176},[70],{"categories":2178},[166],{"categories":2180},[117],{"categories":2182},[135],{"categories":2184},[173],{"categories":2186},[156],{"categories":2188},[70],{"categories":2190},[70],{"categories":2192},[70],{"categories":2194},[109],{"categories":2196},[70],{"categories":2198},[117],{"categories":2200},[135],{"categories":2202},[],{"categories":2204},[],{"categories":2206},[159],{"categories":2208},[166],{"categories":2210},[70],{"categories":2212},[156],{"categories":2214},[159],{"categories":2216},[70],{"categories":2218},[70],{"categories":2220},[117],{"categories":2222},[117],{"categories":2224},[70,112],{"categories":2226},[],{"categories":2228},[156],{"categories":2230},[],{"categories":2232},[70],{"categories":2234},[135],{"categories":2236},[109],{"categories":2238},[109],{"categories":2240},[117],{"categories":2242},[70],{"categories":2244},[112],{"categories":2246},[166],{"categories":2248},[173],{"categories":2250},[],{"categories":2252},[135],{"categories":2254},[70],{"categories":2256},[70],{"categories":2258},[135],{"categories":2260},[166],{"categories":2262},[70],{"categories":2264},[117],{"categories":2266},[135],{"categories":2268},[70],{"categories":2270},[156],{"categories":2272},[70],{"categories":2274},[70],{"categories":2276},[430],{"categories":2278},[120],{"categories":2280},[117],{"categories":2282},[70],{"categories":2284},[135],{"categories":2286},[117],{"categories":2288},[173],{"categories":2290},[70],{"categories":2292},[],{"categories":2294},[70],{"categories":2296},[],{"categories":2298},[],{"categories":2300},[],{"categories":2302},[112],{"categories":2304},[70],{"categories":2306},[117],{"categories":2308},[135],{"categories":2310},[135],{"categories":2312},[135],{"categories":2314},[135],{"categories":2316},[],{"categories":2318},[109],{"categories":2320},[117],{"categories":2322},[135],{"categories":2324},[109],{"categories":2326},[117],{"categories":2328},[70],{"categories":2330},[70,117],{"categories":2332},[117],{"categories":2334},[430],{"categories":2336},[135],{"categories":2338},[135],{"categories":2340},[117],{"categories":2342},[70],{"categories":2344},[],{"categories":2346},[135],{"categories":2348},[173],{"categories":2350},[109],{"categories":2352},[70],{"categories":2354},[70],{"categories":2356},[],{"categories":2358},[166],{"categories":2360},[],{"categories":2362},[109],{"categories":2364},[117],{"categories":2366},[135],{"categories":2368},[70],{"categories":2370},[135],{"categories":2372},[109],{"categories":2374},[135],{"categories":2376},[135],{"categories":2378},[],{"categories":2380},[112],{"categories":2382},[117],{"categories":2384},[135],{"categories":2386},[135],{"categories":2388},[135],{"categories":2390},[135],{"categories":2392},[135],{"categories":2394},[135],{"categories":2396},[135],{"categories":2398},[135],{"categories":2400},[135],{"categories":2402},[135],{"categories":2404},[159],{"categories":2406},[109],{"categories":2408},[70],{"categories":2410},[70],{"categories":2412},[],{"categories":2414},[70,109],{"categories":2416},[],{"categories":2418},[117],{"categories":2420},[135],{"categories":2422},[117],{"categories":2424},[70],{"categories":2426},[70],{"categories":2428},[70],{"categories":2430},[70],{"categories":2432},[70],{"categories":2434},[117],{"categories":2436},[112],{"categories":2438},[156],{"categories":2440},[135],{"categories":2442},[70],{"categories":2444},[],{"categories":2446},[],{"categories":2448},[117],{"categories":2450},[156],{"categories":2452},[70],{"categories":2454},[],{"categories":2456},[],{"categories":2458},[173],{"categories":2460},[70],{"categories":2462},[],{"categories":2464},[],{"categories":2466},[109],{"categories":2468},[112],{"categories":2470},[70],{"categories":2472},[112],{"categories":2474},[156],{"categories":2476},[],{"categories":2478},[135],{"categories":2480},[],{"categories":2482},[156],{"categories":2484},[70],{"categories":2486},[173],{"categories":2488},[],{"categories":2490},[173],{"categories":2492},[],{"categories":2494},[],{"categories":2496},[117],{"categories":2498},[],{"categories":2500},[112],{"categories":2502},[109],{"categories":2504},[156],{"categories":2506},[166],{"categories":2508},[],{"categories":2510},[],{"categories":2512},[70],{"categories":2514},[109],{"categories":2516},[173],{"categories":2518},[],{"categories":2520},[117],{"categories":2522},[117],{"categories":2524},[135],{"categories":2526},[70],{"categories":2528},[117],{"categories":2530},[70],{"categories":2532},[117],{"categories":2534},[70],{"categories":2536},[120],{"categories":2538},[135],{"categories":2540},[],{"categories":2542},[173],{"categories":2544},[166],{"categories":2546},[117],{"categories":2548},[],{"categories":2550},[70],{"categories":2552},[117],{"categories":2554},[112],{"categories":2556},[109],{"categories":2558},[70],{"categories":2560},[156],{"categories":2562},[166],{"categories":2564},[166],{"categories":2566},[70],{"categories":2568},[159],{"categories":2570},[70],{"categories":2572},[117],{"categories":2574},[112],{"categories":2576},[117],{"categories":2578},[70],{"categories":2580},[70],{"categories":2582},[117],{"categories":2584},[135],{"categories":2586},[],{"categories":2588},[109],{"categories":2590},[70],{"categories":2592},[117],{"categories":2594},[70],{"categories":2596},[70],{"categories":2598},[],{"categories":2600},[156],{"categories":2602},[112],{"categories":2604},[135],{"categories":2606},[70],{"categories":2608},[70],{"categories":2610},[156],{"categories":2612},[173],{"categories":2614},[159],{"categories":2616},[70],{"categories":2618},[135],{"categories":2620},[70],{"categories":2622},[117],{"categories":2624},[430],{"categories":2626},[70],{"categories":2628},[117],{"categories":2630},[159],{"categories":2632},[],{"categories":2634},[117],{"categories":2636},[166],{"categories":2638},[156],{"categories":2640},[70],{"categories":2642},[109],{"categories":2644},[112],{"categories":2646},[166],{"categories":2648},[],{"categories":2650},[117],{"categories":2652},[70],{"categories":2654},[],{"categories":2656},[135],{"categories":2658},[],{"categories":2660},[135],{"categories":2662},[70],{"categories":2664},[117],{"categories":2666},[117],{"categories":2668},[117],{"categories":2670},[],{"categories":2672},[],{"categories":2674},[70],{"categories":2676},[70],{"categories":2678},[],{"categories":2680},[156],{"categories":2682},[117],{"categories":2684},[173],{"categories":2686},[109],{"categories":2688},[],{"categories":2690},[],{"categories":2692},[135],{"categories":2694},[166],{"categories":2696},[70],{"categories":2698},[70],{"categories":2700},[70],{"categories":2702},[166],{"categories":2704},[135],{"categories":2706},[156],{"categories":2708},[70],{"categories":2710},[70],{"categories":2712},[70],{"categories":2714},[135],{"categories":2716},[70],{"categories":2718},[135],{"categories":2720},[117],{"categories":2722},[117],{"categories":2724},[166],{"categories":2726},[117],{"categories":2728},[70],{"categories":2730},[166],{"categories":2732},[156],{"categories":2734},[],{"categories":2736},[117],{"categories":2738},[],{"categories":2740},[],{"categories":2742},[],{"categories":2744},[112],{"categories":2746},[70],{"categories":2748},[117],{"categories":2750},[109],{"categories":2752},[117],{"categories":2754},[173],{"categories":2756},[],{"categories":2758},[117],{"categories":2760},[],{"categories":2762},[109],{"categories":2764},[117],{"categories":2766},[],{"categories":2768},[117],{"categories":2770},[70],{"categories":2772},[135],{"categories":2774},[70],{"categories":2776},[117],{"categories":2778},[135],{"categories":2780},[117],{"categories":2782},[166],{"categories":2784},[156],{"categories":2786},[109],{"categories":2788},[],{"categories":2790},[117],{"categories":2792},[156],{"categories":2794},[430],{"categories":2796},[135],{"categories":2798},[70],{"categories":2800},[156],{"categories":2802},[109],{"categories":2804},[],{"categories":2806},[117],{"categories":2808},[117],{"categories":2810},[70],{"categories":2812},[],{"categories":2814},[117],{"categories":2816},[120],{"categories":2818},[135],{"categories":2820},[117],{"categories":2822},[112],{"categories":2824},[],{"categories":2826},[70],{"categories":2828},[120],{"categories":2830},[70],{"categories":2832},[117],{"categories":2834},[135],{"categories":2836},[109],{"categories":2838},[430],{"categories":2840},[70],{"categories":2842},[70],{"categories":2844},[70],{"categories":2846},[135],{"categories":2848},[112],{"categories":2850},[70],{"categories":2852},[156],{"categories":2854},[135],{"categories":2856},[430],{"categories":2858},[70],{"categories":2860},[],{"categories":2862},[],{"categories":2864},[430],{"categories":2866},[159],{"categories":2868},[117],{"categories":2870},[117],{"categories":2872},[135],{"categories":2874},[70],{"categories":2876},[109],{"categories":2878},[156],{"categories":2880},[117],{"categories":2882},[70],{"categories":2884},[173],{"categories":2886},[70],{"categories":2888},[117],{"categories":2890},[],{"categories":2892},[70],{"categories":2894},[70],{"categories":2896},[135],{"categories":2898},[109],{"categories":2900},[],{"categories":2902},[70],{"categories":2904},[70],{"categories":2906},[166],{"categories":2908},[156],{"categories":2910},[70,117],{"categories":2912},[173,112],{"categories":2914},[70],{"categories":2916},[],{"categories":2918},[117],{"categories":2920},[],{"categories":2922},[166],{"categories":2924},[70],{"categories":2926},[135],{"categories":2928},[],{"categories":2930},[117],{"categories":2932},[],{"categories":2934},[156],{"categories":2936},[117],{"categories":2938},[109],{"categories":2940},[117],{"categories":2942},[70],{"categories":2944},[430],{"categories":2946},[173],{"categories":2948},[112],{"categories":2950},[112],{"categories":2952},[109],{"categories":2954},[109],{"categories":2956},[70],{"categories":2958},[117],{"categories":2960},[70],{"categories":2962},[70],{"categories":2964},[109],{"categories":2966},[70],{"categories":2968},[173],{"categories":2970},[135],{"categories":2972},[70],{"categories":2974},[117],{"categories":2976},[70],{"categories":2978},[],{"categories":2980},[166],{"categories":2982},[],{"categories":2984},[117],{"categories":2986},[109],{"categories":2988},[],{"categories":2990},[430],{"categories":2992},[70],{"categories":2994},[],{"categories":2996},[135],{"categories":2998},[117],{"categories":3000},[166],{"categories":3002},[70],{"categories":3004},[117],{"categories":3006},[166],{"categories":3008},[117],{"categories":3010},[135],{"categories":3012},[109],{"categories":3014},[135],{"categories":3016},[166],{"categories":3018},[70],{"categories":3020},[156],{"categories":3022},[70],{"categories":3024},[70],{"categories":3026},[70],{"categories":3028},[70],{"categories":3030},[117],{"categories":3032},[70],{"categories":3034},[117],{"categories":3036},[70],{"categories":3038},[109],{"categories":3040},[70],{"categories":3042},[117],{"categories":3044},[156],{"categories":3046},[109],{"categories":3048},[117],{"categories":3050},[156],{"categories":3052},[],{"categories":3054},[70],{"categories":3056},[70],{"categories":3058},[166],{"categories":3060},[],{"categories":3062},[117],{"categories":3064},[173],{"categories":3066},[70],{"categories":3068},[135],{"categories":3070},[173],{"categories":3072},[117],{"categories":3074},[112],{"categories":3076},[112],{"categories":3078},[70],{"categories":3080},[109],{"categories":3082},[],{"categories":3084},[70],{"categories":3086},[],{"categories":3088},[109],{"categories":3090},[70],{"categories":3092},[117],{"categories":3094},[117],{"categories":3096},[],{"categories":3098},[166],{"categories":3100},[166],{"categories":3102},[173],{"categories":3104},[156],{"categories":3106},[],{"categories":3108},[70],{"categories":3110},[109],{"categories":3112},[70],{"categories":3114},[166],{"categories":3116},[109],{"categories":3118},[135],{"categories":3120},[135],{"categories":3122},[],{"categories":3124},[135],{"categories":3126},[117],{"categories":3128},[156],{"categories":3130},[159],{"categories":3132},[70],{"categories":3134},[],{"categories":3136},[135],{"categories":3138},[166],{"categories":3140},[112],{"categories":3142},[70],{"categories":3144},[109],{"categories":3146},[430],{"categories":3148},[109],{"categories":3150},[],{"categories":3152},[],{"categories":3154},[135],{"categories":3156},[],{"categories":3158},[117],{"categories":3160},[117],{"categories":3162},[117],{"categories":3164},[],{"categories":3166},[70],{"categories":3168},[],{"categories":3170},[135],{"categories":3172},[109],{"categories":3174},[156],{"categories":3176},[70],{"categories":3178},[135],{"categories":3180},[135],{"categories":3182},[],{"categories":3184},[135],{"categories":3186},[109],{"categories":3188},[70],{"categories":3190},[],{"categories":3192},[117],{"categories":3194},[117],{"categories":3196},[109],{"categories":3198},[],{"categories":3200},[],{"categories":3202},[],{"categories":3204},[156],{"categories":3206},[117],{"categories":3208},[70],{"categories":3210},[],{"categories":3212},[],{"categories":3214},[],{"categories":3216},[156],{"categories":3218},[],{"categories":3220},[109],{"categories":3222},[],{"categories":3224},[],{"categories":3226},[156],{"categories":3228},[70],{"categories":3230},[135],{"categories":3232},[],{"categories":3234},[173],{"categories":3236},[135],{"categories":3238},[173],{"categories":3240},[70],{"categories":3242},[],{"categories":3244},[],{"categories":3246},[117],{"categories":3248},[],{"categories":3250},[],{"categories":3252},[117],{"categories":3254},[70],{"categories":3256},[],{"categories":3258},[117],{"categories":3260},[135],{"categories":3262},[173],{"categories":3264},[159],{"categories":3266},[117],{"categories":3268},[117],{"categories":3270},[],{"categories":3272},[],{"categories":3274},[],{"categories":3276},[135],{"categories":3278},[],{"categories":3280},[],{"categories":3282},[156],{"categories":3284},[109],{"categories":3286},[],{"categories":3288},[112],{"categories":3290},[173],{"categories":3292},[70],{"categories":3294},[166],{"categories":3296},[109],{"categories":3298},[159],{"categories":3300},[112],{"categories":3302},[166],{"categories":3304},[],{"categories":3306},[],{"categories":3308},[117],{"categories":3310},[109],{"categories":3312},[156],{"categories":3314},[109],{"categories":3316},[117],{"categories":3318},[430],{"categories":3320},[117],{"categories":3322},[],{"categories":3324},[70],{"categories":3326},[135],{"categories":3328},[166],{"categories":3330},[],{"categories":3332},[156],{"categories":3334},[135],{"categories":3336},[109],{"categories":3338},[117],{"categories":3340},[70],{"categories":3342},[112],{"categories":3344},[117,430],{"categories":3346},[117],{"categories":3348},[166],{"categories":3350},[70],{"categories":3352},[159],{"categories":3354},[173],{"categories":3356},[117],{"categories":3358},[],{"categories":3360},[117],{"categories":3362},[70],{"categories":3364},[112],{"categories":3366},[],{"categories":3368},[],{"categories":3370},[70],{"categories":3372},[159],{"categories":3374},[70],{"categories":3376},[],{"categories":3378},[135],{"categories":3380},[],{"categories":3382},[135],{"categories":3384},[166],{"categories":3386},[117],{"categories":3388},[70],{"categories":3390},[173],{"categories":3392},[166],{"categories":3394},[],{"categories":3396},[135],{"categories":3398},[70],{"categories":3400},[],{"categories":3402},[70],{"categories":3404},[117],{"categories":3406},[70],{"categories":3408},[117],{"categories":3410},[70],{"categories":3412},[70],{"categories":3414},[70],{"categories":3416},[70],{"categories":3418},[112],{"categories":3420},[],{"categories":3422},[120],{"categories":3424},[135],{"categories":3426},[70],{"categories":3428},[],{"categories":3430},[166],{"categories":3432},[70],{"categories":3434},[70],{"categories":3436},[117],{"categories":3438},[135],{"categories":3440},[70],{"categories":3442},[70],{"categories":3444},[112],{"categories":3446},[117],{"categories":3448},[156],{"categories":3450},[],{"categories":3452},[159],{"categories":3454},[70],{"categories":3456},[],{"categories":3458},[135],{"categories":3460},[173],{"categories":3462},[],{"categories":3464},[],{"categories":3466},[135],{"categories":3468},[135],{"categories":3470},[173],{"categories":3472},[109],{"categories":3474},[117],{"categories":3476},[117],{"categories":3478},[70],{"categories":3480},[112],{"categories":3482},[],{"categories":3484},[],{"categories":3486},[135],{"categories":3488},[159],{"categories":3490},[166],{"categories":3492},[117],{"categories":3494},[156],{"categories":3496},[159],{"categories":3498},[159],{"categories":3500},[],{"categories":3502},[135],{"categories":3504},[70],{"categories":3506},[70],{"categories":3508},[166],{"categories":3510},[],{"categories":3512},[135],{"categories":3514},[135],{"categories":3516},[135],{"categories":3518},[],{"categories":3520},[117],{"categories":3522},[70],{"categories":3524},[],{"categories":3526},[109],{"categories":3528},[112],{"categories":3530},[],{"categories":3532},[70],{"categories":3534},[70],{"categories":3536},[],{"categories":3538},[166],{"categories":3540},[],{"categories":3542},[],{"categories":3544},[],{"categories":3546},[],{"categories":3548},[70],{"categories":3550},[135],{"categories":3552},[],{"categories":3554},[],{"categories":3556},[70],{"categories":3558},[70],{"categories":3560},[70],{"categories":3562},[159],{"categories":3564},[70],{"categories":3566},[159],{"categories":3568},[],{"categories":3570},[159],{"categories":3572},[159],{"categories":3574},[430],{"categories":3576},[117],{"categories":3578},[166],{"categories":3580},[],{"categories":3582},[],{"categories":3584},[159],{"categories":3586},[166],{"categories":3588},[166],{"categories":3590},[166],{"categories":3592},[],{"categories":3594},[109],{"categories":3596},[166],{"categories":3598},[166],{"categories":3600},[109],{"categories":3602},[166],{"categories":3604},[112],{"categories":3606},[166],{"categories":3608},[166],{"categories":3610},[166],{"categories":3612},[159],{"categories":3614},[135],{"categories":3616},[135],{"categories":3618},[70],{"categories":3620},[166],{"categories":3622},[159],{"categories":3624},[430],{"categories":3626},[159],{"categories":3628},[159],{"categories":3630},[159],{"categories":3632},[],{"categories":3634},[112],{"categories":3636},[],{"categories":3638},[430],{"categories":3640},[166],{"categories":3642},[166],{"categories":3644},[166],{"categories":3646},[117],{"categories":3648},[135,112],{"categories":3650},[159],{"categories":3652},[],{"categories":3654},[],{"categories":3656},[159],{"categories":3658},[],{"categories":3660},[159],{"categories":3662},[135],{"categories":3664},[117],{"categories":3666},[],{"categories":3668},[166],{"categories":3670},[70],{"categories":3672},[156],{"categories":3674},[],{"categories":3676},[70],{"categories":3678},[],{"categories":3680},[135],{"categories":3682},[109],{"categories":3684},[159],{"categories":3686},[],{"categories":3688},[166],{"categories":3690},[135],[3692,3886,3956,4024],{"id":3693,"title":3694,"ai":3695,"body":3700,"categories":3859,"created_at":71,"date_modified":71,"description":62,"extension":72,"faq":71,"featured":73,"kicker_label":71,"meta":3860,"navigation":87,"path":3873,"published_at":3874,"question":71,"scraped_at":3875,"seo":3876,"sitemap":3877,"source_id":3878,"source_name":3879,"source_type":95,"source_url":3880,"stem":3881,"tags":3882,"thumbnail_url":71,"tldr":3883,"tweet":71,"unknown_tags":3884,"__hash__":3885},"summaries\u002Fsummaries\u002F333109d80f15bbdf-batch-size-unlocks-1000x-llm-inference-efficiency-summary.md","Batch Size Unlocks 1000x LLM Inference Efficiency",{"provider":7,"model":8,"input_tokens":3696,"output_tokens":3697,"processing_time_ms":3698,"cost_usd":3699},8770,2537,24557,0.003,{"type":14,"value":3701,"toc":3852},[3702,3706,3709,3712,3745,3748,3759,3762,3765,3769,3772,3775,3778,3781,3785,3788,3791,3794,3797,3800,3804,3807,3810,3813,3816,3820],[17,3703,3705],{"id":3704},"batch-size-dominates-latency-and-cost-tradeoffs","Batch Size Dominates Latency and Cost Tradeoffs",[22,3707,3708],{},"Reiner Pope breaks down autoregressive inference in transformers, where generating one new token requires a full forward pass attending to the entire KV cache of prior tokens. The KV cache—internal representations from past tokens—dominates memory fetches during attention, while weight matrix multiplies handle compute.",[22,3710,3711],{},"Using roofline analysis on a Blackwell NVL72 rack (72 GPUs), Pope models inference time as the maximum of compute time and memory time:",[3713,3714,3715,3728],"ul",{},[3716,3717,3718,3722,3723,3727],"li",{},[3719,3720,3721],"strong",{},"Compute time",": ",[3724,3725,3726],"code",{},"t_compute = (batch_size * active_params) \u002F FLOPs_per_chip",". Linear in batch size (B), as each sequence element processes active parameters (e.g., 37B for DeepSeek V3's MoE with 700B total).",[3716,3729,3730,3722,3733,3736,3737,3740,3741,3744],{},[3719,3731,3732],{},"Memory time",[3724,3734,3735],{},"t_memory = max(weight_fetch, KV_fetch)",", where ",[3724,3738,3739],{},"weight_fetch = total_params \u002F memory_bandwidth"," (constant, ~all 700B params) and ",[3724,3742,3743],{},"KV_fetch = (B * context_length * bytes_per_token) \u002F memory_bandwidth"," (linear in B and context).",[22,3746,3747],{},"Latency plot vs. B shows an initial flat region (memory-bound by weight fetches) transitioning to a steep compute-limited slope. At low B (e.g., 1), latency floors at weight fetch time (~15-20ms on HBM, capacity\u002Fbandwidth), but cost skyrockets.",[22,3749,3750,3751,3754,3755,3758],{},"Cost per token is ",[3724,3752,3753],{},"latency \u002F B",", transforming curves: compute and KV become constant, weight fetch hyperbolic (1\u002FB). Without batching, weight fetches aren't amortized, yielding \"a thousand times worse\" economics. Optimal B equates memory and compute: ",[3724,3756,3757],{},"B ≈ 300 * (total_params \u002F active_params)"," or ~300 * sparsity (e.g., 2400 for DeepSeek's 1\u002F8 sparsity). Practitioners use 2-3x larger for real-world inefficiencies, yielding ~2000 sequences or 128k tokens\u002Fsecond per rack (60\u002FB batches\u002Fsec).",[22,3760,3761],{},"\"If you do not batch together many users, the cost and the economics you get can be a thousand times worse than if you do batch many users together.\"",[22,3763,3764],{},"This explains \"Fast Mode\" (6x price for 2.5x speed): smaller B reduces queue wait but raises per-token cost via poor amortization. No viable \"Slow Mode\"—beyond optimal B, you're compute-bound with no further savings. Global scale (e.g., Gemini's millions tokens\u002Fsec) shards across thousands of racks.",[17,3766,3768],{"id":3767},"roofline-insights-into-hardware-and-context-limits","Roofline Insights into Hardware and Context Limits",[22,3770,3771],{},"Hardware ratio FLOPs\u002F(2 * memory_bandwidth) ~300 holds across A100-H100-B100, tying optimal B to sparsity alone, not scale. HBM capacity\u002Fbandwidth sets ~20ms cycle: racks process one full memory turnover per batch, reading weights\u002FKV mostly once (reads >> writes).",[22,3773,3774],{},"Context length shifts balance: KV slope matches compute at Goldilocks ~100k tokens; doubling to 200k halves MFU (memory-bound). Dense attention scales linearly with context; sparse (e.g., DeepSeek's sqrt scaling) resists this.",[22,3776,3777],{},"\"For the particular context length where the slopes match, that says I am equally memory-bound and compute-bound, which is a really desirable place to be.\"",[22,3779,3780],{},"Batching adds queue latency: fixed 20ms \"train departures\" mean worst-case 40ms wait + process. Centralization push mild—2000 concurrent users\u002Frack isn't huge, but tokens\u002Fsec scales to global traffic.",[17,3782,3784],{"id":3783},"scaling-to-clusters-moe-pipeline-and-training-overkill","Scaling to Clusters: MoE, Pipeline, and Training Overkill",[22,3786,3787],{},"Timestamps hint at cluster layouts: MoE spreads experts across GPU racks (e.g., 37B active\u002F700B total). Pipeline parallelism shards layers across racks, but Ilya Sutskever's quip \"pipelining is not wise\" stems from bubble inefficiencies.",[22,3789,3790],{},"RL drives 100x overtraining beyond Chinchilla-optimal pretrain, bloating params for post-training gains. Pope deduces long-context costs from API pricing: KV memory linear in context explains premiums.",[22,3792,3793],{},"Convergent evolution: nets and crypto both optimize sparse, high-dim ops.",[22,3795,3796],{},"\"Why Ilya said, 'As we now know, pipelining is not wise.'\"",[22,3798,3799],{},"Dwarkesh probes naively: sparse adoption uncertain, but DeepSeek publishes it. Jane Street tangent (sponsor): FPGAs for ns-latency trading vs. GPU batching.",[17,3801,3803],{"id":3802},"pricing-and-architecture-reverse-engineering","Pricing and Architecture Reverse-Engineering",[22,3805,3806],{},"API prices encode stack: fast modes shrink B, long-context hikes KV. Optimal B insensitive to size\u002Fsparsity ties progress to hardware stability.",[22,3808,3809],{},"Flashcards\u002Fpractice problems (reiner-flashcards.vercel.app) aid retention; full transcript markdown for LLM chat.",[22,3811,3812],{},"\"The cost initially starts very high at a batch size of one. It almost goes to infinity because we've got so many weight fetches that are not amortized over a large batch size.\"",[22,3814,3815],{},"Pope's full-stack view (chips to models) demystifies why AI evolves thus: batch economics favor dense clusters, sparse MoE, balanced compute\u002Fmemory.",[17,3817,3819],{"id":3818},"key-takeaways","Key Takeaways",[3713,3821,3822,3825,3828,3831,3834,3837,3840,3843,3846,3849],{},[3716,3823,3824],{},"Model inference time ≥ max( (B * active_params)\u002FFLOPs , total_params\u002Fbandwidth , (B * ctx * bytes\u002Ftoken)\u002Fbandwidth )—use roofline for predictions.",[3716,3826,3827],{},"Optimal batch ~300 * sparsity (e.g., 2400 tokens for 1\u002F8 MoE); run every 20ms for 128k tokens\u002Fsec\u002Frack.",[3716,3829,3830],{},"Cost\u002Ftoken = latency\u002FB: batching amortizes weights 1000x; fast modes use small B, no cheap slow mode possible.",[3716,3832,3833],{},"Context ~100k balances compute\u002Fmemory; sparse attention (DeepSeek) scales better via sqrt(ctx).",[3716,3835,3836],{},"Hardware FLOPs\u002F(2*BW) ~300 stable; pick B 2-3x optimal for real MFU.",[3716,3838,3839],{},"Queue latency ≤ 2 * batch_time (e.g., 40ms worst-case).",[3716,3841,3842],{},"RL overtrains 100x past Chinchilla; API prices reveal KV costs.",[3716,3844,3845],{},"Avoid pipeline parallelism bubbles; MoE shards experts across racks.",[3716,3847,3848],{},"Test your setup: equate weight_fetch = B * active_compute for balance.",[3716,3850,3851],{},"Build intuition: flashcards at reiner-flashcards.vercel.app.",{"title":62,"searchDepth":63,"depth":63,"links":3853},[3854,3855,3856,3857,3858],{"id":3704,"depth":63,"text":3705},{"id":3767,"depth":63,"text":3768},{"id":3783,"depth":63,"text":3784},{"id":3802,"depth":63,"text":3803},{"id":3818,"depth":63,"text":3819},[],{"content_references":3861,"triage":3871},[3862,3867],{"type":3863,"title":3864,"url":3865,"context":3866},"tool","Reiner flashcards and practice problems","https:\u002F\u002Freiner-flashcards.vercel.app\u002F","recommended",{"type":3868,"title":3869,"url":3870,"context":3866},"other","Markdown transcript of Reiner Pope lecture","https:\u002F\u002Fgist.github.com\u002Fdwarkeshsp\u002F79100f0fdeed69d76241903bb0604dbe",{"relevance":83,"novelty":84,"quality":84,"actionability":84,"composite":85,"reasoning":3872},"Category: AI & LLMs. The article provides in-depth analysis on how batch size impacts latency and cost in LLM inference, addressing a critical aspect of AI engineering that product builders need to consider. It offers actionable insights on optimizing batch sizes for efficiency, which is directly applicable to developers working with LLMs.","\u002Fsummaries\u002F333109d80f15bbdf-batch-size-unlocks-1000x-llm-inference-efficiency-summary","2026-04-29 17:20:27","2026-05-03 16:58:43",{"title":3694,"description":62},{"loc":3873},"4a9b4f0f4e55eb4e","Dwarkesh Patel","https:\u002F\u002Fwww.youtube.com\u002Fwatch?v=xmkSf5IS-zw","summaries\u002F333109d80f15bbdf-batch-size-unlocks-1000x-llm-inference-efficiency-summary",[99,100,101,102],"Reiner Pope deduces frontier LLM training and serving mechanics from roofline analysis, revealing batch size as the core driver of latency-cost tradeoffs, with optimal batches of ~2000 tokens amortizing weights for massive gains.",[],"qeSPy0ZxQcYxrXRD8vDDE3TXXiiSijULELBTTzq62BE",{"id":3887,"title":3888,"ai":3889,"body":3894,"categories":3930,"created_at":71,"date_modified":71,"description":62,"extension":72,"faq":71,"featured":73,"kicker_label":71,"meta":3931,"navigation":87,"path":3944,"published_at":3945,"question":71,"scraped_at":3945,"seo":3946,"sitemap":3947,"source_id":3948,"source_name":3949,"source_type":95,"source_url":3950,"stem":3951,"tags":3952,"thumbnail_url":71,"tldr":3953,"tweet":71,"unknown_tags":3954,"__hash__":3955},"summaries\u002Fsummaries\u002Fa2a811b50a4c64f5-mrc-resilient-networking-for-100k-gpu-ai-training-summary.md","MRC: Resilient Networking for 100K+ GPU AI Training",{"provider":7,"model":8,"input_tokens":3890,"output_tokens":3891,"processing_time_ms":3892,"cost_usd":3893},9014,2044,25377,0.0028023,{"type":14,"value":3895,"toc":3924},[3896,3900,3903,3907,3910,3914,3917,3921],[17,3897,3899],{"id":3898},"multi-plane-topologies-slash-switch-tiers-and-power-for-massive-clusters","Multi-Plane Topologies Slash Switch Tiers and Power for Massive Clusters",[22,3901,3902],{},"Traditional 800Gb\u002Fs networks require three or four tiers of switches to connect over 100,000 GPUs, increasing power use, failure points, and cost. MRC splits each 800Gb\u002Fs interface into eight 100Gb\u002Fs links, creating eight parallel 'planes' that connect to separate switches. A 64-port 800Gb\u002Fs switch now handles 512 ports at 100Gb\u002Fs, enabling full connectivity for 131,000 GPUs using only two tiers. This design boosts path diversity—keeping more traffic local to Tier 0 switches—while cutting components, power, and cost compared to single-plane setups. Without changes, single-path flows (like classic RoCE) still congest links as flows collide, especially in AI's collective communications where worst-case latency stalls synchronous training.",[17,3904,3906],{"id":3905},"packet-spraying-and-srv6-eliminate-congestion-and-dynamic-routing","Packet Spraying and SRv6 Eliminate Congestion and Dynamic Routing",[22,3908,3909],{},"MRC sprays packets from a single transfer across hundreds of paths spanning all planes, using final memory addresses for out-of-order reassembly at the destination. Adaptive load-balancing monitors paths: congestion triggers path swaps, packet loss retires the path (with probes for recovery), and 'packet trimming' at switches forwards headers only during destination congestion to prompt retransmits without false failure alarms. This achieves microsecond failure detection and rerouting, versus seconds for traditional fabrics. MRC replaces BGP dynamic routing with static SRv6 source routing: senders embed full switch ID sequences in IPv6 addresses. Switches shift addresses and follow pre-configured static tables, blindly forwarding without recomputing routes. Failures simply retire paths at endpoints, simplifying control planes and eliminating routing bugs from switch software.",[17,3911,3913],{"id":3912},"production-impact-zero-measurable-downtime-amid-constant-failures","Production Impact: Zero-Measurable Downtime Amid Constant Failures",[22,3915,3916],{},"In OpenAI's NVIDIA GB200 supercomputers (including OCI's Abilene Stargate site and Microsoft's Fairwater), MRC handles millions of links with frequent flaps—multiple per minute between tiers—yet synchronous pretraining jobs show no measurable impact, allowing deferred repairs. Rebooting four Tier-1 switches or repairing links during jobs requires no coordination; MRC avoids bad paths automatically. Real training data shows quick recovery from full T1 switch loss with temporary slowdowns far less than physical capacity loss (e.g., one failed port on an 8-port interface reduces max rate by 1\u002F8th but sustains better effective throughput via path recalculation). Multi-job clusters avoid inter-job interference due to core-wide congestion elimination, maximizing GPU utilization for frontier models like those powering ChatGPT (900M weekly users).",[17,3918,3920],{"id":3919},"strategic-wins-simpler-stacks-for-stargate-scale-compute","Strategic Wins: Simpler Stacks for Stargate-Scale Compute",[22,3922,3923],{},"MRC delivers three edges: two-tier multi-plane redundancy with lower power; zero core congestion for consistent flow throughput in sync training; and SRv6 for instant failure bypass via static planes. Deployed with AMD, Broadcom, Intel, Microsoft, NVIDIA hardware, it's released via Open Compute Project for industry adoption, supporting OpenAI's compute strategy of shared standards to scale AI infrastructure efficiently.",{"title":62,"searchDepth":63,"depth":63,"links":3925},[3926,3927,3928,3929],{"id":3898,"depth":63,"text":3899},{"id":3905,"depth":63,"text":3906},{"id":3912,"depth":63,"text":3913},{"id":3919,"depth":63,"text":3920},[430],{"content_references":3932,"triage":3940},[3933,3936],{"type":3868,"title":3934,"url":3935,"context":81},"OCP MRC 1.0","https:\u002F\u002Fwww.opencompute.org\u002Fdocuments\u002Focp-mrc-1-0-pdf",{"type":3937,"title":3938,"url":3939,"context":81},"paper","Resilient AI Supercomputer Networking using MRC and SRv6","https:\u002F\u002Fcdn.openai.com\u002Fpdf\u002Fresilient-ai-supercomputer-networking-using-mrc-and-srv6.pdf",{"relevance":3941,"novelty":3941,"quality":84,"actionability":63,"composite":3942,"reasoning":3943},3,3.05,"Category: DevOps & Cloud. The article discusses the MRC protocol's innovative networking solutions for AI training, which could be relevant for those building AI-powered products. However, it lacks direct actionable insights for the audience, focusing more on technical specifications than practical applications.","\u002Fsummaries\u002Fa2a811b50a4c64f5-mrc-resilient-networking-for-100k-gpu-ai-training-summary","2026-05-11 15:04:27",{"title":3888,"description":62},{"loc":3944},"a2a811b50a4c64f5","OpenAI News","https:\u002F\u002Fopenai.com\u002Findex\u002Fmrc-supercomputer-networking","summaries\u002Fa2a811b50a4c64f5-mrc-resilient-networking-for-100k-gpu-ai-training-summary",[100,101,102],"OpenAI's MRC protocol uses multi-plane topologies and packet spraying across hundreds of paths with SRv6 source routing to eliminate congestion, route around failures in microseconds, and connect 131k GPUs with just two switch tiers, enabling non-stop frontier model training.",[],"BYXvfLzxxajQIir95xuUTVdTfvID4wPt3TOVHNxrCSU",{"id":3957,"title":3958,"ai":3959,"body":3964,"categories":4001,"created_at":71,"date_modified":71,"description":62,"extension":72,"faq":71,"featured":73,"kicker_label":71,"meta":4002,"navigation":87,"path":4011,"published_at":4012,"question":71,"scraped_at":4013,"seo":4014,"sitemap":4015,"source_id":4016,"source_name":4017,"source_type":95,"source_url":4018,"stem":4019,"tags":4020,"thumbnail_url":71,"tldr":4021,"tweet":71,"unknown_tags":4022,"__hash__":4023},"summaries\u002Fsummaries\u002F30072e6e8b386729-mrc-openai-s-protocol-for-resilient-ai-training-ne-summary.md","MRC: OpenAI's Protocol for Resilient AI Training Networks",{"provider":7,"model":8,"input_tokens":3960,"output_tokens":3961,"processing_time_ms":3962,"cost_usd":3963},8465,1915,20569,0.00214365,{"type":14,"value":3965,"toc":3996},[3966,3970,3973,3976,3979,3983,3986,3989,3993],[17,3967,3969],{"id":3968},"multipath-mechanisms-eliminate-congestion-and-enable-fast-recovery","Multipath Mechanisms Eliminate Congestion and Enable Fast Recovery",[22,3971,3972],{},"In large AI training clusters, network congestion, link failures, and jitter cause GPU idle time, amplifying costs as clusters scale to millions of data transfers per step. MRC builds on RoCEv2 for hardware-accelerated RDMA over Ethernet and SRv6 for static source routing, shifting intelligence to NICs while switches follow pre-configured paths blindly. This avoids interference from dynamic routing.",[22,3974,3975],{},"Adaptive packet spraying distributes packets across hundreds of paths at the NIC level, achieving higher bandwidth, reduced tail latency, and packet-level load balancing—unlike single-path RoCEv2. For failures, MRC detects issues in microseconds and reroutes: if an 8-port 800Gb\u002Fs NIC loses one port, it drops to 7\u002F8 capacity but recalculates paths instantly, notifies peers to avoid the failed plane, and restores it within a minute upon recovery. Conventional fabrics take seconds to tens of seconds, often crashing jobs; MRC keeps training alive with minimal performance hit.",[22,3977,3978],{},"AMD's NSCC congestion control integrates via UEC specs, preserving RDMA semantics while adding multipath support.",[17,3980,3982],{"id":3981},"multi-plane-architecture-cuts-tiers-costs-and-latency","Multi-Plane Architecture Cuts Tiers, Costs, and Latency",[22,3984,3985],{},"MRC reimagines NICs as multiple smaller links (e.g., one 800Gb\u002Fs interface split into eight 100Gb\u002Fs to eight switches), enabling a two-tier Clos network for 131,000 GPUs versus three-to-four tiers in 800Gb\u002Fs designs. Longest paths cross three switches instead of five-to-seven, slashing latency.",[22,3987,3988],{},"For full bisection bandwidth, this uses 2\u002F3 the optics and 3\u002F5 the switches of three-tier networks, reducing power, cost, and failure blast radius. A tier-1 switch failure (e.g., rebooting four during training) no longer halts jobs.",[17,3990,3992],{"id":3991},"production-on-named-hardware-across-openai-clusters","Production on Named Hardware Across OpenAI Clusters",[22,3994,3995],{},"Deployed on 400\u002F800Gb\u002Fs RDMA NICs like NVIDIA ConnectX-8, AMD Pollara\u002FVulcano, Broadcom Thor Ultra; SRv6 switches include NVIDIA Spectrum-4\u002F5 (Cumulus\u002FSONiC) and Broadcom Tomahawk 5 (Arista EOS). Powers NVIDIA GB200 supercomputers in OpenAI's Stargate (OCI Abilene, TX) and Microsoft's Fairwater (Atlanta\u002FWisconsin), training ChatGPT and Codex models without job interruptions from failures.",{"title":62,"searchDepth":63,"depth":63,"links":3997},[3998,3999,4000],{"id":3968,"depth":63,"text":3969},{"id":3981,"depth":63,"text":3982},{"id":3991,"depth":63,"text":3992},[430],{"content_references":4003,"triage":4009},[4004,4006],{"type":3937,"title":3938,"url":3939,"context":4005},"cited",{"type":3868,"title":4007,"url":4008,"context":3866},"MRC Supercomputer Networking Technical Details","https:\u002F\u002Fopenai.com\u002Findex\u002Fmrc-supercomputer-networking\u002F",{"relevance":3941,"novelty":3941,"quality":84,"actionability":63,"composite":3942,"reasoning":4010},"Category: AI & LLMs. The article discusses OpenAI's MRC protocol, which is relevant to AI infrastructure but lacks direct applicability for product builders looking for actionable insights. While it presents some new technical details about network optimization for AI training, it does not provide practical steps or frameworks that the audience can implement.","\u002Fsummaries\u002F30072e6e8b386729-mrc-openai-s-protocol-for-resilient-ai-training-ne-summary","2026-05-07 07:50:02","2026-05-07 11:24:11",{"title":3958,"description":62},{"loc":4011},"30072e6e8b386729","MarkTechPost","https:\u002F\u002Fwww.marktechpost.com\u002F2026\u002F05\u002F07\u002Fopenai-introduces-mrc-multipath-reliable-connection-a-new-open-networking-protocol-for-large-scale-ai-supercomputer-training-clusters\u002F","summaries\u002F30072e6e8b386729-mrc-openai-s-protocol-for-resilient-ai-training-ne-summary",[100,101,102],"OpenAI's MRC extends RoCE with multipath spraying, microsecond failure recovery via SRv6, and multi-plane designs to deliver predictable performance in 131k-GPU clusters, using 2\u002F3 fewer optics and 3\u002F5 fewer switches than traditional setups.",[],"XbDsma4E_5cuB3WLtPi6GgqSNlQtb2CdSK-eHkIrlrc",{"id":4025,"title":4026,"ai":4027,"body":4032,"categories":4060,"created_at":71,"date_modified":71,"description":62,"extension":72,"faq":71,"featured":73,"kicker_label":71,"meta":4061,"navigation":87,"path":4072,"published_at":4073,"question":71,"scraped_at":4074,"seo":4075,"sitemap":4076,"source_id":4077,"source_name":4078,"source_type":95,"source_url":4079,"stem":4080,"tags":4081,"thumbnail_url":71,"tldr":4082,"tweet":71,"unknown_tags":4083,"__hash__":4084},"summaries\u002Fsummaries\u002Ff78d6045a31221d2-mrc-enables-100k-gpu-clusters-with-resilient-multi-summary.md","MRC Enables 100k+ GPU Clusters with Resilient Multipath Networking",{"provider":7,"model":8,"input_tokens":4028,"output_tokens":4029,"processing_time_ms":4030,"cost_usd":4031},4244,1621,21683,0.00163665,{"type":14,"value":4033,"toc":4055},[4034,4038,4041,4045,4048,4052],[17,4035,4037],{"id":4036},"multipath-routing-fixes-core-bottlenecks-in-ai-training","Multipath Routing Fixes Core Bottlenecks in AI Training",[22,4039,4040],{},"MRC (Multipath Reliable Connection) eliminates congestion in AI supercomputers by distributing packets across hundreds of network paths simultaneously, rather than single paths. This delivers faster, more predictable GPU-to-GPU data transfers critical for training massive models. On failures—links, switches, or paths—MRC reroutes in microseconds, versus seconds or tens of seconds for standard 800 Gb\u002Fs fabrics. Result: Training jobs survive reboots and maintenance without stalls. OpenAI's multi-plane design connects over 100,000 GPUs using only two Ethernet switch tiers, slashing component count, power use, and costs compared to conventional three- or four-tier setups.",[17,4042,4044],{"id":4043},"proven-at-scale-on-frontier-supercomputers","Proven at Scale on Frontier Supercomputers",[22,4046,4047],{},"Deployed across OpenAI's largest NVIDIA GB200 clusters—including Oracle Cloud in Abilene, Texas, and Microsoft's Fairwater—MRC handled a real-world test during frontier model training for ChatGPT and Codex. Four tier-1 switches rebooted without coordinating with running jobs, proving zero-disruption resilience. This lets operators maintain networks mid-training, boosting uptime for trillion-parameter models where network stalls previously cost hours or days.",[17,4049,4051],{"id":4050},"open-standards-accelerate-adoption","Open Standards Accelerate Adoption",[22,4053,4054],{},"Specification released via Open Compute Project (OCP MRC 1.0), with contributions from AMD, Broadcom, Intel, Microsoft, and NVIDIA. Builders can implement now for Ethernet-based AI fabrics, avoiding proprietary lock-in while hitting supercomputer-scale performance.",{"title":62,"searchDepth":63,"depth":63,"links":4056},[4057,4058,4059],{"id":4036,"depth":63,"text":4037},{"id":4043,"depth":63,"text":4044},{"id":4050,"depth":63,"text":4051},[135],{"content_references":4062,"triage":4070},[4063,4065,4067],{"type":3937,"title":4064,"url":3939,"context":81},"Resilient AI Supercomputer Networking Using MRC and SRv6",{"type":3868,"title":3934,"publisher":4066,"url":3935,"context":81},"Open Compute Project",{"type":3868,"title":4068,"author":4069,"url":4008,"context":4005},"MRC Supercomputer Networking","OpenAI",{"relevance":3941,"novelty":3941,"quality":84,"actionability":63,"composite":3942,"reasoning":4071},"Category: AI & LLMs. The article discusses a new networking protocol that addresses bottlenecks in AI supercomputing, which is relevant to AI engineering. However, it lacks direct actionable insights for product builders on how to implement or leverage this technology in their own projects.","\u002Fsummaries\u002Ff78d6045a31221d2-mrc-enables-100k-gpu-clusters-with-resilient-multi-summary","2026-05-06 19:13:21","2026-05-07 11:24:04",{"title":4026,"description":62},{"loc":4072},"f78d6045a31221d2","The Decoder","https:\u002F\u002Fthe-decoder.com\u002Fopenai-built-a-networking-protocol-with-amd-broadcom-intel-microsoft-and-nvidia-to-fix-ai-supercomputer-bottlenecks\u002F","summaries\u002Ff78d6045a31221d2-mrc-enables-100k-gpu-clusters-with-resilient-multi-summary",[101,102,100],"OpenAI's MRC protocol spreads packets across hundreds of paths for microsecond failure recovery, connecting 100,000+ GPUs via just 2 switch tiers—cutting power, cost, and downtime in AI training supercomputers.",[],"LvMASfYTesYX0l3RENkA3FOBQpD3T6H-0KnDqYX6HvU"]