[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"summary-df29e9b47ffb4ae6-financebench-llm-eval-dataset-for-sec-filing-qa-summary":3,"summaries-facets-categories":76,"summary-related-df29e9b47ffb4ae6-financebench-llm-eval-dataset-for-sec-filing-qa-summary":3661},{"id":4,"title":5,"ai":6,"body":13,"categories":46,"created_at":48,"date_modified":48,"description":40,"extension":49,"faq":48,"featured":50,"kicker_label":48,"meta":51,"navigation":58,"path":59,"published_at":48,"question":48,"scraped_at":60,"seo":61,"sitemap":62,"source_id":63,"source_name":64,"source_type":65,"source_url":66,"stem":67,"tags":68,"thumbnail_url":48,"tldr":73,"tweet":48,"unknown_tags":74,"__hash__":75},"summaries\u002Fsummaries\u002Fdf29e9b47ffb4ae6-financebench-llm-eval-dataset-for-sec-filing-qa-summary.md","FinanceBench: LLM Eval Dataset for SEC Filing QA",{"provider":7,"model":8,"input_tokens":9,"output_tokens":10,"processing_time_ms":11,"cost_usd":12},"openrouter","x-ai\u002Fgrok-4.1-fast",10599,1737,10323,0.00296565,{"type":14,"value":15,"toc":39},"minimark",[16,21,25,29,32,36],[17,18,20],"h2",{"id":19},"core-structure-enables-llm-financial-reasoning-benchmarks","Core Structure Enables LLM Financial Reasoning Benchmarks",[22,23,24],"p",{},"FinanceBench structures QA pairs from public company SEC filings (10K, 10Q, 8K) across sectors like Industrials (3M), IT (Adobe), Utilities (AES). Key columns include financebench_id, company, doc_name (e.g., 3M_2018_10K), question_type (metrics-generated, domain-relevant, novel-generated), question_reasoning (information extraction, numerical\u002Flogical reasoning), question, answer, justification, evidence (text snippets\u002Fpages), gics_sector, doc_type, doc_period (e.g., 2018-2023), doc_link. All subsets labeled OPEN_SOURCE. Enables testing LLMs on production-grade tasks: direct extraction (e.g., 3M FY2018 CAPEX $1577M from 'Purchases of PP&E'), calculated metrics (e.g., Adobe FY2015 operating cash flow ratio 0.66 = cash from ops \u002F current liabilities), multi-year averages (Activision Blizzard FY2017-19 capex\u002Frevenue 1.9%).",[17,26,28],{"id":27},"numerical-reasoning-tasks-build-real-world-ratios","Numerical Reasoning Tasks Build Real-World Ratios",[22,30,31],{},"Dataset stresses formula-based computations from balance sheets, income\u002Fcash flow statements. Examples: fixed asset turnover (Activision Blizzard FY2019: 24.26 = revenue \u002F avg PP&E); DPO (Amazon FY2017: 93.86 = 365 * avg payables \u002F (COGS + Δinventory)); inventory turnover (AES FY2022: 9.5 = cost of sales \u002F inventory); ROA (AES FY2022: -0.02 = net income \u002F avg total assets); FCF conversion (Adobe FY2022: improved 143% to 156% = (ops cash - CAPEX) \u002F net income); YoY changes (Amazon revenue FY16-17: 30.8%; Adobe op income FY15-16: 65.4%). Justifications detail line items (e.g., 'Net cash provided by operating activities') and math steps, with evidence texts\u002Fpages for verifiability.",[17,33,35],{"id":34},"domain-relevant-and-novel-questions-test-analyst-insights","Domain-Relevant and Novel Questions Test Analyst Insights",[22,37,38],{},"Beyond extraction, probes qualitative\u002Fquantitative judgment: capital intensity (3M FY2022: no, via 5.1% CAPEX\u002Frevenue, 20% fixed assets\u002Ftotal assets, 12.4% ROA); liquidity (3M Q2 FY2023 quick ratio 0.96 = (current assets - inventory) \u002F current liabilities, needs improvement); operating margin drivers (3M FY2022 decline 1.7% from litigation\u002FPFAS exit); segment growth (3M consumer -0.9% organic excluding M&A); dividend stability (3M 65 consecutive years increases); debt securities (3M Q2 2023: MMM26\u002F30\u002F31 on NYSE); restructuring costs (AES FY2022: 0, not outlined). Novel tasks like 'segment dragging growth' or 8K agendas (Amcor 2022: debt substitution) mimic analyst workflows, grounding LLMs in evidence-based reasoning over filings.",{"title":40,"searchDepth":41,"depth":41,"links":42},"",2,[43,44,45],{"id":19,"depth":41,"text":20},{"id":27,"depth":41,"text":28},{"id":34,"depth":41,"text":35},[47],"AI & LLMs",null,"md",false,{"content_references":52,"triage":53},[],{"relevance":54,"novelty":55,"quality":55,"actionability":41,"composite":56,"reasoning":57},3,4,3.25,"Category: AI & LLMs. The article provides a dataset for evaluating LLMs on financial QA tasks, which is relevant for AI developers looking to integrate financial reasoning into their products. However, while it presents novel insights into the dataset's structure and applications, it lacks actionable steps for implementation.",true,"\u002Fsummaries\u002Fdf29e9b47ffb4ae6-financebench-llm-eval-dataset-for-sec-filing-qa-summary","2026-04-16 02:57:08",{"title":5,"description":40},{"loc":59},"df29e9b47ffb4ae6","__oneoff__","article","https:\u002F\u002Fhuggingface.co\u002Fdatasets\u002FPatronusAI\u002Ffinancebench","summaries\u002Fdf29e9b47ffb4ae6-financebench-llm-eval-dataset-for-sec-filing-qa-summary",[69,70,71,72],"llm","data-science","machine-learning","research","FinanceBench benchmarks LLMs on 10K+ financial QA tasks from real 10K\u002F10Q filings, covering metric extraction, numerical ratios like ROA (-0.02 for AES), and domain reasoning like liquidity via quick ratio (0.96 for 3M).",[],"PVbgs9cbbO3dtOWaj0J_mTAqpv6rBJ4-p_CvQdpOaSc",[77,80,83,85,88,91,93,95,97,99,101,103,106,108,110,112,114,116,118,120,122,124,127,130,132,134,137,139,141,144,146,148,150,152,154,156,158,160,162,164,166,168,170,172,174,176,178,180,182,184,186,188,190,192,194,196,198,200,202,204,206,208,210,212,214,216,218,220,222,224,226,228,230,232,234,236,238,240,242,244,246,248,250,252,254,256,258,260,262,264,266,268,270,272,274,276,278,280,282,284,286,288,290,292,294,296,298,300,302,304,306,308,310,312,314,316,318,320,322,324,326,328,330,332,334,336,338,340,342,344,346,348,350,352,354,356,358,360,362,364,366,368,370,372,374,376,378,380,382,384,386,388,390,392,394,396,398,401,403,405,407,409,411,413,415,417,419,421,423,425,427,429,431,433,435,437,439,441,443,445,447,449,451,453,455,457,459,461,463,465,467,469,471,473,475,477,479,481,483,485,487,489,491,493,495,497,499,501,503,505,507,509,511,513,515,517,519,521,523,525,527,529,531,533,535,537,539,541,543,545,547,549,551,553,555,557,559,561,563,565,567,569,571,573,575,577,579,581,583,585,587,589,591,593,595,597,599,601,603,605,607,609,611,613,615,617,619,621,623,625,627,629,631,633,635,637,639,641,643,645,647,649,651,653,655,657,659,661,663,665,667,669,671,673,675,677,679,681,683,685,687,689,691,693,695,697,699,701,703,705,707,709,711,713,715,717,719,721,723,725,727,729,731,733,735,737,739,741,743,745,747,749,751,753,755,757,759,761,763,765,767,769,771,773,775,777,779,781,783,785,787,789,791,793,795,797,799,801,803,805,807,809,811,813,815,817,819,821,823,825,827,829,831,833,835,837,839,841,843,845,847,849,851,853,855,857,859,861,863,865,867,869,871,873,875,877,879,881,883,885,887,889,891,893,895,897,899,901,903,905,907,909,911,913,915,917,919,921,923,925,927,929,931,933,935,937,939,941,943,945,947,949,951,953,955,957,959,961,963,965,967,969,971,973,975,977,979,981,983,985,987,989,991,993,995,997,999,1001,1003,1005,1007,1009,1011,1013,1015,1017,1019,1021,1023,1025,1027,1029,1031,1033,1035,1037,1039,1041,1043,1045,1047,1049,1051,1053,1055,1057,1059,1061,1063,1065,1067,1069,1071,1073,1075,1077,1079,1081,1083,1085,1087,1089,1091,1093,1095,1097,1099,1101,1103,1105,1107,1109,1111,1113,1115,1117,1119,1121,1123,1125,1127,1129,1131,1133,1135,1137,1139,1141,1143,1145,1147,1149,1151,1153,1155,1157,1159,1161,1163,1165,1167,1169,1171,1173,1175,1177,1179,1181,1183,1185,1187,1189,1191,1193,1195,1197,1199,1201,1203,1205,1207,1209,1211,1213,1215,1217,1219,1221,1223,1225,1227,1229,1231,1233,1235,1237,1239,1241,1243,1245,1247,1249,1251,1253,1255,1257,1259,1261,1263,1265,1267,1269,1271,1273,1275,1277,1279,1281,1283,1285,1287,1289,1291,1293,1295,1297,1299,1301,1303,1305,1307,1309,1311,1313,1315,1317,1319,1321,1323,1325,1327,1329,1331,1333,1335,1337,1339,1341,1343,1345,1347,1349,1351,1353,1355,1357,1359,1361,1363,1365,1367,1369,1371,1373,1375,1377,1379,1381,1383,1385,1387,1389,1391,1393,1395,1397,1399,1401,1403,1405,1407,1409,1411,1413,1415,1417,1419,1421,1423,1425,1427,1429,1431,1433,1435,1437,1439,1441,1443,1445,1447,1449,1451,1453,1455,1457,1459,1461,1463,1465,1467,1469,1471,1473,1475,1477,1479,1481,1483,1485,1487,1489,1491,1493,1495,1497,1499,1501,1503,1505,1507,1509,1511,1513,1515,1517,1519,1521,1523,1525,1527,1529,1531,1533,1535,1537,1539,1541,1543,1545,1547,1549,1551,1553,1555,1557,1559,1561,1563,1565,1567,1569,1571,1573,1575,1577,1579,1581,1583,1585,1587,1589,1591,1593,1595,1597,1599,1601,1603,1605,1607,1609,1611,1613,1615,1617,1619,1621,1623,1625,1627,1629,1631,1633,1635,1637,1639,1641,1643,1645,1647,1649,1651,1653,1655,1657,1659,1661,1663,1665,1667,1669,1671,1673,1675,1677,1679,1681,1683,1685,1687,1689,1691,1693,1695,1697,1699,1701,1703,1705,1707,1709,1711,1713,1715,1717,1719,1721,1723,1725,1727,1729,1731,1733,1735,1737,1739,1741,1743,1745,1747,1749,1751,1753,1755,1757,1759,1761,1763,1765,1767,1769,1771,1773,1775,1777,1779,1781,1783,1785,1787,1789,1791,1793,1795,1797,1799,1801,1803,1805,1807,1809,1811,1813,1815,1817,1819,1821,1823,1825,1827,1829,1831,1833,1835,1837,1839,1841,1843,1845,1847,1849,1851,1853,1855,1857,1859,1861,1863,1865,1867,1869,1871,1873,1875,1877,1879,1881,1883,1885,1887,1889,1891,1893,1895,1897,1899,1901,1903,1905,1907,1909,1911,1913,1915,1917,1919,1921,1923,1925,1927,1929,1931,1933,1935,1937,1939,1941,1943,1945,1947,1949,1951,1953,1955,1957,1959,1961,1963,1965,1967,1969,1971,1973,1975,1977,1979,1981,1983,1985,1987,1989,1991,1993,1995,1997,1999,2001,2003,2005,2007,2009,2011,2013,2015,2017,2019,2021,2023,2025,2027,2029,2031,2033,2035,2037,2039,2041,2043,2045,2047,2049,2051,2053,2055,2057,2059,2061,2063,2065,2067,2069,2071,2073,2075,2077,2079,2081,2083,2085,2087,2089,2091,2093,2095,2097,2099,2101,2103,2105,2107,2109,2111,2113,2115,2117,2119,2121,2123,2125,2127,2129,2131,2133,2135,2137,2139,2141,2143,2145,2147,2149,2151,2153,2155,2157,2159,2161,2163,2165,2167,2169,2171,2173,2175,2177,2179,2181,2183,2185,2187,2189,2191,2193,2195,2197,2199,2201,2203,2205,2207,2209,2211,2213,2215,2217,2219,2221,2223,2225,2227,2229,2231,2233,2235,2237,2239,2241,2243,2245,2247,2249,2251,2253,2255,2257,2259,2261,2263,2265,2267,2269,2271,2273,2275,2277,2279,2281,2283,2285,2287,2289,2291,2293,2295,2297,2299,2301,2303,2305,2307,2309,2311,2313,2315,2317,2319,2321,2323,2325,2327,2329,2331,2333,2335,2337,2339,2341,2343,2345,2347,2349,2351,2353,2355,2357,2359,2361,2363,2365,2367,2369,2371,2373,2375,2377,2379,2381,2383,2385,2387,2389,2391,2393,2395,2397,2399,2401,2403,2405,2407,2409,2411,2413,2415,2417,2419,2421,2423,2425,2427,2429,2431,2433,2435,2437,2439,2441,2443,2445,2447,2449,2451,2453,2455,2457,2459,2461,2463,2465,2467,2469,2471,2473,2475,2477,2479,2481,2483,2485,2487,2489,2491,2493,2495,2497,2499,2501,2503,2505,2507,2509,2511,2513,2515,2517,2519,2521,2523,2525,2527,2529,2531,2533,2535,2537,2539,2541,2543,2545,2547,2549,2551,2553,2555,2557,2559,2561,2563,2565,2567,2569,2571,2573,2575,2577,2579,2581,2583,2585,2587,2589,2591,2593,2595,2597,2599,2601,2603,2605,2607,2609,2611,2613,2615,2617,2619,2621,2623,2625,2627,2629,2631,2633,2635,2637,2639,2641,2643,2645,2647,2649,2651,2653,2655,2657,2659,2661,2663,2665,2667,2669,2671,2673,2675,2677,2679,2681,2683,2685,2687,2689,2691,2693,2695,2697,2699,2701,2703,2705,2707,2709,2711,2713,2715,2717,2719,2721,2723,2725,2727,2729,2731,2733,2735,2737,2739,2741,2743,2745,2747,2749,2751,2753,2755,2757,2759,2761,2763,2765,2767,2769,2771,2773,2775,2777,2779,2781,2783,2785,2787,2789,2791,2793,2795,2797,2799,2801,2803,2805,2807,2809,2811,2813,2815,2817,2819,2821,2823,2825,2827,2829,2831,2833,2835,2837,2839,2841,2843,2845,2847,2849,2851,2853,2855,2857,2859,2861,2863,2865,2867,2869,2871,2873,2875,2877,2879,2881,2883,2885,2887,2889,2891,2893,2895,2897,2899,2901,2903,2905,2907,2909,2911,2913,2915,2917,2919,2921,2923,2925,2927,2929,2931,2933,2935,2937,2939,2941,2943,2945,2947,2949,2951,2953,2955,2957,2959,2961,2963,2965,2967,2969,2971,2973,2975,2977,2979,2981,2983,2985,2987,2989,2991,2993,2995,2997,2999,3001,3003,3005,3007,3009,3011,3013,3015,3017,3019,3021,3023,3025,3027,3029,3031,3033,3035,3037,3039,3041,3043,3045,3047,3049,3051,3053,3055,3057,3059,3061,3063,3065,3067,3069,3071,3073,3075,3077,3079,3081,3083,3085,3087,3089,3091,3093,3095,3097,3099,3101,3103,3105,3107,3109,3111,3113,3115,3117,3119,3121,3123,3125,3127,3129,3131,3133,3135,3137,3139,3141,3143,3145,3147,3149,3151,3153,3155,3157,3159,3161,3163,3165,3167,3169,3171,3173,3175,3177,3179,3181,3183,3185,3187,3189,3191,3193,3195,3197,3199,3201,3203,3205,3207,3209,3211,3213,3215,3217,3219,3221,3223,3225,3227,3229,3231,3233,3235,3237,3239,3241,3243,3245,3247,3249,3251,3253,3255,3257,3259,3261,3263,3265,3267,3269,3271,3273,3275,3277,3279,3281,3283,3285,3287,3289,3291,3293,3295,3297,3299,3301,3303,3305,3307,3309,3311,3313,3315,3317,3319,3321,3323,3325,3327,3329,3331,3333,3335,3337,3339,3341,3343,3345,3347,3349,3351,3353,3355,3357,3359,3361,3363,3365,3367,3369,3371,3373,3375,3377,3379,3381,3383,3385,3387,3389,3391,3393,3395,3397,3399,3401,3403,3405,3407,3409,3411,3413,3415,3417,3419,3421,3423,3425,3427,3429,3431,3433,3435,3437,3439,3441,3443,3445,3447,3449,3451,3453,3455,3457,3459,3461,3463,3465,3467,3469,3471,3473,3475,3477,3479,3481,3483,3485,3487,3489,3491,3493,3495,3497,3499,3501,3503,3505,3507,3509,3511,3513,3515,3517,3519,3521,3523,3525,3527,3529,3531,3533,3535,3537,3539,3541,3543,3545,3547,3549,3551,3553,3555,3557,3559,3561,3563,3565,3567,3569,3571,3573,3575,3577,3579,3581,3583,3585,3587,3589,3591,3593,3595,3597,3599,3601,3603,3605,3607,3609,3611,3613,3615,3617,3619,3621,3623,3625,3627,3629,3631,3633,3635,3637,3639,3641,3643,3645,3647,3649,3651,3653,3655,3657,3659],{"categories":78},[79],"Developer Productivity",{"categories":81},[82],"Business & SaaS",{"categories":84},[47],{"categories":86},[87],"AI Automation",{"categories":89},[90],"Product Strategy",{"categories":92},[47],{"categories":94},[79],{"categories":96},[82],{"categories":98},[],{"categories":100},[47],{"categories":102},[],{"categories":104},[105],"AI News & Trends",{"categories":107},[87],{"categories":109},[105],{"categories":111},[87],{"categories":113},[87],{"categories":115},[47],{"categories":117},[47],{"categories":119},[105],{"categories":121},[47],{"categories":123},[],{"categories":125},[126],"Design & Frontend",{"categories":128},[129],"Data Science & Visualization",{"categories":131},[105],{"categories":133},[],{"categories":135},[136],"Software Engineering",{"categories":138},[47],{"categories":140},[87],{"categories":142},[143],"Marketing & Growth",{"categories":145},[47],{"categories":147},[87],{"categories":149},[],{"categories":151},[],{"categories":153},[126],{"categories":155},[87],{"categories":157},[79],{"categories":159},[126],{"categories":161},[47],{"categories":163},[87],{"categories":165},[105],{"categories":167},[],{"categories":169},[],{"categories":171},[87],{"categories":173},[136],{"categories":175},[],{"categories":177},[82],{"categories":179},[],{"categories":181},[],{"categories":183},[87],{"categories":185},[87],{"categories":187},[47],{"categories":189},[],{"categories":191},[136],{"categories":193},[],{"categories":195},[],{"categories":197},[],{"categories":199},[47],{"categories":201},[143],{"categories":203},[126],{"categories":205},[126],{"categories":207},[47],{"categories":209},[87],{"categories":211},[47],{"categories":213},[47],{"categories":215},[87],{"categories":217},[87],{"categories":219},[129],{"categories":221},[105],{"categories":223},[87],{"categories":225},[143],{"categories":227},[87],{"categories":229},[90],{"categories":231},[],{"categories":233},[87],{"categories":235},[],{"categories":237},[87],{"categories":239},[136],{"categories":241},[126],{"categories":243},[47],{"categories":245},[],{"categories":247},[],{"categories":249},[87],{"categories":251},[],{"categories":253},[47],{"categories":255},[],{"categories":257},[79],{"categories":259},[136],{"categories":261},[82],{"categories":263},[105],{"categories":265},[47],{"categories":267},[],{"categories":269},[47],{"categories":271},[],{"categories":273},[136],{"categories":275},[129],{"categories":277},[],{"categories":279},[47],{"categories":281},[126],{"categories":283},[],{"categories":285},[126],{"categories":287},[87],{"categories":289},[],{"categories":291},[87],{"categories":293},[105],{"categories":295},[82],{"categories":297},[47],{"categories":299},[],{"categories":301},[87],{"categories":303},[47],{"categories":305},[90],{"categories":307},[],{"categories":309},[47],{"categories":311},[87],{"categories":313},[87],{"categories":315},[],{"categories":317},[129],{"categories":319},[47],{"categories":321},[],{"categories":323},[79],{"categories":325},[82],{"categories":327},[47],{"categories":329},[87],{"categories":331},[136],{"categories":333},[47],{"categories":335},[],{"categories":337},[],{"categories":339},[47],{"categories":341},[],{"categories":343},[126],{"categories":345},[],{"categories":347},[47],{"categories":349},[],{"categories":351},[87],{"categories":353},[47],{"categories":355},[126],{"categories":357},[],{"categories":359},[47],{"categories":361},[47],{"categories":363},[82],{"categories":365},[87],{"categories":367},[47],{"categories":369},[126],{"categories":371},[87],{"categories":373},[],{"categories":375},[],{"categories":377},[105],{"categories":379},[],{"categories":381},[47],{"categories":383},[82,143],{"categories":385},[],{"categories":387},[47],{"categories":389},[],{"categories":391},[],{"categories":393},[47],{"categories":395},[],{"categories":397},[47],{"categories":399},[400],"DevOps & Cloud",{"categories":402},[],{"categories":404},[105],{"categories":406},[126],{"categories":408},[],{"categories":410},[105],{"categories":412},[105],{"categories":414},[47],{"categories":416},[143],{"categories":418},[],{"categories":420},[82],{"categories":422},[],{"categories":424},[47,400],{"categories":426},[47],{"categories":428},[47],{"categories":430},[87],{"categories":432},[47,136],{"categories":434},[129],{"categories":436},[47],{"categories":438},[143],{"categories":440},[87],{"categories":442},[87],{"categories":444},[],{"categories":446},[87],{"categories":448},[47,82],{"categories":450},[],{"categories":452},[126],{"categories":454},[126],{"categories":456},[],{"categories":458},[],{"categories":460},[105],{"categories":462},[],{"categories":464},[79],{"categories":466},[136],{"categories":468},[47],{"categories":470},[126],{"categories":472},[87],{"categories":474},[136],{"categories":476},[105],{"categories":478},[126],{"categories":480},[],{"categories":482},[47],{"categories":484},[47],{"categories":486},[47],{"categories":488},[105],{"categories":490},[79],{"categories":492},[47],{"categories":494},[87],{"categories":496},[400],{"categories":498},[126],{"categories":500},[87],{"categories":502},[],{"categories":504},[],{"categories":506},[126],{"categories":508},[105],{"categories":510},[129],{"categories":512},[],{"categories":514},[47],{"categories":516},[47],{"categories":518},[82],{"categories":520},[47],{"categories":522},[47],{"categories":524},[105],{"categories":526},[],{"categories":528},[87],{"categories":530},[136],{"categories":532},[],{"categories":534},[47],{"categories":536},[47],{"categories":538},[87],{"categories":540},[],{"categories":542},[],{"categories":544},[47],{"categories":546},[],{"categories":548},[82],{"categories":550},[87],{"categories":552},[],{"categories":554},[79],{"categories":556},[47],{"categories":558},[82],{"categories":560},[105],{"categories":562},[],{"categories":564},[],{"categories":566},[],{"categories":568},[105],{"categories":570},[105],{"categories":572},[],{"categories":574},[],{"categories":576},[82],{"categories":578},[],{"categories":580},[],{"categories":582},[79],{"categories":584},[],{"categories":586},[143],{"categories":588},[87],{"categories":590},[82],{"categories":592},[87],{"categories":594},[136],{"categories":596},[],{"categories":598},[90],{"categories":600},[126],{"categories":602},[136],{"categories":604},[47],{"categories":606},[87],{"categories":608},[82],{"categories":610},[47],{"categories":612},[],{"categories":614},[],{"categories":616},[136],{"categories":618},[129],{"categories":620},[90],{"categories":622},[87],{"categories":624},[47],{"categories":626},[],{"categories":628},[400],{"categories":630},[],{"categories":632},[87],{"categories":634},[],{"categories":636},[],{"categories":638},[47],{"categories":640},[126],{"categories":642},[143],{"categories":644},[87],{"categories":646},[],{"categories":648},[79],{"categories":650},[],{"categories":652},[105],{"categories":654},[47,400],{"categories":656},[105],{"categories":658},[47],{"categories":660},[82],{"categories":662},[47],{"categories":664},[],{"categories":666},[82],{"categories":668},[],{"categories":670},[136],{"categories":672},[126],{"categories":674},[105],{"categories":676},[129],{"categories":678},[79],{"categories":680},[47],{"categories":682},[136],{"categories":684},[],{"categories":686},[],{"categories":688},[90],{"categories":690},[],{"categories":692},[47],{"categories":694},[],{"categories":696},[126],{"categories":698},[126],{"categories":700},[126],{"categories":702},[],{"categories":704},[],{"categories":706},[105],{"categories":708},[87],{"categories":710},[47],{"categories":712},[47],{"categories":714},[47],{"categories":716},[82],{"categories":718},[47],{"categories":720},[],{"categories":722},[136],{"categories":724},[136],{"categories":726},[82],{"categories":728},[],{"categories":730},[47],{"categories":732},[47],{"categories":734},[82],{"categories":736},[105],{"categories":738},[143],{"categories":740},[87],{"categories":742},[],{"categories":744},[126],{"categories":746},[],{"categories":748},[47],{"categories":750},[],{"categories":752},[82],{"categories":754},[87],{"categories":756},[],{"categories":758},[400],{"categories":760},[129],{"categories":762},[136],{"categories":764},[143],{"categories":766},[136],{"categories":768},[87],{"categories":770},[],{"categories":772},[],{"categories":774},[87],{"categories":776},[79],{"categories":778},[87],{"categories":780},[90],{"categories":782},[82],{"categories":784},[],{"categories":786},[47],{"categories":788},[90],{"categories":790},[47],{"categories":792},[47],{"categories":794},[143],{"categories":796},[126],{"categories":798},[87],{"categories":800},[],{"categories":802},[],{"categories":804},[400],{"categories":806},[136],{"categories":808},[],{"categories":810},[87],{"categories":812},[47],{"categories":814},[126,47],{"categories":816},[79],{"categories":818},[],{"categories":820},[47],{"categories":822},[79],{"categories":824},[126],{"categories":826},[87],{"categories":828},[136],{"categories":830},[],{"categories":832},[47],{"categories":834},[],{"categories":836},[79],{"categories":838},[],{"categories":840},[87],{"categories":842},[90],{"categories":844},[47],{"categories":846},[47],{"categories":848},[126],{"categories":850},[87],{"categories":852},[400],{"categories":854},[126],{"categories":856},[87],{"categories":858},[47],{"categories":860},[47],{"categories":862},[47],{"categories":864},[105],{"categories":866},[],{"categories":868},[90],{"categories":870},[87],{"categories":872},[126],{"categories":874},[87],{"categories":876},[136],{"categories":878},[126],{"categories":880},[87],{"categories":882},[105],{"categories":884},[],{"categories":886},[47],{"categories":888},[126],{"categories":890},[47],{"categories":892},[79],{"categories":894},[105],{"categories":896},[47],{"categories":898},[143],{"categories":900},[47],{"categories":902},[47],{"categories":904},[87],{"categories":906},[87],{"categories":908},[47],{"categories":910},[87],{"categories":912},[126],{"categories":914},[47],{"categories":916},[],{"categories":918},[],{"categories":920},[136],{"categories":922},[],{"categories":924},[79],{"categories":926},[400],{"categories":928},[],{"categories":930},[79],{"categories":932},[82],{"categories":934},[143],{"categories":936},[],{"categories":938},[82],{"categories":940},[],{"categories":942},[],{"categories":944},[],{"categories":946},[],{"categories":948},[],{"categories":950},[47],{"categories":952},[87],{"categories":954},[400],{"categories":956},[79],{"categories":958},[47],{"categories":960},[136],{"categories":962},[90],{"categories":964},[47],{"categories":966},[143],{"categories":968},[47],{"categories":970},[47],{"categories":972},[47],{"categories":974},[47,79],{"categories":976},[136],{"categories":978},[136],{"categories":980},[126],{"categories":982},[47],{"categories":984},[],{"categories":986},[],{"categories":988},[],{"categories":990},[136],{"categories":992},[129],{"categories":994},[105],{"categories":996},[126],{"categories":998},[],{"categories":1000},[47],{"categories":1002},[47],{"categories":1004},[],{"categories":1006},[],{"categories":1008},[87],{"categories":1010},[47],{"categories":1012},[82],{"categories":1014},[],{"categories":1016},[79],{"categories":1018},[47],{"categories":1020},[79],{"categories":1022},[47],{"categories":1024},[136],{"categories":1026},[143],{"categories":1028},[47,126],{"categories":1030},[105],{"categories":1032},[126],{"categories":1034},[],{"categories":1036},[400],{"categories":1038},[126],{"categories":1040},[87],{"categories":1042},[],{"categories":1044},[],{"categories":1046},[],{"categories":1048},[],{"categories":1050},[136],{"categories":1052},[87],{"categories":1054},[87],{"categories":1056},[400],{"categories":1058},[47],{"categories":1060},[47],{"categories":1062},[47],{"categories":1064},[],{"categories":1066},[126],{"categories":1068},[],{"categories":1070},[],{"categories":1072},[87],{"categories":1074},[],{"categories":1076},[],{"categories":1078},[143],{"categories":1080},[143],{"categories":1082},[87],{"categories":1084},[],{"categories":1086},[47],{"categories":1088},[47],{"categories":1090},[136],{"categories":1092},[126],{"categories":1094},[126],{"categories":1096},[87],{"categories":1098},[79],{"categories":1100},[47],{"categories":1102},[126],{"categories":1104},[126],{"categories":1106},[87],{"categories":1108},[87],{"categories":1110},[47],{"categories":1112},[],{"categories":1114},[],{"categories":1116},[47],{"categories":1118},[87],{"categories":1120},[105],{"categories":1122},[136],{"categories":1124},[79],{"categories":1126},[47],{"categories":1128},[],{"categories":1130},[87],{"categories":1132},[87],{"categories":1134},[],{"categories":1136},[79],{"categories":1138},[47],{"categories":1140},[79],{"categories":1142},[79],{"categories":1144},[],{"categories":1146},[],{"categories":1148},[87],{"categories":1150},[87],{"categories":1152},[47],{"categories":1154},[47],{"categories":1156},[105],{"categories":1158},[129],{"categories":1160},[90],{"categories":1162},[105],{"categories":1164},[126],{"categories":1166},[],{"categories":1168},[105],{"categories":1170},[],{"categories":1172},[],{"categories":1174},[],{"categories":1176},[],{"categories":1178},[136],{"categories":1180},[129],{"categories":1182},[],{"categories":1184},[47],{"categories":1186},[47],{"categories":1188},[129],{"categories":1190},[136],{"categories":1192},[],{"categories":1194},[],{"categories":1196},[87],{"categories":1198},[105],{"categories":1200},[105],{"categories":1202},[87],{"categories":1204},[79],{"categories":1206},[47,400],{"categories":1208},[],{"categories":1210},[126],{"categories":1212},[79],{"categories":1214},[87],{"categories":1216},[126],{"categories":1218},[],{"categories":1220},[87],{"categories":1222},[87],{"categories":1224},[47],{"categories":1226},[143],{"categories":1228},[136],{"categories":1230},[126],{"categories":1232},[],{"categories":1234},[87],{"categories":1236},[47],{"categories":1238},[87],{"categories":1240},[87],{"categories":1242},[87],{"categories":1244},[143],{"categories":1246},[87],{"categories":1248},[47],{"categories":1250},[],{"categories":1252},[143],{"categories":1254},[105],{"categories":1256},[87],{"categories":1258},[],{"categories":1260},[],{"categories":1262},[47],{"categories":1264},[87],{"categories":1266},[105],{"categories":1268},[87],{"categories":1270},[],{"categories":1272},[],{"categories":1274},[],{"categories":1276},[87],{"categories":1278},[],{"categories":1280},[],{"categories":1282},[129],{"categories":1284},[47],{"categories":1286},[129],{"categories":1288},[105],{"categories":1290},[47],{"categories":1292},[47],{"categories":1294},[87],{"categories":1296},[47],{"categories":1298},[],{"categories":1300},[],{"categories":1302},[400],{"categories":1304},[],{"categories":1306},[],{"categories":1308},[79],{"categories":1310},[],{"categories":1312},[],{"categories":1314},[],{"categories":1316},[],{"categories":1318},[136],{"categories":1320},[105],{"categories":1322},[143],{"categories":1324},[82],{"categories":1326},[47],{"categories":1328},[47],{"categories":1330},[82],{"categories":1332},[],{"categories":1334},[126],{"categories":1336},[87],{"categories":1338},[82],{"categories":1340},[47],{"categories":1342},[47],{"categories":1344},[79],{"categories":1346},[],{"categories":1348},[79],{"categories":1350},[47],{"categories":1352},[143],{"categories":1354},[87],{"categories":1356},[105],{"categories":1358},[82],{"categories":1360},[47],{"categories":1362},[87],{"categories":1364},[],{"categories":1366},[47],{"categories":1368},[79],{"categories":1370},[47],{"categories":1372},[],{"categories":1374},[105],{"categories":1376},[47],{"categories":1378},[],{"categories":1380},[82],{"categories":1382},[47],{"categories":1384},[],{"categories":1386},[],{"categories":1388},[],{"categories":1390},[47],{"categories":1392},[],{"categories":1394},[400],{"categories":1396},[47],{"categories":1398},[],{"categories":1400},[47],{"categories":1402},[47],{"categories":1404},[47],{"categories":1406},[47,400],{"categories":1408},[47],{"categories":1410},[47],{"categories":1412},[126],{"categories":1414},[87],{"categories":1416},[],{"categories":1418},[87],{"categories":1420},[47],{"categories":1422},[47],{"categories":1424},[47],{"categories":1426},[79],{"categories":1428},[79],{"categories":1430},[136],{"categories":1432},[126],{"categories":1434},[87],{"categories":1436},[],{"categories":1438},[47],{"categories":1440},[105],{"categories":1442},[47],{"categories":1444},[82],{"categories":1446},[],{"categories":1448},[400],{"categories":1450},[126],{"categories":1452},[126],{"categories":1454},[87],{"categories":1456},[105],{"categories":1458},[87],{"categories":1460},[47],{"categories":1462},[],{"categories":1464},[47],{"categories":1466},[],{"categories":1468},[],{"categories":1470},[47],{"categories":1472},[47],{"categories":1474},[47],{"categories":1476},[87],{"categories":1478},[47],{"categories":1480},[],{"categories":1482},[129],{"categories":1484},[87],{"categories":1486},[],{"categories":1488},[],{"categories":1490},[47],{"categories":1492},[105],{"categories":1494},[],{"categories":1496},[126],{"categories":1498},[400],{"categories":1500},[105],{"categories":1502},[136],{"categories":1504},[136],{"categories":1506},[105],{"categories":1508},[105],{"categories":1510},[400],{"categories":1512},[],{"categories":1514},[105],{"categories":1516},[47],{"categories":1518},[79],{"categories":1520},[105],{"categories":1522},[],{"categories":1524},[129],{"categories":1526},[105],{"categories":1528},[136],{"categories":1530},[105],{"categories":1532},[400],{"categories":1534},[47],{"categories":1536},[47],{"categories":1538},[],{"categories":1540},[82],{"categories":1542},[],{"categories":1544},[],{"categories":1546},[47],{"categories":1548},[47],{"categories":1550},[47],{"categories":1552},[47],{"categories":1554},[],{"categories":1556},[129],{"categories":1558},[79],{"categories":1560},[],{"categories":1562},[47],{"categories":1564},[47],{"categories":1566},[400],{"categories":1568},[400],{"categories":1570},[],{"categories":1572},[87],{"categories":1574},[105],{"categories":1576},[105],{"categories":1578},[47],{"categories":1580},[87],{"categories":1582},[],{"categories":1584},[126],{"categories":1586},[47],{"categories":1588},[47],{"categories":1590},[],{"categories":1592},[],{"categories":1594},[400],{"categories":1596},[47],{"categories":1598},[136],{"categories":1600},[82],{"categories":1602},[47],{"categories":1604},[],{"categories":1606},[87],{"categories":1608},[79],{"categories":1610},[79],{"categories":1612},[],{"categories":1614},[47],{"categories":1616},[126],{"categories":1618},[87],{"categories":1620},[],{"categories":1622},[47],{"categories":1624},[47],{"categories":1626},[87],{"categories":1628},[],{"categories":1630},[87],{"categories":1632},[136],{"categories":1634},[],{"categories":1636},[47],{"categories":1638},[],{"categories":1640},[47],{"categories":1642},[],{"categories":1644},[47],{"categories":1646},[47],{"categories":1648},[],{"categories":1650},[47],{"categories":1652},[105],{"categories":1654},[47],{"categories":1656},[47],{"categories":1658},[79],{"categories":1660},[47],{"categories":1662},[105],{"categories":1664},[87],{"categories":1666},[],{"categories":1668},[47],{"categories":1670},[143],{"categories":1672},[],{"categories":1674},[],{"categories":1676},[],{"categories":1678},[79],{"categories":1680},[105],{"categories":1682},[87],{"categories":1684},[47],{"categories":1686},[126],{"categories":1688},[87],{"categories":1690},[],{"categories":1692},[87],{"categories":1694},[],{"categories":1696},[47],{"categories":1698},[87],{"categories":1700},[47],{"categories":1702},[],{"categories":1704},[47],{"categories":1706},[47],{"categories":1708},[105],{"categories":1710},[126],{"categories":1712},[87],{"categories":1714},[126],{"categories":1716},[82],{"categories":1718},[],{"categories":1720},[],{"categories":1722},[47],{"categories":1724},[79],{"categories":1726},[105],{"categories":1728},[],{"categories":1730},[],{"categories":1732},[136],{"categories":1734},[126],{"categories":1736},[],{"categories":1738},[47],{"categories":1740},[],{"categories":1742},[143],{"categories":1744},[47],{"categories":1746},[400],{"categories":1748},[136],{"categories":1750},[],{"categories":1752},[87],{"categories":1754},[47],{"categories":1756},[87],{"categories":1758},[87],{"categories":1760},[47],{"categories":1762},[],{"categories":1764},[79],{"categories":1766},[47],{"categories":1768},[82],{"categories":1770},[136],{"categories":1772},[126],{"categories":1774},[],{"categories":1776},[],{"categories":1778},[],{"categories":1780},[87],{"categories":1782},[126],{"categories":1784},[105],{"categories":1786},[47],{"categories":1788},[105],{"categories":1790},[126],{"categories":1792},[],{"categories":1794},[126],{"categories":1796},[105],{"categories":1798},[82],{"categories":1800},[47],{"categories":1802},[105],{"categories":1804},[143],{"categories":1806},[],{"categories":1808},[],{"categories":1810},[129],{"categories":1812},[47,136],{"categories":1814},[105],{"categories":1816},[47],{"categories":1818},[87],{"categories":1820},[87],{"categories":1822},[47],{"categories":1824},[],{"categories":1826},[136],{"categories":1828},[47],{"categories":1830},[129],{"categories":1832},[87],{"categories":1834},[143],{"categories":1836},[400],{"categories":1838},[],{"categories":1840},[79],{"categories":1842},[87],{"categories":1844},[87],{"categories":1846},[136],{"categories":1848},[47],{"categories":1850},[47],{"categories":1852},[],{"categories":1854},[],{"categories":1856},[],{"categories":1858},[400],{"categories":1860},[105],{"categories":1862},[47],{"categories":1864},[47],{"categories":1866},[47],{"categories":1868},[],{"categories":1870},[129],{"categories":1872},[82],{"categories":1874},[],{"categories":1876},[87],{"categories":1878},[400],{"categories":1880},[],{"categories":1882},[126],{"categories":1884},[126],{"categories":1886},[],{"categories":1888},[136],{"categories":1890},[126],{"categories":1892},[47],{"categories":1894},[],{"categories":1896},[105],{"categories":1898},[47],{"categories":1900},[126],{"categories":1902},[87],{"categories":1904},[105],{"categories":1906},[],{"categories":1908},[87],{"categories":1910},[126],{"categories":1912},[47],{"categories":1914},[],{"categories":1916},[47],{"categories":1918},[47],{"categories":1920},[400],{"categories":1922},[105],{"categories":1924},[129],{"categories":1926},[129],{"categories":1928},[],{"categories":1930},[],{"categories":1932},[],{"categories":1934},[87],{"categories":1936},[136],{"categories":1938},[136],{"categories":1940},[],{"categories":1942},[],{"categories":1944},[47],{"categories":1946},[],{"categories":1948},[87],{"categories":1950},[47],{"categories":1952},[],{"categories":1954},[47],{"categories":1956},[82],{"categories":1958},[47],{"categories":1960},[143],{"categories":1962},[87],{"categories":1964},[47],{"categories":1966},[136],{"categories":1968},[105],{"categories":1970},[87],{"categories":1972},[],{"categories":1974},[105],{"categories":1976},[87],{"categories":1978},[87],{"categories":1980},[],{"categories":1982},[82],{"categories":1984},[87],{"categories":1986},[],{"categories":1988},[47],{"categories":1990},[79],{"categories":1992},[105],{"categories":1994},[400],{"categories":1996},[87],{"categories":1998},[87],{"categories":2000},[79],{"categories":2002},[47],{"categories":2004},[],{"categories":2006},[],{"categories":2008},[126],{"categories":2010},[47,82],{"categories":2012},[],{"categories":2014},[79],{"categories":2016},[129],{"categories":2018},[47],{"categories":2020},[136],{"categories":2022},[47],{"categories":2024},[87],{"categories":2026},[47],{"categories":2028},[47],{"categories":2030},[105],{"categories":2032},[87],{"categories":2034},[],{"categories":2036},[],{"categories":2038},[87],{"categories":2040},[47],{"categories":2042},[400],{"categories":2044},[],{"categories":2046},[47],{"categories":2048},[87],{"categories":2050},[],{"categories":2052},[47],{"categories":2054},[143],{"categories":2056},[129],{"categories":2058},[87],{"categories":2060},[47],{"categories":2062},[400],{"categories":2064},[],{"categories":2066},[47],{"categories":2068},[143],{"categories":2070},[126],{"categories":2072},[47],{"categories":2074},[],{"categories":2076},[143],{"categories":2078},[105],{"categories":2080},[47],{"categories":2082},[47],{"categories":2084},[79],{"categories":2086},[],{"categories":2088},[],{"categories":2090},[126],{"categories":2092},[47],{"categories":2094},[129],{"categories":2096},[143],{"categories":2098},[143],{"categories":2100},[105],{"categories":2102},[],{"categories":2104},[],{"categories":2106},[47],{"categories":2108},[],{"categories":2110},[47,136],{"categories":2112},[105],{"categories":2114},[87],{"categories":2116},[136],{"categories":2118},[47],{"categories":2120},[79],{"categories":2122},[],{"categories":2124},[],{"categories":2126},[79],{"categories":2128},[143],{"categories":2130},[47],{"categories":2132},[],{"categories":2134},[126,47],{"categories":2136},[400],{"categories":2138},[79],{"categories":2140},[],{"categories":2142},[82],{"categories":2144},[82],{"categories":2146},[47],{"categories":2148},[136],{"categories":2150},[87],{"categories":2152},[105],{"categories":2154},[143],{"categories":2156},[126],{"categories":2158},[47],{"categories":2160},[47],{"categories":2162},[47],{"categories":2164},[79],{"categories":2166},[47],{"categories":2168},[87],{"categories":2170},[105],{"categories":2172},[],{"categories":2174},[],{"categories":2176},[129],{"categories":2178},[136],{"categories":2180},[47],{"categories":2182},[126],{"categories":2184},[129],{"categories":2186},[47],{"categories":2188},[47],{"categories":2190},[87],{"categories":2192},[87],{"categories":2194},[47,82],{"categories":2196},[],{"categories":2198},[126],{"categories":2200},[],{"categories":2202},[47],{"categories":2204},[105],{"categories":2206},[79],{"categories":2208},[79],{"categories":2210},[87],{"categories":2212},[47],{"categories":2214},[82],{"categories":2216},[136],{"categories":2218},[143],{"categories":2220},[],{"categories":2222},[105],{"categories":2224},[47],{"categories":2226},[47],{"categories":2228},[105],{"categories":2230},[136],{"categories":2232},[47],{"categories":2234},[87],{"categories":2236},[105],{"categories":2238},[47],{"categories":2240},[126],{"categories":2242},[47],{"categories":2244},[47],{"categories":2246},[400],{"categories":2248},[90],{"categories":2250},[87],{"categories":2252},[47],{"categories":2254},[105],{"categories":2256},[87],{"categories":2258},[143],{"categories":2260},[47],{"categories":2262},[],{"categories":2264},[47],{"categories":2266},[],{"categories":2268},[],{"categories":2270},[],{"categories":2272},[82],{"categories":2274},[47],{"categories":2276},[87],{"categories":2278},[105],{"categories":2280},[105],{"categories":2282},[105],{"categories":2284},[105],{"categories":2286},[],{"categories":2288},[79],{"categories":2290},[87],{"categories":2292},[105],{"categories":2294},[79],{"categories":2296},[87],{"categories":2298},[47],{"categories":2300},[47,87],{"categories":2302},[87],{"categories":2304},[400],{"categories":2306},[105],{"categories":2308},[105],{"categories":2310},[87],{"categories":2312},[47],{"categories":2314},[],{"categories":2316},[105],{"categories":2318},[143],{"categories":2320},[79],{"categories":2322},[47],{"categories":2324},[47],{"categories":2326},[],{"categories":2328},[136],{"categories":2330},[],{"categories":2332},[79],{"categories":2334},[87],{"categories":2336},[105],{"categories":2338},[47],{"categories":2340},[105],{"categories":2342},[79],{"categories":2344},[105],{"categories":2346},[105],{"categories":2348},[],{"categories":2350},[82],{"categories":2352},[87],{"categories":2354},[105],{"categories":2356},[105],{"categories":2358},[105],{"categories":2360},[105],{"categories":2362},[105],{"categories":2364},[105],{"categories":2366},[105],{"categories":2368},[105],{"categories":2370},[105],{"categories":2372},[105],{"categories":2374},[129],{"categories":2376},[79],{"categories":2378},[47],{"categories":2380},[47],{"categories":2382},[],{"categories":2384},[47,79],{"categories":2386},[],{"categories":2388},[87],{"categories":2390},[105],{"categories":2392},[87],{"categories":2394},[47],{"categories":2396},[47],{"categories":2398},[47],{"categories":2400},[47],{"categories":2402},[47],{"categories":2404},[87],{"categories":2406},[82],{"categories":2408},[126],{"categories":2410},[105],{"categories":2412},[47],{"categories":2414},[],{"categories":2416},[],{"categories":2418},[87],{"categories":2420},[126],{"categories":2422},[47],{"categories":2424},[],{"categories":2426},[],{"categories":2428},[143],{"categories":2430},[47],{"categories":2432},[],{"categories":2434},[],{"categories":2436},[79],{"categories":2438},[82],{"categories":2440},[47],{"categories":2442},[82],{"categories":2444},[126],{"categories":2446},[],{"categories":2448},[105],{"categories":2450},[],{"categories":2452},[126],{"categories":2454},[47],{"categories":2456},[143],{"categories":2458},[],{"categories":2460},[143],{"categories":2462},[],{"categories":2464},[],{"categories":2466},[87],{"categories":2468},[],{"categories":2470},[82],{"categories":2472},[79],{"categories":2474},[126],{"categories":2476},[136],{"categories":2478},[],{"categories":2480},[],{"categories":2482},[47],{"categories":2484},[79],{"categories":2486},[143],{"categories":2488},[],{"categories":2490},[87],{"categories":2492},[87],{"categories":2494},[105],{"categories":2496},[47],{"categories":2498},[87],{"categories":2500},[47],{"categories":2502},[87],{"categories":2504},[47],{"categories":2506},[90],{"categories":2508},[105],{"categories":2510},[],{"categories":2512},[143],{"categories":2514},[136],{"categories":2516},[87],{"categories":2518},[],{"categories":2520},[47],{"categories":2522},[87],{"categories":2524},[82],{"categories":2526},[79],{"categories":2528},[47],{"categories":2530},[126],{"categories":2532},[136],{"categories":2534},[136],{"categories":2536},[47],{"categories":2538},[129],{"categories":2540},[47],{"categories":2542},[87],{"categories":2544},[82],{"categories":2546},[87],{"categories":2548},[47],{"categories":2550},[47],{"categories":2552},[87],{"categories":2554},[105],{"categories":2556},[],{"categories":2558},[79],{"categories":2560},[47],{"categories":2562},[87],{"categories":2564},[47],{"categories":2566},[47],{"categories":2568},[],{"categories":2570},[126],{"categories":2572},[82],{"categories":2574},[105],{"categories":2576},[47],{"categories":2578},[47],{"categories":2580},[126],{"categories":2582},[143],{"categories":2584},[129],{"categories":2586},[47],{"categories":2588},[105],{"categories":2590},[47],{"categories":2592},[87],{"categories":2594},[400],{"categories":2596},[47],{"categories":2598},[87],{"categories":2600},[129],{"categories":2602},[],{"categories":2604},[87],{"categories":2606},[136],{"categories":2608},[126],{"categories":2610},[47],{"categories":2612},[79],{"categories":2614},[82],{"categories":2616},[136],{"categories":2618},[],{"categories":2620},[87],{"categories":2622},[47],{"categories":2624},[],{"categories":2626},[105],{"categories":2628},[],{"categories":2630},[105],{"categories":2632},[47],{"categories":2634},[87],{"categories":2636},[87],{"categories":2638},[87],{"categories":2640},[],{"categories":2642},[],{"categories":2644},[47],{"categories":2646},[47],{"categories":2648},[],{"categories":2650},[126],{"categories":2652},[87],{"categories":2654},[143],{"categories":2656},[79],{"categories":2658},[],{"categories":2660},[],{"categories":2662},[105],{"categories":2664},[136],{"categories":2666},[47],{"categories":2668},[47],{"categories":2670},[47],{"categories":2672},[136],{"categories":2674},[105],{"categories":2676},[126],{"categories":2678},[47],{"categories":2680},[47],{"categories":2682},[47],{"categories":2684},[105],{"categories":2686},[47],{"categories":2688},[105],{"categories":2690},[87],{"categories":2692},[87],{"categories":2694},[136],{"categories":2696},[87],{"categories":2698},[47],{"categories":2700},[136],{"categories":2702},[126],{"categories":2704},[],{"categories":2706},[87],{"categories":2708},[],{"categories":2710},[],{"categories":2712},[],{"categories":2714},[82],{"categories":2716},[47],{"categories":2718},[87],{"categories":2720},[79],{"categories":2722},[87],{"categories":2724},[143],{"categories":2726},[],{"categories":2728},[87],{"categories":2730},[],{"categories":2732},[79],{"categories":2734},[87],{"categories":2736},[],{"categories":2738},[87],{"categories":2740},[47],{"categories":2742},[105],{"categories":2744},[47],{"categories":2746},[87],{"categories":2748},[105],{"categories":2750},[87],{"categories":2752},[136],{"categories":2754},[126],{"categories":2756},[79],{"categories":2758},[],{"categories":2760},[87],{"categories":2762},[126],{"categories":2764},[400],{"categories":2766},[105],{"categories":2768},[47],{"categories":2770},[126],{"categories":2772},[79],{"categories":2774},[],{"categories":2776},[87],{"categories":2778},[87],{"categories":2780},[47],{"categories":2782},[],{"categories":2784},[87],{"categories":2786},[90],{"categories":2788},[105],{"categories":2790},[87],{"categories":2792},[82],{"categories":2794},[],{"categories":2796},[47],{"categories":2798},[90],{"categories":2800},[47],{"categories":2802},[87],{"categories":2804},[105],{"categories":2806},[79],{"categories":2808},[400],{"categories":2810},[47],{"categories":2812},[47],{"categories":2814},[47],{"categories":2816},[105],{"categories":2818},[82],{"categories":2820},[47],{"categories":2822},[126],{"categories":2824},[105],{"categories":2826},[400],{"categories":2828},[47],{"categories":2830},[],{"categories":2832},[],{"categories":2834},[400],{"categories":2836},[129],{"categories":2838},[87],{"categories":2840},[87],{"categories":2842},[105],{"categories":2844},[47],{"categories":2846},[79],{"categories":2848},[126],{"categories":2850},[87],{"categories":2852},[47],{"categories":2854},[143],{"categories":2856},[47],{"categories":2858},[87],{"categories":2860},[],{"categories":2862},[47],{"categories":2864},[47],{"categories":2866},[105],{"categories":2868},[79],{"categories":2870},[],{"categories":2872},[47],{"categories":2874},[47],{"categories":2876},[136],{"categories":2878},[126],{"categories":2880},[47,87],{"categories":2882},[143,82],{"categories":2884},[47],{"categories":2886},[],{"categories":2888},[87],{"categories":2890},[],{"categories":2892},[136],{"categories":2894},[47],{"categories":2896},[105],{"categories":2898},[],{"categories":2900},[87],{"categories":2902},[],{"categories":2904},[126],{"categories":2906},[87],{"categories":2908},[79],{"categories":2910},[87],{"categories":2912},[47],{"categories":2914},[400],{"categories":2916},[143],{"categories":2918},[82],{"categories":2920},[82],{"categories":2922},[79],{"categories":2924},[79],{"categories":2926},[47],{"categories":2928},[87],{"categories":2930},[47],{"categories":2932},[47],{"categories":2934},[79],{"categories":2936},[47],{"categories":2938},[143],{"categories":2940},[105],{"categories":2942},[47],{"categories":2944},[87],{"categories":2946},[47],{"categories":2948},[],{"categories":2950},[136],{"categories":2952},[],{"categories":2954},[87],{"categories":2956},[79],{"categories":2958},[],{"categories":2960},[400],{"categories":2962},[47],{"categories":2964},[],{"categories":2966},[105],{"categories":2968},[87],{"categories":2970},[136],{"categories":2972},[47],{"categories":2974},[87],{"categories":2976},[136],{"categories":2978},[87],{"categories":2980},[105],{"categories":2982},[79],{"categories":2984},[105],{"categories":2986},[136],{"categories":2988},[47],{"categories":2990},[126],{"categories":2992},[47],{"categories":2994},[47],{"categories":2996},[47],{"categories":2998},[47],{"categories":3000},[87],{"categories":3002},[47],{"categories":3004},[87],{"categories":3006},[47],{"categories":3008},[79],{"categories":3010},[47],{"categories":3012},[87],{"categories":3014},[126],{"categories":3016},[79],{"categories":3018},[87],{"categories":3020},[126],{"categories":3022},[],{"categories":3024},[47],{"categories":3026},[47],{"categories":3028},[136],{"categories":3030},[],{"categories":3032},[87],{"categories":3034},[143],{"categories":3036},[47],{"categories":3038},[105],{"categories":3040},[143],{"categories":3042},[87],{"categories":3044},[82],{"categories":3046},[82],{"categories":3048},[47],{"categories":3050},[79],{"categories":3052},[],{"categories":3054},[47],{"categories":3056},[],{"categories":3058},[79],{"categories":3060},[47],{"categories":3062},[87],{"categories":3064},[87],{"categories":3066},[],{"categories":3068},[136],{"categories":3070},[136],{"categories":3072},[143],{"categories":3074},[126],{"categories":3076},[],{"categories":3078},[47],{"categories":3080},[79],{"categories":3082},[47],{"categories":3084},[136],{"categories":3086},[79],{"categories":3088},[105],{"categories":3090},[105],{"categories":3092},[],{"categories":3094},[105],{"categories":3096},[87],{"categories":3098},[126],{"categories":3100},[129],{"categories":3102},[47],{"categories":3104},[],{"categories":3106},[105],{"categories":3108},[136],{"categories":3110},[82],{"categories":3112},[47],{"categories":3114},[79],{"categories":3116},[400],{"categories":3118},[79],{"categories":3120},[],{"categories":3122},[],{"categories":3124},[105],{"categories":3126},[],{"categories":3128},[87],{"categories":3130},[87],{"categories":3132},[87],{"categories":3134},[],{"categories":3136},[47],{"categories":3138},[],{"categories":3140},[105],{"categories":3142},[79],{"categories":3144},[126],{"categories":3146},[47],{"categories":3148},[105],{"categories":3150},[105],{"categories":3152},[],{"categories":3154},[105],{"categories":3156},[79],{"categories":3158},[47],{"categories":3160},[],{"categories":3162},[87],{"categories":3164},[87],{"categories":3166},[79],{"categories":3168},[],{"categories":3170},[],{"categories":3172},[],{"categories":3174},[126],{"categories":3176},[87],{"categories":3178},[47],{"categories":3180},[],{"categories":3182},[],{"categories":3184},[],{"categories":3186},[126],{"categories":3188},[],{"categories":3190},[79],{"categories":3192},[],{"categories":3194},[],{"categories":3196},[126],{"categories":3198},[47],{"categories":3200},[105],{"categories":3202},[],{"categories":3204},[143],{"categories":3206},[105],{"categories":3208},[143],{"categories":3210},[47],{"categories":3212},[],{"categories":3214},[],{"categories":3216},[87],{"categories":3218},[],{"categories":3220},[],{"categories":3222},[87],{"categories":3224},[47],{"categories":3226},[],{"categories":3228},[87],{"categories":3230},[105],{"categories":3232},[143],{"categories":3234},[129],{"categories":3236},[87],{"categories":3238},[87],{"categories":3240},[],{"categories":3242},[],{"categories":3244},[],{"categories":3246},[105],{"categories":3248},[],{"categories":3250},[],{"categories":3252},[126],{"categories":3254},[79],{"categories":3256},[],{"categories":3258},[82],{"categories":3260},[143],{"categories":3262},[47],{"categories":3264},[136],{"categories":3266},[79],{"categories":3268},[129],{"categories":3270},[82],{"categories":3272},[136],{"categories":3274},[],{"categories":3276},[],{"categories":3278},[87],{"categories":3280},[79],{"categories":3282},[126],{"categories":3284},[79],{"categories":3286},[87],{"categories":3288},[400],{"categories":3290},[87],{"categories":3292},[],{"categories":3294},[47],{"categories":3296},[105],{"categories":3298},[136],{"categories":3300},[],{"categories":3302},[126],{"categories":3304},[105],{"categories":3306},[79],{"categories":3308},[87],{"categories":3310},[47],{"categories":3312},[82],{"categories":3314},[87,400],{"categories":3316},[87],{"categories":3318},[136],{"categories":3320},[47],{"categories":3322},[129],{"categories":3324},[143],{"categories":3326},[87],{"categories":3328},[],{"categories":3330},[87],{"categories":3332},[47],{"categories":3334},[82],{"categories":3336},[],{"categories":3338},[],{"categories":3340},[47],{"categories":3342},[129],{"categories":3344},[47],{"categories":3346},[],{"categories":3348},[105],{"categories":3350},[],{"categories":3352},[105],{"categories":3354},[136],{"categories":3356},[87],{"categories":3358},[47],{"categories":3360},[143],{"categories":3362},[136],{"categories":3364},[],{"categories":3366},[105],{"categories":3368},[47],{"categories":3370},[],{"categories":3372},[47],{"categories":3374},[87],{"categories":3376},[47],{"categories":3378},[87],{"categories":3380},[47],{"categories":3382},[47],{"categories":3384},[47],{"categories":3386},[47],{"categories":3388},[82],{"categories":3390},[],{"categories":3392},[90],{"categories":3394},[105],{"categories":3396},[47],{"categories":3398},[],{"categories":3400},[136],{"categories":3402},[47],{"categories":3404},[47],{"categories":3406},[87],{"categories":3408},[105],{"categories":3410},[47],{"categories":3412},[47],{"categories":3414},[82],{"categories":3416},[87],{"categories":3418},[126],{"categories":3420},[],{"categories":3422},[129],{"categories":3424},[47],{"categories":3426},[],{"categories":3428},[105],{"categories":3430},[143],{"categories":3432},[],{"categories":3434},[],{"categories":3436},[105],{"categories":3438},[105],{"categories":3440},[143],{"categories":3442},[79],{"categories":3444},[87],{"categories":3446},[87],{"categories":3448},[47],{"categories":3450},[82],{"categories":3452},[],{"categories":3454},[],{"categories":3456},[105],{"categories":3458},[129],{"categories":3460},[136],{"categories":3462},[87],{"categories":3464},[126],{"categories":3466},[129],{"categories":3468},[129],{"categories":3470},[],{"categories":3472},[105],{"categories":3474},[47],{"categories":3476},[47],{"categories":3478},[136],{"categories":3480},[],{"categories":3482},[105],{"categories":3484},[105],{"categories":3486},[105],{"categories":3488},[],{"categories":3490},[87],{"categories":3492},[47],{"categories":3494},[],{"categories":3496},[79],{"categories":3498},[82],{"categories":3500},[],{"categories":3502},[47],{"categories":3504},[47],{"categories":3506},[],{"categories":3508},[136],{"categories":3510},[],{"categories":3512},[],{"categories":3514},[],{"categories":3516},[],{"categories":3518},[47],{"categories":3520},[105],{"categories":3522},[],{"categories":3524},[],{"categories":3526},[47],{"categories":3528},[47],{"categories":3530},[47],{"categories":3532},[129],{"categories":3534},[47],{"categories":3536},[129],{"categories":3538},[],{"categories":3540},[129],{"categories":3542},[129],{"categories":3544},[400],{"categories":3546},[87],{"categories":3548},[136],{"categories":3550},[],{"categories":3552},[],{"categories":3554},[129],{"categories":3556},[136],{"categories":3558},[136],{"categories":3560},[136],{"categories":3562},[],{"categories":3564},[79],{"categories":3566},[136],{"categories":3568},[136],{"categories":3570},[79],{"categories":3572},[136],{"categories":3574},[82],{"categories":3576},[136],{"categories":3578},[136],{"categories":3580},[136],{"categories":3582},[129],{"categories":3584},[105],{"categories":3586},[105],{"categories":3588},[47],{"categories":3590},[136],{"categories":3592},[129],{"categories":3594},[400],{"categories":3596},[129],{"categories":3598},[129],{"categories":3600},[129],{"categories":3602},[],{"categories":3604},[82],{"categories":3606},[],{"categories":3608},[400],{"categories":3610},[136],{"categories":3612},[136],{"categories":3614},[136],{"categories":3616},[87],{"categories":3618},[105,82],{"categories":3620},[129],{"categories":3622},[],{"categories":3624},[],{"categories":3626},[129],{"categories":3628},[],{"categories":3630},[129],{"categories":3632},[105],{"categories":3634},[87],{"categories":3636},[],{"categories":3638},[136],{"categories":3640},[47],{"categories":3642},[126],{"categories":3644},[],{"categories":3646},[47],{"categories":3648},[],{"categories":3650},[105],{"categories":3652},[79],{"categories":3654},[129],{"categories":3656},[],{"categories":3658},[136],{"categories":3660},[105],[3662,3747,3817,3897],{"id":3663,"title":3664,"ai":3665,"body":3670,"categories":3720,"created_at":48,"date_modified":48,"description":40,"extension":49,"faq":48,"featured":50,"kicker_label":48,"meta":3721,"navigation":58,"path":3734,"published_at":3735,"question":48,"scraped_at":3736,"seo":3737,"sitemap":3738,"source_id":3739,"source_name":3740,"source_type":65,"source_url":3741,"stem":3742,"tags":3743,"thumbnail_url":48,"tldr":3744,"tweet":48,"unknown_tags":3745,"__hash__":3746},"summaries\u002Fsummaries\u002F1dcaa9cf36eee656-blt-cuts-inference-bandwidth-50-92-via-diffusion-s-summary.md","BLT Cuts Inference Bandwidth 50-92% via Diffusion & Speculation",{"provider":7,"model":8,"input_tokens":3666,"output_tokens":3667,"processing_time_ms":3668,"cost_usd":3669},8589,2722,30748,0.00305615,{"type":14,"value":3671,"toc":3714},[3672,3676,3679,3683,3691,3694,3698,3701,3704,3707,3711],[17,3673,3675],{"id":3674},"blts-memory-bandwidth-bottleneck-in-byte-level-generation","BLT's Memory Bandwidth Bottleneck in Byte-Level Generation",[22,3677,3678],{},"Byte-level models like BLT avoid tokenization pitfalls—noise sensitivity, poor multilingual support, weak character\u002Fcode handling—by processing raw bytes via entropy-based patches (avg 4 bytes, max 8). Computation uses local encoder, global Transformer, local decoder on latent tokens. Inference slows because autoregressive decoder generates one byte\u002Fstep, vs. tokens covering multiple bytes. This multiplies memory loads for weights\u002FKV caches, the key serving bottleneck. BLT needs 4x more decoder passes than token models for equivalent text, hiking bandwidth costs.",[17,3680,3682],{"id":3681},"block-diffusion-enables-multi-byte-decoding-per-pass-blt-d","Block Diffusion Enables Multi-Byte Decoding per Pass (BLT-D)",[22,3684,3685,3686,3690],{},"BLT-D replaces byte-by-byte autoregression with discrete diffusion in fixed blocks (B=4\u002F8\u002F16 bytes). Training: corrupt blocks by masking bytes independently with prob t~U(0,1); loss combines next-byte prediction on clean seq + masked prediction on corrupted. Inference: start with ",[3687,3688,3689],"span",{},"MASK"," block, iteratively unmask multiple bytes\u002Fpass via confidence (prob>α) or entropy-bounded (cumulative entropy\u003Cγ) sampling. Encoder\u002Fglobal called once\u002Fblock, not per-patch; supports KV caching.",[22,3692,3693],{},"At 3B params on BLT-1T (1T tokens), BLT-D-4 matches BLT scores on FLORES-101 translation (French\u002FEnglish, German\u002FEnglish; 4-shot BLEU), nears on HumanEval\u002FMBPP coding (0\u002F3-shot pass@1). BLT-D-16 cuts bandwidth 87-92% but drops coding pass@1. Likelihoods (ARC-Easy\u002FChallenge, PIQA, HellaSwag, MMLU) near baseline via causal-masked decoder. Translation gains most; coding sensitive to block size. Entropy-bounded + top-p boosts diversity (higher type-token ratio) as NFEs rise.",[17,3695,3697],{"id":3696},"no-training-speculation-recycles-existing-decoder-blt-s-blt-dv","No-Training Speculation Recycles Existing Decoder (BLT-S, BLT-DV)",[22,3699,3700],{},"BLT-S uses lightweight decoder as self-drafter: generate k=8\u002F16 bytes ignoring patch boundaries, conditioning on last latent; verify via full encode\u002Fglobal\u002Fdecode, accept to first mismatch. Greedy decoding guarantees identical output to BLT (no quality loss); reduces encoder\u002Fglobal calls despite more decoder passes. At 3B\u002Fk=16, 77% bandwidth cut.",[22,3702,3703],{},"BLT-DV (on BLT-D weights): one-step diffusion drafts block, autoregressive verify accepts to mismatch. Single-step diffusion degrades alone but verification fixes it. At 3B, up to 81% bandwidth reduction.",[22,3705,3706],{},"All trained 1B:240k steps, 3B:480k on BLT-1T (public + Datacomp-LM subset). Efficiency proxies: decoder\u002Fencoder NFEs, GB bandwidth (16-bit, param\u002Fforward counts). Wall-clock needs optimized serving.",[17,3708,3710],{"id":3709},"practical-tradeoffs-for-production-deployment","Practical Tradeoffs for Production Deployment",[22,3712,3713],{},"BLT-D fastest (esp B=16) but coding tradeoffs; BLT-S zero-loss safest. All preserve autoregressive likelihoods\u002Freasoning. Bandwidth proxies predict real gains in memory-bound serving. Future: optimized inference impl. Byte-level now viable for production-scale speed without tokenizer fragility.",{"title":40,"searchDepth":41,"depth":41,"links":3715},[3716,3717,3718,3719],{"id":3674,"depth":41,"text":3675},{"id":3681,"depth":41,"text":3682},{"id":3696,"depth":41,"text":3697},{"id":3709,"depth":41,"text":3710},[47],{"content_references":3722,"triage":3732},[3723,3728],{"type":3724,"title":3725,"url":3726,"context":3727},"paper","Fast Byte Latent Transformer That Reduces Inference Memory Bandwidth by Over 50% Without Tokenization","https:\u002F\u002Farxiv.org\u002Fpdf\u002F2605.08044","recommended",{"type":3724,"title":3729,"url":3730,"context":3731},"Byte Latent Transformer (BLT): A Tokenizer-Free Model That Scales Efficiently","https:\u002F\u002Fwww.marktechpost.com\u002F2024\u002F12\u002F13\u002Fmeta-ai-introduces-byte-latent-transformer-blt-a-tokenizer-free-model-that-scales-efficiently\u002F","cited",{"relevance":54,"novelty":55,"quality":55,"actionability":41,"composite":56,"reasoning":3733},"Category: AI & LLMs. The article discusses a new approach to improving inference bandwidth in AI models, which is relevant to AI engineering. However, it lacks practical applications or frameworks that the audience can directly implement, focusing instead on theoretical advancements.","\u002Fsummaries\u002F1dcaa9cf36eee656-blt-cuts-inference-bandwidth-50-92-via-diffusion-s-summary","2026-05-11 17:52:15","2026-05-12 15:01:28",{"title":3664,"description":40},{"loc":3734},"1dcaa9cf36eee656","MarkTechPost","https:\u002F\u002Fwww.marktechpost.com\u002F2026\u002F05\u002F11\u002Fmeta-and-stanford-researchers-propose-fast-byte-latent-transformer-that-reduces-inference-memory-bandwidth-by-over-50-without-tokenization\u002F","summaries\u002F1dcaa9cf36eee656-blt-cuts-inference-bandwidth-50-92-via-diffusion-s-summary",[69,71,72],"Meta\u002FStanford researchers accelerate Byte Latent Transformer (BLT) inference with BLT-D (diffusion decoding), BLT-S (self-speculation), and BLT-DV (diffusion+verification), reducing memory bandwidth 50-92% at 3B params while nearing baseline performance on translation\u002Fcoding tasks.",[],"xMZyx1diuvh2XXZUy_NPhOgWy_XqDJeXjel738dmvjs",{"id":3748,"title":3749,"ai":3750,"body":3755,"categories":3786,"created_at":48,"date_modified":48,"description":40,"extension":49,"faq":48,"featured":50,"kicker_label":48,"meta":3787,"navigation":58,"path":3804,"published_at":3805,"question":48,"scraped_at":3806,"seo":3807,"sitemap":3808,"source_id":3809,"source_name":3810,"source_type":65,"source_url":3811,"stem":3812,"tags":3813,"thumbnail_url":48,"tldr":3814,"tweet":48,"unknown_tags":3815,"__hash__":3816},"summaries\u002Fsummaries\u002F5c8a61f1aa3cea08-llm-scaling-works-via-strong-superposition-summary.md","LLM Scaling Works via Strong Superposition",{"provider":7,"model":8,"input_tokens":3751,"output_tokens":3752,"processing_time_ms":3753,"cost_usd":3754},4549,1921,23559,0.00136345,{"type":14,"value":3756,"toc":3781},[3757,3761,3764,3767,3771,3774,3778],[17,3758,3760],{"id":3759},"superposition-drives-predictable-error-reduction","Superposition Drives Predictable Error Reduction",[22,3762,3763],{},"Language models represent tens of thousands of tokens in spaces with only thousands of dimensions by using superposition: squeezing multiple concepts into the same dimensions with slight overlaps. In the dominant 'strong superposition' regime, every token gets represented, and error stems from overlap noise, not dropped rare tokens. Doubling model width (m) halves error via the geometric 1\u002Fm relationship, yielding power-law scaling (exponent ~1) regardless of data distribution. Weak superposition, where only common tokens are stored cleanly, requires power-law token frequencies for scaling—less reliable for natural language's flatter distributions.",[22,3765,3766],{},"This mechanistic view outperforms prior assumptions: real LLMs don't discard rare tokens but overlap everything, matching theory with measured overlap strength shrinking at 1\u002Fm.",[17,3768,3770],{"id":3769},"validation-across-real-models-matches-theory","Validation Across Real Models Matches Theory",[22,3772,3773],{},"Analysis of output layers in OPT, GPT-2, Qwen2.5, and Pythia (100M to 70B parameters) confirms strong superposition: all tokens represented with overlaps scaling at 1\u002Fm. Observed exponent of 0.91 aligns with theory's 1; DeepMind's Chinchilla data hits 0.88. Simplified models toggling overlap regimes prove scaling emerges directly from geometry, not just data power laws ('power law in, power law out').",[17,3775,3777],{"id":3776},"limits-and-optimization-opportunities","Limits and Optimization Opportunities",[22,3779,3780],{},"Scaling halts when width equals vocabulary size—no more overlaps needed, error from superposition vanishes, breaking power laws. Natural language's even frequencies limit speedup, but uneven domains (e.g., specialized vocab) enable steeper curves. Architectures promoting denser packing, like Nvidia's nGPT (vectors on unit sphere), boost performance at fixed size. Trade-off: denser overlaps hinder mechanistic interpretability, complicating AI safety.",{"title":40,"searchDepth":41,"depth":41,"links":3782},[3783,3784,3785],{"id":3759,"depth":41,"text":3760},{"id":3769,"depth":41,"text":3770},{"id":3776,"depth":41,"text":3777},[],{"content_references":3788,"triage":3802},[3789,3793,3797],{"type":3724,"title":3790,"author":3791,"url":3792,"context":3731},"Toy Model of Superposition","Anthropic","https:\u002F\u002Ftransformer-circuits.pub\u002F2022\u002Ftoy_model\u002Findex.html",{"type":3724,"title":3794,"author":3795,"url":3796,"context":3731},"Chinchilla","DeepMind","https:\u002F\u002Fthe-decoder.com\u002Fdeepmind-artificial-intelligence-is-far-from-being-fed-up\u002F",{"type":3724,"title":3798,"author":3799,"url":3800,"context":3801},"nGPT","Nvidia","https:\u002F\u002Farxiv.org\u002Fabs\u002F2410.01131","mentioned",{"relevance":54,"novelty":55,"quality":55,"actionability":41,"composite":56,"reasoning":3803},"Category: AI & LLMs. The article discusses the mechanics of LLM scaling through strong superposition, which is relevant to AI engineering. It presents new insights into how model width affects prediction error, but lacks practical applications or frameworks that the audience can directly implement.","\u002Fsummaries\u002F5c8a61f1aa3cea08-llm-scaling-works-via-strong-superposition-summary","2026-05-03 08:42:45","2026-05-03 17:01:29",{"title":3749,"description":40},{"loc":3804},"5c8a61f1aa3cea08","The Decoder","https:\u002F\u002Fthe-decoder.com\u002Fmit-study-explains-why-scaling-language-models-works-so-reliably\u002F","summaries\u002F5c8a61f1aa3cea08-llm-scaling-works-via-strong-superposition-summary",[69,71,72],"LLMs pack all tokens into limited dimensions via overlapping vectors (strong superposition), causing prediction error to halve when model width doubles—explaining reliable power-law scaling.",[],"TxCrmsO7g860jqMKD8Z7LhJqkiaTNkcDx-Z3AQT2GA0",{"id":3818,"title":3819,"ai":3820,"body":3825,"categories":3867,"created_at":48,"date_modified":48,"description":40,"extension":49,"faq":48,"featured":50,"kicker_label":48,"meta":3868,"navigation":58,"path":3885,"published_at":48,"question":48,"scraped_at":3886,"seo":3887,"sitemap":3888,"source_id":3889,"source_name":3890,"source_type":65,"source_url":3891,"stem":3892,"tags":3893,"thumbnail_url":48,"tldr":3894,"tweet":48,"unknown_tags":3895,"__hash__":3896},"summaries\u002Fsummaries\u002Fd445780e74d7b6ed-llm-pretraining-scaling-fsdp-wins-until-comms-crat-summary.md","LLM Pretraining Scaling: FSDP Wins Until Comms Crater",{"provider":7,"model":8,"input_tokens":3821,"output_tokens":3822,"processing_time_ms":3823,"cost_usd":3824},8296,2378,19998,0.00282555,{"type":14,"value":3826,"toc":3861},[3827,3831,3834,3837,3841,3844,3848,3851,3855,3858],[17,3828,3830],{"id":3829},"fsdp-dominates-parallelism-until-scale-forces-pipeline-trade-offs","FSDP Dominates Parallelism Until Scale Forces Pipeline Trade-offs",[22,3832,3833],{},"Pretraining FLOPs = 6ND (2 forward + 4 backward per param-token). Data parallel (DP) copies weights across GPUs but hits HBM limits (B300: 288GB). Fully Sharded Data Parallel (FSDP) shards params per layer across GPUs, all-gathering full weights per layer (forward\u002Fbackward) while overlapping comms with compute since weights are layer-independent. FSDP comms: params×3 (all-gather forward\u002Fback + reduce-scatter backward), 50% over DP's params×2 all-reduce—achievable because all-gather is half an all-reduce. Use hierarchical collectives across NVLink domains: reduce-scatter intra-domain, all-reduce shards inter-domain, all-gather intra-domain to saturate IB bandwidth.",[22,3835,3836],{},"Comms time stays flat with GPU count (ring all-reduce chunks scale inversely with participants), but compute drops linearly, cratering MFU at 'crossover' (comms > compute). Delay crossover by larger batches (more compute\u002FGPU) or sparser models; TPUs excel with bigger domains. Batch size floors FSDP at ~1K GPUs (e.g., 10M-token batch, 10K seq len = 1K seqs). Add pipeline parallelism (PP) next, but it introduces bubbles (idle GPUs at batch start\u002Fend) unfillable in training due to per-batch gradient sync. PP constrains architecture (e.g., Kimi's cross-layer attention, mixed attention types cause stage imbalance), slowing research.",[17,3838,3840],{"id":3839},"distillation-remains-cheap-and-evasion-proof","Distillation Remains Cheap and Evasion-Proof",[22,3842,3843],{},"Frontier labs can't halt distillation: 1T tokens from Opus 4.6 costs $25M ($25\u002FMTok), commoditizing open models rapidly (cf. Fineweb 18.5T, OpenWebText 9B). Hiding chain-of-thought (CoT) fails—instruct no-think\u002Fdirect solve or RLVR on reconstructed CoT. Core value in local tool use (file edits, bash) evades cloud hiding; users resist workflow migration. Products atop APIs distill better: reward 'gold diffs' (final user-accepted code) over rejected intermediates from 10+ turn sessions.",[17,3845,3847],{"id":3846},"agentic-ai-shifts-cybersecurity-toward-defense","Agentic AI Shifts Cybersecurity Toward Defense",[22,3849,3850],{},"Mythos chains 5+ vulns into exploits (vs. prior single-vuln finds), but software is securer now despite human probing—sudden AI intelligence influx likely strengthens defense via industry patching (e.g., Glasswing reveals zero-days). AI excels at vuln finding over patching (XKCD: fixes break edge cases\u002Ffeatures). Solutions: LLM-port C to Rust; formal verification (e.g., seL4 proofs); patching mirrors LLM bug-finding in others' repos. Hoarding Mythos risky—build\u002Frelease classifiers rejecting cyberattack intents (Anthropic plans for 4.7). Evade classifiers by subproblems (harmless vulns). Patching own code routine for coding LLMs.",[17,3852,3854],{"id":3853},"pipeline-rl-fixes-stragglers-causalitybias-dooms-runs","Pipeline RL Fixes Stragglers; Causality\u002FBias Dooms Runs",[22,3856,3857],{},"RL responses grow in mean\u002Fvariance length, straggling GPU utilization. Pipeline RL does 'in-flight weight updates': swap generating model mid-trajectory post-training step, ensuring recent-model rollouts without full offline RL off-policyness.",[22,3859,3860],{},"Pretraining fails via causality breaks (MoE expert-choice routes token n+k affecting n; token-dropping ignores early for later matches—rumored Llama 4\u002FGemini 2 flops) or bias (FP16 collectives round large sums wrong, e.g., post-1024 granularity skips +1; GPT-4 initial bug). Bias compounds > variance. New scale unveils bespoke issues (numerics, kernels)—not 5 fixable failure modes. RL inference needs training-engine fidelity (numerical drift biases); enforce disciplined compute multipliers to avoid bug stacks. Kernel optimization AGI-hard (Nvidia took ages for Blackwell).",{"title":40,"searchDepth":41,"depth":41,"links":3862},[3863,3864,3865,3866],{"id":3829,"depth":41,"text":3830},{"id":3839,"depth":41,"text":3840},{"id":3846,"depth":41,"text":3847},{"id":3853,"depth":41,"text":3854},[],{"content_references":3869,"triage":3882},[3870,3874,3877],{"type":3871,"title":3872,"url":3873,"context":3801},"podcast","Conversation with Michael Nielsen","https:\u002F\u002Fwww.dwarkesh.com\u002Fp\u002Fmichael-nielsen",{"type":3724,"title":3875,"url":3876,"context":3731},"Pipeline RL","https:\u002F\u002Farxiv.org\u002Fpdf\u002F2509.19128",{"type":3878,"title":3879,"author":3880,"url":3881,"context":3801},"other","Pretraining parallelisms lecture","Horace He","https:\u002F\u002Fhorace.io\u002F",{"relevance":55,"novelty":54,"quality":55,"actionability":41,"composite":3883,"reasoning":3884},3.4,"Category: AI & LLMs. The article discusses the practical application of Fully Sharded Data Parallel (FSDP) for scaling pretraining in LLMs, which addresses a specific pain point for AI developers regarding efficient model training. However, while it provides technical insights, it lacks concrete actionable steps that the audience could directly implement.","\u002Fsummaries\u002Fd445780e74d7b6ed-llm-pretraining-scaling-fsdp-wins-until-comms-crat-summary","2026-04-19 01:22:25",{"title":3819,"description":40},{"loc":3885},"d445780e74d7b6ed","Dwarkesh Patel","https:\u002F\u002Fwww.dwarkesh.com\u002Fp\u002Fwhat-i-learned-april-15","summaries\u002Fd445780e74d7b6ed-llm-pretraining-scaling-fsdp-wins-until-comms-crat-summary",[69,71,72],"Use FSDP as default for scaling pretraining (params×3 comms overhead) until GPU count hits comms crossover; distillation costs $25M\u002FT from frontier models, unstoppable via tool use; training fails from causality breaks and FP16 bias.",[],"UCftWL3lVDs_ij_juNq8mtYfE_yqIH5SLhHL1KTHG3s",{"id":3898,"title":3899,"ai":3900,"body":3905,"categories":3936,"created_at":48,"date_modified":48,"description":40,"extension":49,"faq":48,"featured":50,"kicker_label":48,"meta":3937,"navigation":58,"path":3950,"published_at":3951,"question":48,"scraped_at":3952,"seo":3953,"sitemap":3954,"source_id":3955,"source_name":3740,"source_type":65,"source_url":3956,"stem":3957,"tags":3958,"thumbnail_url":48,"tldr":3960,"tweet":48,"unknown_tags":3961,"__hash__":3962},"summaries\u002Fsummaries\u002F70d68e2e9ac01aa6-autodata-agents-create-superior-synthetic-training-summary.md","Autodata: Agents Create Superior Synthetic Training Data",{"provider":7,"model":8,"input_tokens":3901,"output_tokens":3902,"processing_time_ms":3903,"cost_usd":3904},8968,1596,12976,0.0025691,{"type":14,"value":3906,"toc":3931},[3907,3911,3914,3917,3921,3924,3928],[17,3908,3910],{"id":3909},"agentic-pipeline-generates-challenging-filtered-data","Agentic Pipeline Generates Challenging, Filtered Data",[22,3912,3913],{},"Autodata runs a closed-loop process where an orchestrator LLM coordinates four subagents—Challenger (generates input-response pairs grounded in source documents like CS papers), Weak Solver (smaller model expected to fail), Strong Solver (capable model expected to succeed), and Verifier (rubric-based judge)—to produce training\u002Fevaluation data. Examples pass only if all criteria hold: quality verifier approval; weak solver averages ≤65% with max ≤75% and no zeros; strong averages ≥60% but \u003C95%; and gap ≥20%. This rejects trivial or unsolvable questions, running 3-5 median iterations per paper until acceptance or budget exhaustion. From 10,000+ S2ORC (2022+) CS papers, it yields 2,117 QA pairs that specifically reward stronger capabilities, trading inference compute for data quality.",[22,3915,3916],{},"Prior single-pass methods like Self-Instruct, Grounded\u002FCoT Self-Instruct, and Self-Challenging lack this feedback loop, producing data where weak (71.4%) and strong (73.3%) solvers perform nearly identically (1.9-point gap). Autodata widens this to weak 43.7% vs. strong 77.8% (34-point gap), creating harder, more discriminative examples without human annotation.",[17,3918,3920],{"id":3919},"training-gains-from-agentic-data","Training Gains from Agentic Data",[22,3922,3923],{},"Fine-tuning Qwen-3.5-4B via GRPO (one epoch, batch 32, LR 1e-6) using Kimi-K2.6 as reward model on Autodata outperforms CoT Self-Instruct baselines on in- and out-of-distribution tests. Rubrics from Challengers ensure responses align with paper-specific insights, preventing generic knowledge leakage—e.g., questions test unique paper content verifiable only after reading, with context limited to problem setup sans solutions.",[17,3925,3927],{"id":3926},"meta-optimization-evolves-the-data-agent","Meta-Optimization Evolves the Data Agent",[22,3929,3930],{},"An outer evolution loop (233 iterations, 126 accepted) uses Kimi-K2.6 to analyze failures and edit the agent's harness (prompts\u002Fscaffolding), boosting validation pass rates from 12.8% to 42.4% across 50 train\u002F25 validation papers. Auto-discovered fixes: enforce paper-specific questions via self-tests; ban solution leaks in context; use positive-only rubrics with weights capped at 7; enforce strict JSON rubric format. This eliminates manual tuning, scaling data scientist effectiveness as compute increases.",{"title":40,"searchDepth":41,"depth":41,"links":3932},[3933,3934,3935],{"id":3909,"depth":41,"text":3910},{"id":3919,"depth":41,"text":3920},{"id":3926,"depth":41,"text":3927},[47],{"content_references":3938,"triage":3946},[3939,3943],{"type":3878,"title":3940,"author":3941,"url":3942,"context":3727},"Autodata Blog","Meta AI RAM Team","https:\u002F\u002Ffacebookresearch.github.io\u002FRAM\u002Fblogs\u002Fautodata\u002F",{"type":3944,"title":3945,"context":3801},"dataset","S2ORC Corpus",{"relevance":3947,"novelty":55,"quality":55,"actionability":54,"composite":3948,"reasoning":3949},5,4.15,"Category: AI & LLMs. The article discusses a novel framework, Autodata, that utilizes AI agents to create high-quality synthetic training data, addressing a specific pain point in AI model training. It provides insights into the agentic pipeline and its performance improvements, making it relevant for developers looking to implement similar strategies.","\u002Fsummaries\u002F70d68e2e9ac01aa6-autodata-agents-create-superior-synthetic-training-summary","2026-05-01 22:24:02","2026-05-03 17:01:49",{"title":3899,"description":40},{"loc":3950},"70d68e2e9ac01aa6","https:\u002F\u002Fwww.marktechpost.com\u002F2026\u002F05\u002F01\u002Fmeta-introduces-autodata-an-agentic-framework-that-turns-ai-models-into-autonomous-data-scientists-for-high-quality-training-data-creation\u002F","summaries\u002F70d68e2e9ac01aa6-autodata-agents-create-superior-synthetic-training-summary",[3959,69,71,70],"agents","Meta's Autodata deploys AI agents as data scientists to iteratively generate high-quality QA pairs from CS papers, outperforming CoT Self-Instruct by expanding weak-strong solver gaps from 1.9 to 34 points and boosting downstream model training.",[],"6E7fy1EJIZVboGc1nOTx7_oBFzgtfhR7XeTwtWIvfC4"]