[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"summary-7d871a9968ec8d6b-train-gpt-2-for-48-in-2-hours-on-8xh100-with-nanoc-summary":3,"summaries-facets-categories":231,"summary-related-7d871a9968ec8d6b-train-gpt-2-for-48-in-2-hours-on-8xh100-with-nanoc-summary":3816},{"id":4,"title":5,"ai":6,"body":13,"categories":178,"created_at":180,"date_modified":180,"description":171,"extension":181,"faq":180,"featured":182,"kicker_label":180,"meta":183,"navigation":214,"path":215,"published_at":180,"question":180,"scraped_at":216,"seo":217,"sitemap":218,"source_id":219,"source_name":220,"source_type":221,"source_url":222,"stem":223,"tags":224,"thumbnail_url":180,"tldr":228,"tweet":180,"unknown_tags":229,"__hash__":230},"summaries\u002Fsummaries\u002F7d871a9968ec8d6b-train-gpt-2-for-48-in-2-hours-on-8xh100-with-nanoc-summary.md","Train GPT-2 for $48 in 2 Hours on 8xH100 with nanochat",{"provider":7,"model":8,"input_tokens":9,"output_tokens":10,"processing_time_ms":11,"cost_usd":12},"openrouter","x-ai\u002Fgrok-4.1-fast",9517,2234,206553,0.001593,{"type":14,"value":15,"toc":170},"minimark",[16,21,42,65,69,84,88,155,159],[17,18,20],"h2",{"id":19},"achieve-gpt-2-performance-at-fraction-of-original-cost","Achieve GPT-2 Performance at Fraction of Original Cost",[22,23,24,25,29,30,33,34,37,38],"p",{},"nanochat trains full GPT-2 equivalent models (1.6B params, CORE score 0.2565+) for $15-48 on spot\u002Fregular 8xH100 nodes (~$3\u002FGPU\u002Fhr, ~$24\u002Fhr\u002Fnode), versus GPT-2's 2019 $43k cost. Use single ",[26,27,28],"code",{},"--depth"," dial (e.g., d24-d26 for GPT-2) to auto-set all hyperparameters: transformer width, heads, LR schedule, horizons, weight decay for compute-optimal scaling. Pretraining dominates compute; full pipeline (pretrain, SFT, RL, eval, inference, ChatGPT-like UI) runs end-to-end. Reproduce via ",[26,31,32],{},"bash runs\u002Fspeedrun.sh"," on Lambda.ai 8xH100: ~2-3 hours to 4e19 FLOPs model. Serve with ",[26,35,36],{},"python -m scripts.chat_web"," for web UI at http:\u002F\u002F",[39,40,41],"public-ip",{},":8000. Model behaves like \"kindergartener\": hallucinates identity, explains sky color simply.",[22,43,44,45,48,49,52,53,56,57,60,61,64],{},"Trade-offs: Single GPU works (gradient accumulation, 8x slower); \u003C80GB VRAM needs ",[26,46,47],{},"--device-batch-size"," reduction (32→16\u002F8\u002F4\u002F2\u002F1). CPU\u002FMPS via ",[26,50,51],{},"runs\u002Fruncpu.sh"," (tiny model, weak results). Precision auto: bf16 on A100\u002FH100 (native tensor cores), fp32 on V100\u002FT4\u002FCPU\u002FMPS; override via ",[26,54,55],{},"NANOCHAT_DTYPE=bfloat16\u002Ffloat16\u002Ffloat32",". Weights fp32 (optimizer), compute in ",[26,58,59],{},"COMPUTE_DTYPE",", embeddings in reduced prec—no ",[26,62,63],{},"torch.amp.autocast",".",[17,66,68],{"id":67},"leaderboard-drives-community-optimization","Leaderboard Drives Community Optimization",[22,70,71,72,75,76,79,80,83],{},"\"Time-to-GPT-2\" leaderboard ranks wall-clock on 8xH100 to beat GPT-2 CORE 0.256525 via DCLM CORE eval (",[26,73,74],{},"scripts.base_eval.py","). Current best: 1.65 hours (0.2626 CORE, ClimbMix dataset, autoresearch). Progress: 168hr (2019 GPT-2) → 3.04hr baseline → 2.91hr (fp8) → 2.76hr (1M token batch) → 2.02hr (ClimbMix) → 1.80hr (autoresearch r1) → 1.65hr (r2). Submit via ",[26,77,78],{},"runs\u002Fspeedrun.sh","; see dev\u002FLEADERBOARD.md. Monitor wandb: val_bpb vs step\u002FFLOPs\u002Ftime, CORE, VRAM\u002FMFU\u002Ftok\u002Fsec. Quick expts: d12 (",[26,81,82],{},"--depth=12",", ~5min pretrain) tests changes across depths.",[17,85,87],{"id":86},"minimal-hackable-code-for-full-llm-pipeline","Minimal, Hackable Code for Full LLM Pipeline",[22,89,90,91,94,95,98,99,102,103,106,107,110,111,114,115,118,119,122,123,126,127,130,131,134,135,138,139,142,143,146,147,150,151,154],{},"~1k LoC PyTorch: ",[26,92,93],{},"nanochat\u002Fgpt.py"," (transformer), ",[26,96,97],{},"dataloader.py"," (distributed tokenizing), ",[26,100,101],{},"optim.py"," (AdamW\u002FMuon), ",[26,104,105],{},"tokenizer.py"," (BPE GPT-4 style), ",[26,108,109],{},"engine.py"," (KV-cache inference), ",[26,112,113],{},"execution.py"," (Python tool exec), ",[26,116,117],{},"core_eval.py"," (DCLM CORE). Stages: ",[26,120,121],{},"base_train.py"," (pretrain), ",[26,124,125],{},"chat_sft.py"," (SFT), ",[26,128,129],{},"chat_rl.py"," (RL), ",[26,132,133],{},"chat_eval.py"," (tasks: ARC\u002FGSM8K\u002FMMLU\u002FHumanEval\u002Fspellingbee\u002FSmolTalk), ",[26,136,137],{},"chat_cli\u002Fweb",". Tasks in ",[26,140,141],{},"tasks\u002F",": mixtures\u002Fsequences. Data: FineWeb (HF), ClimbMix (NVIDIA). Setup: ",[26,144,145],{},"uv sync --extra gpu --group dev"," (uv dep mgr). Scripts: ",[26,148,149],{},"scaling_laws.sh","\u002F",[26,152,153],{},"miniseries.sh"," sweep depths. No config monsters—depth drives all.",[17,156,158],{"id":157},"research-and-customization-workflow","Research and Customization Workflow",[22,160,161,162,165,166,169],{},"Forkable baseline for \u003C$1k micro-models. Improve pretrain (e.g., dataset, fp8, batch=1M). Guides: Infuse personality via synthetic data (",[26,163,164],{},"dev\u002Fgen_synthetic_data.py",") + SFT mix; add abilities (e.g., strawberry 'r' count) via tasks\u002Fcustomjson. Ex: ",[26,167,168],{},"torchrun -m scripts.base_train --depth=12 --run=d12"," (wandb, no intermediates). PRs: Declare LLM contributions. Inspired by nanoGPT\u002Fmodded-nanoGPT. Cite as @misc{nanochat...}.",{"title":171,"searchDepth":172,"depth":172,"links":173},"",2,[174,175,176,177],{"id":19,"depth":172,"text":20},{"id":67,"depth":172,"text":68},{"id":86,"depth":172,"text":87},{"id":157,"depth":172,"text":158},[179],"AI & LLMs",null,"md",false,{"content_references":184,"triage":209},[185,190,193,197,200,204,206],{"type":186,"title":187,"url":188,"context":189},"tool","uv","https:\u002F\u002Fdocs.astral.sh\u002Fuv\u002F","mentioned",{"type":186,"title":191,"url":192,"context":189},"Lambda GPU Cloud","https:\u002F\u002Flambda.ai\u002Fservice\u002Fgpu-cloud",{"type":194,"title":195,"url":196,"context":189},"other","nanoGPT","https:\u002F\u002Fgithub.com\u002Fkarpathy\u002FnanoGPT",{"type":194,"title":198,"url":199,"context":189},"modded-nanoGPT","https:\u002F\u002Fgithub.com\u002FKellerJordan\u002Fmodded-nanogpt",{"type":201,"title":202,"author":203,"context":189},"dataset","FineWeb","HuggingFace",{"type":201,"title":205,"author":203,"context":189},"SmolTalk",{"type":186,"title":207,"url":208,"context":189},"DeepWiki","https:\u002F\u002Fdeepwiki.com\u002Fkarpathy\u002Fnanochat",{"relevance":210,"novelty":211,"quality":211,"actionability":210,"composite":212,"reasoning":213},5,4,4.55,"Category: AI & LLMs. The article provides a detailed guide on training GPT-2 models efficiently and cost-effectively, addressing the audience's need for practical applications in AI product development. It includes specific commands and parameters for implementation, making it immediately actionable.",true,"\u002Fsummaries\u002F7d871a9968ec8d6b-train-gpt-2-for-48-in-2-hours-on-8xh100-with-nanoc-summary","2026-04-16 03:01:10",{"title":5,"description":171},{"loc":215},"7d871a9968ec8d6b","__oneoff__","article","https:\u002F\u002Fgithub.com\u002Fkarpathy\u002Fnanochat","summaries\u002F7d871a9968ec8d6b-train-gpt-2-for-48-in-2-hours-on-8xh100-with-nanoc-summary",[225,226,227],"llm","python","open-source","nanochat trains GPT-2 capability LLMs (CORE score >0.2565) on a single 8xH100 GPU node for ~$48 (~2-3 hours wall-clock), with auto-optimal hyperparameters via single --depth dial, plus chat UI.",[],"ORueeVh-F7dwE9iVUlW3Gu6vJa1iZvPgawLZSA5XKkE",[232,235,238,240,243,246,248,250,252,254,256,258,261,263,265,267,269,271,273,275,277,279,282,285,287,289,292,294,296,299,301,303,305,307,309,311,313,315,317,319,321,323,325,327,329,331,333,335,337,339,341,343,345,347,349,351,353,355,357,359,361,363,365,367,369,371,373,375,377,379,381,383,385,387,389,391,393,395,397,399,401,403,405,407,409,411,413,415,417,419,421,423,425,427,429,431,433,435,437,439,441,443,445,447,449,451,453,455,457,459,461,463,465,467,469,471,473,475,477,479,481,483,485,487,489,491,493,495,497,499,501,503,505,507,509,511,513,515,517,519,521,523,525,527,529,531,533,535,537,539,541,543,545,547,549,551,553,556,558,560,562,564,566,568,570,572,574,576,578,580,582,584,586,588,590,592,594,596,598,600,602,604,606,608,610,612,614,616,618,620,622,624,626,628,630,632,634,636,638,640,642,644,646,648,650,652,654,656,658,660,662,664,666,668,670,672,674,676,678,680,682,684,686,688,690,692,694,696,698,700,702,704,706,708,710,712,714,716,718,720,722,724,726,728,730,732,734,736,738,740,742,744,746,748,750,752,754,756,758,760,762,764,766,768,770,772,774,776,778,780,782,784,786,788,790,792,794,796,798,800,802,804,806,808,810,812,814,816,818,820,822,824,826,828,830,832,834,836,838,840,842,844,846,848,850,852,854,856,858,860,862,864,866,868,870,872,874,876,878,880,882,884,886,888,890,892,894,896,898,900,902,904,906,908,910,912,914,916,918,920,922,924,926,928,930,932,934,936,938,940,942,944,946,948,950,952,954,956,958,960,962,964,966,968,970,972,974,976,978,980,982,984,986,988,990,992,994,996,998,1000,1002,1004,1006,1008,1010,1012,1014,1016,1018,1020,1022,1024,1026,1028,1030,1032,1034,1036,1038,1040,1042,1044,1046,1048,1050,1052,1054,1056,1058,1060,1062,1064,1066,1068,1070,1072,1074,1076,1078,1080,1082,1084,1086,1088,1090,1092,1094,1096,1098,1100,1102,1104,1106,1108,1110,1112,1114,1116,1118,1120,1122,1124,1126,1128,1130,1132,1134,1136,1138,1140,1142,1144,1146,1148,1150,1152,1154,1156,1158,1160,1162,1164,1166,1168,1170,1172,1174,1176,1178,1180,1182,1184,1186,1188,1190,1192,1194,1196,1198,1200,1202,1204,1206,1208,1210,1212,1214,1216,1218,1220,1222,1224,1226,1228,1230,1232,1234,1236,1238,1240,1242,1244,1246,1248,1250,1252,1254,1256,1258,1260,1262,1264,1266,1268,1270,1272,1274,1276,1278,1280,1282,1284,1286,1288,1290,1292,1294,1296,1298,1300,1302,1304,1306,1308,1310,1312,1314,1316,1318,1320,1322,1324,1326,1328,1330,1332,1334,1336,1338,1340,1342,1344,1346,1348,1350,1352,1354,1356,1358,1360,1362,1364,1366,1368,1370,1372,1374,1376,1378,1380,1382,1384,1386,1388,1390,1392,1394,1396,1398,1400,1402,1404,1406,1408,1410,1412,1414,1416,1418,1420,1422,1424,1426,1428,1430,1432,1434,1436,1438,1440,1442,1444,1446,1448,1450,1452,1454,1456,1458,1460,1462,1464,1466,1468,1470,1472,1474,1476,1478,1480,1482,1484,1486,1488,1490,1492,1494,1496,1498,1500,1502,1504,1506,1508,1510,1512,1514,1516,1518,1520,1522,1524,1526,1528,1530,1532,1534,1536,1538,1540,1542,1544,1546,1548,1550,1552,1554,1556,1558,1560,1562,1564,1566,1568,1570,1572,1574,1576,1578,1580,1582,1584,1586,1588,1590,1592,1594,1596,1598,1600,1602,1604,1606,1608,1610,1612,1614,1616,1618,1620,1622,1624,1626,1628,1630,1632,1634,1636,1638,1640,1642,1644,1646,1648,1650,1652,1654,1656,1658,1660,1662,1664,1666,1668,1670,1672,1674,1676,1678,1680,1682,1684,1686,1688,1690,1692,1694,1696,1698,1700,1702,1704,1706,1708,1710,1712,1714,1716,1718,1720,1722,1724,1726,1728,1730,1732,1734,1736,1738,1740,1742,1744,1746,1748,1750,1752,1754,1756,1758,1760,1762,1764,1766,1768,1770,1772,1774,1776,1778,1780,1782,1784,1786,1788,1790,1792,1794,1796,1798,1800,1802,1804,1806,1808,1810,1812,1814,1816,1818,1820,1822,1824,1826,1828,1830,1832,1834,1836,1838,1840,1842,1844,1846,1848,1850,1852,1854,1856,1858,1860,1862,1864,1866,1868,1870,1872,1874,1876,1878,1880,1882,1884,1886,1888,1890,1892,1894,1896,1898,1900,1902,1904,1906,1908,1910,1912,1914,1916,1918,1920,1922,1924,1926,1928,1930,1932,1934,1936,1938,1940,1942,1944,1946,1948,1950,1952,1954,1956,1958,1960,1962,1964,1966,1968,1970,1972,1974,1976,1978,1980,1982,1984,1986,1988,1990,1992,1994,1996,1998,2000,2002,2004,2006,2008,2010,2012,2014,2016,2018,2020,2022,2024,2026,2028,2030,2032,2034,2036,2038,2040,2042,2044,2046,2048,2050,2052,2054,2056,2058,2060,2062,2064,2066,2068,2070,2072,2074,2076,2078,2080,2082,2084,2086,2088,2090,2092,2094,2096,2098,2100,2102,2104,2106,2108,2110,2112,2114,2116,2118,2120,2122,2124,2126,2128,2130,2132,2134,2136,2138,2140,2142,2144,2146,2148,2150,2152,2154,2156,2158,2160,2162,2164,2166,2168,2170,2172,2174,2176,2178,2180,2182,2184,2186,2188,2190,2192,2194,2196,2198,2200,2202,2204,2206,2208,2210,2212,2214,2216,2218,2220,2222,2224,2226,2228,2230,2232,2234,2236,2238,2240,2242,2244,2246,2248,2250,2252,2254,2256,2258,2260,2262,2264,2266,2268,2270,2272,2274,2276,2278,2280,2282,2284,2286,2288,2290,2292,2294,2296,2298,2300,2302,2304,2306,2308,2310,2312,2314,2316,2318,2320,2322,2324,2326,2328,2330,2332,2334,2336,2338,2340,2342,2344,2346,2348,2350,2352,2354,2356,2358,2360,2362,2364,2366,2368,2370,2372,2374,2376,2378,2380,2382,2384,2386,2388,2390,2392,2394,2396,2398,2400,2402,2404,2406,2408,2410,2412,2414,2416,2418,2420,2422,2424,2426,2428,2430,2432,2434,2436,2438,2440,2442,2444,2446,2448,2450,2452,2454,2456,2458,2460,2462,2464,2466,2468,2470,2472,2474,2476,2478,2480,2482,2484,2486,2488,2490,2492,2494,2496,2498,2500,2502,2504,2506,2508,2510,2512,2514,2516,2518,2520,2522,2524,2526,2528,2530,2532,2534,2536,2538,2540,2542,2544,2546,2548,2550,2552,2554,2556,2558,2560,2562,2564,2566,2568,2570,2572,2574,2576,2578,2580,2582,2584,2586,2588,2590,2592,2594,2596,2598,2600,2602,2604,2606,2608,2610,2612,2614,2616,2618,2620,2622,2624,2626,2628,2630,2632,2634,2636,2638,2640,2642,2644,2646,2648,2650,2652,2654,2656,2658,2660,2662,2664,2666,2668,2670,2672,2674,2676,2678,2680,2682,2684,2686,2688,2690,2692,2694,2696,2698,2700,2702,2704,2706,2708,2710,2712,2714,2716,2718,2720,2722,2724,2726,2728,2730,2732,2734,2736,2738,2740,2742,2744,2746,2748,2750,2752,2754,2756,2758,2760,2762,2764,2766,2768,2770,2772,2774,2776,2778,2780,2782,2784,2786,2788,2790,2792,2794,2796,2798,2800,2802,2804,2806,2808,2810,2812,2814,2816,2818,2820,2822,2824,2826,2828,2830,2832,2834,2836,2838,2840,2842,2844,2846,2848,2850,2852,2854,2856,2858,2860,2862,2864,2866,2868,2870,2872,2874,2876,2878,2880,2882,2884,2886,2888,2890,2892,2894,2896,2898,2900,2902,2904,2906,2908,2910,2912,2914,2916,2918,2920,2922,2924,2926,2928,2930,2932,2934,2936,2938,2940,2942,2944,2946,2948,2950,2952,2954,2956,2958,2960,2962,2964,2966,2968,2970,2972,2974,2976,2978,2980,2982,2984,2986,2988,2990,2992,2994,2996,2998,3000,3002,3004,3006,3008,3010,3012,3014,3016,3018,3020,3022,3024,3026,3028,3030,3032,3034,3036,3038,3040,3042,3044,3046,3048,3050,3052,3054,3056,3058,3060,3062,3064,3066,3068,3070,3072,3074,3076,3078,3080,3082,3084,3086,3088,3090,3092,3094,3096,3098,3100,3102,3104,3106,3108,3110,3112,3114,3116,3118,3120,3122,3124,3126,3128,3130,3132,3134,3136,3138,3140,3142,3144,3146,3148,3150,3152,3154,3156,3158,3160,3162,3164,3166,3168,3170,3172,3174,3176,3178,3180,3182,3184,3186,3188,3190,3192,3194,3196,3198,3200,3202,3204,3206,3208,3210,3212,3214,3216,3218,3220,3222,3224,3226,3228,3230,3232,3234,3236,3238,3240,3242,3244,3246,3248,3250,3252,3254,3256,3258,3260,3262,3264,3266,3268,3270,3272,3274,3276,3278,3280,3282,3284,3286,3288,3290,3292,3294,3296,3298,3300,3302,3304,3306,3308,3310,3312,3314,3316,3318,3320,3322,3324,3326,3328,3330,3332,3334,3336,3338,3340,3342,3344,3346,3348,3350,3352,3354,3356,3358,3360,3362,3364,3366,3368,3370,3372,3374,3376,3378,3380,3382,3384,3386,3388,3390,3392,3394,3396,3398,3400,3402,3404,3406,3408,3410,3412,3414,3416,3418,3420,3422,3424,3426,3428,3430,3432,3434,3436,3438,3440,3442,3444,3446,3448,3450,3452,3454,3456,3458,3460,3462,3464,3466,3468,3470,3472,3474,3476,3478,3480,3482,3484,3486,3488,3490,3492,3494,3496,3498,3500,3502,3504,3506,3508,3510,3512,3514,3516,3518,3520,3522,3524,3526,3528,3530,3532,3534,3536,3538,3540,3542,3544,3546,3548,3550,3552,3554,3556,3558,3560,3562,3564,3566,3568,3570,3572,3574,3576,3578,3580,3582,3584,3586,3588,3590,3592,3594,3596,3598,3600,3602,3604,3606,3608,3610,3612,3614,3616,3618,3620,3622,3624,3626,3628,3630,3632,3634,3636,3638,3640,3642,3644,3646,3648,3650,3652,3654,3656,3658,3660,3662,3664,3666,3668,3670,3672,3674,3676,3678,3680,3682,3684,3686,3688,3690,3692,3694,3696,3698,3700,3702,3704,3706,3708,3710,3712,3714,3716,3718,3720,3722,3724,3726,3728,3730,3732,3734,3736,3738,3740,3742,3744,3746,3748,3750,3752,3754,3756,3758,3760,3762,3764,3766,3768,3770,3772,3774,3776,3778,3780,3782,3784,3786,3788,3790,3792,3794,3796,3798,3800,3802,3804,3806,3808,3810,3812,3814],{"categories":233},[234],"Developer Productivity",{"categories":236},[237],"Business & SaaS",{"categories":239},[179],{"categories":241},[242],"AI Automation",{"categories":244},[245],"Product Strategy",{"categories":247},[179],{"categories":249},[234],{"categories":251},[237],{"categories":253},[],{"categories":255},[179],{"categories":257},[],{"categories":259},[260],"AI News & Trends",{"categories":262},[242],{"categories":264},[260],{"categories":266},[242],{"categories":268},[242],{"categories":270},[179],{"categories":272},[179],{"categories":274},[260],{"categories":276},[179],{"categories":278},[],{"categories":280},[281],"Design & Frontend",{"categories":283},[284],"Data Science & Visualization",{"categories":286},[260],{"categories":288},[],{"categories":290},[291],"Software Engineering",{"categories":293},[179],{"categories":295},[242],{"categories":297},[298],"Marketing & Growth",{"categories":300},[179],{"categories":302},[242],{"categories":304},[],{"categories":306},[],{"categories":308},[281],{"categories":310},[242],{"categories":312},[234],{"categories":314},[281],{"categories":316},[179],{"categories":318},[242],{"categories":320},[260],{"categories":322},[],{"categories":324},[],{"categories":326},[242],{"categories":328},[291],{"categories":330},[],{"categories":332},[237],{"categories":334},[],{"categories":336},[],{"categories":338},[242],{"categories":340},[242],{"categories":342},[179],{"categories":344},[],{"categories":346},[291],{"categories":348},[],{"categories":350},[],{"categories":352},[],{"categories":354},[179],{"categories":356},[298],{"categories":358},[281],{"categories":360},[281],{"categories":362},[179],{"categories":364},[242],{"categories":366},[179],{"categories":368},[179],{"categories":370},[242],{"categories":372},[242],{"categories":374},[284],{"categories":376},[260],{"categories":378},[242],{"categories":380},[298],{"categories":382},[242],{"categories":384},[245],{"categories":386},[],{"categories":388},[242],{"categories":390},[],{"categories":392},[242],{"categories":394},[291],{"categories":396},[281],{"categories":398},[179],{"categories":400},[],{"categories":402},[],{"categories":404},[242],{"categories":406},[],{"categories":408},[179],{"categories":410},[],{"categories":412},[234],{"categories":414},[291],{"categories":416},[237],{"categories":418},[260],{"categories":420},[179],{"categories":422},[],{"categories":424},[179],{"categories":426},[],{"categories":428},[291],{"categories":430},[284],{"categories":432},[],{"categories":434},[179],{"categories":436},[281],{"categories":438},[],{"categories":440},[281],{"categories":442},[242],{"categories":444},[],{"categories":446},[242],{"categories":448},[260],{"categories":450},[237],{"categories":452},[179],{"categories":454},[],{"categories":456},[242],{"categories":458},[179],{"categories":460},[245],{"categories":462},[],{"categories":464},[179],{"categories":466},[242],{"categories":468},[242],{"categories":470},[],{"categories":472},[284],{"categories":474},[179],{"categories":476},[],{"categories":478},[234],{"categories":480},[237],{"categories":482},[179],{"categories":484},[242],{"categories":486},[291],{"categories":488},[179],{"categories":490},[],{"categories":492},[],{"categories":494},[179],{"categories":496},[],{"categories":498},[281],{"categories":500},[],{"categories":502},[179],{"categories":504},[],{"categories":506},[242],{"categories":508},[179],{"categories":510},[281],{"categories":512},[],{"categories":514},[179],{"categories":516},[179],{"categories":518},[237],{"categories":520},[242],{"categories":522},[179],{"categories":524},[281],{"categories":526},[242],{"categories":528},[],{"categories":530},[],{"categories":532},[260],{"categories":534},[],{"categories":536},[179],{"categories":538},[237,298],{"categories":540},[],{"categories":542},[179],{"categories":544},[],{"categories":546},[],{"categories":548},[179],{"categories":550},[],{"categories":552},[179],{"categories":554},[555],"DevOps & Cloud",{"categories":557},[],{"categories":559},[260],{"categories":561},[281],{"categories":563},[],{"categories":565},[260],{"categories":567},[260],{"categories":569},[179],{"categories":571},[298],{"categories":573},[],{"categories":575},[237],{"categories":577},[],{"categories":579},[179,555],{"categories":581},[179],{"categories":583},[179],{"categories":585},[242],{"categories":587},[179,291],{"categories":589},[284],{"categories":591},[179],{"categories":593},[298],{"categories":595},[242],{"categories":597},[242],{"categories":599},[],{"categories":601},[242],{"categories":603},[179,237],{"categories":605},[],{"categories":607},[281],{"categories":609},[281],{"categories":611},[],{"categories":613},[],{"categories":615},[260],{"categories":617},[],{"categories":619},[234],{"categories":621},[291],{"categories":623},[179],{"categories":625},[281],{"categories":627},[242],{"categories":629},[291],{"categories":631},[260],{"categories":633},[281],{"categories":635},[],{"categories":637},[179],{"categories":639},[179],{"categories":641},[179],{"categories":643},[260],{"categories":645},[234],{"categories":647},[179],{"categories":649},[242],{"categories":651},[555],{"categories":653},[281],{"categories":655},[242],{"categories":657},[],{"categories":659},[],{"categories":661},[281],{"categories":663},[260],{"categories":665},[284],{"categories":667},[],{"categories":669},[179],{"categories":671},[179],{"categories":673},[237],{"categories":675},[179],{"categories":677},[179],{"categories":679},[260],{"categories":681},[],{"categories":683},[242],{"categories":685},[291],{"categories":687},[],{"categories":689},[179],{"categories":691},[179],{"categories":693},[242],{"categories":695},[],{"categories":697},[],{"categories":699},[179],{"categories":701},[],{"categories":703},[237],{"categories":705},[242],{"categories":707},[],{"categories":709},[234],{"categories":711},[179],{"categories":713},[237],{"categories":715},[260],{"categories":717},[],{"categories":719},[],{"categories":721},[],{"categories":723},[260],{"categories":725},[260],{"categories":727},[],{"categories":729},[],{"categories":731},[237],{"categories":733},[],{"categories":735},[],{"categories":737},[234],{"categories":739},[],{"categories":741},[298],{"categories":743},[242],{"categories":745},[237],{"categories":747},[242],{"categories":749},[291],{"categories":751},[],{"categories":753},[245],{"categories":755},[281],{"categories":757},[291],{"categories":759},[179],{"categories":761},[242],{"categories":763},[237],{"categories":765},[179],{"categories":767},[],{"categories":769},[],{"categories":771},[291],{"categories":773},[284],{"categories":775},[245],{"categories":777},[242],{"categories":779},[179],{"categories":781},[],{"categories":783},[555],{"categories":785},[],{"categories":787},[242],{"categories":789},[],{"categories":791},[],{"categories":793},[179],{"categories":795},[281],{"categories":797},[298],{"categories":799},[242],{"categories":801},[],{"categories":803},[234],{"categories":805},[],{"categories":807},[260],{"categories":809},[179,555],{"categories":811},[260],{"categories":813},[179],{"categories":815},[237],{"categories":817},[179],{"categories":819},[],{"categories":821},[237],{"categories":823},[],{"categories":825},[291],{"categories":827},[281],{"categories":829},[260],{"categories":831},[284],{"categories":833},[234],{"categories":835},[179],{"categories":837},[291],{"categories":839},[],{"categories":841},[],{"categories":843},[245],{"categories":845},[],{"categories":847},[179],{"categories":849},[],{"categories":851},[281],{"categories":853},[281],{"categories":855},[281],{"categories":857},[],{"categories":859},[],{"categories":861},[260],{"categories":863},[242],{"categories":865},[179],{"categories":867},[179],{"categories":869},[179],{"categories":871},[237],{"categories":873},[179],{"categories":875},[],{"categories":877},[291],{"categories":879},[291],{"categories":881},[237],{"categories":883},[],{"categories":885},[179],{"categories":887},[179],{"categories":889},[237],{"categories":891},[260],{"categories":893},[298],{"categories":895},[242],{"categories":897},[],{"categories":899},[281],{"categories":901},[],{"categories":903},[179],{"categories":905},[],{"categories":907},[237],{"categories":909},[242],{"categories":911},[],{"categories":913},[555],{"categories":915},[284],{"categories":917},[291],{"categories":919},[298],{"categories":921},[291],{"categories":923},[242],{"categories":925},[],{"categories":927},[],{"categories":929},[242],{"categories":931},[234],{"categories":933},[242],{"categories":935},[245],{"categories":937},[237],{"categories":939},[],{"categories":941},[179],{"categories":943},[245],{"categories":945},[179],{"categories":947},[179],{"categories":949},[298],{"categories":951},[281],{"categories":953},[242],{"categories":955},[],{"categories":957},[],{"categories":959},[555],{"categories":961},[291],{"categories":963},[],{"categories":965},[242],{"categories":967},[179],{"categories":969},[281,179],{"categories":971},[234],{"categories":973},[],{"categories":975},[179],{"categories":977},[234],{"categories":979},[281],{"categories":981},[242],{"categories":983},[291],{"categories":985},[],{"categories":987},[179],{"categories":989},[],{"categories":991},[234],{"categories":993},[],{"categories":995},[242],{"categories":997},[245],{"categories":999},[179],{"categories":1001},[179],{"categories":1003},[281],{"categories":1005},[242],{"categories":1007},[555],{"categories":1009},[281],{"categories":1011},[242],{"categories":1013},[179],{"categories":1015},[179],{"categories":1017},[179],{"categories":1019},[260],{"categories":1021},[],{"categories":1023},[245],{"categories":1025},[242],{"categories":1027},[281],{"categories":1029},[242],{"categories":1031},[291],{"categories":1033},[281],{"categories":1035},[242],{"categories":1037},[260],{"categories":1039},[],{"categories":1041},[179],{"categories":1043},[281],{"categories":1045},[179],{"categories":1047},[234],{"categories":1049},[260],{"categories":1051},[179],{"categories":1053},[298],{"categories":1055},[179],{"categories":1057},[179],{"categories":1059},[242],{"categories":1061},[242],{"categories":1063},[179],{"categories":1065},[242],{"categories":1067},[281],{"categories":1069},[179],{"categories":1071},[],{"categories":1073},[],{"categories":1075},[291],{"categories":1077},[],{"categories":1079},[234],{"categories":1081},[555],{"categories":1083},[],{"categories":1085},[234],{"categories":1087},[237],{"categories":1089},[298],{"categories":1091},[],{"categories":1093},[237],{"categories":1095},[],{"categories":1097},[],{"categories":1099},[],{"categories":1101},[],{"categories":1103},[],{"categories":1105},[179],{"categories":1107},[242],{"categories":1109},[555],{"categories":1111},[234],{"categories":1113},[179],{"categories":1115},[291],{"categories":1117},[245],{"categories":1119},[179],{"categories":1121},[298],{"categories":1123},[179],{"categories":1125},[179],{"categories":1127},[179],{"categories":1129},[179,234],{"categories":1131},[291],{"categories":1133},[291],{"categories":1135},[281],{"categories":1137},[179],{"categories":1139},[],{"categories":1141},[],{"categories":1143},[],{"categories":1145},[291],{"categories":1147},[284],{"categories":1149},[260],{"categories":1151},[281],{"categories":1153},[],{"categories":1155},[179],{"categories":1157},[179],{"categories":1159},[],{"categories":1161},[],{"categories":1163},[242],{"categories":1165},[179],{"categories":1167},[237],{"categories":1169},[],{"categories":1171},[234],{"categories":1173},[179],{"categories":1175},[234],{"categories":1177},[179],{"categories":1179},[291],{"categories":1181},[298],{"categories":1183},[179,281],{"categories":1185},[260],{"categories":1187},[281],{"categories":1189},[],{"categories":1191},[555],{"categories":1193},[281],{"categories":1195},[242],{"categories":1197},[],{"categories":1199},[],{"categories":1201},[],{"categories":1203},[],{"categories":1205},[291],{"categories":1207},[242],{"categories":1209},[242],{"categories":1211},[555],{"categories":1213},[179],{"categories":1215},[179],{"categories":1217},[179],{"categories":1219},[],{"categories":1221},[281],{"categories":1223},[],{"categories":1225},[],{"categories":1227},[242],{"categories":1229},[],{"categories":1231},[],{"categories":1233},[298],{"categories":1235},[298],{"categories":1237},[242],{"categories":1239},[],{"categories":1241},[179],{"categories":1243},[179],{"categories":1245},[291],{"categories":1247},[281],{"categories":1249},[281],{"categories":1251},[242],{"categories":1253},[234],{"categories":1255},[179],{"categories":1257},[281],{"categories":1259},[281],{"categories":1261},[242],{"categories":1263},[242],{"categories":1265},[179],{"categories":1267},[],{"categories":1269},[],{"categories":1271},[179],{"categories":1273},[242],{"categories":1275},[260],{"categories":1277},[291],{"categories":1279},[234],{"categories":1281},[179],{"categories":1283},[],{"categories":1285},[242],{"categories":1287},[242],{"categories":1289},[],{"categories":1291},[234],{"categories":1293},[179],{"categories":1295},[234],{"categories":1297},[234],{"categories":1299},[],{"categories":1301},[],{"categories":1303},[242],{"categories":1305},[242],{"categories":1307},[179],{"categories":1309},[179],{"categories":1311},[260],{"categories":1313},[284],{"categories":1315},[245],{"categories":1317},[260],{"categories":1319},[281],{"categories":1321},[],{"categories":1323},[260],{"categories":1325},[],{"categories":1327},[],{"categories":1329},[],{"categories":1331},[],{"categories":1333},[291],{"categories":1335},[284],{"categories":1337},[],{"categories":1339},[179],{"categories":1341},[179],{"categories":1343},[284],{"categories":1345},[291],{"categories":1347},[],{"categories":1349},[],{"categories":1351},[242],{"categories":1353},[260],{"categories":1355},[260],{"categories":1357},[242],{"categories":1359},[234],{"categories":1361},[179,555],{"categories":1363},[],{"categories":1365},[281],{"categories":1367},[234],{"categories":1369},[242],{"categories":1371},[281],{"categories":1373},[],{"categories":1375},[242],{"categories":1377},[242],{"categories":1379},[179],{"categories":1381},[298],{"categories":1383},[291],{"categories":1385},[281],{"categories":1387},[],{"categories":1389},[242],{"categories":1391},[179],{"categories":1393},[242],{"categories":1395},[242],{"categories":1397},[242],{"categories":1399},[298],{"categories":1401},[242],{"categories":1403},[179],{"categories":1405},[],{"categories":1407},[298],{"categories":1409},[260],{"categories":1411},[242],{"categories":1413},[],{"categories":1415},[],{"categories":1417},[179],{"categories":1419},[242],{"categories":1421},[260],{"categories":1423},[242],{"categories":1425},[],{"categories":1427},[],{"categories":1429},[],{"categories":1431},[242],{"categories":1433},[],{"categories":1435},[],{"categories":1437},[284],{"categories":1439},[179],{"categories":1441},[284],{"categories":1443},[260],{"categories":1445},[179],{"categories":1447},[179],{"categories":1449},[242],{"categories":1451},[179],{"categories":1453},[],{"categories":1455},[],{"categories":1457},[555],{"categories":1459},[],{"categories":1461},[],{"categories":1463},[234],{"categories":1465},[],{"categories":1467},[],{"categories":1469},[],{"categories":1471},[],{"categories":1473},[291],{"categories":1475},[260],{"categories":1477},[298],{"categories":1479},[237],{"categories":1481},[179],{"categories":1483},[179],{"categories":1485},[237],{"categories":1487},[],{"categories":1489},[281],{"categories":1491},[242],{"categories":1493},[237],{"categories":1495},[179],{"categories":1497},[179],{"categories":1499},[234],{"categories":1501},[],{"categories":1503},[234],{"categories":1505},[179],{"categories":1507},[298],{"categories":1509},[242],{"categories":1511},[260],{"categories":1513},[237],{"categories":1515},[179],{"categories":1517},[242],{"categories":1519},[],{"categories":1521},[179],{"categories":1523},[234],{"categories":1525},[179],{"categories":1527},[],{"categories":1529},[260],{"categories":1531},[179],{"categories":1533},[],{"categories":1535},[237],{"categories":1537},[179],{"categories":1539},[],{"categories":1541},[],{"categories":1543},[],{"categories":1545},[179],{"categories":1547},[],{"categories":1549},[555],{"categories":1551},[179],{"categories":1553},[],{"categories":1555},[179],{"categories":1557},[179],{"categories":1559},[179],{"categories":1561},[179,555],{"categories":1563},[179],{"categories":1565},[179],{"categories":1567},[281],{"categories":1569},[242],{"categories":1571},[],{"categories":1573},[242],{"categories":1575},[179],{"categories":1577},[179],{"categories":1579},[179],{"categories":1581},[234],{"categories":1583},[234],{"categories":1585},[291],{"categories":1587},[281],{"categories":1589},[242],{"categories":1591},[],{"categories":1593},[179],{"categories":1595},[260],{"categories":1597},[179],{"categories":1599},[237],{"categories":1601},[],{"categories":1603},[555],{"categories":1605},[281],{"categories":1607},[281],{"categories":1609},[242],{"categories":1611},[260],{"categories":1613},[242],{"categories":1615},[179],{"categories":1617},[],{"categories":1619},[179],{"categories":1621},[],{"categories":1623},[],{"categories":1625},[179],{"categories":1627},[179],{"categories":1629},[179],{"categories":1631},[242],{"categories":1633},[179],{"categories":1635},[],{"categories":1637},[284],{"categories":1639},[242],{"categories":1641},[],{"categories":1643},[],{"categories":1645},[179],{"categories":1647},[260],{"categories":1649},[],{"categories":1651},[281],{"categories":1653},[555],{"categories":1655},[260],{"categories":1657},[291],{"categories":1659},[291],{"categories":1661},[260],{"categories":1663},[260],{"categories":1665},[555],{"categories":1667},[],{"categories":1669},[260],{"categories":1671},[179],{"categories":1673},[234],{"categories":1675},[260],{"categories":1677},[],{"categories":1679},[284],{"categories":1681},[260],{"categories":1683},[291],{"categories":1685},[260],{"categories":1687},[555],{"categories":1689},[179],{"categories":1691},[179],{"categories":1693},[],{"categories":1695},[237],{"categories":1697},[],{"categories":1699},[],{"categories":1701},[179],{"categories":1703},[179],{"categories":1705},[179],{"categories":1707},[179],{"categories":1709},[],{"categories":1711},[284],{"categories":1713},[234],{"categories":1715},[],{"categories":1717},[179],{"categories":1719},[179],{"categories":1721},[555],{"categories":1723},[555],{"categories":1725},[],{"categories":1727},[242],{"categories":1729},[260],{"categories":1731},[260],{"categories":1733},[179],{"categories":1735},[242],{"categories":1737},[],{"categories":1739},[281],{"categories":1741},[179],{"categories":1743},[179],{"categories":1745},[],{"categories":1747},[],{"categories":1749},[555],{"categories":1751},[179],{"categories":1753},[291],{"categories":1755},[237],{"categories":1757},[179],{"categories":1759},[],{"categories":1761},[242],{"categories":1763},[234],{"categories":1765},[234],{"categories":1767},[],{"categories":1769},[179],{"categories":1771},[281],{"categories":1773},[242],{"categories":1775},[],{"categories":1777},[179],{"categories":1779},[179],{"categories":1781},[242],{"categories":1783},[],{"categories":1785},[242],{"categories":1787},[291],{"categories":1789},[],{"categories":1791},[179],{"categories":1793},[],{"categories":1795},[179],{"categories":1797},[],{"categories":1799},[179],{"categories":1801},[179],{"categories":1803},[],{"categories":1805},[179],{"categories":1807},[260],{"categories":1809},[179],{"categories":1811},[179],{"categories":1813},[234],{"categories":1815},[179],{"categories":1817},[260],{"categories":1819},[242],{"categories":1821},[],{"categories":1823},[179],{"categories":1825},[298],{"categories":1827},[],{"categories":1829},[],{"categories":1831},[],{"categories":1833},[234],{"categories":1835},[260],{"categories":1837},[242],{"categories":1839},[179],{"categories":1841},[281],{"categories":1843},[242],{"categories":1845},[],{"categories":1847},[242],{"categories":1849},[],{"categories":1851},[179],{"categories":1853},[242],{"categories":1855},[179],{"categories":1857},[],{"categories":1859},[179],{"categories":1861},[179],{"categories":1863},[260],{"categories":1865},[281],{"categories":1867},[242],{"categories":1869},[281],{"categories":1871},[237],{"categories":1873},[],{"categories":1875},[],{"categories":1877},[179],{"categories":1879},[234],{"categories":1881},[260],{"categories":1883},[],{"categories":1885},[],{"categories":1887},[291],{"categories":1889},[281],{"categories":1891},[],{"categories":1893},[179],{"categories":1895},[],{"categories":1897},[298],{"categories":1899},[179],{"categories":1901},[555],{"categories":1903},[291],{"categories":1905},[],{"categories":1907},[242],{"categories":1909},[179],{"categories":1911},[242],{"categories":1913},[242],{"categories":1915},[179],{"categories":1917},[],{"categories":1919},[234],{"categories":1921},[179],{"categories":1923},[237],{"categories":1925},[291],{"categories":1927},[281],{"categories":1929},[],{"categories":1931},[],{"categories":1933},[],{"categories":1935},[242],{"categories":1937},[281],{"categories":1939},[260],{"categories":1941},[179],{"categories":1943},[260],{"categories":1945},[281],{"categories":1947},[],{"categories":1949},[281],{"categories":1951},[260],{"categories":1953},[237],{"categories":1955},[179],{"categories":1957},[260],{"categories":1959},[298],{"categories":1961},[],{"categories":1963},[],{"categories":1965},[284],{"categories":1967},[179,291],{"categories":1969},[260],{"categories":1971},[179],{"categories":1973},[242],{"categories":1975},[242],{"categories":1977},[179],{"categories":1979},[],{"categories":1981},[291],{"categories":1983},[179],{"categories":1985},[284],{"categories":1987},[242],{"categories":1989},[298],{"categories":1991},[555],{"categories":1993},[],{"categories":1995},[234],{"categories":1997},[242],{"categories":1999},[242],{"categories":2001},[291],{"categories":2003},[179],{"categories":2005},[179],{"categories":2007},[],{"categories":2009},[],{"categories":2011},[],{"categories":2013},[555],{"categories":2015},[260],{"categories":2017},[179],{"categories":2019},[179],{"categories":2021},[179],{"categories":2023},[],{"categories":2025},[284],{"categories":2027},[237],{"categories":2029},[],{"categories":2031},[242],{"categories":2033},[555],{"categories":2035},[],{"categories":2037},[281],{"categories":2039},[281],{"categories":2041},[],{"categories":2043},[291],{"categories":2045},[281],{"categories":2047},[179],{"categories":2049},[],{"categories":2051},[260],{"categories":2053},[179],{"categories":2055},[281],{"categories":2057},[242],{"categories":2059},[260],{"categories":2061},[],{"categories":2063},[242],{"categories":2065},[281],{"categories":2067},[179],{"categories":2069},[],{"categories":2071},[179],{"categories":2073},[179],{"categories":2075},[555],{"categories":2077},[260],{"categories":2079},[284],{"categories":2081},[284],{"categories":2083},[],{"categories":2085},[],{"categories":2087},[],{"categories":2089},[242],{"categories":2091},[291],{"categories":2093},[291],{"categories":2095},[],{"categories":2097},[],{"categories":2099},[179],{"categories":2101},[],{"categories":2103},[242],{"categories":2105},[179],{"categories":2107},[],{"categories":2109},[179],{"categories":2111},[237],{"categories":2113},[179],{"categories":2115},[298],{"categories":2117},[242],{"categories":2119},[179],{"categories":2121},[291],{"categories":2123},[260],{"categories":2125},[242],{"categories":2127},[],{"categories":2129},[260],{"categories":2131},[242],{"categories":2133},[242],{"categories":2135},[],{"categories":2137},[237],{"categories":2139},[242],{"categories":2141},[],{"categories":2143},[179],{"categories":2145},[234],{"categories":2147},[260],{"categories":2149},[555],{"categories":2151},[242],{"categories":2153},[242],{"categories":2155},[234],{"categories":2157},[179],{"categories":2159},[],{"categories":2161},[],{"categories":2163},[281],{"categories":2165},[179,237],{"categories":2167},[],{"categories":2169},[234],{"categories":2171},[284],{"categories":2173},[179],{"categories":2175},[291],{"categories":2177},[179],{"categories":2179},[242],{"categories":2181},[179],{"categories":2183},[179],{"categories":2185},[260],{"categories":2187},[242],{"categories":2189},[],{"categories":2191},[],{"categories":2193},[242],{"categories":2195},[179],{"categories":2197},[555],{"categories":2199},[],{"categories":2201},[179],{"categories":2203},[242],{"categories":2205},[],{"categories":2207},[179],{"categories":2209},[298],{"categories":2211},[284],{"categories":2213},[242],{"categories":2215},[179],{"categories":2217},[555],{"categories":2219},[],{"categories":2221},[179],{"categories":2223},[298],{"categories":2225},[281],{"categories":2227},[179],{"categories":2229},[],{"categories":2231},[298],{"categories":2233},[260],{"categories":2235},[179],{"categories":2237},[179],{"categories":2239},[234],{"categories":2241},[],{"categories":2243},[],{"categories":2245},[281],{"categories":2247},[179],{"categories":2249},[284],{"categories":2251},[298],{"categories":2253},[298],{"categories":2255},[260],{"categories":2257},[],{"categories":2259},[],{"categories":2261},[179],{"categories":2263},[],{"categories":2265},[179,291],{"categories":2267},[260],{"categories":2269},[242],{"categories":2271},[291],{"categories":2273},[179],{"categories":2275},[234],{"categories":2277},[],{"categories":2279},[],{"categories":2281},[234],{"categories":2283},[298],{"categories":2285},[179],{"categories":2287},[],{"categories":2289},[281,179],{"categories":2291},[555],{"categories":2293},[234],{"categories":2295},[],{"categories":2297},[237],{"categories":2299},[237],{"categories":2301},[179],{"categories":2303},[291],{"categories":2305},[242],{"categories":2307},[260],{"categories":2309},[298],{"categories":2311},[281],{"categories":2313},[179],{"categories":2315},[179],{"categories":2317},[179],{"categories":2319},[234],{"categories":2321},[179],{"categories":2323},[242],{"categories":2325},[260],{"categories":2327},[],{"categories":2329},[],{"categories":2331},[284],{"categories":2333},[291],{"categories":2335},[179],{"categories":2337},[281],{"categories":2339},[284],{"categories":2341},[179],{"categories":2343},[179],{"categories":2345},[242],{"categories":2347},[242],{"categories":2349},[179,237],{"categories":2351},[],{"categories":2353},[281],{"categories":2355},[],{"categories":2357},[179],{"categories":2359},[260],{"categories":2361},[234],{"categories":2363},[234],{"categories":2365},[242],{"categories":2367},[179],{"categories":2369},[237],{"categories":2371},[291],{"categories":2373},[298],{"categories":2375},[],{"categories":2377},[260],{"categories":2379},[179],{"categories":2381},[179],{"categories":2383},[260],{"categories":2385},[291],{"categories":2387},[179],{"categories":2389},[242],{"categories":2391},[260],{"categories":2393},[179],{"categories":2395},[281],{"categories":2397},[179],{"categories":2399},[179],{"categories":2401},[555],{"categories":2403},[245],{"categories":2405},[242],{"categories":2407},[179],{"categories":2409},[260],{"categories":2411},[242],{"categories":2413},[298],{"categories":2415},[179],{"categories":2417},[],{"categories":2419},[179],{"categories":2421},[],{"categories":2423},[],{"categories":2425},[],{"categories":2427},[237],{"categories":2429},[179],{"categories":2431},[242],{"categories":2433},[260],{"categories":2435},[260],{"categories":2437},[260],{"categories":2439},[260],{"categories":2441},[],{"categories":2443},[234],{"categories":2445},[242],{"categories":2447},[260],{"categories":2449},[234],{"categories":2451},[242],{"categories":2453},[179],{"categories":2455},[179,242],{"categories":2457},[242],{"categories":2459},[555],{"categories":2461},[260],{"categories":2463},[260],{"categories":2465},[242],{"categories":2467},[179],{"categories":2469},[],{"categories":2471},[260],{"categories":2473},[298],{"categories":2475},[234],{"categories":2477},[179],{"categories":2479},[179],{"categories":2481},[],{"categories":2483},[291],{"categories":2485},[],{"categories":2487},[234],{"categories":2489},[242],{"categories":2491},[260],{"categories":2493},[179],{"categories":2495},[260],{"categories":2497},[234],{"categories":2499},[260],{"categories":2501},[260],{"categories":2503},[],{"categories":2505},[237],{"categories":2507},[242],{"categories":2509},[260],{"categories":2511},[260],{"categories":2513},[260],{"categories":2515},[260],{"categories":2517},[260],{"categories":2519},[260],{"categories":2521},[260],{"categories":2523},[260],{"categories":2525},[260],{"categories":2527},[260],{"categories":2529},[284],{"categories":2531},[234],{"categories":2533},[179],{"categories":2535},[179],{"categories":2537},[],{"categories":2539},[179,234],{"categories":2541},[],{"categories":2543},[242],{"categories":2545},[260],{"categories":2547},[242],{"categories":2549},[179],{"categories":2551},[179],{"categories":2553},[179],{"categories":2555},[179],{"categories":2557},[179],{"categories":2559},[242],{"categories":2561},[237],{"categories":2563},[281],{"categories":2565},[260],{"categories":2567},[179],{"categories":2569},[],{"categories":2571},[],{"categories":2573},[242],{"categories":2575},[281],{"categories":2577},[179],{"categories":2579},[],{"categories":2581},[],{"categories":2583},[298],{"categories":2585},[179],{"categories":2587},[],{"categories":2589},[],{"categories":2591},[234],{"categories":2593},[237],{"categories":2595},[179],{"categories":2597},[237],{"categories":2599},[281],{"categories":2601},[],{"categories":2603},[260],{"categories":2605},[],{"categories":2607},[281],{"categories":2609},[179],{"categories":2611},[298],{"categories":2613},[],{"categories":2615},[298],{"categories":2617},[],{"categories":2619},[],{"categories":2621},[242],{"categories":2623},[],{"categories":2625},[237],{"categories":2627},[234],{"categories":2629},[281],{"categories":2631},[291],{"categories":2633},[],{"categories":2635},[],{"categories":2637},[179],{"categories":2639},[234],{"categories":2641},[298],{"categories":2643},[],{"categories":2645},[242],{"categories":2647},[242],{"categories":2649},[260],{"categories":2651},[179],{"categories":2653},[242],{"categories":2655},[179],{"categories":2657},[242],{"categories":2659},[179],{"categories":2661},[245],{"categories":2663},[260],{"categories":2665},[],{"categories":2667},[298],{"categories":2669},[291],{"categories":2671},[242],{"categories":2673},[],{"categories":2675},[179],{"categories":2677},[242],{"categories":2679},[237],{"categories":2681},[234],{"categories":2683},[179],{"categories":2685},[281],{"categories":2687},[291],{"categories":2689},[291],{"categories":2691},[179],{"categories":2693},[284],{"categories":2695},[179],{"categories":2697},[242],{"categories":2699},[237],{"categories":2701},[242],{"categories":2703},[179],{"categories":2705},[179],{"categories":2707},[242],{"categories":2709},[260],{"categories":2711},[],{"categories":2713},[234],{"categories":2715},[179],{"categories":2717},[242],{"categories":2719},[179],{"categories":2721},[179],{"categories":2723},[],{"categories":2725},[281],{"categories":2727},[237],{"categories":2729},[260],{"categories":2731},[179],{"categories":2733},[179],{"categories":2735},[281],{"categories":2737},[298],{"categories":2739},[284],{"categories":2741},[179],{"categories":2743},[260],{"categories":2745},[179],{"categories":2747},[242],{"categories":2749},[555],{"categories":2751},[179],{"categories":2753},[242],{"categories":2755},[284],{"categories":2757},[],{"categories":2759},[242],{"categories":2761},[291],{"categories":2763},[281],{"categories":2765},[179],{"categories":2767},[234],{"categories":2769},[237],{"categories":2771},[291],{"categories":2773},[],{"categories":2775},[242],{"categories":2777},[179],{"categories":2779},[],{"categories":2781},[260],{"categories":2783},[],{"categories":2785},[260],{"categories":2787},[179],{"categories":2789},[242],{"categories":2791},[242],{"categories":2793},[242],{"categories":2795},[],{"categories":2797},[],{"categories":2799},[179],{"categories":2801},[179],{"categories":2803},[],{"categories":2805},[281],{"categories":2807},[242],{"categories":2809},[298],{"categories":2811},[234],{"categories":2813},[],{"categories":2815},[],{"categories":2817},[260],{"categories":2819},[291],{"categories":2821},[179],{"categories":2823},[179],{"categories":2825},[179],{"categories":2827},[291],{"categories":2829},[260],{"categories":2831},[281],{"categories":2833},[179],{"categories":2835},[179],{"categories":2837},[179],{"categories":2839},[260],{"categories":2841},[179],{"categories":2843},[260],{"categories":2845},[242],{"categories":2847},[242],{"categories":2849},[291],{"categories":2851},[242],{"categories":2853},[179],{"categories":2855},[291],{"categories":2857},[281],{"categories":2859},[],{"categories":2861},[242],{"categories":2863},[],{"categories":2865},[],{"categories":2867},[],{"categories":2869},[237],{"categories":2871},[179],{"categories":2873},[242],{"categories":2875},[234],{"categories":2877},[242],{"categories":2879},[298],{"categories":2881},[],{"categories":2883},[242],{"categories":2885},[],{"categories":2887},[234],{"categories":2889},[242],{"categories":2891},[],{"categories":2893},[242],{"categories":2895},[179],{"categories":2897},[260],{"categories":2899},[179],{"categories":2901},[242],{"categories":2903},[260],{"categories":2905},[242],{"categories":2907},[291],{"categories":2909},[281],{"categories":2911},[234],{"categories":2913},[],{"categories":2915},[242],{"categories":2917},[281],{"categories":2919},[555],{"categories":2921},[260],{"categories":2923},[179],{"categories":2925},[281],{"categories":2927},[234],{"categories":2929},[],{"categories":2931},[242],{"categories":2933},[242],{"categories":2935},[179],{"categories":2937},[],{"categories":2939},[242],{"categories":2941},[245],{"categories":2943},[260],{"categories":2945},[242],{"categories":2947},[237],{"categories":2949},[],{"categories":2951},[179],{"categories":2953},[245],{"categories":2955},[179],{"categories":2957},[242],{"categories":2959},[260],{"categories":2961},[234],{"categories":2963},[555],{"categories":2965},[179],{"categories":2967},[179],{"categories":2969},[179],{"categories":2971},[260],{"categories":2973},[237],{"categories":2975},[179],{"categories":2977},[281],{"categories":2979},[260],{"categories":2981},[555],{"categories":2983},[179],{"categories":2985},[],{"categories":2987},[],{"categories":2989},[555],{"categories":2991},[284],{"categories":2993},[242],{"categories":2995},[242],{"categories":2997},[260],{"categories":2999},[179],{"categories":3001},[234],{"categories":3003},[281],{"categories":3005},[242],{"categories":3007},[179],{"categories":3009},[298],{"categories":3011},[179],{"categories":3013},[242],{"categories":3015},[],{"categories":3017},[179],{"categories":3019},[179],{"categories":3021},[260],{"categories":3023},[234],{"categories":3025},[],{"categories":3027},[179],{"categories":3029},[179],{"categories":3031},[291],{"categories":3033},[281],{"categories":3035},[179,242],{"categories":3037},[298,237],{"categories":3039},[179],{"categories":3041},[],{"categories":3043},[242],{"categories":3045},[],{"categories":3047},[291],{"categories":3049},[179],{"categories":3051},[260],{"categories":3053},[],{"categories":3055},[242],{"categories":3057},[],{"categories":3059},[281],{"categories":3061},[242],{"categories":3063},[234],{"categories":3065},[242],{"categories":3067},[179],{"categories":3069},[555],{"categories":3071},[298],{"categories":3073},[237],{"categories":3075},[237],{"categories":3077},[234],{"categories":3079},[234],{"categories":3081},[179],{"categories":3083},[242],{"categories":3085},[179],{"categories":3087},[179],{"categories":3089},[234],{"categories":3091},[179],{"categories":3093},[298],{"categories":3095},[260],{"categories":3097},[179],{"categories":3099},[242],{"categories":3101},[179],{"categories":3103},[],{"categories":3105},[291],{"categories":3107},[],{"categories":3109},[242],{"categories":3111},[234],{"categories":3113},[],{"categories":3115},[555],{"categories":3117},[179],{"categories":3119},[],{"categories":3121},[260],{"categories":3123},[242],{"categories":3125},[291],{"categories":3127},[179],{"categories":3129},[242],{"categories":3131},[291],{"categories":3133},[242],{"categories":3135},[260],{"categories":3137},[234],{"categories":3139},[260],{"categories":3141},[291],{"categories":3143},[179],{"categories":3145},[281],{"categories":3147},[179],{"categories":3149},[179],{"categories":3151},[179],{"categories":3153},[179],{"categories":3155},[242],{"categories":3157},[179],{"categories":3159},[242],{"categories":3161},[179],{"categories":3163},[234],{"categories":3165},[179],{"categories":3167},[242],{"categories":3169},[281],{"categories":3171},[234],{"categories":3173},[242],{"categories":3175},[281],{"categories":3177},[],{"categories":3179},[179],{"categories":3181},[179],{"categories":3183},[291],{"categories":3185},[],{"categories":3187},[242],{"categories":3189},[298],{"categories":3191},[179],{"categories":3193},[260],{"categories":3195},[298],{"categories":3197},[242],{"categories":3199},[237],{"categories":3201},[237],{"categories":3203},[179],{"categories":3205},[234],{"categories":3207},[],{"categories":3209},[179],{"categories":3211},[],{"categories":3213},[234],{"categories":3215},[179],{"categories":3217},[242],{"categories":3219},[242],{"categories":3221},[],{"categories":3223},[291],{"categories":3225},[291],{"categories":3227},[298],{"categories":3229},[281],{"categories":3231},[],{"categories":3233},[179],{"categories":3235},[234],{"categories":3237},[179],{"categories":3239},[291],{"categories":3241},[234],{"categories":3243},[260],{"categories":3245},[260],{"categories":3247},[],{"categories":3249},[260],{"categories":3251},[242],{"categories":3253},[281],{"categories":3255},[284],{"categories":3257},[179],{"categories":3259},[],{"categories":3261},[260],{"categories":3263},[291],{"categories":3265},[237],{"categories":3267},[179],{"categories":3269},[234],{"categories":3271},[555],{"categories":3273},[234],{"categories":3275},[],{"categories":3277},[],{"categories":3279},[260],{"categories":3281},[],{"categories":3283},[242],{"categories":3285},[242],{"categories":3287},[242],{"categories":3289},[],{"categories":3291},[179],{"categories":3293},[],{"categories":3295},[260],{"categories":3297},[234],{"categories":3299},[281],{"categories":3301},[179],{"categories":3303},[260],{"categories":3305},[260],{"categories":3307},[],{"categories":3309},[260],{"categories":3311},[234],{"categories":3313},[179],{"categories":3315},[],{"categories":3317},[242],{"categories":3319},[242],{"categories":3321},[234],{"categories":3323},[],{"categories":3325},[],{"categories":3327},[],{"categories":3329},[281],{"categories":3331},[242],{"categories":3333},[179],{"categories":3335},[],{"categories":3337},[],{"categories":3339},[],{"categories":3341},[281],{"categories":3343},[],{"categories":3345},[234],{"categories":3347},[],{"categories":3349},[],{"categories":3351},[281],{"categories":3353},[179],{"categories":3355},[260],{"categories":3357},[],{"categories":3359},[298],{"categories":3361},[260],{"categories":3363},[298],{"categories":3365},[179],{"categories":3367},[],{"categories":3369},[],{"categories":3371},[242],{"categories":3373},[],{"categories":3375},[],{"categories":3377},[242],{"categories":3379},[179],{"categories":3381},[],{"categories":3383},[242],{"categories":3385},[260],{"categories":3387},[298],{"categories":3389},[284],{"categories":3391},[242],{"categories":3393},[242],{"categories":3395},[],{"categories":3397},[],{"categories":3399},[],{"categories":3401},[260],{"categories":3403},[],{"categories":3405},[],{"categories":3407},[281],{"categories":3409},[234],{"categories":3411},[],{"categories":3413},[237],{"categories":3415},[298],{"categories":3417},[179],{"categories":3419},[291],{"categories":3421},[234],{"categories":3423},[284],{"categories":3425},[237],{"categories":3427},[291],{"categories":3429},[],{"categories":3431},[],{"categories":3433},[242],{"categories":3435},[234],{"categories":3437},[281],{"categories":3439},[234],{"categories":3441},[242],{"categories":3443},[555],{"categories":3445},[242],{"categories":3447},[],{"categories":3449},[179],{"categories":3451},[260],{"categories":3453},[291],{"categories":3455},[],{"categories":3457},[281],{"categories":3459},[260],{"categories":3461},[234],{"categories":3463},[242],{"categories":3465},[179],{"categories":3467},[237],{"categories":3469},[242,555],{"categories":3471},[242],{"categories":3473},[291],{"categories":3475},[179],{"categories":3477},[284],{"categories":3479},[298],{"categories":3481},[242],{"categories":3483},[],{"categories":3485},[242],{"categories":3487},[179],{"categories":3489},[237],{"categories":3491},[],{"categories":3493},[],{"categories":3495},[179],{"categories":3497},[284],{"categories":3499},[179],{"categories":3501},[],{"categories":3503},[260],{"categories":3505},[],{"categories":3507},[260],{"categories":3509},[291],{"categories":3511},[242],{"categories":3513},[179],{"categories":3515},[298],{"categories":3517},[291],{"categories":3519},[],{"categories":3521},[260],{"categories":3523},[179],{"categories":3525},[],{"categories":3527},[179],{"categories":3529},[242],{"categories":3531},[179],{"categories":3533},[242],{"categories":3535},[179],{"categories":3537},[179],{"categories":3539},[179],{"categories":3541},[179],{"categories":3543},[237],{"categories":3545},[],{"categories":3547},[245],{"categories":3549},[260],{"categories":3551},[179],{"categories":3553},[],{"categories":3555},[291],{"categories":3557},[179],{"categories":3559},[179],{"categories":3561},[242],{"categories":3563},[260],{"categories":3565},[179],{"categories":3567},[179],{"categories":3569},[237],{"categories":3571},[242],{"categories":3573},[281],{"categories":3575},[],{"categories":3577},[284],{"categories":3579},[179],{"categories":3581},[],{"categories":3583},[260],{"categories":3585},[298],{"categories":3587},[],{"categories":3589},[],{"categories":3591},[260],{"categories":3593},[260],{"categories":3595},[298],{"categories":3597},[234],{"categories":3599},[242],{"categories":3601},[242],{"categories":3603},[179],{"categories":3605},[237],{"categories":3607},[],{"categories":3609},[],{"categories":3611},[260],{"categories":3613},[284],{"categories":3615},[291],{"categories":3617},[242],{"categories":3619},[281],{"categories":3621},[284],{"categories":3623},[284],{"categories":3625},[],{"categories":3627},[260],{"categories":3629},[179],{"categories":3631},[179],{"categories":3633},[291],{"categories":3635},[],{"categories":3637},[260],{"categories":3639},[260],{"categories":3641},[260],{"categories":3643},[],{"categories":3645},[242],{"categories":3647},[179],{"categories":3649},[],{"categories":3651},[234],{"categories":3653},[237],{"categories":3655},[],{"categories":3657},[179],{"categories":3659},[179],{"categories":3661},[],{"categories":3663},[291],{"categories":3665},[],{"categories":3667},[],{"categories":3669},[],{"categories":3671},[],{"categories":3673},[179],{"categories":3675},[260],{"categories":3677},[],{"categories":3679},[],{"categories":3681},[179],{"categories":3683},[179],{"categories":3685},[179],{"categories":3687},[284],{"categories":3689},[179],{"categories":3691},[284],{"categories":3693},[],{"categories":3695},[284],{"categories":3697},[284],{"categories":3699},[555],{"categories":3701},[242],{"categories":3703},[291],{"categories":3705},[],{"categories":3707},[],{"categories":3709},[284],{"categories":3711},[291],{"categories":3713},[291],{"categories":3715},[291],{"categories":3717},[],{"categories":3719},[234],{"categories":3721},[291],{"categories":3723},[291],{"categories":3725},[234],{"categories":3727},[291],{"categories":3729},[237],{"categories":3731},[291],{"categories":3733},[291],{"categories":3735},[291],{"categories":3737},[284],{"categories":3739},[260],{"categories":3741},[260],{"categories":3743},[179],{"categories":3745},[291],{"categories":3747},[284],{"categories":3749},[555],{"categories":3751},[284],{"categories":3753},[284],{"categories":3755},[284],{"categories":3757},[],{"categories":3759},[237],{"categories":3761},[],{"categories":3763},[555],{"categories":3765},[291],{"categories":3767},[291],{"categories":3769},[291],{"categories":3771},[242],{"categories":3773},[260,237],{"categories":3775},[284],{"categories":3777},[],{"categories":3779},[],{"categories":3781},[284],{"categories":3783},[],{"categories":3785},[284],{"categories":3787},[260],{"categories":3789},[242],{"categories":3791},[],{"categories":3793},[291],{"categories":3795},[179],{"categories":3797},[281],{"categories":3799},[],{"categories":3801},[179],{"categories":3803},[],{"categories":3805},[260],{"categories":3807},[234],{"categories":3809},[284],{"categories":3811},[],{"categories":3813},[291],{"categories":3815},[260],[3817,3899,3980,4246],{"id":3818,"title":3819,"ai":3820,"body":3825,"categories":3867,"created_at":180,"date_modified":180,"description":171,"extension":181,"faq":180,"featured":182,"kicker_label":180,"meta":3868,"navigation":214,"path":3885,"published_at":3886,"question":180,"scraped_at":3887,"seo":3888,"sitemap":3889,"source_id":3890,"source_name":3891,"source_type":221,"source_url":3892,"stem":3893,"tags":3894,"thumbnail_url":180,"tldr":3896,"tweet":180,"unknown_tags":3897,"__hash__":3898},"summaries\u002Fsummaries\u002F2c114f7483e1445f-tradingagents-llm-hedge-fund-sim-w-debating-teams-summary.md","TradingAgents: LLM Hedge Fund Sim w\u002F Debating Teams",{"provider":7,"model":8,"input_tokens":3821,"output_tokens":3822,"processing_time_ms":3823,"cost_usd":3824},5460,1737,17784,0.001938,{"type":14,"value":3826,"toc":3861},[3827,3831,3834,3837,3841,3844,3848,3851,3854,3858],[17,3828,3830],{"id":3829},"multi-agent-structure-mirrors-real-trading-firms","Multi-Agent Structure Mirrors Real Trading Firms",[22,3832,3833],{},"TradingAgents breaks a hedge fund into specialized LLM agents: four parallel analysts (fundamentals pulls filings for ratio analysis and intrinsic value; sentiment scores Reddit\u002FX mood; news tracks macro events; technical runs MACD\u002FRSI\u002FBollinger Bands), producing independent reports without vector collapse to preserve disagreement as signal. Bullish and bearish researchers debate analyst outputs over configurable rounds, citing specifics. Trader proposes timing\u002Fposition size, risk team checks volatility\u002Fliquidity, and portfolio manager approves\u002Frejects with explanation. Built on LangGraph for node-based orchestration with checkpoint resume on crashes and persistent markdown decision log that injects past trade reflections (alpha vs. SPY benchmark) into future prompts, enabling learning from realized returns.",[22,3835,3836],{},"This traceable design outperforms mechanical rule-based systems (e.g., moving averages) or opaque ML black boxes by logging full transcripts—analyst reports, debates, rejection reasons—for auditability absent in traditional quants.",[17,3838,3840],{"id":3839},"bull-bear-debate-drives-defensible-positions","Bull-Bear Debate Drives Defensible Positions",[22,3842,3843],{},"Hedge funds succeed via team arguments, not solo picks; TradingAgents replicates this with structurally opposing researchers who argue multiple rounds on analyst data. Bull pushes open positions, bear counters, trader synthesizes transcript for trade proposal. This preserves diverse signals from parallel analysts, turning conflict into robust reasoning. Portfolio uses 5-tier scale (buy\u002Foverweight\u002Fhold\u002Funderweight\u002Fsell) consistently with research\u002Ftrader outputs and log.",[17,3845,3847],{"id":3846},"painless-setup-and-v024-production-upgrades","Painless Setup and v0.2.4 Production Upgrades",[22,3849,3850],{},"Clone repo (53k stars, 9.7k forks, Apache 2.0), pip install, set LLM API key (supports OpenAI GPT, Gemini, Claude, Grok, DeepSeek, Qwen, Ollama\u002Flocal). CLI picks ticker, date, provider, debate rounds; runs simulated exchange backtest. v0.2.4 (Apr 25) adds Pydantic-structured outputs for research\u002Ftrader\u002Fportfolio (cuts failures), DeepSeek\u002FQwen\u002FGLM\u002FAzure support, Docker multi-stage builds—drops setup to ~10min for hobbyists.",[22,3852,3853],{},"Quant researchers get LangGraph reference for multi-agent graphs; fintech founders fork for retail tools; indie hackers study practical agent wiring.",[17,3855,3857],{"id":3856},"key-trade-offs-research-tool-not-live-trading","Key Trade-offs: Research Tool, Not Live Trading",[22,3859,3860],{},"Token-intensive (4 analysts + debates\u002Ftrader\u002Fmanager per ticker) burns LLM costs; simulated backtest lacks live broker—build your own. Not financial advice; don't bet retirement. Yet weekly releases, multi-lang docs, UCLA arXiv paper (2412.20138) validate as clean 2026 agent benchmark—clone to dissect wiring.",{"title":171,"searchDepth":172,"depth":172,"links":3862},[3863,3864,3865,3866],{"id":3829,"depth":172,"text":3830},{"id":3839,"depth":172,"text":3840},{"id":3846,"depth":172,"text":3847},{"id":3856,"depth":172,"text":3857},[179],{"content_references":3869,"triage":3882},[3870,3874,3880],{"type":186,"title":3871,"url":3872,"context":3873},"TradingAgents","https:\u002F\u002Fgithub.com\u002FTauricResearch\u002FTradingAgents","recommended",{"type":3875,"title":3876,"author":3877,"url":3878,"context":3879},"paper","arXiv:2412.20138","UCLA","https:\u002F\u002Farxiv.org\u002Fabs\u002F2412.20138","cited",{"type":186,"title":3881,"context":189},"LangGraph",{"relevance":210,"novelty":211,"quality":211,"actionability":211,"composite":3883,"reasoning":3884},4.35,"Category: AI & LLMs. The article provides a detailed overview of TradingAgents, a simulation that uses LLM agents to replicate hedge fund decision-making, addressing practical applications of AI in finance. It offers insights into the multi-agent structure and how it enhances trading strategies, which is relevant for builders looking to implement AI in financial products.","\u002Fsummaries\u002F2c114f7483e1445f-tradingagents-llm-hedge-fund-sim-w-debating-teams-summary","2026-04-28 19:30:04","2026-05-03 16:59:26",{"title":3819,"description":171},{"loc":3885},"2c114f7483e1445f","AI Summaries (evaluation playlist)","https:\u002F\u002Fwww.youtube.com\u002Fwatch?v=9FoEsXNGLwI","summaries\u002F2c114f7483e1445f-tradingagents-llm-hedge-fund-sim-w-debating-teams-summary",[3895,225,226,227],"agents","TradingAgents simulates a Wall Street firm using LLM agents—4 parallel analysts, bull\u002Fbear debaters, trader, risk, and portfolio manager—for fully traceable stock decisions that learn from past trades.",[],"i-lNfymBC9gXappbkG8AamDG5SvodMAVK8XU-5YpfEk",{"id":3900,"title":3901,"ai":3902,"body":3907,"categories":3947,"created_at":180,"date_modified":180,"description":171,"extension":181,"faq":180,"featured":182,"kicker_label":180,"meta":3948,"navigation":214,"path":3966,"published_at":3967,"question":180,"scraped_at":3968,"seo":3969,"sitemap":3970,"source_id":3971,"source_name":3972,"source_type":221,"source_url":3973,"stem":3974,"tags":3975,"thumbnail_url":180,"tldr":3977,"tweet":180,"unknown_tags":3978,"__hash__":3979},"summaries\u002Fsummaries\u002Fd64cbc961f981052-openmythos-770m-rdt-matches-1-3b-transformer-power-summary.md","OpenMythos: 770M RDT Matches 1.3B Transformer Power",{"provider":7,"model":8,"input_tokens":3903,"output_tokens":3904,"processing_time_ms":3905,"cost_usd":3906},5480,2000,15694,0.0020735,{"type":14,"value":3908,"toc":3942},[3909,3913,3916,3919,3922,3926,3929,3932,3936,3939],[17,3910,3912],{"id":3911},"recurrent-depth-transformers-scale-reasoning-via-inference-loops","Recurrent-Depth Transformers Scale Reasoning via Inference Loops",[22,3914,3915],{},"Recurrent-Depth Transformers (RDTs), or Looped Transformers, differ from standard transformers by reusing a fixed set of weights iteratively across T loop steps (up to 16 in OpenMythos) in a single forward pass. This decouples reasoning depth from parameter count: deeper reasoning comes from more loops at inference, not more layers or params. The structure follows Prelude → Recurrent Block → Coda, where Prelude and Coda are one-time standard transformer layers.",[22,3917,3918],{},"In the Recurrent Block, update hidden state ht+1 = A·ht + B·e + Transformer(ht, e), with encoded input e re-injected each step to prevent drift. This mimics draft refinement, enabling continuous latent-space reasoning without mid-loop token emissions—equivalent to chain-of-thought over vectors, per Saunshi et al. (2025). Unlike standard transformers failing on unseen depths (e.g., 5-hop trained model flops on 10-hop), RDTs extend depth at inference without retraining: allocate more loops to hard problems.",[22,3920,3921],{},"Replace standard FFN with Mixture-of-Experts (MoE) from DeepSeekMoE: sparse top-K experts per token plus shared experts, routed differently per loop for distinct computation despite tied weights. Use Multi-Latent Attention from DeepSeek-V2, caching compressed low-rank KV latents for 10–20× KV memory savings.",[17,3923,3925],{"id":3924},"stability-and-adaptive-depth-prevent-explosion-or-overthinking","Stability and Adaptive Depth Prevent Explosion or Overthinking",[22,3927,3928],{},"Looping risks residual explosion (unbounded ht growth) or overthinking (drift past solutions). Enforce Linear Time-Invariant (LTI) constraint from Parcae: spectral radius ρ(A) \u003C 1 by construction, ensuring stability independent of learning rate. Add Adaptive Computation Time (ACT) halting: learned scalar per position dynamically stops loops when converged—harder tokens get more compute.",[22,3930,3931],{},"Depth-Wise LoRA adapters apply small rank-r matrices per iteration, differentiating behavior without bloating params, blending pure tying and unique layers.",[17,3933,3935],{"id":3934},"half-the-params-equivalent-performance-via-predictable-scaling","Half the Params, Equivalent Performance via Predictable Scaling",[22,3937,3938],{},"At 770M params, OpenMythos RDT matches 1.3B standard transformer on identical data, per Parcae (Prairie et al., 2026) scaling laws: optimal recurrence and token count follow power laws. This shifts scaling focus from training params to inference loops, challenging bigger-is-better assumptions.",[22,3940,3941],{},"OpenMythos delivers PyTorch code for RDT with MoE, LTI training, LoRA adapters, and baselines—falsifiable hypothesis for Claude Mythos, runnable for experimenting with looped dynamics.",{"title":171,"searchDepth":172,"depth":172,"links":3943},[3944,3945,3946],{"id":3911,"depth":172,"text":3912},{"id":3924,"depth":172,"text":3925},{"id":3934,"depth":172,"text":3935},[],{"content_references":3949,"triage":3962},[3950,3953,3956,3960],{"type":186,"title":3951,"url":3952,"context":189},"OpenMythos","https:\u002F\u002Fgithub.com\u002Fkyegomez\u002FOpenMythos",{"type":3875,"title":3954,"url":3955,"context":3879},"Saunshi et al. (2025)","https:\u002F\u002Farxiv.org\u002Fabs\u002F2502.17416",{"type":3875,"title":3957,"author":3958,"url":3959,"context":3879},"Parcae","Prairie et al.","https:\u002F\u002Farxiv.org\u002Fabs\u002F2604.12946",{"type":194,"title":3961,"context":189},"COCONUT (2024)",{"relevance":3963,"novelty":3963,"quality":211,"actionability":172,"composite":3964,"reasoning":3965},3,3.05,"Category: AI & LLMs. The article discusses a new architecture for transformers, which is relevant to AI engineering, but it lacks practical applications or examples for product builders to implement this technology. While it presents some novel insights into the structure and functioning of Recurrent-Depth Transformers, it does not provide actionable steps or frameworks that the audience can directly apply.","\u002Fsummaries\u002Fd64cbc961f981052-openmythos-770m-rdt-matches-1-3b-transformer-power-summary","2026-04-19 19:47:49","2026-04-21 15:26:59",{"title":3901,"description":171},{"loc":3966},"d64cbc961f981052","MarkTechPost","https:\u002F\u002Fwww.marktechpost.com\u002F2026\u002F04\u002F19\u002Fmeet-openmythos-an-open-source-pytorch-reconstruction-of-claude-mythos-where-770m-parameters-match-a-1-3b-transformer\u002F","summaries\u002Fd64cbc961f981052-openmythos-770m-rdt-matches-1-3b-transformer-power-summary",[225,3976,227,226],"machine-learning","OpenMythos reconstructs Claude Mythos as a Recurrent-Depth Transformer (RDT) in PyTorch: loop the same weights T=16 times for reasoning depth, achieving 1.3B transformer performance at 770M params via MoE, stability fixes, and inference-time scaling.",[],"catU0v9NcZQXj7dgnu-iH80ub7d_pZ-fh6mDqyuTN3c",{"id":3981,"title":3982,"ai":3983,"body":3988,"categories":4212,"created_at":180,"date_modified":180,"description":171,"extension":181,"faq":180,"featured":182,"kicker_label":180,"meta":4213,"navigation":214,"path":4235,"published_at":180,"question":180,"scraped_at":4236,"seo":4237,"sitemap":4238,"source_id":4239,"source_name":220,"source_type":221,"source_url":4240,"stem":4241,"tags":4242,"thumbnail_url":180,"tldr":4243,"tweet":180,"unknown_tags":4244,"__hash__":4245},"summaries\u002Fsummaries\u002F2a9849ad35620d4f-turboquant-6-4x-kv-cache-compression-at-q8-0-speed-summary.md","TurboQuant+: 6.4x KV Cache Compression at q8_0 Speed",{"provider":7,"model":8,"input_tokens":3984,"output_tokens":3985,"processing_time_ms":3986,"cost_usd":3987},11014,3209,20267,0.0037848,{"type":14,"value":3989,"toc":4205},[3990,3994,3997,4000,4006,4009,4013,4016,4019,4024,4027,4030,4034,4037,4148,4153,4157,4160,4165,4168,4172],[17,3991,3993],{"id":3992},"turboquant-formats-deliver-extreme-compression-with-minimal-quality-loss","TurboQuant Formats Deliver Extreme Compression with Minimal Quality Loss",[22,3995,3996],{},"TurboQuant+ ports Google's TurboQuant (ICLR 2026) to llama.cpp, compressing KV cache via PolarQuant (multi-centroid scalar quantization) + Walsh-Hadamard Transform (WHT) rotation, dropping the paper's 1-bit QJL error correction which amplified softmax variance. Formats: turbo2 (2.5 bits\u002Fval, 6.4x vs fp16), turbo3 (3.5 bits\u002Fval at block=32, 4.6x; 3.125 bits\u002Fval at block=128, 5.12x), turbo4 (4.25 bits\u002Fval, 3.8x). On M5 Max (Qwen3.5-27B\u002F35B-A3B), turbo4 PPL 6.125 (+0.23% vs q8_0 baseline 6.111 on wikitext-2 512 chunks); turbo3 6.176 (+1.06%). turbo4 outperforms q4_0 (6.142, +0.52%) in quality at similar compression.",[22,3998,3999],{},"Block size optimization (study: docs\u002Fpapers\u002Fblock-size-experiment.md) boosts turbo3 to 5.12x at block=128 with identical PPL across 512-32K contexts, 3 architectures (Qwen2.5-1.5B, Llama3.1-8B, Qwen3.5-27B), validated on M2 Pro\u002FM5 Max Metal. Larger blocks reduce overhead but risk cache thrashing on older hardware—default block=32 balances.",[4001,4002,4003],"blockquote",{},[22,4004,4005],{},"\"Compresses transformer KV cache 3.8-6.4x using PolarQuant + Walsh-Hadamard rotation. Near q8_0 prefill speed and ~0.9x decode throughput at long context (Apple Silicon).\"",[22,4007,4008],{},"Asymmetric K\u002FV caching preserves quality on Q4_K_M weights: keep K at q8_0 (attention routing), compress V (turbo3\u002F4). E.g., Qwen2.5-7B Q4_K_M: q8_0-K + turbo4-V PPL 6.64 (+1.0% vs q8_0); symmetric turbo3 catastrophic (3556 PPL). Bigger models tolerate symmetric better (104B Command-R+: turbo3 +3.6%). Config guide: docs\u002Fturboquant-recommendations.md.",[17,4010,4012],{"id":4011},"layer-aware-and-sparse-optimizations-maximize-speed-and-quality","Layer-Aware and Sparse Optimizations Maximize Speed and Quality",[22,4014,4015],{},"Boundary V (layer-aware): Protects first\u002Flast 2 layers at q8_0-V, turbo2-V elsewhere. Recovers 37-91% of quality gap to turbo3 (e.g., Qwen3.5-35B MoE: turbo2 5.257 → Boundary 5.148 vs turbo3 5.137). Scales with depth (91% on 64L MoE). Enabled via TURBO_LAYER_ADAPTIVE=7; no speed hit.",[22,4017,4018],{},"Sparse V dequant: Skips V dequant for softmax weights \u003C1e-6 (most at long context). +22.8% decode at 32K (turbo3: 0.76x → 0.93x q8_0), no PPL change (wikitext-103 50 chunks, CI±0.021). General opt: +5% on q8_0 KV. Validated 1.5B-104B; dense models gain less (1-2% as FFN dominates).",[4001,4020,4021],{},[22,4022,4023],{},"\"Sparse V: Attention-gated KV cache decoding that skips low-weight V positions during inference. Up to +22.8% decode speed at 32K context... no measurable PPL change.\"",[22,4025,4026],{},"Prefill scales 2K-32K: turbo3\u002F4 ≥ q8_0 (e.g., 32K: turbo3 1204 vs 1098 t\u002Fs). Decode (M5 Max Qwen3.5-35B-A3B Sparse V): turbo4 1060 t\u002Fs long ctx (0.90x q8_0); real 24K PDF: turbo4 63.7 t\u002Fs (0.93x). M1 Max 38K doc: turbo4 +33.9% decode vs q8_0.",[22,4028,4029],{},"Optimization path (4K prefill): fp32 WHT (739 t\u002Fs, 0.27x q8_0) → fp16 + vectorized butterfly + graph rotation + block-32 + dequant → 2524 t\u002Fs (0.98x). KL div vs f16: turbo4 0.009633 (lower than q4_0 0.008091? Wait, table shows turbo4 better top-p agreement 95.98%).",[17,4031,4033],{"id":4032},"cross-hardware-benchmarks-confirm-production-readiness","Cross-Hardware Benchmarks Confirm Production Readiness",[22,4035,4036],{},"Apple Silicon (M5 Max 128GB): 104B@128K turbo3 (PPL 4.024? Wait, table 6.415 +3.6%; 74GB peak). Raise iogpu.wired_limit_mb=117964. M1 Max: turbo4 beats q8_0 long ctx. CUDA (RTX3090 Qwen3.5-9B Q4_K_M): turbo3\u002F4 decode 95-98 t\u002Fs (0.93-0.96x q8_0). AMD RX9070 XT (RDNA4 HIP): q8_0-K + turbo4-V +1.0% PPL, +2.5% decode.",[4038,4039,4040,4065],"table",{},[4041,4042,4043],"thead",{},[4044,4045,4046,4050,4053,4056,4059,4062],"tr",{},[4047,4048,4049],"th",{},"Hardware",[4047,4051,4052],{},"Model",[4047,4054,4055],{},"Config",[4047,4057,4058],{},"Decode t\u002Fs",[4047,4060,4061],{},"vs q8_0",[4047,4063,4064],{},"Notes",[4066,4067,4068,4089,4109,4128],"tbody",{},[4044,4069,4070,4074,4077,4080,4083,4086],{},[4071,4072,4073],"td",{},"M5 Max",[4071,4075,4076],{},"Qwen3.5-35B-A3B",[4071,4078,4079],{},"turbo4 + Sparse V",[4071,4081,4082],{},"1060 (32K)",[4071,4084,4085],{},"0.90x",[4071,4087,4088],{},"MoE",[4044,4090,4091,4094,4097,4100,4103,4106],{},[4071,4092,4093],{},"RTX3090",[4071,4095,4096],{},"Qwen3.5-9B Q4_K_M",[4071,4098,4099],{},"turbo4\u002Fturbo4",[4071,4101,4102],{},"95.87",[4071,4104,4105],{},"0.93x",[4071,4107,4108],{},"CUDA",[4044,4110,4111,4114,4116,4119,4122,4125],{},[4071,4112,4113],{},"M1 Max 64GB",[4071,4115,4076],{},[4071,4117,4118],{},"turbo4",[4071,4120,4121],{},"16.6 (38K)",[4071,4123,4124],{},"+33.9%",[4071,4126,4127],{},"Real doc",[4044,4129,4130,4133,4136,4139,4142,4145],{},[4071,4131,4132],{},"RX9070 XT",[4071,4134,4135],{},"Qwen2.5-7B Q4_K_M",[4071,4137,4138],{},"q8_0-K\u002Fturbo4-V",[4071,4140,4141],{},"86.8",[4071,4143,4144],{},"+2.5%",[4071,4146,4147],{},"HIP",[4001,4149,4150],{},[22,4151,4152],{},"\"104B at 128K context on a MacBook with turbo3 (PPL 4.024, 74 GB peak memory).\"",[17,4154,4156],{"id":4155},"retrieval-and-perplexity-validate-fidelity","Retrieval and Perplexity Validate Fidelity",[22,4158,4159],{},"NIAH (Kamradt\u002FRULER): turbo4 31\u002F33 (+3% vs q8_0 30\u002F33); turbo3 + Sparse V 9\u002F9. Multi-key 100% to 32K. Long ctx PPL (32K wikitext-103 50ch): turbo3 +1.64% vs q8_0, Sparse V delta=0. PPL stable: Llama3.1-70B turbo4 +6.3%, Command-R+104B +1.9%.",[4001,4161,4162],{},[22,4163,4164],{},"\"turbo4 beats q8_0 on retrieval (31\u002F33 vs 30\u002F33). Shared failure at 8K\u002F100% is a model weakness, not quantization.\"",[22,4166,4167],{},"Python prototype confirms: turbo4 cosine sim 0.96, MSE 0.0007. Gaussianization exact (kurtosis 900→2.9).",[17,4169,4171],{"id":4170},"key-takeaways","Key Takeaways",[4173,4174,4175,4179,4187,4190,4193,4196,4199,4202],"ul",{},[4176,4177,4178],"li",{},"Use turbo4 for best quality\u002Fcompression balance (3.8x, +0.23% PPL); turbo3 for max (5.12x block=128, +1% PPL).",[4176,4180,4181,4182,4186],{},"Asymmetric q8_0-K + turbo",[4183,4184,4185],"span",{},"3\u002F4","-V on Q4_K_M weights; symmetric on Q8_0+ or large models.",[4176,4188,4189],{},"Enable Sparse V always (+22% long decode, no PPL hit); Boundary V on deep models.",[4176,4191,4192],{},"Prefill ≥ q8_0 speed; validate decode on your hardware (M5+ best for turbo3).",[4176,4194,4195],{},"Build llama.cpp from fork; test PPL\u002FNIAH on your model before deploy.",[4176,4197,4198],{},"For Apple Silicon max ctx: sysctl iogpu.wired_limit_mb=90% RAM.",[4176,4200,4201],{},"Upstream path: Stable pieces as llama.cpp patches.",[4176,4203,4204],{},"MLX Swift fork for 2.5x faster Apple decode (144 t\u002Fs Qwen3.5-35B-A3B).",{"title":171,"searchDepth":172,"depth":172,"links":4206},[4207,4208,4209,4210,4211],{"id":3992,"depth":172,"text":3993},{"id":4011,"depth":172,"text":4012},{"id":4032,"depth":172,"text":4033},{"id":4155,"depth":172,"text":4156},{"id":4170,"depth":172,"text":4171},[],{"content_references":4214,"triage":4233},[4215,4218,4221,4225,4229],{"type":3875,"title":4216,"url":4217,"context":189},"TurboQuant: Redefining AI Efficiency with Extreme Compression","https:\u002F\u002Fresearch.google\u002Fblog\u002Fturboquant-redefining-ai-efficiency-with-extreme-compression\u002F",{"type":186,"title":4219,"url":4220,"context":189},"llama-cpp-turboquant","https:\u002F\u002Fgithub.com\u002FTheTom\u002Fllama-cpp-turboquant",{"type":186,"title":4222,"author":4223,"url":4224,"context":3873},"mlx-swift-lm","ekryski","https:\u002F\u002Fgithub.com\u002Fekryski\u002Fmlx-swift-lm",{"type":186,"title":4226,"author":4227,"url":4228,"context":3879},"LLMTest_NeedleInAHaystack","gkamradt","https:\u002F\u002Fgithub.com\u002Fgkamradt\u002FLLMTest_NeedleInAHaystack",{"type":186,"title":4230,"author":4231,"url":4232,"context":3879},"RULER","NVIDIA","https:\u002F\u002Fgithub.com\u002FNVIDIA\u002FRULER",{"relevance":3963,"novelty":3963,"quality":211,"actionability":172,"composite":3964,"reasoning":4234},"Category: AI & LLMs. The article discusses a specific implementation of TurboQuant for KV cache compression, which is relevant to AI engineering. However, it lacks practical application details that the target audience could act on immediately, focusing more on technical specifications and performance metrics.","\u002Fsummaries\u002F2a9849ad35620d4f-turboquant-6-4x-kv-cache-compression-at-q8-0-speed-summary","2026-04-16 03:08:34",{"title":3982,"description":171},{"loc":4235},"2a9849ad35620d4f","https:\u002F\u002Fgithub.com\u002FTheTom\u002Fturboquant_plus.git","summaries\u002F2a9849ad35620d4f-turboquant-6-4x-kv-cache-compression-at-q8-0-speed-summary",[225,227,3976,226],"Implements TurboQuant in llama.cpp for 3.8-6.4x KV cache compression (turbo2\u002F3\u002F4 formats) with PPL near q8_0, matching prefill speed, and 0.9x decode on Apple Silicon, CUDA, AMD—plus Sparse V for +22.8% decode.",[],"B-pb0MWnzaai4T1NyHjyPyKUvFNfyk7UQEPHQ791a_c",{"id":4247,"title":4248,"ai":4249,"body":4254,"categories":4282,"created_at":180,"date_modified":180,"description":171,"extension":181,"faq":180,"featured":182,"kicker_label":180,"meta":4283,"navigation":214,"path":4297,"published_at":180,"question":180,"scraped_at":4298,"seo":4299,"sitemap":4300,"source_id":4301,"source_name":220,"source_type":221,"source_url":4302,"stem":4303,"tags":4304,"thumbnail_url":180,"tldr":4305,"tweet":180,"unknown_tags":4306,"__hash__":4307},"summaries\u002Fsummaries\u002F63a0d80738ea0494-apache-2-0-for-gemma-build-modify-sell-freely-summary.md","Apache 2.0 for Gemma: Build, Modify, Sell Freely",{"provider":7,"model":8,"input_tokens":4250,"output_tokens":4251,"processing_time_ms":4252,"cost_usd":4253},5498,1736,14876,0.00146075,{"type":14,"value":4255,"toc":4277},[4256,4260,4263,4267,4270,4274],[17,4257,4259],{"id":4258},"usage-rights-unlock-commercial-ai-builds","Usage Rights Unlock Commercial AI Builds",[22,4261,4262],{},"Apache 2.0 provides each contributor's perpetual, worldwide, non-exclusive, royalty-free, irrevocable copyright license to reproduce, prepare derivative works, publicly display\u002Fperform, sublicense, and distribute Gemma models in source or object form. Patent licenses cover making, using, selling, or importing the work, but only for claims necessarily infringed by the contributor's additions—terminate if you sue over the work. This setup lets you integrate Gemma into SaaS products, fine-tune for custom agents, or bundle in apps without royalty payments, as long as you comply with redistribution rules.",[17,4264,4266],{"id":4265},"redistribution-four-conditions-to-follow","Redistribution: Four Conditions to Follow",[22,4268,4269],{},"Distribute unmodified or modified Gemma copies in any medium by: (a) including the full Apache 2.0 license; (b) adding prominent notices to changed files; (c) retaining all original copyright, patent, trademark, and attribution notices in source forms (omit irrelevant ones); (d) carrying over any NOTICE file contents in your derivatives via NOTICE file, docs, or UI displays. Add your own copyright or stricter terms to modifications, but never alter the original work's license. Contributions you submit default to Apache 2.0 unless specified otherwise. Trademarks like 'Gemma' can't be used beyond describing origin or NOTICE reproduction. These steps ensure legal forks, like RAG pipelines or hosted inference services, while protecting upstream contributors.",[17,4271,4273],{"id":4272},"no-warranties-your-risks","No Warranties, Your Risks",[22,4275,4276],{},"Gemma comes 'AS IS' without warranties of title, non-infringement, merchantability, or fitness—test thoroughly for production. Contributors limit liability for all damages (direct, indirect, etc.), even if warned. Offer paid support or warranties on your derivatives, but indemnify contributors. For AI builders, this means validate model outputs, handle hallucinations, and monitor costs yourself; the license shields Google and contributors from your app's failures.",{"title":171,"searchDepth":172,"depth":172,"links":4278},[4279,4280,4281],{"id":4258,"depth":172,"text":4259},{"id":4265,"depth":172,"text":4266},{"id":4272,"depth":172,"text":4273},[179],{"content_references":4284,"triage":4294},[4285,4288,4291],{"type":194,"title":4286,"url":4287,"context":189},"Creative Commons Attribution 4.0 License","https:\u002F\u002Fcreativecommons.org\u002Flicenses\u002Fby\u002F4.0\u002F",{"type":194,"title":4289,"url":4290,"context":189},"Apache 2.0 License","https:\u002F\u002Fwww.apache.org\u002Flicenses\u002FLICENSE-2.0",{"type":194,"title":4292,"url":4293,"context":189},"Google Developers Site Policies","https:\u002F\u002Fdevelopers.google.com\u002Fsite-policies",{"relevance":211,"novelty":3963,"quality":211,"actionability":211,"composite":4295,"reasoning":4296},3.8,"Category: Business & SaaS. The article provides a detailed overview of the Apache 2.0 licensing for Gemma models, which is crucial for AI builders looking to integrate these models into commercial applications. It outlines specific conditions for redistribution and legal considerations, making it actionable for developers and founders who need to navigate licensing in their AI product development.","\u002Fsummaries\u002F63a0d80738ea0494-apache-2-0-for-gemma-build-modify-sell-freely-summary","2026-04-15 15:33:14",{"title":4248,"description":171},{"loc":4297},"63a0d80738ea0494","https:\u002F\u002Fai.google.dev\u002Fgemma\u002Fdocs\u002Fgemma_4_license","summaries\u002F63a0d80738ea0494-apache-2-0-for-gemma-build-modify-sell-freely-summary",[225,227],"Gemma models grant perpetual, royalty-free copyright and patent licenses to reproduce, modify, distribute, and commercialize under Apache 2.0, requiring attribution retention, change notices, and license inclusion—ideal for production AI apps.",[],"SDmGg1GY44qs6hT8xihrZmtoku0r_92N4_cHS9lVNcg"]