<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[Shravan Reddy]]></title><description><![CDATA[The intersection of startups, fintech, AI and B2B.]]></description><link>https://writing.shravanreddy.com</link><generator>Substack</generator><lastBuildDate>Mon, 27 Apr 2026 12:12:38 GMT</lastBuildDate><atom:link href="https://writing.shravanreddy.com/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Shravan Reddy]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[shravangreddy@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[shravangreddy@substack.com]]></itunes:email><itunes:name><![CDATA[Shravan Reddy]]></itunes:name></itunes:owner><itunes:author><![CDATA[Shravan Reddy]]></itunes:author><googleplay:owner><![CDATA[shravangreddy@substack.com]]></googleplay:owner><googleplay:email><![CDATA[shravangreddy@substack.com]]></googleplay:email><googleplay:author><![CDATA[Shravan Reddy]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[The First Law of Complexodynamics]]></title><description><![CDATA[Ilya Sutskever recommended 30 papers to read with the tantalizing statement that "If you really learn all of these, you'll know 90% of what matters today".]]></description><link>https://writing.shravanreddy.com/p/the-first-law-of-complexodynamics</link><guid isPermaLink="false">https://writing.shravanreddy.com/p/the-first-law-of-complexodynamics</guid><dc:creator><![CDATA[Shravan Reddy]]></dc:creator><pubDate>Sat, 02 Aug 2025 03:09:06 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!cQTy!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F02c19672-301d-423c-8da4-217428123c42_1024x1352.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!cQTy!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F02c19672-301d-423c-8da4-217428123c42_1024x1352.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!cQTy!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F02c19672-301d-423c-8da4-217428123c42_1024x1352.png 424w, https://substackcdn.com/image/fetch/$s_!cQTy!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F02c19672-301d-423c-8da4-217428123c42_1024x1352.png 848w, https://substackcdn.com/image/fetch/$s_!cQTy!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F02c19672-301d-423c-8da4-217428123c42_1024x1352.png 1272w, https://substackcdn.com/image/fetch/$s_!cQTy!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F02c19672-301d-423c-8da4-217428123c42_1024x1352.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!cQTy!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F02c19672-301d-423c-8da4-217428123c42_1024x1352.png" width="728" height="961.1875" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/02c19672-301d-423c-8da4-217428123c42_1024x1352.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:false,&quot;imageSize&quot;:&quot;normal&quot;,&quot;height&quot;:1352,&quot;width&quot;:1024,&quot;resizeWidth&quot;:728,&quot;bytes&quot;:2768201,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://shravangreddy.substack.com/i/169902158?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5ebfed6b-d291-438c-9580-b2f66b3dd502_1024x1536.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:&quot;center&quot;,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!cQTy!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F02c19672-301d-423c-8da4-217428123c42_1024x1352.png 424w, https://substackcdn.com/image/fetch/$s_!cQTy!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F02c19672-301d-423c-8da4-217428123c42_1024x1352.png 848w, https://substackcdn.com/image/fetch/$s_!cQTy!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F02c19672-301d-423c-8da4-217428123c42_1024x1352.png 1272w, https://substackcdn.com/image/fetch/$s_!cQTy!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F02c19672-301d-423c-8da4-217428123c42_1024x1352.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Ilya Sutskever recommended <a href="https://aman.ai/primers/ai/top-30-papers/">30 papers to read with the tantalizing statement</a> that "<em>If you really learn all of these, you'll know 90% of what matters today</em>". This series is meant to chronicle my effort at understanding each of these papers, by writing out my takeaways.</p><h3>Part 1/30: <strong><a href="https://scottaaronson.blog/?p=762">The First Law of Complexodynamics</a></strong></h3><p>The statement posed at the beginning of this article (not really a paper) is <em>"why does &#8220;complexity&#8221; or &#8220;interestingness&#8221; of physical systems seem to increase with time and then hit a maximum and decrease, in contrast to the entropy, which of course increases monotonically?"</em></p><p>It&#8217;s easy enough to understand intuitively why the <em><strong>entropy</strong></em> in a system increases. But how <em><strong>interesting</strong></em> that system looks first rises, then peaks, and finally drops. The author uses milk being poured into espresso as an analogy: perfectly separate layers are tidy but dull, the swirling marbled phase is mesmerizing, and the final homogeneous mocha is&#8230; boring again. An alternative example would be ice crystals (low entropy) vs. a jar of mixed sand (high entropy). Both of these would still be considered simple. In between the two, is something like snow (medium entropy), which is composed of snowflakes with impossibly complex patterns.</p><p>To formalize the term <em><strong>interesting</strong></em>, the author introduces <em><strong>Kolmogorov</strong></em><strong> </strong><em><strong>complexity</strong></em>. Sidenote: it&#8217;s amusing how academic authors introduce new concepts. The exact statement in the paper is "<em><strong>Recall</strong> that the Kolmogorov complexity of a string x is the length of the shortest computer program that outputs x</em>&#8221;. I recall no such thing...</p><p>Anyway, KC is defined as the number of bits in the shortest computer program to reproduce a string. Ordered patterns like AAAA&#8230; compress into a tiny loop, so they have low KC. A truly random 1000-bit string has no shorter representation, so its KC is 1000 itself. My main takeaway from this section was to think of KC as a means to measure "lossless compression". If the KC is low, then it can be compressed without any data loss. If it's high, then the level of compression possible is minimal.</p><p>But the paradox the author points out is that high-KC random strings don&#8217;t feel <em><strong>sophisticated</strong></em>. You can summarize everything about the random strong in six words: <em>&#8220;It came from fair coin flips.&#8221;</em></p><p><strong>Sophistication</strong> (or &#8220;<strong>complextropy&#8221;</strong>) captures a second measure &#8211; <em>How many bits does it take to specify a model that captures the regularities in the data?</em> Pure order takes barely any description. Pure noise doesn&#8217;t need much either, since the only rule is basically "just randomness all the way down." The sweet spot is in between, where you need a non&#8209;trivial model <em>and</em> some extra bits to pick out one particular instance. That middle zone of snowflakes, marbled coffee, Shakespeare&#8217;s prose all match our gut feeling of maximum <strong>interestingness</strong>.</p><p>My main takeaway from this article was that training a neural network is glorified compression. The optimal weight configurations can capture vast datasets with as few bits as possible along with their structure. Too small a model and you underfit like an ice crystal. Too large and you memorize noise like that random string. Compression is insight; randomness isn&#8217;t inherently interesting; and peak complexity is fleeting.</p><p>Next up on the list is Andrej Karpathy <em>&#8220;<strong><a href="https://karpathy.github.io/2015/05/21/rnn-effectiveness/">The Unreasonable Effectiveness of Recurrent Neural Networks</a></strong>.&#8221;</em> Wish me luck.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://writing.shravanreddy.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">If you found this valuable, consider subscribing to receive new essays <em>irregularly</em>.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item></channel></rss>