My apologies if this is a joke I’m not understanding, but as far as I can tell with the wayback machine, this animation predates not just coding/generative AI, but the Attention paper and the founding of OpenAI too.
It makes sense if you imagine the real motivation is “make sure the AI contracts go to my good friend Sam”, and all the red line stuff is just a way to pick a fight with Anthropic.
“Think about what this means … the original SimCity ran on a Commodore 64. An empty Chrome tab takes more memory than that entire machine had. We’re not constrained by hardware anymore. We’re not even constrained by understanding what the code does … codebases will 10-100x in size because AI … endless bugs … the question is whether you’re building with it or explaining why you’re not.”
Looking through the eyes of an AI champion, I see a world where the first execution of any given idea, the first product to hit the market for any given need, is guaranteed to be AI-generated - with the “10-100x size” codebase, the corresponding (and often superlinear) decrease in performance, and the attendant “endless bugs”.
There are costs to doing ads (e.g. it burns social/political capital that could be used to defuse scandals or slow down hostile legislation, it consumes some fraction of your employees’ work hours, it may discourage some new talent from joining).
Yes. Infinite low cost intelligence labor to replace those pesky humans!
Really reminds me of the economics of slavery. Best way for line to go up is the ultimate suppression and subjugation of labor!
Hypothetically can lead to society free to not waste their life on work, but pursue their passions. Most likely it’ll lead to plantation-style hungry-hungry-hippo ruling class taking the economy away from the rest of us
If you had told me in 2011, when I first started discussing artificial intelligence, that in 2026 a trillion dollar company would earnestly publish the statement “Our mission is to ensure AGI benefits all of humanity; our pursuit of advertising is always in support of that mission”, I would have tossed my laptop into the sea and taken up farming instead.
I thought your quote was hyperbole or an exaggerated summary of the post. Nope. It's literally taken verbatim. I can't believe someone wrote that down with a straight face... although to be honest it was probably written with AI
In 2011 I would've had trouble believing there could be a trillion dollar AI company, but that if there was such a company I could almost expect they would make such an asinine statement.
15MB of JavaScript is 15MB of code that your browser is trying to execute. It’s the same principle as “compiling a million lines of code takes a lot longer than compiling a thousand lines”.
It's a lot more complicated than that. If I have a 15MB .js file and it's just a collection of functions that get called on-demand (later), that's going to have a very, very low overhead because modern JS engines JIT compile on-the-fly (as functions get used) with optimization happening for "hot" stuff (even later).
If there's 15MB of JS that gets run immediately after page load, that's a different story. Especially if there's lots of nested calls. Ever drill down deep into a series of function calls inside the performance report for the JS on a web page? The more layers of nesting you have, the greater the overhead.
DRY as a concept is great from a code readability standpoint but it's not ideal performance when it comes to things like JS execution (haha). I'm actually disappointed that modern bundlers don't normally inline calls at the JS layer. IMHO, they rely too much on the JIT to optimize hot call sites when that could've been done by the bundler. Instead, bundlers tend to optimize for file size which is becoming less and less of a concern as bandwidth has far outpaced JS bundle sizes.
The entire JS ecosystem is a giant mess of "tiny package does one thing well" that is dependent on n layers of "other tiny package does one thing well." This results in LOADS of unnecessary nesting when the "tiny package that does one thing well" could've just written their own implementation of that simple thing it relies on.
Don't think of it from the perspective of, "tree shaking is supposed to take care of that." Think of it from the perspective of, "tree shaking is only going to remove dead/duplicated code to save file sizes." It's not going to take that 10-line function that handles with <whatever> and put that logic right where its used (in order to shorten the call tree).
That 15mb still needs to be parsed on every page load, even if it runs in interpreted mode. And on low end devices there’s very little cache, so the working set is likely to be far bigger than available cache, which causes performance to crater.
Ah, that's the thing: "on page load". A one-time expense! If you're using modern page routing, "loading a new URL" isn't actually loading a new page... The client is just simulating it via your router/framework by updating the page URL and adding an entry to the history.
Also, 15MB of JS is nothing on modern "low end devices". Even an old, $5 Raspberry Pi 2 won't flinch at that and anything slower than that... isn't my problem! Haha =)
There comes a point where supporting 10yo devices isn't worth it when what you're offering/"selling" is the latest & greatest technology.
It shouldn't be, "this is why we can't have nice things!" It should be, "this is why YOU can't have nice things!"
When you write code with this mentality it makes my modern CPU with 16 cores at 4HGz and 64GB of RAM feel like a Pentium 3 running at 900MHz with 512MB of RAM.
This really is a very wrong take. My iPhone 11 isn't that old but it struggles to render some websites that are Chrome-optimised. Heck, even my M1 Air has a hard time sometimes. It's almost 2026, we can certainly stop blaming the client for our shitty webdevelopment practices.
>There comes a point where supporting 10yo devices isn't worth it
Ten years isn't what it used to be in terms of hardware performance. Hell, even back in 2015 you could probably still make do with a computer from 2005 (although it might have been on its last legs). If your software doesn't run properly (or at all) on ten-year-old hardware, it's likely people on five-year-old hardware, or with a lower budget, are getting a pretty shitty experience.
I'll agree that resources are finite and there's a point beyond which further optimizations are not worthwhile from a business sense, but where that point lies should be considered carefully, not picked arbitrarily and the consequences casually handwaved with an "eh, not my problem".
Tangentially related: one of my favourite things about JavaScript is that it has so many different ways for the computer to “say no” (in the sense of “computer says no”): false, null, undefined, NaN, boolean coercion of 0/“”, throwing errors, ...
While it’s common to see groaning about double-equal vs triple-equal comparison and eye-rolling directed at absurdly large tables like in https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guid... but I think it’s genuinely great that we have the ability to distinguish between concepts like “explicitly not present” and “absent”.
https://web.archive.org/web/20150314221334/http://acko.net/
reply