Login
You're viewing the social.afront.org public feed.
  • Apr 23, 2026, 6:56 PM

    I need a graph of context window growth vs. resources like inference compute / RAM over time. Has anyone already made this?

    And does the model's size itself put limits on context windows, or is that purely a feature of the hardware you're doing inference on?

    💬 2🔄 0⭐ 0

Replies

  • Apr 23, 2026, 7:26 PM

    @glyph that's a good question, but the answer definitely won't fit in toots or graphs. It's very, very complicated because it's oranges to lead slugs comparisons. It also gets into the weeds on real capability vs. incompetent children writing bad math.

    💬 1🔄 0⭐ 0
  • Apr 23, 2026, 7:38 PM

    @rootwyrm I have heard the claim "context grows linearly while compute grows exponentially" and I mainly just want to know if that is roughly correct, and if so, why.

    💬 1🔄 0⭐ 0
  • Apr 23, 2026, 8:41 PM

    @glyph oh, LOL. EL OH EL OH EL OH EL. They are actually genuinely saying that shit still? That is beyond hilarious. Moore's Law died an ignoble death years and years ago. NV claiming exponential is even more hilariously bullshit, as in fact, their parts are getting exponentially *WORSE*. Never mind that it's just a fact that they are slamming into a memory wall because NV's parts are trash, running trash code, chasing something we already know will never work.

    💬 1🔄 0⭐ 0
  • Apr 23, 2026, 9:17 PM

    @rootwyrm sorry, what I meant was, "if you put in an exponential increase of inference compute, you get a linear increase in context". i.e. we are probably going to hit some hard limits on context size soon. This sounds mostly plausible to me (and it sounds like you're saying something similar?) but I would like a more detailed citation

    💬 0🔄 0⭐ 0
  • 💬 1🔄 0⭐ 0
  • Apr 23, 2026, 7:50 PM

    @nielsa 3 factors keep me wanting to keep this a blog post:

    1. I want to be done with it so bad, I am trying to make it shorter not longer
    2. I have overpromised to my patrons at this point and the "book" form-factor, even self-published ebook, adds complexity and thus further delay
    3. the endless footnotes are really part of the whole… deal… so it doesn't lend itself to an offline "book" style reading experience

    💬 1🔄 0⭐ 0
  • Apr 23, 2026, 7:51 PM

    @nielsa I also, so badly, do not want to be an expert in this field. the whole thing is about how I want to ignore it and can't

    💬 0🔄 0⭐ 0