• BetaDoggo_@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 years ago

    I can’t see local models or hardware needing to scale much past the sizes we already have. Recent models like mistral have shown that we are still far from saturation at current model sizes.

    • NotMyOldRedditName@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 years ago

      And we only ever needed 64kb of ram.

      Even if we have a lot of room to optimize and grow within what we have, we still have so much more to do.

      Fully coherent audio and video synthesis for a scene for example.

      And these models are being trained on server farms, but thats just because video memory is so expensive to come by.

      We’re just starting to crawl, we haven’t even started walking yet on where this is going.