Wondering if Modern LLMs like GPT4, Claude Sonnet and llama 3 are closer to human intelligence or next word predictor. Also not sure if this graph is right way to visualize it.

  • Binette@lemmy.ml
    link
    fedilink
    arrow-up
    2
    ·
    4 hours ago

    Hell no. Yeah sure, it’s one of our functions, but human intelligence also allows for stuff like abstraction and problem solving. There are things that you can do in your head without using words.

    • CanadaPlus@lemmy.sdf.org
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      3 hours ago

      I mean, I know that about my mind. Not anybody else’s.

      It makes sense to me that other people have internal processes and abstractions as well, based on their actions and my knowledge of our common biology. Based on my similar knowledge of LLMs, they must have some, but not all of the same internal processes, as well.