• okwhateverdude@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    15 days ago

    Ackually 🤓, gemini pro and other similar models are basically a loop over some metaprompts with tool usage including using search. It will actually reference/cite documentation if given explicit instructions. You’re right, the anthropomorphization is troubling. That said, the simulacrum presented DOES follow directions and it’s (meaning the complete system of LLM + looped prompts) behavior can be interpreted as having some kind of agency. We’re on the same side, but you’re sorely misinformed, friend.

    • MotoAsh@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      15 days ago

      I’m not misinformed. You’re still trying to call a groomed LLM something that reasons when it literally is not doing that in any meaningful capacity.