By 2030, AI will greatly outperform humans in some complex intellectual tasks. Discover how LLMs are doubling their capabilities every seven months.

  • snooggums@lemmy.world
    link
    fedilink
    English
    arrow-up
    102
    ·
    23 hours ago

    This is like measuring the increasing speeds of cars in the early years and extrapolating that they would be supersonic by now by ignoring the exponential impact that air resistance has.

    • Voroxpete@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      13
      ·
      9 hours ago

      My son has doubled in size every month for the last few months. At this rate he’ll be fifty foot tall by the time he’s seven years old.

      Yeah, it’s a stupid claim to make on the face of it. It also ignores practical realities. The first is those is training data, and the second is context windows. The idea that AI will successfully write a novel or code a large scale piece of software like a video game would require them to be able to hold that entire thing in their context window at once. Context windows are strongly tied to hardware usage, so scaling them to the point where they’re big enough for an entire novel may not ever be feasible (at least from a cost/benefit perspective).

      I think there’s also the issue of how you define “success” for the purpose of a study like this. The article claims that AI may one day write a novel, but how do you define “successfully” writing a novel? Is the goal here that one day we’ll have a machine that can produce algorithmically mediocre works of art? What’s the value in that?

    • Lovable Sidekick@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      ·
      edit-2
      14 hours ago

      Very good analogy. They’re also ignoring that getting faster and faster at reaching a 50% success rate (a totally unacceptable success rate for meaningful tasks) doesn’t imply ever achieving consistently acceptable success.