• criss_cross@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    edit-2
    9 hours ago

    Not that I have any skin of the game or what you’re saying, but his entire pitch is to build something that’s not an LLM

    LeCun argues that most human reasoning is grounded in the physical world, not language, and that AI world models are necessary to develop true human-level intelligence. “The idea that you’re going to extend the capabilities of LLMs [large language models] to the point that they’re going to have human-level intelligence is complete nonsense,” he said in an interview with WIRED.

    I forget the technique he’s going after but it’s supposed to be different from the regular LLM.

    EDIT: if I paid more attention I’d see it’s in the article. World models.