• ikt@aussie.zoneOP
    link
    fedilink
    arrow-up
    9
    ·
    edit-2
    8 hours ago

    tbf they are getting significantly better, one of the best things that hasn’t really filtered through to the mainstream is MOE / mixture of experts

    the tldr is that back in the chatgpt4 days, wayyy back in the ye olden times of 2024, ai would essentially go through the entire library every single question to find an answer

    Now the libraries are getting massive but the queries are getting faster at responding because instead of going through the entire library for every question, they only need a part of it, just like in a library instead of querying all of human knowledge for what is 8x12, it just goes to the maths section saving a lot of power and time

    In the case of chat.mistral.ai it doesn’t even go through the library to maths, it just makes a quick python script and outputs the answer that way: