• Richard@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    3
    ·
    1 day ago

    Untrue. There are small models that produce better output than the previous “flagships” like GPT-2. Also, you can achieve much more than we currently do with far less energy by working on novel, specialised hardware (neuromorphic computing).