• didnt_readit@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    6 months ago

    They’ve been doing ML locally on devices for like a decade. Since way before all the AI hype. They’ve had dedicated ML inference cores in their chips for a long time too which helps the battery life situation.