• Hawk@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    They were inferencing a cnn on a mobile device? I have no clue but that would be costly battery wise at least.

    • didnt_readit@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      6 months ago

      They’ve been doing ML locally on devices for like a decade. Since way before all the AI hype. They’ve had dedicated ML inference cores in their chips for a long time too which helps the battery life situation.