• RvTV95XBeo@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    14 hours ago

    Maybe I’m just getting old, but I honestly can’t think of any practical use case for AI in my day-to-day routine.

    ML algorithms are just fancy statistics machines, and to that end, I can see plenty of research and industry applications where large datasets need to be assessed (weather, medicine, …) with human oversight.

    But for me in my day to day?

    I don’t need a statistics bot making decisions for me at work, because if it was that easy I wouldn’t be getting paid to do it.

    I don’t need a giant calculator telling me when to eat or sleep or what game to play.

    I don’t need a Roomba with a graphics card automatically replying to my text messages.

    Handing over my entire life’s data just so a ML algorithm might be able to tell me what that one website I visited 3 years ago that sold kangaroo testicles was isn’t a filing system. There’s nothing I care about losing enough to go the effort of setting up copilot, but not enough to just, you know, bookmark it, or save it with a clear enough file name.

    Long rant, but really, what does copilot actually do for me?

      • Dragonstaff@leminal.space
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 hours ago

        We’ve had speech to text since the 90s. Current iterations have improved, like most technology has improved since the 90s. But, no, I wouldn’t buy a new computer with glaring privacy concerns for real time subtitles in movies.

      • zurohki@aussie.zone
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 hours ago

        I tried feeding Japanese audio to an LLM to generate English subs and it started translating silence and music as requests to donate to anime fansubbers.

        No, really. Fansubbed anime would put their donation message over the intro music or when there wasn’t any speech to sub and the LLM learned that.

    • ByteJunk@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 hours ago

      I use it to speed up my work.

      For example, I can give it a database schema and ask it for what I need to achieve and most of the time it will throw out a pretty good approximation or even get it right on the first go, depending on complexity and how well I phrase the request. I could write these myself, of course, but not in 2 seconds.

      Same with text formatting, for example. I regularly need to format long strings in specific ways, adding brackets and changing upper/lower capitalization. It does it in a second, and really well.

      Then there’s just convenience things. At what date and time will something end if it starts in two weeks and takes 400h to do? There’s tools for that, or I could figure it out myself, but I mean the AI is just there and does it in a sec…

      • morbidcactus@lemmy.ca
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 hours ago

        Gotta be real, LLMs for queries makes me uneasy. We’re already in a place where data modeling isn’t as common and people don’t put indexes or relationships between tables (and some tools didn’t really support those either), they might be alright at describing tables (Databricks has it baked in for better or worse for example, it’s usually pretty good at a quick summary of what a table is for), throwing an LLM on that doesn’t really inspire confidence.

        If your data model is highly normalised, with fks everywhere, good naming and well documented, yeah totally I could see that helping, but if that’s the case you already have good governance practices (which all ML tools benefit from AFAIK). Without that, I’m totally dreading the queries, people already are totally capable of generating stuff that gives DBAs a headache, simple cases yeah maybe, but complex queries idk I’m not sold.

        Data understanding is part of the job anyhow, that’s largely conceptual which maybe LLMs could work as an extension for, but I really wouldn’t trust it to generate full on queries in most of the environments I’ve seen, data is overwhelmingly super messy and orgs don’t love putting effort towards governance.

      • Samskara@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 hours ago

        adding brackets and changing upper/lower capitalization

        I have used a system wide service in macOS for that for decades by now.

      • self@awful.systems
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 hours ago

        it’s really embarrassing when the promptfans come here to brag about how they’re using the technology that’s burning the earth and it’s just basic editor shit they never learned. and then you watch these fuckers “work” and it’s miserably slow cause they’re prompting the piece of shit model in English, waiting for the cloud service to burn enough methane to generate a response, correcting the output and re-prompting, all to do the same task that’s just a fucking key combo.

        Same with text formatting, for example. I regularly need to format long strings in specific ways, adding brackets and changing upper/lower capitalization. It does it in a second, and really well.

        how in fuck do you work with strings and have this shit not be muscle memory or an editor macro? oh yeah, by giving the fuck up.

        • CarrotsHaveEars@lemmy.ml
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          3 hours ago

          (100% natural rant)

          I can change a whole fucking sentence to FUCKING UPPERCASE by just pressing vf.gU in fucking vim with a fraction of the amount of the energy that’s enough to run a fucking marathon, which in turn, only need to consume a fraction of the energy the fucking AI cloud cluster uses to spit out the same shit. The comparison is like a ping pong ball to the Earth, then to the fucking sun!

          Alright, bros, listen up. All these great tasks you claim AI does it faster and better, I can write up a script or something to do it even faster and better. Fucking A! This surge of high when you use AI comes from you not knowing how to do it or if even it’s possible. You!

          You prompt bros are blasting shit tons of energy just to achieve the same quality of work, if not worse, in a much fucking longer time.

          And somehow these executives claim AI improves fucking productivity‽

    • Flipper@feddit.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      12 hours ago

      Apparently it’s useful for extraction of information out of a text to a format you specify. A Friend is using it to extract transactions out of 500 year old texts. However to get rid of hallucinations the temperature reds to be 0. So the only way is to self host.

      • OhNoMoreLemmy@lemmy.ml
        link
        fedilink
        English
        arrow-up
        0
        ·
        11 hours ago

        Setting the temperature to 0 doesn’t get rid of hallucinations.

        It might slightly increase accuracy, but it’s still going to go wrong.

      • daellat@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        12 hours ago

        Well, LLMs are capable (but hallucinant) and cost an absolute fuckton of energy. There have been purpose trained efficient ML models that we’ve used for years. Document Understanding and Computer Vision are great, just don’t use a LLM for them.

    • Don_alForno@feddit.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      12 hours ago

      Our boss all but ordered us to have IT set this shit up on our PCs. So far I’ve been stalling, but I don’t know how long I can keep doing it.

    • Ledericas@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      13 hours ago

      same here, i mostly dont even use it on the phone. my bro is into it thought, thinking ai generate dpicture is good.

      • RvTV95XBeo@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        13 hours ago

        It’s a fun party trick for like a second, but at no point today did I need a picture of a goat in a sweater smoking three cigarettes while playing tic-tac-toe with a llama dressed as the Dalai Lama.

        • bampop@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          10 hours ago

          It’s great if you want to do a kids party invitation or something like that

          • meowMix2525@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            4 hours ago

            That wasn’t that hard to do in the first place, and certainly isn’t worth the drinking water to cool whatever computer made that calculation for you.

    • AbsentBird@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      14 hours ago

      The only feature that actually seems useful for on-device AI is voice to text that doesn’t need an Internet connection.

      • RvTV95XBeo@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        13 hours ago

        As someone who hates orally dictating my thoughts, that’s a no from me dawg, but I can kinda understand the appeal (though I’ll note offline TTS has been around for like a decade pre-AI)