• stingpie@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      Recently, I’ve just given up trying to use cuda for machine learning. Instead, I’ve been using (relatively) cpu intensive activation functions & architecture to make up the difference. It hasn’t worked, but I can at least consistently inch forward.

  • Gabu@lemmy.ml
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    10 months ago

    I prefer ROCM:
    R -
    O -
    C -
    M -

    • Fuck me, it didn’t work again
  • Uranium3006@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    9 months ago

    Some numbnut pushed nvidia driver code with compilation errors and now I have to use an old Kernel until it’s fixed

  • Gamma@beehaw.org
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    Related to D: today vscode released an update that made it so you can’t use the remote tools with Ubuntu 18.04 (which is supported with security updates until 2028) 🥴 the only fix is to downgrade

    • Skull giver@popplesburger.hilciferous.nl
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      That’s a problem many LTS distro users don’t seem to understand when they first install their distro of choice: the LTS guarantees only apply to the software made by the distro maintainers.

      Loads of tools (GUI or command line) used in development normally come from external repositories because people deem the ones in the upstream repos “too old” (which is kind of the whole point of LTS distros, of course).

      You can still run 16.04/18.04/20.04 today, but you’ll be stuck with the software that was available back when these versions were released. The LTS versions of Ubuntu are great for postponing updates until the necessary bug fixes have been applied (say, a year after release) but staying more than one full LTS release behind is something I would only consider doing on servers.

    • scrion@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      10 months ago

      I started working with CUDA at version 3 (so maybe around 2010?) and it was definitely more than rough around the edges at that time. Nah, honestly, it was a nightmare - I discovered bugs and deviations from the documented behavior on a daily basis. That kept up for a few releases, although I’ll mention that NVIDIA was/is really motivated to push CUDA for general purpose computing and thus the support was top notch - still was in no way pleasant to work with.

      That being said, our previous implementation was using OpenGL and did in fact produce computational results as a byproduct of rendering noise on a lab screen, so there’s that.

    • Skullgrid@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      I don’t know wtf cuda is, but the sentiment is pretty universal: please just fucking work I want to kill myself

      • topinambour_rex@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        9 months ago

        Cuda turns a gpu in to a very fast cpu for specific operations. It won’t replace the cpu, just assist it.

        Graphics are just maths. Plenty of operations for display the beautiful 3d models with the beautiful lights and shadows and shines.

        Those maths used for display 3d, can be used for calculate other stuffs, like chatgpt’s engine.