• scrion@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      10 months ago

      I started working with CUDA at version 3 (so maybe around 2010?) and it was definitely more than rough around the edges at that time. Nah, honestly, it was a nightmare - I discovered bugs and deviations from the documented behavior on a daily basis. That kept up for a few releases, although I’ll mention that NVIDIA was/is really motivated to push CUDA for general purpose computing and thus the support was top notch - still was in no way pleasant to work with.

      That being said, our previous implementation was using OpenGL and did in fact produce computational results as a byproduct of rendering noise on a lab screen, so there’s that.

    • Skullgrid@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      I don’t know wtf cuda is, but the sentiment is pretty universal: please just fucking work I want to kill myself

      • topinambour_rex@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        10 months ago

        Cuda turns a gpu in to a very fast cpu for specific operations. It won’t replace the cpu, just assist it.

        Graphics are just maths. Plenty of operations for display the beautiful 3d models with the beautiful lights and shadows and shines.

        Those maths used for display 3d, can be used for calculate other stuffs, like chatgpt’s engine.