• Lucy :3@feddit.org
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        16 days ago

        ~0.28€/kWh

        So 50€/month assumes an average of 263W used 24/7, though considering I also have two switches and a workstation/backup server as well as the inefficiency of an UPS, that is realistic.

        • kalpol@lemmy.ca
          link
          fedilink
          arrow-up
          0
          ·
          16 days ago

          Yeah sounds about right. But we have really cheap power, something like 7 or 8 cents per kwh (US). Not sure why, lots of wind I guess.

  • wise_pancake@lemmy.ca
    link
    fedilink
    arrow-up
    0
    ·
    16 days ago

    This is how I feel about vibe coding

    “Claude can you help me with this”

    “Sure, taking a look”

    you’ve exceeded your allotted token limit

    Oops.

  • henfredemars@infosec.pub
    link
    fedilink
    English
    arrow-up
    0
    ·
    16 days ago

    I was promised 15 years ago that cloud computing would avoid unexpected bills and provide consistent expenses that project managers love so much.

    • towerful@programming.dev
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      16 days ago

      Oh, it’s expected costs.
      Like, figure out the compute requirements of your code, multiply by the cost per compute unit (or whatever): boom, your cost.
      Totally predictable.
      Compared to suddenly having to replace a $20k server that dies in your data center.
      So much easier.

      Except when your code (let’s be honest, the most likely thing to have an error in it… At least compared to some 4+ year old production hardware that everyone runs) has a bug in it that requires 20x compute.
      But maybe that is a popularity spike (the hug-of-death)! That’s why you migrated to the #cloud anyway, right? To handle these spikes! And you’ve always paid your bills so… Yeh, here’s a 20x bill.