• Maalus@lemmy.world
    link
    fedilink
    arrow-up
    9
    ·
    6 days ago

    What’s clear is that you don’t realize how much energy AI actually uses up and you ate up propaganda that you are spreading right now. A querry that runs for 20s to generate an image on a card that uses up at most 350W/h during heavy gaming sessions isn’t magically going to doom the world. Chill out.

    • utopiah@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      6 days ago

      I’ll assume you didn’t misread my question on purpose, I didn’t ask about inference, I asked about training.

      • Maalus@lemmy.world
        link
        fedilink
        arrow-up
        8
        ·
        6 days ago

        How much energy was used to bring the truckload of groceries into the shop that one time so hundreds of people can use it?

        • utopiah@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          6 days ago

          Great point, so are you saying there is a certain threshold above which training is energetically useful but under which it is not, e.g. if training of a large model is used by 1 person, it is not sustainable but if 1 million people use it (assuming it’s done productively, not spam or scam) then it is fine?

            • utopiah@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              6 days ago

              Results? I have no idea what you are talking about. I thought we were discussing the training cost (my initial question) and that the truckload was an analogy to argue that the impact from that upfront costs is spread among users.