Fuck this shit, why does every fucking thing need an LLM?

    • Hot Potato
      link
      fedilink
      -84 months ago

      this can run locally on your device which means probably doesn’t consume that much energy

        • Hot Potato
          link
          fedilink
          -14 months ago

          but that was already done. They are using Mistrial which was already trained. Proton didn’t train a new AI for this.

          • @anyhow2503@lemmy.world
            link
            fedilink
            54 months ago

            The whole premise of this discussion was about technological progress and growth going by your initial comment. That means refining existing models and training new ones, which is going to cost a lot of energy. The way this industry is going, even privacy conscious usage of open source models will contribute to the insane energy usage by creating demand and popularizing the technology.