Despite its name, the infrastructure used by the “cloud” accounts for more global greenhouse emissions than commercial flights. In 2018, for instance, the 5bn YouTube hits for the viral song Despacito used the same amount of energy it would take to heat 40,000 US homes annually.

Large language models such as ChatGPT are some of the most energy-guzzling technologies of all. Research suggests, for instance, that about 700,000 litres of water could have been used to cool the machines that trained ChatGPT-3 at Microsoft’s data facilities.

Additionally, as these companies aim to reduce their reliance on fossil fuels, they may opt to base their datacentres in regions with cheaper electricity, such as the southern US, potentially exacerbating water consumption issues in drier parts of the world.

Furthermore, while minerals such as lithium and cobalt are most commonly associated with batteries in the motor sector, they are also crucial for the batteries used in datacentres. The extraction process often involves significant water usage and can lead to pollution, undermining water security. The extraction of these minerals are also often linked to human rights violations and poor labour standards. Trying to achieve one climate goal of limiting our dependence on fossil fuels can compromise another goal, of ensuring everyone has a safe and accessible water supply.

Moreover, when significant energy resources are allocated to tech-related endeavours, it can lead to energy shortages for essential needs such as residential power supply. Recent data from the UK shows that the country’s outdated electricity network is holding back affordable housing projects.

In other words, policy needs to be designed not to pick sectors or technologies as “winners”, but to pick the willing by providing support that is conditional on companies moving in the right direction. Making disclosure of environmental practices and impacts a condition for government support could ensure greater transparency and accountability.

  • gandalf_der_12te@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    5
    ·
    1 month ago

    AI Training is a flexible energy consumer, meaning it can be switched on and off at will, so that it can take advantage of excess solar power during the daylight, providing extra income to solar panel parks. The important thing to do is to install solar panels, and then AI training isn’t an environmental problem anymore.

    • SandbagTiara2816@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      2
      ·
      1 month ago

      We already have a more elegant solution than training AI when solar arrays produce more electricity than the grid needs - batteries. It strikes me as a better option to save the energy for later use than to burn it off to train AI.

      • Buddahriffic@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 month ago

        It looks like you and the commenter you replied to are talking about two different problems. You’re talking about what to do about excess solar energy, they are talking about how to power AI training in an environmentally-friendly way.

        • SandbagTiara2816@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 month ago

          Ah, that makes sense! Yeah, I’m out of my depth when it comes to how to train an AI model. I tend to leap into defense mode when intermittency of renewable energy comes up, because it’s very often an anti-renewables talking point, when we actually do have a lot of solutions for it.

      • gandalf_der_12te@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 month ago

        I would say that both are interesting proposals to look at. Of course, doing the math and crafting the best approach is work and takes time, and I can’t give many details in a lemmy comment.

        • SandbagTiara2816@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 month ago

          It is for sure a tricky question. Another comment pointed out that we may be coming at the topic from different directions. I’ll admit that the energy demands of AI make me nervous, when I consider how hard the transition to renewables already is without the added load, but I’m not familiar with work in that space to make AI training less energy intense. What options are being worked on?

          (Other than SMR or betting on fusion)