The race to meet the energy demands of Artificial Intelligence (AI) is on, and Google Cloud is leading the charge with a three-pronged strategy. According to Thomas Kurian, CEO of Google Cloud, the company has identified energy as the 'most problematic thing' in AI development, and is taking proactive steps to address this challenge. In a recent interview, Kurian revealed that Google Cloud has been working on AI since before large language models took the spotlight, and has designed its machines to be highly efficient in energy usage. The International Energy Agency estimates that some AI-focused data centers consume as much electricity as 100,000 homes, and the demand for data center capacity is expected to increase by 46% in the next two years. To tackle this, Google Cloud is taking three key approaches. Firstly, they are diversifying their energy sources to power AI computation, recognizing that not all forms of energy can handle the sudden spikes in energy demand. Secondly, they are optimizing their energy efficiency, including reusing energy within data centers and using AI in control systems to monitor thermodynamic exchanges. Lastly, Google Cloud is exploring new fundamental technologies to create energy in innovative forms. This strategy is crucial as energy supply is critical to AI development, alongside innovations in chips and improved language models. However, the ability to build data centers is another potential chokepoint, with China having an advantage over the US in this regard. The question remains: will Google Cloud's strategy be enough to meet the energy demands of AI, or will we see a bottleneck in the energy supply chain? The discussion is open for comments and thoughts.