Because some use cases are truly more time-pressing than others.
Our latest rendition of The Good AI Conversation went live last month, wherein we covered the relationship of artificial intelligence to the issue of the World’s Energy Crisis. We brought together various experts and viewpoints to discuss the topic. We invited four guest speakers, each working on the application of AI to address the world’s energy crisis.
Who Were the Guests?

First, we had Priya Donti, the Co-founder and Executive Director of Climate Change AI, and an incoming assistant professor at MIT. Climate Change AI is a non-profit organization that was founded in 2019 to mobilize work to address climate change using machine learning. Since its founding, the organization has built a global community, provided education, created infrastructure, and advanced the discourse on the use of AI for a more sustainable future.
Second, we had Georg Rollmann, Head of Advanced Analytics and AI at Siemens Energy, one of the world’s leading energy technology companies. The company works with its customers and partners on energy systems for the future, thus supporting the transition to a more sustainable world. With its portfolio of products, solutions, and services, Siemens Energy covers almost the entire energy value chain – from power generation and transmission to storage. Georg looks across and along the value chain, to see where AI can be applied to optimize the entire process.
Third, we had Nicolas Bossé, Chief Energy Transition officer at Brainbox AI. Brainbox AI is a start-up company founded in 2019, based out of Montreal, Canada. The company’s goal is to decarbonize commercial real estate and provide grid flexibility. The company uses deep learning, cloud-based computing, algorithms, and a proprietary process to support a 24/7 self-operating building that requires no human intervention and enables maximum energy efficiency. They manage the HVAC systems, which are responsible for around 50% of energy consumption in commercial buildings.
Lastly, we have Anton Frisk, Co-founder, and COO at Southern Lights, based out of Stockholm, Sweden. Southern Lights works in the area of green hydrogen, wherein they look into how traditional industries can make the transition into using greener energy supplies. The company is creating the next generation of digital tools with AI to help the project developers for hydrogen production. They are using AI to speed up the development and scaling of this energy transition.
AI Applications to Mitigate the Energy Crisis

The guest speakers emphasized the ways in which AI is an optimization tool across the energy value chain; from monitoring methane leaks and gas systems, collecting location data, forecasting prices, and markets, to supply and demand, and accelerating scientific discovery.
For example, computer vision technologies are being used to better understand the degradation of components on the field, and to help technicians better predict and accommodate any issues that may arise.
From a historical perspective, energy efficiency and demand have an independent dynamic. But as the grid has evolved, the two have melded into a system that demands a more integrated system. As such, AI can be a key tool that can be used to manage these more mingled relationships from previously existing tools and assets in the sector.
Global energy systems and requirements differ. Indeed, certain events can further increase and complicate this disparity. Such is the case with the energy crisis that has been exacerbated by the political war and tension in Ukraine. As such, Priya Donti points out the need to make sure that populations have a hundred percent access to reliable energy sources. In this arena, AI has contributed by estimating the current demands, and extrapolating the latent or future demand, to optimize energy distribution overall.
Furthermore, Anton adds that AI will be crucial for hydrogen production’s project designers and developers. AI will allow these professionals to have a larger understanding of the best configurations available using the data set provided. In this way, it will lead to better efficiency and decisions overall. And Georg emphasizes the learning aspect of AI for dynamic scenarios. For example, in the Internet of Things arena, algorithms must be able to adapt and configure themselves as situations change.

However, Priya also points out that there are places in the energy system wherein AI may not be appropriate. For example, in occasions where there are strong design decisions about the values we want to impose on our energy systems. An illustration of this would be the design of an energy market, where strong value judgments are made about distributional prices, which have the capacity to negatively affect different strata in society. In this scenario, Priya points out it would be problematic to try to use AI to automate the system design completely since this would be likely to exacerbate existing inequities in the system.
Some of the Roadblocks to the Use of AI for the Energy Crisis

By Unsplash+
The first roadblock was highlighted by Georg, wherein he identifies the topic of acceptance and understanding of AI in various aspects. There is a need for education on topics like transparency and trustworthiness, to effectively understand what AI can and cannot do, and capitalize on this capacity.
And from a practical point of view, Nicolas believes that AI literacy is critical. At Brainbox AI, they work with commercial real estate and energy markets, which are really conservative, They are highly reluctant to adopt new and disruptive technologies, AI being one of the toughest technologies to sell. And Anton agrees with this entirely, further pointing out that machine learning models require more data openness, which feeds back into the need for the entire energy sector to have a greater understanding of AI.
Another roadblock is the need for further development of the methodologies on the AI and machine learning side. Priya points out that there aren’t any simulation environments and agreed-upon metrics with which to evaluate when a method has succeeded and has made correct assumptions to assess whether a system works well. Having the ideal versions of these simulations, or digital twins would be extremely invaluable, but waiting for them, might be waiting too long. As such, Priya believes that what the industry needs are a good enough proxy for what a power grid may generally look like, in addition to agreed-upon metrics, that would allow for a larger deployment.
And last but not least, they discussed the requisite need for governmental global agreements and how companies can and should act now, without waiting for the establishment of global agreements. This roadblock also ties in with the issue of data usage and privacy, which needs to be democratic and lessen the demonizing of AI and the issue of privacy – not all data are made equal, and not everything needs to be private.
Green AI: Energy Problem of AI Itself
Green AI problem refers to the notion that the deployment of AI requires a significant amount of energy as well, therefore presenting a cost-benefit conundrum. Caroline states that according to a paper on the footprint of computing, ICT will account for 7% and 20% of global demand by 2030.
Priya comments on this issue by pointing out that even with the measurement of the impact itself, the industry needs to have more unity, arguing that the numbers are hotly contested, with the International Energy Agency saying more like 2% for ICT. As such, monitoring and measuring require more than an address at the micro-level. It requires effectively pinpointing what aspects of ICT are causing what, to give us better the ability to forecast and address the issue today itself. Moreover, in terms of addressing the emissions born out of AI, there are a couple of aspects to consider.
First is the operational aspect. For example, the electricity consumed for running AI algorithms. And then the emissions from the embodied hardware that the algorithms are running on. At first, it may seem incredibly worrying, however, Priya points out that the emissions associated with ICT and data centres have actually stayed constant in the last decade. This is due in large part to the increased efficiency of hardware, which has stayed in pace with the increase in computing. But Priya warns that we should not count on this trend continuing–there’s a need to be proactive.
Now, data centres replace their hardware quite frequently, to ensure their hardware is efficient in mitigating computational emissions. But as the practice continues, the embodied emissions of hardware continue to increase–it is a wicked problem on the hardware and computational side. And from a practical perspective, Nicolas gives their approach at Brainbox.ai as always aiming to reduce the amount of data used to train their AI models and drive the energy efficiencies they deliver to their clients.
Brainbox.ai is also cognizant of training their AI models when the grids are less carbon intensive, to assure they contribute rather than create a more negative impact on the planet. And admittedly, this is a question they continue to ask themselves at the company; how they can do more with less. More concretely, they train their models at night, and in locations where the grid is less intense. For Southern Lights, Anton says that they’re currently supporting a company that works with replacing the diesel generators of data centres with hydrogen systems to lessen the emissions. Batteries, he adds, can further act as replacements to reduce not necessarily the energy usage, but the carbon footprint.
Lastly, Georg points out the role of reusability with regard to greener AI. Perhaps, there is a more incremental approach to take advantage of whatever is already available to be more sustainable for the creation of AI models.
Priya adds to this by delineating three approaches the research has often taken:
- Can we create more sparse learning models that have fewer parameters but still perform well, which would use less energy for computation?
- Can we somehow come up with a base model that generalizes well, which could just be tweaked to serve a particular purpose, to overall consume less energy?
- Can we start with a model and have a good enough model powered by online learning so that when new data arrives, we can just add that data? Instead of re-training an entirely new model which would consume more energy?
Georg and Nicolas close on this idea by pointing out how models that can be trained in these three ways would prove beneficial across the world, wherein we can import and export models as the need arise and have less energy consumption worldwide for the use of AI. Lastly, Priya closes with the importance of considering the cost-benefit analysis of further development on AI models, weighing the advantage they can create against the extra energy required to gain said advantage.
Conclusion
The use of AI to mitigate the world’s energy crisis is not only powerful but essential. Indeed, together, the guests have pointed out the various ways in which AI can be used across the entire value chain, from energy production to optimization, and forecasting. Moreover, with the correct applications and interventions in place, the benefits are huge.
But there remain numerous roadblocks to fully achieving the potential of what AI can do for the energy crisis. Some are more difficult than others, but the first step to solving any problem, is awareness – the fact that this conversation exists is a good sign. As such, we, at The Good AI hope that the conversation has brought you not only insight but inspiration to act – it definitely galvanized our team!
`ABOUT THE AUTHOR
Michelle Diaz …