The sustainability of AI

Mar 29, 2024

While AI promises unparalleled efficiency, productivity, and innovation, questions regarding its environmental impact loom large. How can AI’s development be maintained against a backdrop of environmental responsibility?
Article published by Silicon Technology 28th March 2024 featuring CSMO Fredrik Jansson

At its core, AI relies on vast computational power to process and analyse data, driving machine learning algorithms and powering intelligent decision-making processes. This insatiable demand for computing resources comes with a hefty environmental price tag. According to IEA, datacentres currently consume 1.5% of all global electricity generated. And with NVIDIA estimated to be shipping 1.5 million AI servers by 2027, the energy consumption of these vast compute centres would be more than the energy needs of some small countries.

The energy consumption of AI infrastructure, including datacentres and high-performance computing facilities, contributes significantly to carbon emissions. As AI applications proliferate and data volumes soar, the strain on energy resources intensifies. In addition, the manufacturing and disposal of AI hardware components, such as specialised chips and servers, further exacerbate environmental concerns, contributing to electronic waste accumulation.

As AI becomes integral to business modernisation and social progress, we have a responsibility to embed sustainable principles throughout its implementation,” Vikram Nair, President, EMEA Business, Tech Mahindra explained to Silicon UK. Organisations with foresight, which are investing in long-term research and planning around AI rather than focusing on short-term gains, are positioning AI as part of their Net Zero strategies, and integrating it into demand prediction, waste reduction, and efficiency improvements across operations. By investing in robust, sustainable infrastructure and establishing clear standards and accountability, businesses can use AI to improve their entire organisational processes in ways that are both efficient and environmentally responsible.”

As the urgency to address climate change intensifies, the imperative for sustainable AI practices becomes ever more apparent. From research laboratories to corporate boardrooms, stakeholders across the AI ecosystem are embracing sustainability as a guiding principle in technology development and deployment.

One notable initiative is the pursuit of renewable energy sources to power AI infrastructure. Tech giants such as Google, Microsoft, and Amazon have committed to powering their datacentres with renewable energy, reducing their carbon footprint and setting a precedent for sustainable computing practices. Additionally, collaborations between academia, industry, and policymakers are fostering innovative solutions for sustainable AI development.

Powering the machine
Amid growing apprehensions about AI’s environmental impact, industry stakeholders are increasingly focusing on enhancing energy efficiency and mitigating carbon emissions. Innovations in AI hardware design, including the development of low-power processors and energy-efficient architectures, aim to optimize performance while minimizing energy consumption.

Fredrik Jansson, Chief Strategy and Marketing Officer, atNorth, explained how the location of datacentres will have a profound impact on their sustainable credentials:

AI requires a significant investment in digital infrastructure to allow for the storage and almost instantaneous processing of vast amounts of data. Datacentres that accommodate these workloads require significant cooling systems that use a huge amount of energy at a considerable environmental and financial cost. It might seem that businesses have an almost impossible task to balance the need for the best possible infrastructure in the right location to support digitization and drive increasingly critical sustainability initiatives. Yet there is a solution that lies with the choice of datacentre.

Jansson concluded: Datacentre located in regions with a consistent cool climate and a surplus of renewable energy sources can offer a stable long term power supply that is significantly cheaper compared to its natural gas counterparts. Modern datacentres built in cooler regions such as the Nordics, utilize the climate to enable the implementation of more energy efficient infrastructure and allow for heat recovery technology that permits excess heat to be reused to heat local communities.

Read the complete article online here.

Privacy Consent

We would like to use cookies and other technologies to improve your experience on this website and help us understand how it is performing. If you would like more information, read our Privacy Policy.