Environment

Computation kills the planet
The inspiration for this post came from a recent paper which estimated the carbon footprint of machine learning models. The authors of the paper estimated how much energy would be needed to train one of the most sophisticated natural language processing models developed. Based on current hardware, it’s an environmental disaster. The model was estimated to leave a carbon footprint the same size that six cars produce over their entire life. Ouch.

Screen Shot 2019-07-15 at 10.50.27 AM.png

Yea, ouch.

But this shouldn't’t be a surprise
Data centers, which are often used to run complex AI/ML calculations, use a lot of energy - and are growing quickly. In 2016, data centers used 2% of all energy consumption in the US. Three years later, the number has doubled to almost 5%. Obviously, that's a lot of energy.

To counter the enormous energy consumption, cloud providers have been moving to renewable energy. For the second year in a row, Google purchased enough renewable energy to match 100% of their electricity consumption. Google is also planning on building massive solar farms in the southern United States.

Amazon - Chief Environment Destroyer
Others haven’t been so good. Amazon announced an initiative to go 100% renewable in 2014, and reached about 50% in 2018. However, Amazon hasn’t announced any renewable deals since 2016 and research done by Greenpeace shows recent data center builds in Virginia haven’t included any renewable sources at all. To top it off, Gizmodo speculates the lack of renewable focus is because AWS is trying to win clients in the oil and gas sector.

EVEN FUNNIER - is when you look at the top 500 supercomputers in the world, energy companies (Total and Eni) have the two most powerful computers in the private sector and the US Department of Energy is listed as having a partnership with seven of the top 20 (including the most powerful) supercomputers in the world.

So, energy companies burn a tremendous amount of energy to find a tremendous amount of energy - which people then burn.

Anyways, getting back to machine learning
The irony in all of this is the brain, which has more computing power than we can imagine, uses almost no energy at all. In fact, the brain uses about 20 watts of energy, which is less than a light bulb, and hundreds of millions of times less than it takes to build the machine learning model in our original example.

Companies are now starting to reduce the amount of power AI/ML chips use, and trying to mimic the brain. Graphcore, a UK based startup, is developing specialized AI/ML chips that run at 120W, which is the equivalent of about a light bulb. Tesla is also trying to reduce power consumption of chips for their self driving cars, as an electric car with high energy demands is a recipe for disaster.

You would think AI/ML could help us
In theory, AI/ML should help us find ways to improve the environment. The World Economic Forum published a report in Jan 2018 outlining how AI/ML could help and listed reasons including better management of electricity, predicting solar flares to protect power grids and autonomous deep sea assessments.

Using AI/ML to deploy energy in better ways seems to be the low hanging fruit. Google has already deployed AI/ML to improve energy utilization. More recently, it started using the technology to better predict the output of wind farms. GE is doing something similar, with the company using AI/ML to “adjust how the power plant equipment runs in real time to help achieve more efficiency, flexibility and capacity”

There are other AI/ML environmental projects kicking off, but are still in the early days. We aren’t close to having all energy being part of a “smart” grid or having chips use little power, but the hope is one day we will get there.

Let’s just hope we don’t kill the environment with AI, by trying to have AI save it.