In 2020, Akamai’s global platform used 10 times less energy per unit of capacity than it did in 2015, according to the company. At the same time, traffic grew by more than 350%. These gains are due to the company’s widely-distributed network and focus on edge computing.
By using machine learning, Akamai caches the most relevant data to its edge servers and eliminates the need for repetitive fetches to the origin. This saves on potential space and CPU power expenditure, which in turn, reduces carbon emissions and overall energy use.
The company has about 325,000 servers in 1,435 networks in 135 different countries.
Michael Gooding, the company’s lead web performance architect, said the company is constantly looking for energy efficiencies in data center operations.
“Given the size of our network, being even 1% more efficient adds up to being quite a lot,” he said.
The company also is working to use more green energy. The U.S. sources include wind farms in Texas and Illinois and a solar array in Virginia.
“Those three projects alone cover 23% of our global power needs already,” Gooding said.
The company announced its sustainability goals on Thursday. These targets include:
- 100% renewable energy by 2030
- 50% more energy efficient platform
- 100% platform emissions mitigation
- Responsible supply chain management
- Global expansion of 100% electronic waste recycling program
Gooding said that reducing power consumption is good for Akamai’s bottom line and also could provide a competitive advantage.
“We’ve seen a 30% increase in RPFs with mentions of sustainability so we know that our customers are starting to care about this,” he said.
Gooding said customers are looking to offload compute and storage needs to Akamai and get help optimizing operations.
“We can detect what device is coming and automatically send the right image size down to the device, which also helps the user with battery use,” he said.
Gooding shared the example of a retail client that wanted to personalize geolocation services. Sending a ZIP code lookup to the cloud took about 500 milliseconds, but doing the same task at the edge took only 10 milliseconds.
“If you’ve already broken out this task into a separate service, it’s really easy to move that to the edge, he said.
Gooding said the company’s distributed network gets the CDN closer to users, which may improve service overall as internet traffic continues to grow.
“We predict that as the core of the internet gets more and more congested, for companies to get the bandwidth they need, they’ll have to have capacity at the edge because it doesn’t exist in the middle,” he said.