Your default browser language is set to . Browse this site in another language: Continue color Created with Sketch.

Reports and Whitepaper

Read logistics-related industry reports, & sector-specific important-export guides here.

View all resources
  • Newsletter subscription (View Sample)
  • Get a sales representative to contact me
  • I agree to the  Terms and Privacy Notice
ALSO WORTH READING

Living on the green edge with localized data centers

Data is the new oil, but is it just as dirty? Living on the edge may be the more sustainable answer.
Data is the new oil, but is it just as dirty? Living on the edge may be the more sustainable answer.
Shot of corridor inside server room of a data center, full of rack supercomputers mainframe.
09 November 2023 •

Scrolling through TikTok has become one of the world’s most popular pastimes. However, behind each short-form video is a package of data that needs to be processed before the next skit or dance video can appear. Along with TikTok, the rise of many other social media platforms in the last decade has led to an explosion in global data demand.

Large data centers, or hyperscalers, have long sustained this demand. However, these huge data-processing facilities produce high levels of carbon emissions – they make up over 3.5 percent of global greenhouse gas emissions. Each center can use up to five million gallons of water daily to cool its thousands of data server racks.

Demand for data shows no signs of abating, but the data industry cannot continue relying on hyperscalers. A new data solution is necessary to ensure that the industry does not end up harming the environment in its bid to meet data demand.

Safer to live on the edge

Instead of dreaming big, the industry needs to start thinking on a micro level. Enter edge data centers.

Unlike hyperscalers, edge data centers are smaller and built closer to the end-user. They are housed at strategic locations nearer to the source of the data they are receiving, at the edge of the cloud.

These centers employ edge computing, meaning that data is processed locally in the city or nearby regions, rather than in large hyperscalers located many miles away or overseas. Sending data over a shorter distance means lower data latency, leading to faster loading time for websites on devices

Local edge data networks also translate to greener operations. They reduce the amount of data that needs to be sent to hyperscalers located far offshore for processing. This significantly reduces energy consumption. Devices also work more efficiently, given the shorter data-processing time.

Edge data centers take up less physical real estate, and has the advantage of low latency to improve data transmission speed.
Edge data centers take up less physical real estate, and has the advantage of low latency to improve data transmission speed.

As the industry is beginning to recognize the benefits of edge computing, several data companies have begun setting up edge data centers in Southeast Asia to meet growing demand. Singapore-based companies are looking beyond the tiny island to build its edge data centers overseas, in a bid to circumvent land limitations, and capitalize on cheaper costs.

As part of the “SG+ Strategy,” these companies have found feasible locations nearby – such as Batam in Indonesia, and Johor in Malaysia – to construct their centers and provide high-quality data coverage in Singapore.

One such example is Digital Edge, a Singapore-based data center company that opened a facility in Manila in March this year. Besides providing faster data processing for its clients, Digital Edge has also reaped the environmental benefits of building an edge data center – it reported a Power Usage Efficiency (PUE), which measures the energy efficiency of a data center, of 1.193 at its Manila facility, significantly lower than the global average of 1.5, and close to the perfect efficiency score of 1.0.

The journey from massive hyperscalers to more efficient edge data centers marks a turning point in the data industry's environmental responsibility. By embracing edge computing and strategic geographical placement, companies are not only processing data faster; they are also doing it greener.

Maintaining the green edge

Edge data centers can work in tandem with clean energy sources to reduce the overall carbon footprint.
Edge data centers can work in tandem with clean energy sources to reduce the overall carbon footprint.

However, edge data centers are only as sustainable as their infrastructure.

Greener sources such as wind, solar, and geothermal energy are the new go-to sources to power data centers, and the data industry is already looking at other energy options, including natural gas or nuclear power, and are likely to make location decisions based on access to such power sources.

One of the most popular potential locations for new data centers is Indonesia. Home to one of the world’s largest geothermal potentials and equipped with significant hydropower capacities, the nation is becoming an attractive location for green data centers. Amazon, for instance, signed wind and solar power purchase agreements in both India and Indonesia in late 2022. Similarly, Princeton Digital Group has bought renewable energy certificates for geothermal energy for its three data centers in Indonesia.

Keeping it cool

The use of cooling systems for data centers is essential but it needs to take a more sustainable route in the long run.
The use of cooling systems for data centers is essential but it needs to take a more sustainable route in the long run.

Most importantly, data centers use a lot of water and energy in their heat management systems to cool server racks. The water and resources needed to operate edge data centers should be adjusted and measured to fit their smaller scale to avoid wastage.

Innovative infrastructure is required to ensure edge data center operations are green. One such solution is high-density liquid cooling. This method of server cooling includes “direct-to-chip” cooling, where a refrigerant, water, or non-conductive liquid is piped directly into the server chassis. This brings the coolant straight to the heat source, the central computing unit, rather than to its surrounding environment like traditional cooling systems do.

The full immersion system is another way – servers are loaded into a non-conductive liquid bath to cool them. This means that there is no constant water flow to the server racks unlike traditional systems, resulting in less energy and water used.

Pressure to address carbon emissions is pushing the search for new sites to make the most of natural and renewable resources for powering and cooling a data center. For example, some companies have built data centers in countries such as Iceland, where the ample infrastructure to harness renewable energy for electricity and power, and natural air cooling because of the cold climate make it an ideal place to locate a data center.

Other companies have also found unconventional locations to house new data centers. Highlander, a Chinese company, has just built its first commercial underwater edge data center, while Singapore-based Atlas Technology plans to do the same soon.

Building data centers in low-temperature environments means less energy is required to cool server racks that are already at a lower base temperature. Additionally, natural power from the surrounding waves can be utilized to power these centers, thus creating an environment in which data centers can continue to function while producing a minimized carbon footprint.

As the world navigates the digital landscape, demand for data will inevitably continue to surge. By reshaping data centers and ensuring that data is harnessed responsibly, edge computing can be the data industry’s key to redefining its footprint as it paves a greener path forward.


RELATED TOPICS
RELATED TOPICS