One of the members of the LG Electronics founding clan teamed up with the government of South Korea for the creation of what is being referred to as the world's largest AI data centre – a venture that might greatly influence the global AI infrastructure distribution.
Brian Koo, Stock Farm Road Inc.'s cofounder, has convinced Jeollanam-do Province's Governor Kim to join forces through the signing of a Memorandum of Understanding for the building of the massive facility. Construction is scheduled to commence this year, and the facility is to be finished by 2028.
The size is breathtaking
At its pinnacle, the data centre will have the ability to reach 3 gigawatts capacity – a mere sum that has never been seen before in one facility. To put it in perspective, that's the same as the power needed to operate a little town.
The total amount of money the project is likely to earn annually from the very first year is going to be $3.5 billion, and it has the potential to reach $35 billion in total value eventually. The figures are not just dreams – rather, they are indications of the vast need for AI computing infrastructure that is left by the davastatingly popular global migration towards generative AI.
Sophisticated infrastructure
The complex would not only have top-notch cooling systems – the very thing that would help the AI workloads within the complex produce such extreme heat – but also regional and international fibre bandwidth. Most importantly, it is built to overcome the sudden and heavy fluctuations of load on the power grid, as stated in a press release.
AI data centres often experience rapid shifts in demand for computing resources during model training or when there are unexpected surges in usage, which means that they depend heavily on the power supply to stay up and running and that last characteristic of being able to ramp up power it is a big plus.
More than just infrastructure
Koo's new venture Fir Hills Inc. (a part of Stock Farm Road Inc.) Koo, Amin Badr-El-Din with Koo co-founder, positioned the project as "more than just a technological milestone."
"It's taking great steps into the future for the country in the world of technology," he uttered. "We feel proud to be partnered late with Stock Farm Road and the Jeollanam-do government to this crucial infrastructure of creating something that will help us establish the ground for new AI."
Koo, who has been in the middle of things in Asia, thinks that the ramifications are even wider: "I was the lucky one to see how powerful technologically big Asian companies were, I realize that this project has great potential to uplift Korea and the whole region in terms of technology and economy. This data center is not an ordinary infrastructure project but the way to a new digital industrial revolution."
The global context
The project comes at a time when nations around the world are competing to build AI infrastructure. The United States occupies the first place with enormous data centers run by Microsoft, Google, Amazon, and others. China is making rapid gains in spite of the restrictions on advanced chip exports imposed by the US. And now South Korea appears to be taking a daring decision to transform into a major AI hub.
For South Korea, it is a strategically smart move. The country already hosts two of the largest memory chip manufacturers—Samsung and SK Hynix—along with cutting-edge manufacturing technology, a stable power supply, and government backing for technology initiatives. What it hasn’t had is a hyperscale AI data center; this project will fill that gap.
What this means for the region
The global race for AI infrastructure is currently going on and the project has come at the right moment. US is leading the race with the gigantic data centers of Microsoft, Google, Amazon, and others. Despite the limitations on the US's most advanced chip exports, China is catching up fast. Now, South Korea seems to be making a bold choice to change its status as a minor AI player into a major one.
For South Korea, it is a tactically brilliant decision. The country is home to two leading memory chip producers—Samsung and SK Hynix—apart from being equipped with world-class manufacturing technology, a steady supply of electricity, and government support for tech projects. One thing it lacked was a huge AI data center; this project will turn that into reality.
The challenges ahead
The construction of a 3-gigawatt data centre by 2028 is a huge challenge to be met. The undertaking will demand a massive power supply – something that, even in a country with South Korea's infrastructure, is not easily done. It must obtain vast amounts of state-of-the-art computing hardware, including GPUs that are not easily available worldwide. It has to be that the firms, whose AI workloads are compatible with the centre, are actually drawn in.
The overall value expectation of $35 billion also takes into account a demand for AI compute that is going to be sustained over a period of years. The current trends in the market support this assumption but, at the same time, the AI market is highly unpredictable as there are doubts regarding whether the demand will continue to grow at the current rate or eventually plateau.
Down the line
The new data centre for artificial intelligence projected in South Korea indicates the up-to-date turn in the worldwide competition for AI infrastructure supremacy. The alliance of the LG founding family's financial power with the support of the government has led to the joining of the private sector's know-how with the public sector's endorsement – a strategy that has been successfully applied in South Korea's earlier technological advances.
The labelling of the site as "the largest AI data centre in the world" is reserved for the future, the way the project is performed and the rival infrastructure put up in the meantime will be the decisive factors. However, the aspiration is unmistakable: South Korea is already selling itself as a major player on the AI infrastructure layer that will support the upcoming technological future.
The signal for India and similar nations observing this layer, on the other hand, is very explicit: to raise top-notch AI capabilities, it would necessitate not only talented software developers or startups but also large-scale investments in physical infrastructure. Data centres that consume gigawatts, cooling systems that can withstand extreme temperatures, and financial partnerships that can allocate billions in capital are the essentials for the AI revolution, which will not be running on laptops.


