As artificial intelligence (AI) and machine learning (ML) applications expand, so do the demands for higher bandwidth, performance, and reliability in data center interconnectivity. Industries like banking and finance require secure, high-capacity networks to support AI-driven applications while ensuring compliance with strict regulations.

Experts from Nokia, a global leader in telecommunications and information technology, explore how the rise of bespoke training and inference models is driving the need for scalable, resilient infrastructure. In collaboration with Nokia, Kyndryl, a multinational IT service provider, examines strategies to address these evolving challenges.

AI-driven transformation in banking and finance

The rise of AI-driven technologies is transforming industries, and nowhere is this more apparent than in banking and finance.

GettyImages-1030530536
– Getty Images

Just as the stars in a constellation are connected by invisible lines, the data centers powering these industries must align securely to ensure compliance, performance, and reliability. Manfred Bürger, global program lead mission-critical programs for optical networks at Nokia, highlights the key concerns facing the banking and financing sectors:

“Our customers in banking and finance are really concerned about attacks on their infrastructure, such as ‘harvest now, decrypt later.’ With AI adoption accelerating, a critical question emerges: What AI resources should be deployed in the public cloud, private cloud, or a hybrid approach? These are strategic decisions that the sector is asking as it navigates AI integration and security challenges.”

As banking and finance enterprises embrace digital transformation, they are evolving into fully digitalized entities, from the supply chain to value creation. Bürger explains:

“The more digital assets they create, the more operational optimization practices they seek. This includes leveraging the cloud, whether through a full cloud, hybrid, or cloud-like architecture.”

Bürger emphasizes that cloud adoption introduces a separation between an enterprise’s physical infrastructure and its cloud-based resources. In turn, trusted cloud access and secure data center interconnects are necessary to protect a company’s digital assets – the foundational value of the business.

Goodbye analog, hello digitalization

According to current market predictions, data center interconnection is expected to grow from approximately $7 billion in 2024 to around $11.5 billion by 2029, largely driven by factors such as the increasing adoption of cloud computing, the rise of data-intensive applications like AI and ML, and the need for high-speed data transfer between data centers. Bürger underscores this trend:

“We see that growth, and behind that is a massive global investment in data center infrastructure – on one hand, driven by AI, but also digitalization in general. In one of McKinsey’s studies, they talk about 60 gigawatts of data center capacity available today around the globe, and they forecast growth to be between 170 and 220 gigawatts by the end of the decade – almost tripling today’s capacity in such a short timeframe.”

“The more data centers, the more data center interconnects, which translates into a massive increase in the traffic demand that we see between data centers.”

As enterprises seek to create added value for their customers, the trend toward digitalization is expanding, increasing the need for robust interconnectivity between enterprises, the cloud, their customers, suppliers, and partners.

James Knights, director and principal architect at Kyndryl, further emphasizes how the proliferation of AL and ML is shaping this transformation, with enterprises seizing the opportunity to deploy private cloud solutions on-site:

“While InfiniBand leads in low latency, it also leads in cost. The Ultra Ethernet Alliance is developing cost-effective alternatives with broad vendor support. With 800G interfaces available now and 1.2T per wavelength already in optical solutions, the future is closer than ever.”

This push for high-speed, cost-effective connectivity will be critical in sustaining the rapid growth of data center interconnects in the coming years.

Two-fold digitalization: Faster and wider

With port speeds as well as the scale of data center interconnects rising to support AI/ML training and inference, the challenge of digitalization becomes two-fold. The challenge is not just about meeting current demands, but about adopting forward-thinking strategies that support continuous growth and optimization.

Scaling infrastructure to accommodate digitalization places a particular strain on operations teams, especially given the shortage of highly skilled IT professionals to manage this large-scale infrastructure advancement.

Bürger points out that enterprises need more than just increased bandwidth to address their challenges – they also require versatility and scalability in the solutions they adopt:

“Instead of simply focusing on immediate needs, such as upgrading from 10 gigabits per second (Gbps) to 100Gbps, enterprises should implement automation frameworks and scalable architectures that allow for smooth expansion to 400Gbps or even 800Gbps and beyond as their business grows. This flexibility will help to scale bandwidth and connectivity efficiently, which is the key to long-term success.”

Options, options

As enterprises fully adopt digitalization and look to establish interconnectivity with the cloud, their own data center, or third-party facilities, the challenge becomes finding the most cost-effective and strategically advantageous way to build that connectivity, all while ensuring it meets key requirements such as availability, flexibility, performance, and control.

Enterprises have a number of options to interconnect their data centers, and the choice largely depends on the type and availability of fiber. Bürger explains that for enterprises with internal networking capabilities, building their own infrastructure is a viable solution:

“Building DCI links, if dark fiber is available, provides the largest possible level of control to the enterprise. They can purchase and manage their own equipment with maximum flexibility, security, and full control over when services are activated or deactivated.”

Alternatively, Bürger notes that enterprises can explore a range of options, including Telco carrier retail services and managed private builds:

“Enterprises can engage a regional or global system integrator to design a bespoke data center interconnect or access solution that meets their specific requirements in terms of bandwidth, latency, distance, reach, and security.”

Lighting up the dark

Dark fiber, a subset of fiber optic cables that aren’t currently transmitting data, provides key opportunities for scaling enterprise connectivity. By definition, dark fiber is “unlit,” but in some regions, it is unbundled, allowing companies to lease and manage the fiber themselves to create dedicated networks. This contrasts with “lit” fiber, which is bundled and only available to those who deploy it. Bürger explains:

“When there's an abundance of dark fiber, it presents an opportunity for enterprises to get a regional or global system integrator to manage a private build using that dark fiber.”

“So, while an enterprise may not have the resources or desire to host and build an interconnected data center or access solution on their own, a service-driven, managed private build by a system integrator with access to dark fiber can provide the necessary infrastructure.”

Knights adds that, depending on the distance, dark fiber is often the solution of choice. The main decision revolves around whether to lease individual strands or secure the right-of-way. For shorter distances that don’t require repeaters or amplifiers, he explains, leasing fiber is straightforward and cost-effective, as all channels and speeds are available without additional charges. However, when amplification is needed, vendors can charge per channel and speed, making the right-of-way lease a more attractive option.

Bürger emphasizes that having access to a dedicated physical infrastructure gives enterprises full control, making it a highly beneficial and often cost-effective option in the long term. He summarizes:

“By being able to deploy equipment whenever and wherever it's needed, you master your own destiny.”

Charting a course for success

One of the most pressing concerns in the data center industry today, especially with the rise of digitalization and the increasing demands of AI/ML workloads, is how to meet the current needs of the site while preparing for future scaling.

GettyImages-1072960902
– Getty Images

Bürger explains that with dark fiber access, enterprises can connect equipment that meets their immediate requirements and later upgrade or refresh the infrastructure as their needs evolve, all through direct or managed access to the dark fiber. Knights adds:

“You can scale bandwidth easily by upgrading the optics, without paying for unnecessary services or backbone infrastructure. Plus, security can be tailored with any encryption method required.”

Ultimately, the available interconnection options are often influenced by external factors. For example, geography-specific regulations may restrict or eliminate fiber availability, forcing enterprises to adopt a mixed approach. Bürger elaborates:

“There is no one-size-fits-all solution, as no enterprise operates on a single, homogeneous network. Networks are inherently heterogeneous, shaped by factors such as country-specific regulations, local resource availability, and infrastructure constraints.”

Quantum threats: Preparing for the inevitable

In this evolving landscape, with the integration of AI into daily operations, secure and trusted architectures are essential to ensure seamless, resilient cloud adoption.

Although quantum computers exist today, their full potential has yet to be realized. These machines have the capability to eventually break the encryption systems that we’ve relied on for decades to protect sensitive information online.

For decades, cryptographic algorithms were designed to withstand attacks from traditional computers – machines that would require thousands of years to break these ciphers.

But quantum computers, due to their fundamentally different way of processing information, will one day be able to solve these mathematical problems with ease, rendering today’s current security measures obsolete. Martin Charbonneau, head of quantum-safe networks at Nokia, contextualizes:

“As enterprises increasingly rely on digital assets for value creation and exchange, trust in these assets is only possible if the underlying cryptography remains secure. This is where the quantum threat presents a major disruption. Quantum computers are no longer just theoretical – they exist today.”

Knights reinforces this point, highlighting the growing threat of data breaches. With threat actors already damaging underwater cables and possessing the technology to tap fiber optic cables, the risk of data being intercepted and decrypted later is a real and pressing concern:

“It's not just financial institutions that need to safeguard data integrity and authenticity – health organizations, public utilities, defense agencies, and others are equally at risk.”

Though quantum computers are not yet capable of breaking encryption, it’s only a matter of time before a cryptographically relevant quantum computer (CRQC) emerges. This marks the arrival of “Q-Day” – the day when a quantum computer becomes powerful enough to break these widely used encryption methods.

So, it is not a matter of if, but when. On the bright side, organizations can start taking steps now to protect data confidentiality, authenticity, and integrity. Charbonneau states:

“The quantum threat is that a malicious actor with access to a CRQC could compromise the security of our digital infrastructure. How do we protect against that threat? Transitioning to quantum-safe network cryptography is the solution.”

By adopting quantum-safe cryptography solutions designed to withstand quantum attacks, businesses can attempt to safeguard their digital assets and maintain trust in their data as quantum computing continues to advance.

A lesson in defense

To prepare for the challenge that Q-Day presents, the industry has made significant strides in the development of new mathematical algorithms designed to resist these quantum attacks – referred to as post-quantum cryptography (PQC). Charbonneau offers a masterclass in the key cryptographic migration methods currently in progress:

Crypto agility: Seamlessly jumping from one new algorithm to another

Since no key exchange algorithm is guaranteed to be fully future-proof, crypto agility provides the ability to transition from one of these new PQC algorithms to another as quantum capabilities evolve, without requiring significant changes to existing systems.

“This ensures that if quantum computers weaken or break these new algorithms, enterprises can quickly transition to more secure alternatives.”

Crypto resilience: Doubling down on quantum physics

Given the critical reliance on digital assets for value creation and exchange, changing algorithms alone is not sufficient. Crypto resilience requires incorporating physics-based cryptography alongside mathematics-based techniques. This dual approach ensures that in the event of a PQC algorithm compromise, another cryptography line of defense remains intact and resilient.

GettyImages-1358049863
– Getty Images

“The goal is that this complementary cryptography, when done correctly, will be unbreakable by a quantum computer, ensuring data remains secure even in the face of algorithm breaches.”

Defense in depth: Layering security for maximum protection

Finally, defense in depth combines crypto agility and crypto resilience, creating a multi-layered security strategy. By applying various types of quantum-safe cryptography across different network layers (optical, ethernet, IP, message-level connections, etc), enterprises can establish a much stronger security framework.

Aligning the stars for a safer tomorrow

Charbonneau summarizes that as a foundation, Nokia’s mission is to build technologies that enable global connectivity. He adds, however, that in light of the quantum thread, this mission evolves:

“Our goal is to enable the world to connect in a quantum-safe manner.”

As Nokia and Kyndryl work together, they form a stellar alliance. As a global system integrator, Kyndryl offers managed private data center access and interconnect services, allowing enterprises to achieve quantum-safe connectivity today without requiring in-house expertise or capital investment. Bürger concludes with this insight:

“This partnership, uniting Nokia’s cutting-edge technology with Kyndryl’s global system integration expertise, provides enterprises with flexible solutions to safeguard their networks both now and in the future against the quantum threat.”

Schedule a consultation with the Nokia/Kyndryl experts here.