Eliminating the USD 1.5T technical debt with NTT Data's AI-driven approach

Rajeev Singh of NTT DATA details leveraging agentic AI to eliminate vast technical debt, emphasising their "code-to-spec" approach and Smart AI Agent Ecosystem. He covers NTT's USD 1.5B India investment in AI infrastructure, strategic OpenAI partnership.

author-image
Punam Singh
New Update
Rajeev Singh

Rajeev Singh, Group Senior Vice President and GTM head for the Apps and BPS global practice at NTT Data

Listen to this article
0.75x 1x 1.5x
00:00 / 00:00

In a conversation with Rajeev Singh, Group Senior Vice President and GTM head for the Apps and BPS global practice at NTT Data, we delved into the critical issue of technical debt, its immense financial implications, and how cutting-edge technologies like agentic AI are transforming its eradication.

Advertisment

Rajeev also shed light on NTT Data's strategic investments in India, their innovative Smart AI Agent Ecosystem, and the significant partnership with OpenAI.

Could you define technical debt and explain why it has become such a critical issue, particularly for large enterprises, given its estimated value of USD 1.5 trillion?

Technical debt, a concept long understood by IT teams, is now a boardroom topic, signifying its broad implications. Consider the insurance industry: many large insurers still use COBOL or mainframe platforms for core processing. Onboarding new customers is not simple; it requires extensive data conversions, configuration changes, and backend integration. These outdated systems cause delays, hinder business growth and integration, and directly impact customer acquisition and revenue.

Advertisment

Legacy systems were built in silos, whereas today’s ecosystems are highly integrated, demanding rapid adaptation to drive business outcomes. IT systems are often designed for maintenance rather than agility, leading to significant expenditure simply to keep them operational. This "debt" means investment is directed toward maintaining outdated infrastructure instead of fostering growth.

The advent of cloud platforms enables much faster movement with architectures designed for rapid change. Moreover, AI requires seamless data access, which is complex with legacy systems. To effectively implement AI, reducing technical debt is essential. This explains why technical debt is a pressing concern from a business outcome perspective.

How does Agentic AI help eliminate technical debt, and what makes its approach unique compared to other AI formats for legacy systems?

Advertisment

Traditionally, removing technical debt involved resource-heavy and time-consuming approaches like lift-and-shift or complete rewrites, often taking 12 to 36 months.

With generative AI, we can now process vast amounts of old code, for instance, 16 million lines, with remarkable efficiency. Generative AI scans, analyses, and graphically structures systems, providing causal analysis previously unattainable. This significantly streamlines the initial understanding phase.

Agentic AI extends this by taking autonomous action based on the analysis. It can consume information, trigger conversions, and support forward engineering, fundamentally altering our modernisation approach. Processes that once took years can now be completed in a fraction of the time. For example, 16 million lines of legacy COBOL code were processed to create a complete information base in just three and a half months, a feat impossible before. Organisations were aware of technical debt but hesitated due to high failure rates and complexity. These AI technologies allow us to address it more efficiently, reduce risk, and provide clients with clarity and speed. This is how generative and agentic AI are transforming technical debt management.

Advertisment

Can you describe the diagnostic phase for quantifying and prioritizing technical debt, including proprietary tools, and how insights are consolidated for new environment alignment?

The fundamental reason for our advisory element is to help clients construct a successful roadmap that secures funding and ensures comfort with the process.

We utilise a comprehensive diagnostic framework, which we call Genesis. This suite of tools evolved from an initial partnership with a startup. Genesis allows us to scan specific client environments and create tailored roadmaps. A powerful aspect is our ability to demonstrate that full modernisation is not always necessary. Instead, we often recommend fragmenting the architecture, treating different layers distinctively based on business needs and risk profiles.

Advertisment

We meticulously map the modernization path, help clients understand budgeting, and align them with clear milestones. Our diagnostic process extends beyond technical evaluation; it clarifies, outlines risks, offers remediation strategies, and builds a shared vision for transformation. Genesis is a key enabler in fostering client confidence.

What is the integration process for Crowdbotics with other AI systems, and what legacy transformations or managed service transitions does it facilitate?

At NTT Data, through our venture capital arm, we have built relationships with numerous startups, including Crowdbotics, which has been a valuable partner in developing our assets.

Advertisment

We leverage Crowdbotics in our "code-to-spec and spec-to-code" process. This involves reverse-engineering old legacy code to create a specification, then forward-engineering from that spec to develop new systems. While complex, once requirements are captured, the system can perform autonomous forward coding, automated testing, and even automated cloud deployment. We have built this entire chain around their platform, integrating various nuances of the generative AI and agentic AI ecosystem.

Regarding legacy transformation and managed service transitions, we also maintain systems for our large deals, including application management. To enhance productivity, we have infused generative and agentic AI into the ecosystem. For instance, AI operations are embedded to eliminate noise and reduce service tickets. By incorporating conversational AI and full-stack observability and automating layers, AIOps can trigger self-healing mechanisms when issues are detected. This is a progressive journey, not an instant change.

Furthermore, a significant challenge in large deals is knowledge transition. Generative AI enables an AI-led approach to this, allowing us to ingest standard operating procedures, documents, and codebases to build an enterprise knowledge platform. This platform introduces intelligence and accelerates our ramp-up time. We actively drive this in all our deals.

Advertisment

In essence, we create this knowledge platform for transitions, then leverage it to power AI operations, promoting self-healing, embedding full-stack observability, reducing noise, and utilising conversational AI where feasible. This comprehensive approach targets the entire ecosystem.

We recently launched the Smart Agent Ecosystem, a significant effort recognising agentic AI as a game-changer. It is an ecosystem, not a point product, designed to address both domain-specific and horizontal requirements across industries like automotive, banking, and finance, with a business outcome–driven approach. We collaborate with partners like OpenAI, Aisera, and Kore.ai to guide clients through the agentic AI journey, making it simple, effective, and outcome-driven.

Could you provide a specific example of how infusing AI into transaction BPO, knowledge BPO, or application tech has significantly improved client outcomes?

Consider two processes. In insurance, within our TPA (third-party administration) business, the partial payment process used to take approximately four hours due to multiple steps, approvals, and traditional workflow deviations. By agentizing this process with agentic AI and redesigning it for autonomous function, we reduced processing time from four hours to three to four minutes. This exemplifies the productivity improvement achievable with precise implementation, despite the inherent probabilistic nature of AI.

Another example involves a global automotive client facing frequent assembly line breakdowns impacting production. Over three months, we collected and analysed their operating manuals, device documentation, and technical content. Using generative AI, we built an organised intelligence network and a smart chatbot five to six months ago. Now, assembly line workers access real-time solutions via the chatbot, eliminating waits for subject matter experts. This significantly improved productivity by resolving operational stoppage issues. The client is now expanding this solution.

We are also progressively integrating agentic functions, such as guided repair instructions and automated ordering of replacement parts. Hyster-Yale has recognised our advisory support in building their multi-agent ecosystem. These demonstrate the real and measurable impact of our AI-led transformations.

Where are NTT DATA's USD 1.5 billion India investments allocated; across campuses, AI innovation hubs, and data centre infrastructure?

The USD 1.5 billion investment over the next three years is a significant allocation, aiming to position India among NTT Data's top five global markets.

Our investments target four main segments. First, India serves as a hub for our global customers.  Second, we cater to our India-based clients. Third, we heavily invest in data centres and AI infrastructure, crucial for digital transformation. Finally, the payments industry is a major focus, where we have expanded our presence through acquisition and organic growth.

Most investment aligns with these areas. Recent announcements include our submarine cable project and a new data centre in Hyderabad. The objective is to bring NTT-owned intellectual property and high-grade infrastructure, like IOWN (Innovative Optical and Wireless Network), to India. We see this optical network technology as a foundational layer for future infrastructure.

The second investment area focuses on innovation centres globally, not steady-state IT services. These centres develop cutting-edge technologies, such as blockchain in payments, computer vision, and improving AI-driven vision system stability. We seek talent with deep expertise in these emerging technologies. The goal of these centres is to create intellectual property that becomes part of our mainstream service offerings, such as the recently developed attribute-based encryption, part of NTT Data’s global research output backed by our USD 3.6 billion R&D investment.

Overall, our investments prioritize digital transformation, infrastructure, payments, innovation centers, and strengthening campus capabilities for global customers. With over 40,000 employees in India, it remains an extremely important market and delivery center for NTT Data, and we are committed to expanding our presence.

NTT Data unveiled a "Smart AI Agent Ecosystem" and a patented plug-in solution to transform legacy bots into autonomous intelligent agents.  How does this align with your strategy for addressing technical debt, especially for enterprises with existing bot infrastructure?

This initiative addresses a critical challenge: the vast number of existing bots deployed across BPO processes. While efficient, the emergence of agentic AI poses a risk of these becoming another layer of technical debt if not managed carefully.

We have created a structured path for customers to convert their existing bots into an agentic framework, making them as autonomous as possible. This provides a solid foundation for evolution without discarding current investments. While further transformation may be needed over time, this initial conversion is a significant step forward.

Our objective is to prevent legacy bots from becoming a burden. For instance, the reduction of partial payment processing time from four hours to four minutes was only achieved after transitioning the bot into an agentic framework and applying necessary changes. We aim to help customers confidently move forward, unhindered by outdated technology. This approach is a key component of our Smart AI Agent Ecosystem.

NTT Data has actively pursued strategic collaborations, including with OpenAI. How do these partnerships, especially in the Generative AI space, enhance NTT Data's capabilities?

 We consider our collaboration with OpenAI a strategic, long-term partnership.

OpenAI is a major player in the generative AI space. Our focus is to leverage OpenAI’s capabilities to co-develop next-generation solutions for our customers. For example, in contact centres, OpenAI enables the creation of autonomous agents capable of independent conversations.

Our goal at NTT Data is to apply this capability to real-world applications. We have established a Centre of Excellence focused on OpenAI’s technology to work closely with them, explore new use cases, and design impactful solutions for business customers. We also aim to scale these solutions through our Innovation Centres.

Regarding return on investment, we approach it from a long-term perspective. These technologies are crucial for the sustained success of both our clients and ourselves. Generative AI is a mainstream capability that will only strengthen over time, with rapid evolution evident internally.

Therefore, ROI is not a simple cost-benefit analysis here. It is about participating in an evolving ecosystem, adapting, and co-creating new solutions in emerging markets. We are also exploring AI as a native service, a concept that did not exist previously. This represents the long-term strategic value we aim for through this collaboration.

NTT Data is significantly expanding its data centre footprint globally, including doubling capacity in India and investing in a new 400 MW AI data centre campus in Hyderabad with 25,000 GPUs. How critical is this physical infrastructure to NTT Data's AI-led modernisation strategy?

The GPU component has become critical in data centres today. Modern AI workloads simply cannot run on traditional CPU-based infrastructure.

As AI workloads increase, the fundamental need for GPU-powered infrastructure grows. Data centres must be designed with core GPU capabilities, along with essential factors like advanced cooling technologies. Our data centre business focuses on building AI-ready infrastructure to support both our clients and our compute-heavy solutions.

Concurrently, we approach AI with a strong sense of responsibility, designing every solution with a secure and responsible framework. We maintain robust internal governance to mitigate risks, especially at scale.

Regarding regulatory compliance, we collaborate closely with local authorities to ensure our data centres and entire ecosystem align fully with standards. Data localisation and regulatory adherence are top priorities. We view compliance not only as a responsibility but also as an opportunity for new engagement and innovation.

Our investments in the Smart AI Agent Ecosystem and the GPU-based AI data centre in Hyderabad aim to build a strong foundation. Our objective is to be fully equipped to support clients throughout their AI journey, which represents a major part of our future investment.