/dq/media/media_files/2025/07/14/deepak-2025-07-14-11-18-19.jpg)
In a recent in-depth interview, we had the privilege of speaking with Deepak Dastrala, CTO of Intellect Design. Deepak provided a comprehensive overview of Intellect AI's mission, its innovative use of Artificial Intelligence in financial services, and the strategic partnership with MongoDB.
MongoDB's Role in AI-Powered Full-Stack Modernisation
This is an important point. We developed this data platform, commencing its journey approximately eight years ago. Initially, our use case focused on commercial insurance, specifically underwriting. In North America, the underwriting of commercial insurance involves some of the most extensive use of data to make critical decisions, such as whether to underwrite particular corporate clients across various lines of business. This could involve property policies or worker compensation policies for employees.
When we began this endeavour, we sought a suitable database for managing unstructured data. Our collaboration with Mongo commenced eight years ago for this purpose. Given our unstructured use cases, MongoDB's document database was an appropriate fit. Throughout our AI journey, MongoDB has become an integral component of our data platform.
In the last couple of years, while continuing my work with AI, I also assumed responsibility for our wealth business and capital markets business. This provided an opportunity to identify friction points within our wealth platform. Our wealth platform is utilised by some of the largest banks and sovereign wealth funds. The trigger point for this particular use case emerged from challenges with our analytics, where we experienced delays in completing data loads and encountered performance bottlenecks. Having previously worked with unstructured data, I understood that the complexities and challenges associated with it are considerably greater than those with structured data use cases.
It then became clear why we were addressing this problem in a siloed manner. Historically, MongoDB has been primarily recognised as a document database, without widespread understanding of its relevance for OLTP (online transaction processing) and OLAP (online analytical processing) use cases. This led to my collaboration with MongoDB. We aimed to demonstrate how MongoDB is equally, if not more, relevant for structured data use cases, transaction use cases, and analytics use cases. My objective was not merely to resolve the immediate analytics problem, but also to future-proof our data strategy for the wealth platform and for future AI initiatives.
Therefore, I began working with MongoDB on the modernisation journey of our wealth platform. We determined how it not only addresses current concerns regarding transaction and analytics use cases, but also how it can be made relevant for AI use cases. In this manner, I perceived an opportunity to advance significantly in the AI journey. The goal was to establish our wealth platform as a system of record and a system of engagement. With this unified data platform approach, we can also transform it into a system of intelligence. This outlines the strategy.
Development Level After MongoDB Collaboration
Deepak Dastrala: Regarding the analytics use case that I previously referenced, upon completing the modernisation, the immediate observation was a performance enhancement of over 50 to 70%. Furthermore, end-of-day loads, which previously required more than four hours, can now be completed in less than 50 minutes. Third, the total cost of ownership, compared to our prior choice, represents only 30% of the annual expenditure for that infrastructure. These three metrics represent a direct, like-for-like comparison. Additionally, we are now able to leverage the platform for AI use cases, a capability not present with our previous database.
Improvements in Batch Processing Framework and Architecture
The initial problem was a purely ETL (Extract, Transform, Load) approach, necessitating processing at a specific time of day. This presented a fundamental constraint. Secondly, the prior use of a SQL loader imposed a very rigid schema. Thirdly, PL/SQL was entirely coupled and primarily executed sequential processing. All of these components have been replaced with MongoDB aggregations. This means we no longer need to await the end of the day. We are capable of near real-time streaming, encompassing both data change capture and aggregations. This has resulted in a 70% increase in speed, for a direct comparison. More significantly, we can perform these operations during daytime hours, as a key advantage of MongoDB, which many underestimate, is its optimal performance for real-time dashboards.
Scalability and Operational Efficiency for Customers
From our perspective, many of our customers are cloud-based and distributed across various regions. The expectation is that we can independently scale OLTP (Online Transaction Processing) or vector use cases without concern for the entire system. Particularly with vector use cases, given the volume of incoming data, extensive prior planning is often not feasible. We must operate in near real-time. That, in my assessment, constitutes a significant differentiation.
Future-Proofing AI Solutions and Data Trends
That is an excellent question. The true power of MongoDB resides in its flexible schema. This eliminates concerns about specific data types or rigid structures. In conventional systems, the data model is paramount, and any modification to it can trigger a ripple effect throughout the application. In this scenario, however, the flexibility to even alter the schema at runtime provides significant advantages.
Allow me to offer another illustrative example. Consider a structured data model where specific entities can be tagged for an AI use case. This implies that you can embed the document for that entity and also treat that particular entity as unstructured data, thereby making it more relevant and useful for an AI application. This enables a hybrid nature of unstructured data within a structured framework.
Partnership Benefits Beyond Technicalities
With AI, every business, whether technology-oriented or otherwise, undergoes a fundamental re-evaluation. The critical aspect is speed to value—the ability to evolve rapidly and address market needs. If speed to value is the primary driver for all, then a partner capable of innovating at the same pace is essential. For instance, the concept of MCP (Model Context Protocol) in AI did not exist a few months ago, yet MongoDB is already supporting it.
Secondly, one must begin to consider what an AI-native database truly entails, because, unlike before, embeddings are now central. In the future, everything will likely be stored as embeddings. MongoDB's acquisition of OHAI and its integration of native embedding models and re-ranking models help us reduce the cost associated with utilising third-party solutions. For me, to maintain a competitive business advantage, my partner must also remain at the forefront of innovation. Failure to do so would be detrimental. That is one key area.
A second area pertains to our global business, with clients worldwide. In my personal view, operating in AI without a cloud model presents significant challenges in managing data scale and scaling requirements. Each of our clients maintains preferred cloud providers due to long-term commitments. Therefore, my choice of database must coexist with their priorities. Whether they operate on Azure, GCP, or AWS, my database must support all these environments and adhere to region-specific compliance. This is where MongoDB assists us in capitalising on global opportunities within our clients' preferred cloud environments.
Security Layers in BFSI Data Handling
It is important to note that a key differentiation lies in query capabilities. Primarily, almost everything is encrypted to a significant extent, and MongoDB's ability to query on encrypted data is, in my assessment, unparalleled among competitors. Typically, data is encrypted, then decrypted for querying. However, the ability to query directly on encrypted data without decryption is a notable feature. This is one example I provide.
Generally, security in transit and security at rest are expected. The crucial question, however, is how to meet every compliance requirement. As you rightly stated, operating in the UK and EU necessitates compliance with GDPR. Two aspects are critical: first, what we, as an application or platform, implement; and second, if equivalent compliance, from both a data sovereignty perspective and other considerations, is not inherent in MongoDB, then it would present a challenge for us. MongoDB, by default, provides all of these capabilities. This is why some of the largest banks are their customers; it is considerably challenging in a cloud era to comply with and meet the expectations of banks and financial services, particularly given that regulations in Europe and the UK are far more stringent than elsewhere in the world. I believe this is where MongoDB's default offerings and native innovations in encryption, compliance, and overall data sovereignty significantly assist us.
Adhering to the Zero Trust Model and Evolving Compliance
Yes, it does. Adherence is one aspect, but the ability to audit and demonstrate compliance to any third party is also crucial. We perform all necessary procedures for this, as the certification process itself is quite rigorous. For example, while much of this discussion pertains to structured data, consider the application to unstructured data.
Currently, as an organisation, we are pursuing ISO 42001 certification. This ensures that our AI system complies with the expected regulations for any AI system. Therefore, we are actively adopting new compliances as they evolve due to AI. Our differentiation lies not in merely restricting ourselves to past compliances but in adapting to the evolving regulatory landscape, which is critical, as every system will, in some form, become an AI system due to the inherent nature of the changes occurring. Technology is evolving. We cannot rely solely on a single compliance framework.
Intellect AI's Future Perspective and AI Expansion
Firstly, having initiated the development of AI products significantly earlier than others and having operationalised them at scale, we possess a profound understanding of what change management entails in an AI-driven environment, as opposed to a typical digital one. The transformation is significant because, historically, especially in highly regulated industries like BFSA, the digital journey was expected to be deterministic. This means if you anticipated a value of one thousand rupees, it should be precisely one thousand rupees. With AI, however, it might indicate "with 99% confidence, it is one thousand."
Therefore, how to foster business acceptance of this probabilistic journey is a key focus. We ensure that change management is an integral, thoroughly considered process for businesses, particularly enterprises, to adopt AI. This is an area where we possess understanding and continuously strive for improvement. Without this focus, everything risks remaining in the Proof of Concept (POC) phase, never reaching production. That is our primary priority.
Secondly, we maintain an active partnership with Columbia University. Together, we established a lab named DAP lab, which stands for Data Agents and Process. The objective of this lab is significant because currently, there is an excessive focus solely on models, often overlooking the broader picture of what is required to build an AI-native enterprise. For example, I believe we are currently at a stage comparable to 1996 or 1997 in the internet's evolution, where we had Hotmail and Yahoo Mail. However, the evolution into internet banking, then mobile banking, and subsequently cloud computing and other advancements, occurred when fundamental business models transformed, not merely with the advent of Hotmail or Yahoo Mail.
To illustrate, while the internet gained popularity in 1995, internet banking only emerged around 2000-2001, taking approximately five to six years. This delay was necessary to establish trust in conducting financial transactions over the internet. Considering this, reflect on the extent of innovation required across the infrastructure space. Currently, all infrastructure, including cloud, is designed for applications, not for agents. In the future, there will be billions of agents. Consider how an agent from an SBA bank can securely and safely communicate with an agent from an Axis Bank to facilitate a transaction. What does trust signify in such a scenario?
All of these aspects are central to our focus. We are entirely concentrated on defining what a trusted infrastructure looks like across data, application, and security domains, particularly for highly regulated industries, to enable trust in AI and operationalise AI at scale. Essentially, our goal is to identify the innovations necessary for any enterprise to become an AI-native enterprise. That is our core focus.