In an exclusive interview with Data Quest, Vijayant Rai, Managing Director for India at Snowflake, sheds light on the transformative power of generative AI and its potential to revolutionize industries. He explains Snowflake's innovative approach to integrating generative AI within its robust data platform, highlighting the significance of a strong data strategy.
He also discusses Snowflake's new enterprise-grade large language model, Arctic, and its applications, while addressing key considerations for data security and governance. This interview provides a comprehensive overview of Snowflake's strategies and future plans, emphasizing their commitment to empowering businesses with cutting-edge AI technologies.
Excerpts
DQ: Can you explain the significance of generative AI and Snowflake's approach to it?
Vijayant Rai: Generative AI represents a significant technological transformation, comparable to the advent of the internet or mobile technology. It has made AI accessible to everyone, allowing for easy data querying, rapid application development, and the creation of AI-powered tools like co-pilots.
To utilize generative AI effectively, organizations must first have a robust data strategy. The foundational element is well-organized data, as there can be no AI strategy without a data strategy. Data is the raw material for generative AI, and having it in the right shape is crucial for deriving meaningful outcomes.
Snowflake is at the forefront of this revolution. As a leading cloud-based data platform, we help customers consolidate their data from various silos into one place, ensuring the foundations are correct. In addition to our data platform services, we offer our own generative AI solutions. We have announced a new large language model called Arctic, which is an open-source model.
For our customers, the process involves integrating all their data—HR data, customer data, etc.—into one platform. From there, they can achieve various outcomes, such as advanced analytics, application development, and applying generative AI to derive new insights. We provide both our own generative AI solutions and access to a variety of open-source large language models (LLMs) within our platform.
DQ: Can you explain Snowflake's new large language model and its enterprise applications?
Vijayant Rai: Snowflake has announced Arctic, an enterprise-grade large language model (LLM). Unlike public LLMs like ChatGPT, which are often used for tasks such as writing essays or various activities, Arctic is designed specifically for enterprises. It focuses on security, governance, and maintaining data boundaries, making it a secure choice for companies.
In addition to Arctic, we offer use-case-based solutions built on our LLM. These include:
- Document AI: For summarizing and analyzing documents.
- Universal Search: An enterprise search solution based on LLM.
- SQL Co-Pilot: Assists developers in quickly creating code.
These solutions, along with Arctic, provide enterprises with powerful tools to create specific use cases and achieve various outcomes securely and efficiently.
DQ: How is generative AI being adopted across different industries?
Vijayant Rai: Generative AI is a revolutionary technology that no one wants to miss. Across various industries, we are seeing significant experimentation and testing of generative AI through proofs of concept (POCs) and pilot projects. While some applications, like chatbots, have already gone into production, most industries are still in the experimentation phase. Over the next 12 months, we expect to see more serious, generative AI-driven products emerge.
Currently, the contact and service centers are experiencing substantial experimentation, aiming to increase efficiency. Financial services are aggressively adopting generative AI for customer experience, employee efficiency, and modernizing business processes. Digital natives, such as startups and unicorns, are ahead in leveraging generative AI to develop innovative products and solutions.
Manufacturing is also showing interest, although it may take some time to scale. Additionally, the service industry, including hotels and airlines, is exploring generative AI in a significant way.
DQ: What considerations should be kept in mind regarding risk, security, and governance when working with large language models (LLMs)?
Vijayant Rai: Generative AI relies on data, which is a valuable asset. Protecting this data is crucial to prevent fraud and security breaches. Key considerations include:
- Data Security: Ensure that your data remains within secure platforms and is not accessible to unauthorized actors. At Snowflake, we ensure that all data stays within our secure, governed platform. We regularly apply the latest security patches and implement strict access controls to ensure that only authorized personnel can access the data.
- Application to Data Approach: Instead of sending data out of secure environments, bring applications or LLMs to the data. This minimizes the risk associated with data leaving secure premises and potentially falling into the wrong hands.
- Governance: Maintain robust governance policies to oversee data access and usage. This includes continuous monitoring and updating of security measures to adapt to evolving threats.
By focusing on these aspects, organizations can mitigate risks and ensure their data is protected, even as data volumes continue to grow in the digital age.
DQ: What sets Snowflake apart in the adoption of generative AI?
Vijayant Rai: Snowflake's unique approach lies in our cloud-based platform, which focuses solely on data, AI, and generative AI. This holistic focus offers several advantages. We provide a fully cloud-based platform that handles data storage, compute resources, and security. Customers don’t need to worry about the technical infrastructure; we manage it all, allowing them to focus on their core business.
Our platform offers seamless scalability. For example, during peak seasons, such as holidays for an e-commerce company, we ensure that all computing and storage needs are met without the customer needing to manage it. Because we manage all infrastructure elements, customers can quickly derive value from our platform, achieving business outcomes faster.
Our platform is a single product that integrates various capabilities, including generative AI and machine learning. Customers can simply turn on these features without needing to buy and integrate separate products. This comprehensive, customer-focused approach makes Snowflake uniquely differentiated in the generative AI space.
DQ: What is Snowflake's vision for growth in India, and what are your expectations for the Indian market?
Vijayant Rai: India represents a significant growth opportunity for Snowflake, given its strong and steadily growing economy. The potential here is massive, as much of India's data is still on-premises and has not yet moved to the cloud. This presents a huge opportunity for a cloud-focused data platform like ours.
The total addressable market across all verticals, including financial services, manufacturing, IT companies, digital natives, and the public sector, remains vast. We are focused on scaling and expanding across these verticals by building specialized teams for each sector, such as financial services, digital natives, and the public sector, over the next 12 to 18 months.
We are also targeting the SME and mid-market segments, which play a crucial role in driving India's GDP. Our commercial team is being scaled up to enhance our reach and support in these areas. Overall, we aim to build stronger customer partnerships by expanding our organization and investing in our technical teams to demonstrate the value of data and how our platform can be utilized effectively.
Additionally, we are focusing on developing a robust partner ecosystem. We collaborate with top Indian and global service integrators, advisory firms, and local partners to deliver the full value of our platform.