Confluent has unveiled a set of new features for its Confluent Cloud for Apache Flink, designed to simplify the development of real-time AI applications. The enhancements include Flink Native Inference, Flink Search, and built-in machine learning (ML) functions, offering developers a more streamlined experience in deploying and managing AI models within a single platform.
The announcement, made on 19 March in Singapore, highlights Confluentโs continued efforts to bring powerful AI capabilities to organisations without requiring deep technical expertise or fragmented tools.
Shaun Clowes, Chief Product Officer at Confluent, said, โBuilding real-time AI applications has been too complex for too long, requiring a maze of tools and deep expertise just to get started. With the latest advancements in Confluent Cloud for Apache Flink, weโre breaking down those barriersโbringing AI-powered streaming intelligence within reach of any team. What once required a patchwork of technologies can now be done seamlessly within our platform, with enterprise-level security and cost efficiencies baked in.โ
Addressing the complexity of real-time AI
According to research from McKinsey, 92% of businesses plan to increase their AI investments in the next three years. However, the journey to building AI applications remains challenging. Many developers currently navigate an ecosystem filled with multiple tools, programming languages, and interfaces to deploy ML models and extract context from various data sources. This complexity often leads to inefficiencies, operational delays, and unreliable AI outputsโsometimes referred to as โAI hallucinations.โ
To address this, Confluentโs updated Flink features are focused on three core areas: inference, search, and accessibility.
Flink Native Inference allows development teams to run open source AI models directly within Confluent Cloud. This reduces the need for managing additional infrastructure, making it easier to maintain security and efficiency while running ML models. Since inference takes place within the platform, it adds another layer of data protection and lowers costs.
Flink Search, another new addition, gives developers a single interface to access data across multiple vector databases like MongoDB, Elasticsearch, and Pinecone. These vector searches are essential in helping large language models (LLMs) provide accurate and relevant responses. By simplifying the process of retrieving context-rich data, Flink Search eliminates the need for complex ETL (Extract, Transform, Load) pipelines and manual data handling.
Making data science more accessible
The built-in ML functions in Flink SQL bring common AI use cases such as forecasting, anomaly detection, and real-time data visualisation into reach for users without advanced data science backgrounds. By embedding these features directly into SQL workflows, Confluent enables teams across industries to generate insights and improve decision-making faster.
Commenting on the platformโs impact, Steffen Hoellinger, Co-founder and CEO at Airy, said, โConfluent helps us accelerate copilot adoption for our customers, giving teams access to valuable real-time, organisational knowledge. Confluentโs data streaming platform with Flink AI Model Inference simplified our tech stack by enabling us to work directly with large language models (LLMs) and vector databases for retrieval-augmented generation (RAG) and schema intelligence, providing real-time context for smarter AI agents. As a result, our customers have achieved greater productivity and improved workflows across their enterprise operations.โ
Stewart Bond, Vice President of Data Intelligence and Integration Software at IDC, added, โThe ability to integrate real-time, contextualised, and trustworthy data into AI and ML models will give companies a competitive edge with AI. Organisations need to unify data processing and AI workflows for accurate predictions and LLM responses. Flink provides a single interface to orchestrate inference and vector search for RAG, and having it available in a cloud-native and fully managed implementation will make real-time analytics and AI more accessible and applicable to the future of generative AI and agentic AI.โ
More features available in early access
Confluent Cloud for Apache Flink remains the only serverless stream processing solution combining real-time and batch processing in a single platform. With the new AI, ML, and analytics capabilities, businesses can reduce operational overhead and simplify development processes further. These features are now available in an early access programme for current Confluent Cloud users.
Other enhancements in Confluent Cloud include Tableflow, Freight Clusters, Confluent for Visual Studio Code, and the Oracle XStream CDC Source Connector, providing teams with even more tools to manage and process real-time data.