Confluent Platform is a streaming platform built by the original creators of Apache Kafka. It enables organizations to organize and manage streaming data from various sources. Confluent launched its IPO in June this year and raised $828 million to further expand its business. Confluent Platform was brought to several public cloud vendor marketplaces last year as Confluent Cloud. The offering is currently available in Azure, AWS, and GCP marketplaces. Furthermore, the company strengthened its partnership with Microsoft at the beginning of this year, establishing Confluent Cloud as a fully managed Apache Kafka service directly available on Microsoft Azure. Azure customers can access the extensive library of pre-built connectors, a unified billing model with options to use Azure committed spend on Confluent Cloud, and deeper integrations with Azure services.
Organizations are moving toward real-time event streaming as data proliferates across various business types. The benefit of event-stream processing is that it can connect to various data sources; normalize, enrich and filter data; and automatically apply rules to the data to reveal patterns, relationships or trends in real time. This enables organizations to quickly analyze data immediately after it is created. It also allows personnel to add contextual data from various sources to ensure proper interpretation of events and then apply real-time business logic and rules, or machine learning (ML), to trigger an action. We assert that by 2024, more than one-half of all organizations’ standard information architectures will include streaming data and event processing, allowing organizations to be more responsive and provide better customer experiences.
Confluent recently announced the general availability of Confluent for Kubernetes 2.0.0 which is a cloud-native control plane for deploying and managing Confluent in a private cloud environment. It offers a standard interface to customize, deploy and manage Confluent Platform through declarative API. Confluent for Kubernetes can detect process failures and restart processes automatically, minimizing the risk of business disruption and limiting data loss. The platform can also automatically generate configurations, schedule and run new broker processes, and ensure data is balanced across brokers so that clusters can be utilized efficiently. With the ability to spin up and down Confluent clusters, organizations can control infrastructure costs while quickly scaling to meet business needs.
Confluent also announced the final release of Project Metamorphosis that enables organizations to improve Kafka scalability in the cloud and enhances elasticity to automate the scalability process. It also announced the release of infinite retention in Confluent Cloud that creates a centralized platform for all current and historic event streams with limitless storage and retention. Confluent Hub provides a single marketplace with more than 120 pre-built connectors for Kafka. Confluent also offers support for connectors, together with its partners including Amazon S3, MongoDB, Elasticsearch, MQTT, Salesforce and Azure Data Lake. Many of these connectors are also available as fully managed services in Confluent Cloud, along with marketplace integrations on AWS, Azure, and Google Cloud.
Event-streaming technology brings a myriad of benefits for the new era of big data. Organizations often require real-time data access across a shared data pool, within both multi-cloud and on-premises infrastructures. Kafka data streaming can ensure the low latency required for data streaming across different distributed applications and infrastructures. With Confluent for Kubernetes, organizations can deploy and operate Confluent Platform in a private datacenter and can connect the data from their cloud environments to Confluent Cloud. I recommend that organizations looking for an approach to process multiple streams of data in real time should evaluate the capabilities of Confluent.