Confluent, Inc. has announced several new capabilities.
“Every company is in a race to transform their business and take advantage of the simplicity of cloud computing,” said Ganesh Srinivasan, Chief Product Officer, Confluent. “However, migrating to the cloud often comes with tradeoffs on security, monitoring insights, and uptime guarantees. With this launch, we make it possible to achieve those fundamental requirements without added complexity, so organizations can start innovating in the cloud faster.”
Data security is paramount in any organisation, especially when migrating to public clouds. To operate efficiently and securely, organisations need to ensure the right people have access to only the right data. However, controlling access to sensitive data all the way down to individual Apache Kafka topics takes significant time and resources because of the complex scripts needed to manually set permissions.
Last year, Confluent introduced RBAC for Confluent Cloud, enabling customers to streamline this process for critical resources like production environments, sensitive clusters, and billing details, and making role-based permissions as simple as clicking a button. With today’s launch, RBAC now covers access to individual Kafka resources including topics, consumer groups, and transactional IDs. It allows organisations to set clear roles and responsibilities for administrators, operators, and developers, allowing them to access only the data specifically required for their jobs on both data and control planes.
“Here at Neon we have many teams with different roles and business contexts that use Confluent Cloud,” said Thiago Pereira de Souza, Senior IT Engineer, Neon. “Therefore, our environment needs a high level of security to support these different needs. With RBAC, we can isolate data for access only to people who really need to access it, and operations to people who really need to do it. The result is more security and less chance of failure and data leakage, reducing the scope of action.”
Expanded Confluent Cloud Metrics API delivers enterprise-wide observability to optimise data streaming performance across the entire business
Businesses need a strong understanding of their IT stack to effectively deliver high-quality services their customers demand while efficiently managing operating costs. The Confluent Cloud Metrics API already provides the easiest and fastest way for customers to understand their usage and performance across the platform. Today, Confluent is introducing two new insights for even greater visibility into data streaming deployments, alongside an expansion to our third-party monitoring integrations to ensure these critical metrics are available wherever they are needed:
- Customers can now easily understand organisational usage of data streams across their business and sub-divisions to see where and how resources are used. This capability is particularly important to enterprises that are expanding their use of data streaming and need to manage internal chargebacks by business unit. Additionally, it helps teams to identify where resources are being over or underutilised, down to the level of an individual user, in order to optimise resource allocation and improve cost savings.
- New capabilities for consumer lag monitoring help organisations ensure their mission-critical services are always meeting customer expectations. With real-time insights, customers are able to identify hotspots in their data pipelines and can easily identify where resources need to be scaled to avoid an incident before it occurs. Additionally, with records exposed as a time series, teams are equipped to make informed decisions based upon deep historical context when setting or adjusting SLOs.
- A new, first-class integration with Grafana Cloud gives customers deep visibility into Confluent Cloud from within the monitoring tool they already use. Along with recently announced integrations, this update allows businesses to monitor their data streams directly alongside the rest of their technology stack through their service of choice.
To enable easy and cost-effective integration of more data from high-value systems, Confluent’s Premium Source Connector for Oracle Change Data Capture (CDC) is now available for Confluent Cloud. The fully managed connector enables users to capture valuable change events from an Oracle database and see them in real time within Confluent’s leading cloud-native Kafka service without any operational overhead.
“ksqlDB made it super easy to get started with stream processing thanks to its simple, intuitive SQL syntax,” said Jeffrey Jennings, Vice President of Data and Integration Services, ACERTUS. “By easily accessing and enriching data in real-time with Confluent, we can provide the business with immediately actionable insights in a timely, consistent, and cost-effective manner across multiple teams and environments, rather than waiting to process in silos across downstream systems and applications. Plus, with the Stream Processing Use Case recipes, we will be able to leverage ready-to-go code samples to jumpstart new real-time initiatives for the business.”
Read the latest edition of PCR’s monthly magazine here: