Getty Images

Confluent platform update targets developer choice, security

The vendor's latest follows its recent acquisition of WarpStream and adds new security features, along with support for the Table API to provide new options for Apache Flink users.

Data streaming specialist Confluent on Tuesday unveiled its latest platform update, including new security capabilities and support for the Table API that makes the Apache Flink platform accessible to Java and Python developers.

The release, which includes generally available features as well as some in preview, closely follows Confluent's acquisition of WarpStream, another streaming data vendor that Confluent bought Sept. 9.

Based in Mountain View, Calif., Confluent develops a streaming data platform built on Apache Kafka, an open source technology developed by Confluent co-founders Jay Kreps, Neha Narkhede and Jun Rao when they were working at LinkedIn. Kafka, which was first released in 2011, enables users to ingest and process data as it is produced in real time.

Using Kafka as a foundation, Confluent offers Confluent Cloud as a managed service and Confluent Platform for on-premises users.

Apache Flink, meanwhile, was launched in 2014 and is a processing framework for data streaming similar to Confluent's proprietary platforms. Flink provides a compute layer that enables users to filter, combine and enrich data as it's produced and processed to foster real-time analysis.

Confluent unveiled support for Flink in March to provide users the option of using it as a managed service rather than Confluent Cloud.

New capabilities

Just as adding support for Flink provided Confluent users with more choice as they build their streaming data infrastructure, adding support for the Table API -- which is now in open preview -- similarly adds more choice to the Confluent platform while also opening it to a new set of potential users.

When Confluent first provided customers with Flink as an option, it did so with a SQL API that enabled developers to build data streams using SQL code. However, not all developers know SQL. And even among those who do know SQL, the programming language may not be their preferred coding format.

The Table API, like the SQL API, is a tool that enables Flink users to develop pipelines by writing code. But rather than SQL, the Table API enables developers to use Java and Python.

Choice is important as developers create environments for data management and analytics. It not only enables enterprises to avoid vendor lock-in but also lets them use the tools that best fit their needs for a given task or that users know best and prefer. Therefore, Confluent's addition of support for the Table API is a logical step for the vendor following its initial support for Flink, according to David Menninger, an analyst at ISG's Ventana Research.

It will be significant to developers that would prefer to write code rather than SQL statements. In some cases, developers may not be very well versed in SQL. In some cases, it may just be a preference.
David MenningerAnalyst, ISG's Ventana Research

"It will be significant to developers that would prefer to write code rather than SQL statements," he said. "In some cases, developers may not be very well versed in SQL. In some cases, it may just be a preference."

Beyond support for the Table API, Confluent's addition of new security features is important, according to Menninger.

Specifically, Confluent's platform now offers private networking support for Flink so users of private networks rather than public clouds can take advantage of Flink's capabilities. In addition, the platform now includes client-side field level encryption, which enables customers to encrypt fields within data streams to ensure security and regulatory compliance.

Data volume is growing at an exponential rate. So is the complexity of data. To ensure security so sensitive information remains private, many organizations have hybrid data storage environments, with their less-regulated data stored in public clouds such as AWS and Azure and their more regulated data, such as that with personally identifiable information, kept on premises or in private clouds.

By enabling customers to use Flink in private networks, Confluent is supporting potential customers that may not have been able to use its platform in the past due to security concerns to now use its streaming data capabilities.

Specific features of Confluent's private networking support for Flink, which is generally available on AWS for Confluent Enterprise users, include:

  • Safeguards for in-transit data, including a private network to provide secure connections between private clouds and Flink.
  • Simple configuration that enables users without extensive networking expertise to set up private connections between their private data storage environments and Flink.
  • Flexible data stream processing of Kafka clusters within the secure environment so that private cloud users can benefit from the same speed and efficiency as other Confluent users.

"It may not be very sexy, but new security features including private networking and client-side field-level encryption will be welcomed additions," Menninger said. "Enterprises have a heightened focus on governance, compliance and security. The lack of these capabilities may, in fact, have prevented certain organizations from using Flink previously."

Confluent's impetus for including support for the Table API and the new security features -- along with an extension for the Visual Studio Code development platform -- came from a combination of customer interactions and observation of market trends, according to Jean-Sébastien Brunner, Confluent's director of product management.

Confluent maintains a feedback loop with its users and takes information gathered from that feedback into account when deciding what to add in any given platform update, he said.

In addition, the vendor pays close attention to industry trends to make sure its tools are consistent with those being offered by competing platforms such as Cloudera, Aiven and streaming data tools from tech giants such as AWS, Google Cloud and Microsoft.

Finally, with its roots in the open source community, a focal point for Confluent is making sure that technologies such as Kafka and Flink are accessible and easy to use.

"We look at several signals," he said.

While Confluent's platform update aims to meet customer needs and respond to industry trends, the vendor's acquisition of WarpStream was designed to expand Confluent's reach within an enterprise's data stack by adding new applications for its platform, according to Kreps, Confluent's CEO.

Confluent, which was founded in 2014, provides certain capabilities and is a good fit for certain companies. WarpStream provides different capabilities such as a bring-your-own-cloud (BYOC) architecture that enables users to deploy the streaming data platform in their own clouds rather than a vendor's.

In a sense, BYOC is similar to Confluent's private networking support for Flink. However, as a native architecture, it is a foundation rather than an add-on.

"Our goal is to make data streaming the central nervous system of every company," Kreps said. "To do that we need to make it something that is a great fit for a vast array of use cases and companies. The big thing they did that got our attention was their next-generation approach to BYOC architectures."

Once integrated, WarpStream's BYOC capabilities should help Confluent accomplish its aim of providing customers with more deployment options, according to Menninger.

He noted that some vendors offer a managed cloud service or a self-managed option that can be run in the cloud. Other vendors that are more mature offer both. Both options have benefits and drawbacks. For example, managed cloud versions reduce management burdens but can be expensive. Self-managed versions can be less expensive but require more labor.

WarpStream provides a third choice.

"WarpStream offers an option in between," Menninger said. "Enterprises can offload some of the management and administrative responsibilities, but not all of them."

How data streaming works

Plans

As Confluent plots future platform updates, continuing to add security and networking capabilities to ensure regulatory compliance is a continued focus, according to Brunner. So is enabling customers to connect to external sources to better foster real-time analysis and insights.

"We remain focused on helping our customers get insights faster by making data accessible once it's generated," Brunner said.

Menninger, meanwhile, suggested that Confluent could further meet the needs of customers by enabling them to more easily combine streaming data with data at rest.

While streaming data is an imperative for real-time decision-making, streaming data can have broader applications when used together with data at rest. For example, as enterprises increasingly develop generative AI tools, streaming data could be used to keep models current.

However, despite potential real-world applications for streaming data and data at rest being used together, the two are too often kept separate, according to Menninger. Therefore, anything vendors such as Confluent can do to bring streaming data together with data at rest would be beneficial.

"The worlds of streaming data and data at rest are coming closer together, but they are still largely separate worlds that can be integrated or co-exist," Menninger said. "I'd like to see Confluent and others create a more unified platform across both streaming data and data at rest."

Eric Avidon is a senior news writer for TechTarget Editorial and a journalist with more than 25 years of experience. He covers analytics and data management.

Dig Deeper on Business intelligence technology

Data Management
SearchAWS
Content Management
SearchOracle
SearchSAP
Close