Products

Solutions

Resources

Developers

Company

A new Getting Started Experience with Kafka & Conduktor

A new Getting Started Experience with Kafka & Conduktor

Big Update! Try Kafka in a couple minutes and dive into Conduktor's unique Kafka developers & organizational features.

Stuart Mould

Dec 18, 2023

A new Getting Started Experience
A new Getting Started Experience
A new Getting Started Experience
A new Getting Started Experience

Title

Title

Title

We're happy to announce that we have massively improve Conduktor experience in our latest release! Checkout our changelog to read everything about the nifty-grifty details. Keep reading to know how to enjoy this new experience.

Let’s dive into what’s new!

New Kafka Getting Started Experience

Our Getting Started installation provides an embedded Kafka cluster, continuously fed with real-time data. Beyond Kafka exploration and troubleshooting assistance, we also helps setting up data encryption and data masking, protect the cluster from bad topic configuration (safeguarding it), and other advanced features not available in classic Kafka. Better yet, connect to your own Kafka and see for yourself.

To accompany the latest release of Conduktor and the new Getting Started experience, I'll share with you:

  • How to quickly explore your Kafka

  • How to encrypt your topic data securely

  • How to partially reveal/hide topic data with data masking

  • How to protect your Kafka infrastructure from misbehaving clients

  • How to quickly adjust this setup to see the benefits on your own Kafka!

Copy this command and drop it into your terminal to get started:

# Start Conduktor in seconds, powered with Redpanda
curl -L https://releases.conduktor.io/quick-start -o docker-compose.yml \
  && docker compose up -d --wait

his command runs a Docker Compose file, as Conduktor is just a bunch of Docker containers. You'll need to install Docker. My preference is Homebrew for those on Mac! When our Docker Compose is running, you'll get everything: a Kafka cluster, a data generator, and Conduktor! Once running, visit localhost:8080 to setup your profile and get started.

After login, you'll be greeted by the homescreen where you can quickly navigate through common options: connect your cluster, manage groups & permissions with RBAC or jump straight to all the topics. Let's start by see how to explore Kafka data.

Exploring Kafka topics

As mentionned, we fed continuouly the embedded Kafka with a data generator of ours, to produce sample ecommerce data to some topics: customers, products, purchases and returns. If you're observant, you'll notice that the customers topic also includes encrypted and masked versions. We'll delve into those shortly.

Below is a view of the topic customers_masked. If you open a record, you'll notice you cannot see all the values in plain text:

From the side bar, you can quickly get access to the core parts of Kafka: Schema Registry, Consumer Groups, Kafka Connectors, or zoom into the topics where there is a wealth of features to explore your data.

You should be able to navigate through all these topics on your get started cluster (named cdk-gateway):

The Magic to Encrypt and Mask data

Now that you're familiar with exploring topics, let's delve into the magic of data encryption and decryption.

In our example, the data within the 'customers' topics are considered sensitive and thus require encryption.

  • Switch your view to the Kafka cluster 'local-kafka' (using the cluster selector at the top left). Here, you'll find the 'customers_encrypted' topic. This contains encrypted data that's unreadable to you. You might recall seeing it decrypted in the first cluster, 'cdk-gateway,' and wonder why. Yes, you're right – let's unravel this mystery!

  • Return to the 'cdk-gateway' cluster. Here, you can view the data in plain text. This decryption happens seamlessly because you have the necessary authorization. Enhanced by Conduktor, this Kafka cluster ensures that data are securely encrypted at rest in your Kafka. 'cdk-gateway' provides a view of what your user is permitted to see in your Kafka.

Upon exploring the 'customers_masked' topic, you'll observe that certain fields are masked, making their full values unreadable. This feature is crucial for protecting PII data from being visible to unauthorized users in your organization. This topic, too, has been enhanced by Conduktor. The original cluster retains the non-masked, original values.

The magic here is powered by what we call Conduktor interceptors. Let's delve deeper into these interceptors by navigating to the Gateway Interceptors menu, as illustrated below:

Out of the box, we have added a few interceptors for you to get a sense of this new experience:

  • Encrypt on Produce: we encrypt data produced by the data generator in some topics

  • Decrypt on Consume: we decrypt the data when you consume data

  • Field level Data Masking: we mask parts of the data when you consume data

On the Gateway Interceptors Page, we can see each of these security plugins and their configuration.

For instance, click on the masking interceptor to see its configuration:

  • See the topics on which it is applied

  • See which field is fully mask: credidCardNumber and email

  • See the various masking strategy, like the first 9 characters of phone

These interceptors add a layer of security and new features to your Kafka without modifying your Kafka cluster, duplicating data, or changing your applications.

Let's see how to protect your Kafka cluster from misbehaving or faulty Kafka clients! This is a common problem we see in the field as Kafka producers/consumers comes with a lot of configuration options that can be misused.

Advanced Kafka Protection: A Guide to Safeguard Interceptors

Conduktor is compatible with all the Kafka providers (Open-source, Confluent, AWS MSK, Aiven, Redpanda etc.) as we speak the general Kafka Protocol.

In our get started cluster, we have added a few interceptors to protect the Kafka cluster from misbehaving clients (for good or bad reasons). Let's take a look at what is protected and how you can configure these protections.

On the Gateway Interceptors Page, you can see the following plugins:

  • Guard on Produce: only allow producers if they are using compression.type

  • Guard create Topics: topics starting project- must have between 1 and 3 partitions

Try for yourself:

  • Create a topic project-xxx with 4 partitions. This will result in an error.

We've set the interceptor to BLOCK such actions. Alternatively, you can choose to ALLOW & LOG requests, facilitating a smoother migration. Additionally, there's the option to ALTER the request, tailoring it to meet specific criteria while overriding the client's input (use this feature cautiously).

It's important to note that this behavior isn't limited to the Conduktor Console, which is merely a client in this scenario. The same error will occur if you attempt these actions with your own applications or through GitOps.

There are many more protective measures available, and a vast array of Kafka properties can be safeguarded. Visit the Marketplace to explore all the available options..

Your Business, Your Own Interceptors

You will have your own criteria and rules to govern your Kafka ecosystem. Let's have a go now at creating an example for creating topics.

First, delve into the configuration of the existing guard-create-project-topics interceptor. You can find this by inspecting the relevant interceptor in the list. Once located, copy its configuration directly from the text editor. This configuration will serve as the foundation for your new interceptor.

Now, back to the main list, click on + Interceptor at the top right, search 'create' and locate the 'Create Topic Policy' plugin.

  • Paste the configuration you copied earlier

  • Change the interceptor name (to not conflict with the existing one)

  • Modify the configuration to suit your requirements. Let's say you want to apply a global configuration for all topics: remove the "topic" field

  • Click on Deploy Interceptor to activate it

Now, no one will be able to create a topic with more than 3 partitions!

Need some creative inspiration? Delve into the embedded documentation for further examples of configurations. Alternatively, explore our range of demos for practical insights: Conduktor Gateway Demos.

No more topics with 5000 partitions or consumer suffering data loss because they did not set acks=all!

How do you Protect your own Kafka clusters?

You might be eager to delve into a Kafka environment that's more familiar to you: your own. Linking to your personal Kafka cluster is both quick and straightforward. Starting from the Home screen, follow the wizard Connect your Kafka clusters -> or select 'Manage Clusters' from the cluster dropdown menu.

Once you're in the 'Manage Clusters' panel, follow the intuitive on-screen instructions to add your cluster configuration. You can also refer to one of our handy guides for step-by-step assistance. With your cluster connected, you're all set to explore your Kafka topics and delve deeper into your Kafka infrastructure.

To fully leverage the interceptors on your own Kafka cluster, a Gateway deployment connected to your cluster is required. For a streamlined setup, refer to our comprehensive Docker deployment documentation. For more extensive deployment options, explore our detailed guide on other deployment methods (LB, SNI).

Behind the Code: Uncovering Our Engineering Achievements

In discussing how we enhance your Kafka experience with features like encryption, data masking, and cluster protection, you might wonder about the mechanics behind it and the role of the cdk-gateway cluster, especially when switching from the local-kafka cluster.

Conduktor operates through two primary components:

  • a user-friendly UI that connects directly to your clusters like any standard application

  • a Kafka proxy strategically positioned between your applications and your Kafka.

When you connect to your own Kafka, you gain all the advantages offered by the UI. To extend these benefits across all your applications, you connect them to the Conduktor Gateway.

This proxy is designed to envelope your Kafka clusters, is fully Kafka-wire compatible, and does not require any changes to any application (it works no matter it's Java, Spring, Python, even the Kafka command-line!). Simply update the bootstrap.server to point to a Gateway, and continue working normally.

Conclusion

In summary, we've explored the substantial upgrades in Conduktor's latest release, showcasing how it enhances any Kafka experience. From an easy start with an embedded Kafka cluster to advanced features like data encryption and masking, we've covered many aspects often mandatory in organizations.

If you have done it yet, Get Started with Conduktor. If you are ready to get serious when it comes to Kafka, book a chat with us.