Set up a Kafka Connect Cluster in IBM Cloud Pak for Integration.

Mohamed Hassan
4 min readMar 13, 2021

Learning objectives

In this tutorial, you will learn how to setup Kafka Connect Cluster in IBM Cloud Pak for Integration.

Prerequisites

To complete this tutorial, you need

· Red Hat OpenShift cluster in your preferred Cloud. In this tutorial, I’m using IBM Red Hat OpenShift Cluster.

· Cloud Pak for Integration in you preferred Cloud. In this tutorial, I’m using Cloud Pak for Integration in IBM Cloud.

· Event Streams Cluster. (To Setup your Cluster. Please refer to this tutorial.)

. OpenShift CLI.

Steps

If you have the kafka-connect-s2i.yaml file. You can skip steps from 1 to 3

  1. Click Find more in the Toolbox tile.

2. Scroll to Setup A Kafka Connect environment tile. Click setup.

3. Click Download Kafka Connect Zip tile. Download the compressed file and extract the contents to your preferred location.

4. Enter your connect cluster name in the downloaded kafka-connect-s2i.yaml file.

5. Update the bootstrap servers by the configured listener. You can refer to Generate Kafka credentials to connect to your cluster tutorial.

6. Enter Event Streams version that you used in your Event Streams Cluster installation. I used the default version 10.1.0 in my installation. If you changed it, please edit the version with your Cluster’s version.

7. Enter the container’s name by editing productChargedContainers in the yaml file. The name should consist of the cluster name that you used in step 4 in this tutorial concatenated with “-connect”. I used “demo-connect-cluster” in the cluster name. My container name should be “demo-connect-cluster-connect”.

8. You need to enter your Cloud Pak of Integration version. In this tutorial, I use 2020.3.1.

9. To establish secure connection between your Kafka Connect and Event Streams, Enter the secret name that contains your CA certificate. The CA certificate is added as a secret to the namespace/project that you used to install your Event streams cluster. The secret name consists of Event Streams Cluster name concatenated with “-cluster-ca-cert”. I used “demo-min-prod” in the cluster name. My secret should be “demo-min-prod-cluster-ca-cert”.

You can verify the certificate existence by running the below Openshift command

oc get secret YOUR_SECRET_NAME -n YOUR_NAMESPACE

10.Enter the username that you previously created or create new one . You can refer to Generate Kafka credentials to connect to your cluster tutorial to create new user.

11. Event Streams generate password for your user and persists it in openshift secret. The secret has the same name of the username that you used. You can refer to Generate Kafka credentials to connect to your cluster tutorial to retrieve your existing password or create new user.

12. To reference to the password in the secret. Please change the value of the password propriety to be “password”

13. Now, you are ready to setup you Connect in Openshift. You can run the below command to create you Kafka Connect Cluster.

oc apply -f kafka-connect-s2i.yaml -n YOUR_NAMESPACE

14. You can verify the connect cluster creation by running the below command. The connect cluster name that you created should be listed in the result.

oc get kafkaconnects2i YOUR_CLUSTER_NAME-n YOUR_NAMESPACE

The result will be something similar to the below result:

NAME                   DESIRED REPLICASdemo-connect-cluster   1

15. You can verify the connect pod status by running the below command.

oc get pods -l eventstreams.ibm.com/name=YOUR_CONTAINER_NAME -n YOU_NAMESPACE

YOUR_CONTAINER_NAME is the name that you used in this tutorial in Step 7.

The result will be something similar to the below result:

NAME                                   READY   STATUS    RESTARTS   AGEdemo-connect-cluster-connect-1-j9z9b   1/1     Running   0          3h41m

Now You have Kafka Connect has been setup and it is ready to be used.

Next: To Learn how to deploy and configure IBM Cloudant Kafka Connector for IBM Event Streams, you can refer to this tutorial.

--

--