Confluent Platform: Connecting SAP HANA to Kafka

Introduction

In today’s world we deal a lot with real-time streaming data and events that come from a bunch of different sources and from which we can derive stats, facts and information that are crucial for today’s businesses.

Processing and storing these real-time streams in a scalable and fault-tolerant manner is now possible using Kafka as an Event Streaming Platform and KsqlDB as an Event Streaming Database. These technologies have made Event Stream Processing Applications thrive.

I have planned a series of articles that explain and demonstrate how to connect the following data sources using Kafka Connector to the Kafka Cluster :

In this article, i will be focuing on setting up SAP Source Connector using a containerized SAP and Confluent Platform.

Having Kafka linked to SAP will allow us to store real-time SAP streams in a Kafka topic and to process them using the event streaming database KsqlDB.

The principles behind Kafka and KsqlDB have been explained in my previous articles:

In this demonstration, I will be using :

  1. Docker in order to run the Kafka cluster.
  2. Confluent Platform in order to have the necessary commands.

Requirements

First, you need to have docker installed. To install Docker, follow these steps for Ubuntu:

  1. Install Docker Engine
  2. Install Docker Compose
  3. Setup Post-installation steps

Then, you need to install Confluent Platform. It is a fully managed Kafka service and enterprise stream processing platform that allows building Real-time data streaming applications.

After that, add the path of the confluent as follows :

Add the following

source /etc/profile

The Confluent Hub Client is integrated within the Confluent Platform.It allows you to install the different connectors to connect KsqlDB to your data sources.

Here are the connectors that you need to install for this series of demo. However, for this demonstration, we will install the SAP Kafka Source Connector manually. For the installation, select the second option : 2./(installed rpm/deb package ) . The connectors will be installed in /usr/share/confluent-hub-components .

SAP Kafka Source Connector setup

First, you need to get Java JRE installed.

Then, go to SAP Development Tools web site and download the following :

Then run the following:

You will have all the files extracted to ~/sap/hdbclient/ . In this file you will find ngdbc.jar .

Next, you need to setup the Kafka Source Connector for SAP.

Confluent Platform along with SAP using Docker

First, create the /opt/docker-compose directory and put the following docker-compose.yml File.

  • Ensure having a Docker Hub account and to run docker login to enter your credentials before running docker-compose up -d to be able to pull the SAP HANA image.
  • Ensure creating mkdir data/hana/ inside /opt/docker-compose to persist SAP HANA data outside of the container. This directory will be mounted using ./data/hana:/hana/mounts/ in the volume part of the hana container inside docker-compose.yml as follows:
  • Ensure according the right permissions to data/hanaas follows:
  • The installed connectors will also be mounted in the connect container inside docker-compose.yml as follows :

Demonstration

First, run docker-compose:

Then run the following command to check SAP Hana’s logs

Once done, access the hana container to confirm that the SAP installation has been successful

Then, inside the hana containerrun the following:

You should be logged in as hxeadm, the default SAP HANA, express edition user.

You can also enter the following:

  • hdbnameserver
  • hdbcompileserver
  • hdbdiserver
  • hdbwebdispatcher

Before using the connector, you can verify that you can access to your SAP database as follows:

Method 1 : Inside the hanacontainer

Inside the hana container, You can log into your tenant database with the following command:

Then , run the following to check the available tables:

Run the following to display the content of the DUMMYtable

Method 2 : Outside the hana container

To connect to our SAP, we use the following command

Once you have confirmed that you can successfully access to your SAP, run the SAP Kafka Connector as follows :

Method 1

  1. Access your confluent control-center :

2. In the connect part, click on upload connector then upload the following connector

You can check connector’s logs as follows

You should have you connector up and running. Then, you can verify that data have been successfully imported to the topic test_topic_1 .

3. Consume your imported data from the test_topic_1 topic as follows

Method 2

Then, create the connector

Finally, you can verify if things have worked correctly

You can run the following commands to stop and clean docker

References

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store