Produce/consume messages in KAFKA with SAP Netweaver using Java Connector – Part 3/3

This is the final part of my blog…. we are now going to join the dots together.

The funny thing with doing a proof of concept on AWS is like getting a taxi ride… every minute counts… 🙂

Let’s recap.

We now have an EC2 instance running in AWS with:

  • SAP NW backend
  • RFC destination setup
  • KAFKA setup and we can produce and consume messges from a topic “my-kafka-topic”
  • SAP JCo Server setup and connection established with the SAP NW

Now, back to the fun part.

Produce a message from SAP to KAFKA

I copied the Java code from “” and put it inside provided by SAP. See from

(I will not take credit for this code.)

Compile and run the code

See code here

You need the following files:

Now this file is a combination of the SAP example “” and some code I copied from here. I didn’t spend too much time on making the code pretty and neat. I just did the necessary to make it work so please don’t judge.

What did I change in


In the handleRequest method, I added the following:

 String message = function.getImportParameterList().getString("REQUTEXT"); // Assign topicName to string variable String topicName = "my-kafka-topic"; // create instance for properties to access producer configs Properties props = new Properties(); // Assign localhost id props.put("bootstrap.servers", "localhost:9092"); // Set acknowledgements for producer requests. props.put("acks", "all"); // If the request fails, the producer can automatically retry, props.put("retries", 0); // Specify buffer size in config props.put("batch.size", 16384); // Reduce the no of requests less than 0 props.put("", 1); // The buffer.memory controls the total amount of memory available to the // producer for buffering. props.put("buffer.memory", 33554432); props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer"); props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer"); producer = new KafkaProducer<String, String>(props); producer.send(new ProducerRecord<String, String>(topicName, message, message)); function.getExportParameterList().setValue("RESPTEXT", "Message sent successfully"); System.out.println("Message sent successfully"); // producer.close();​

What the code does is take the text passed from the STFC_CONNECTION function module, and call the Apache KAFKA producer API with it. It’s that simple.

Compile and run. Notice I need to put both JCo and KAFKA libraries in the classpath now.

export KAFKA_HEAP_OPTS="-Xmx512M -Xms256M" javac -cp ~/sapjco30/sapjco3.jar:/opt/kafka/libs/*

Run the JCo Server

java -cp ~/sapjco30/sapjco3.jar:/opt/kafka/libs/*:. StepByStepServer

Start KAFKA consumer

/opt/kafka/bin/ --bootstrap-server localhost:9092 --topic my-kafka-topic

Call RFC function module


Here’s a link to the youtube video for the result

Consume a message using the Java connector client and call RFC

Now let’s try the other way. Someone changed a transaction outside and published to a KAFKA topic and SAP wants to know about it and do something with it.

The flow is:

  1. Message is produced to KAFKA topic
  2. Java client (with Java Connector) consumes the message.
  3. Java client calls SAP RFC
  4. SAP RFC do something with the message.

I copied the Java code from “” and put it inside provided by SAP. See from

(I will not take credit for this code.)

Setup and run the JCo Client

See code here

You need the following files:

Now let’s explain. is a copy of the which I borrowed from here  as mentioned, combined with the code from the from the SAP example.

The code can already consume a message from a KAFKA topic “my-kafka-topic”, and I take that message and call function STFC_CONNECTION in SAP with the message.

The function will echo back the text showing it has successfully received it.

In the doWork method which is called when a message is received, I’ve added the code to call function STFC_CONNECTION. It should be straight forward what the code does.

Compile and run the Java client

javac -cp ~/sapjco30/sapjco3.jar:/opt/kafka/libs/* -Xlint:deprecation *.java java -cp ~/sapjco30/sapjco3.jar:/opt/kafka/libs/*:. SapKafkaConsumeDemo

Produce a message to KAFKA topic

/opt/kafka/bin/ --broker-list localhost:9092 --topic my-kafka-topic



Here’s a youtube video.

In summary, it is possible and also not too difficult to do so.

What we now need to explore is how to productionize this solution, how to make this HA/DR etc. I’m still have some unanswered questions on how this can handle massive volume in an enterprise environment, or whether the JCo server and client should be on a separate instance.

Anyway, thanks for you time, I hope you find this interesting. Leave me some comments below.