Analyzing ServiceNow processes using Process Mining : ServiceNow linked to Logpickr
Introduction
In today’s world we deal a lot with real-time streaming data and events that come from a bunch of different sources and from which we can derive stats, facts and information that are crucial for today’s businesses.
Processing and storing these real-time streams in a scalable and fault-tolerant manner is now possible using Kafka as an Event Streaming Platform and KsqlDB as an Event Streaming Database. These technologies have made Event Stream Processing Applications thrive.
In today’s article i will discuss the huge benefit that Logpickr Process Explorer 360 platform can derive to any kind of business that uses ServiceNow as its task management platform and the contribution of the Confluent Platform in such context.
Service Now is a task management platform. it manages several type of organizations processes.
Logpickr Process Explorer 360 is a platform that brings together Process Mining and Machine Learning technologies to help you move your business forward through analyzing, anticipating, optimizing any type of incoming process and most importantly, predicting the future behavior of each analysed process which will allow you to accelerate your digital transformation and your RPA automation projects.
Linking ServiceNow to Logpickr Process Explorer 360 will provide any organization a better overview of its ServiceNow processes and a better understanding of its process actual and future behavior.
This powerful data-driven decision making provided by Logpickr Process Explorer 360 will highly accelerate the organization’s digital transformation.
In order to realize such a combination, Confluent Platform is used. It is a fully managed Kafka service and enterprise stream processing platform that allows building Real-time data streaming applications. Recently, Confluent have launched KsqlDB, an event streaming database built on top of Kafka for stream processing applications. This component plays an amazing role in processing ServiceNow data that have been imported to a Kafka Topic before sending it to Logpickr Process Explorer 360 for further analysis.
Confluent Platform in Action
The following Figure describes the architecture that have been set up by logpickr to link ServiceNow to its Logpickr Process Explorer 360 platform.
Demonstration
In this demonstration i will be using :
- Docker in order to run the Kafka cluster.
- Confluent Platform in order to have the necessary commands to install connectors etc.
- ServiceNow account to get the demo processes
- ServiceNow Kafka Source Connector
Download Docker for Ubuntu
Download the Confluent Platform for Ubuntu
You need to install Confluent Platform. It is a fully managed Kafka service and enterprise stream processing platform that allows building Real-time data streaming applications.
tar -xzf confluent-5.5.1-2.12.tar.gz
After that, add the path of the confluent as follows :
sudo su
nano /etc/profile
Add the following
export CONFLUENT_HOME=/home/ubuntu/confluent-5.5.1
export PATH=$PATH:$CONFLUENT_HOME/bin
Then
source /etc/profile
ServiceNow Source Connector
The Confluent Hub Client is integrated within the Confluent Platform.It allows you to install the different connectors to connect KsqlDB to your data sources.
confluent-hub
Here the ServiceNow Source Connector that you need to install for this demo. For the installation, select the second option : 2./(installed rpm/deb package)
. The connectors will be installed in /usr/share/confluent-hub-components
.
# To Connect ServiceNow
confluent-hub install confluentinc/kafka-connect-servicenow:2.0.1#
ServiceNow account
- Go to ServiceNow developer web site and create an account.
2. Once done, you should create an instance then access it
3. You can then run some API calls on the available demo tables. For this demo i have chosen the “incident” table that will display the information about the incident processes
When running the GET API call, ServiceNow provides you with your API URL
GET https://devxxxx.service-now.com/api/now/v1/table/{tableName}
This is the URL that we will be using to connect ServiceNow to Kafka.
Connecting ServiceNow to Kafka
First, create the /opt/docker-compose
directory and put the following docker-compose.yml
File.
sudo mkdir /opt/docker-compose
sudo nano /opt/docker-compose/docker-compose.yml
The installed connectors will also be mounted in the connect container
inside docker-compose.yml
as follows :
volumes:
- /usr/share/confluent-hub-components/:/usr/share/confluent-hub-components
Run docker-compose to deploy the Kafka cluster provided by Confluent.
docker-compose up -d
Create the topic where the ServiceNow data will be imported and stored
docker-compose exec broker kafka-topics --create --bootstrap-server \
broker:29092 --replication-factor 1 --partitions 1 --topic servicenow
Method 1 : Using the Connect Component
- Access your confluent control-center :
http://localhost:9021
2. In the connect part, click on upload connector then upload the following connector
servicenow.url
is the API URL as described before
servicenow.table
specifies the table on which you run the API GET call
kafka.topic
is the topic created previously and on which the imported ServiceNow data will be stored for further processing.
servicenow.user
and servicenow.password
are the credentials that allow you to access your instance on which you have executed the API call previously.
You can check connector’s logs as follows
curl -s -X GET http://127.0.0.1:8083/connectors/ServiceNowSourceConnector/status
You should have your connector up and running. Then, you can verify that data have been successfully imported to the topicservicenow
.
3. Consume your imported data from the servicenow
topic as follows
docker-compose exec broker \
kafka-console-consumer --bootstrap-server broker:29092 --topic servicenow --from-beginning
Method 2 : Using KsqlDB Component
docker-compose exec ksqldb-cli ksql http://ksqldb-server:8088
Then, create the connector
CREATE SOURCE CONNECTOR ServiceNowSourceConnector WITH ('connector.class' = 'io.confluent.connect.servicenow.ServiceNowSourceConnector','kafka.topic' = 'servicenow','servicenow.url' = 'https://devxxx.service-now.com/','tasks.max' = '1','servicenow.table' = 'incident','servicenow.user' = 'xxxx','servicenow.password' = 'xxxxx','servicenow.since' = '2019-01-01','key.converter' = 'io.confluent.connect.avro.AvroConverter','key.converter.schema.registry.url' = 'http://schema-registry:8081','value.converter' = 'io.confluent.connect.avro.AvroConverter','value.converter.schema.registry.url' = 'http://schema-registry:8081', 'confluent.topic.bootstrap.servers' = 'broker:29092','confluent.license' ='','poll.interval.s' = '10','confluent.topic.replication.factor' = '1');
Finally, you can verify if things have worked correctly
show connectors;
describe connector SERVICENOWSOURCECONNECTOR;
show topics;
print 'servicenow' from beginning;
Once the data get into ServiceNow topic, it is given to Logpickr Process Explorer 360 for further analysis, the following figure describes the obtained result.
You can run the following commands to stop and clean docker
docker stop $(docker ps -a -q) &&docker rm $(docker ps -a -q)