Photos via Pexels

Introduction

KVM stands for Kernel based Virtual Machine and according to Redhat, it is an open source virtualization technology built into Linux. Specifically, KVM lets you turn Linux into a hypervisor that allows a host machine to run multiple, isolated virtual environments called guests or virtual machines (VMs).

In today’s article, i will mainly cover the topic of KVM VMs backup.

Backup your KVM VM

First, Login as sudo and list all of your available KVM virtual machines.

Next, you need to shutdown the VM you want to backup.

Then run the following to verify that the…


Photos via Pexels

What is DevOps

According to IBM, DevOps has become an increasingly common approach to software delivery that development and operations teams use to build, test, deploy, and monitor applications with speed, quality, and control.

DevOps is essential for any business aspiring to be lean, agile, and capable of responding rapidly to changing marketplace demands. It is an approach on the journey to lean and agile software delivery that promotes closer collaboration between lines of business, development, and IT operations while removing barriers between stakeholders, and customers.

Development teams need to design, develop, deliver and run the software as quickly and reliably as possible…


Photos via Pexels

Introduction

In my previous articles , i have covered monitoring using Prometheus along with Grafana. and alerting using the AlertManager integrated to Prometheus.

In today’s article, i will mainly cover the topic of integrating the Postgres exporter to Prometheus in order to get PostgreSQL databases metrics for a monitoring purpose.

Setting up the Postgres exporter

Download the Postgres Exporter Binary

First, you need to download the postgres exporter binary tar archive and untar it as follows

cd /opt/postgres_exporterwget https://github.com/wrouesnel/postgres_exporter/releases/download/v0.5.1/postgres_exporter_v0.5.1_linux-amd64.tar.gztar -xzvf postgres_exporter_v0.5.1_linux-amd64.tar.gzcd postgres_exporter_v0.5.1_linux-amd64sudo cp postgres_exporter /usr/local/bin

Prepare the env File

Then, you need to prepare an env file for the postgres exporter where we will set a variable known as…


Introduction

In today’s world we deal a lot with real-time streaming data and events that come from a bunch of different sources and from which we can derive stats, facts and information that are crucial for today’s businesses.

Processing and storing these real-time streams in a scalable and fault-tolerant manner is now possible using Kafka as an Event Streaming Platform and KsqlDB as an Event Streaming Database. These technologies have made Event Stream Processing Applications thrive.

In today’s article i will discuss the huge benefit that Logpickr Process Explorer 360 platform can derive to any kind of business that uses ServiceNow…


Introduction

The Splunk Source connector provided by Confluent does not support receiving data from a Splunk Universal Forwarder or Splunk Heavy Forwarder.

This connector simply listens on a network port of an application that posts data to Splunk.

In this article, i will be focuing on setting up Splunk Source Connector using a containerized Splunk and Confluent Platform.

For this demonstration, I will be using :

  1. Docker in order to run the Kafka cluster.
  2. Confluent Platform in order to have the necessary commands.

Requirements

First, you need to have docker installed. To install Docker, follow these steps for Ubuntu:

  1. Install Docker Engine


Introduction

In today’s world we deal a lot with real-time streaming data and events that come from a bunch of different sources and from which we can derive stats, facts and information that are crucial for today’s businesses.

Processing and storing these real-time streams in a scalable and fault-tolerant manner is now possible using Kafka as an Event Streaming Platform and KsqlDB as an Event Streaming Database. These technologies have made Event Stream Processing Applications thrive.

I have planned a series of articles that explain and demonstrate how to connect the following data sources using Kafka Connector to the Kafka Cluster…


Photos via Pexels

Introduction

With the increasing amount of company’s data, the task of protecting it becomes more challenging, especially in the cloud. As a result, the demand for reliable backup and recovery solutions has never been greater.

According to IBM, Backup and restore refers to technologies and practices for making periodic copies of data to a separate, secondary device and then using those copies to recover the critical company’s data in cases where data is lost or damaged due to different events such as power outage, cyberattack, human error or disaster.

The aim behind this article is to explain how to backup a…


Photos via Pexels

Introduction

In my previous articles, i have explained how to mount the Object Storage Bucket as a File System on Flexible Engine’s ECS Instances.

In this article, i will be using the s3fs-fuse project along with the Swift S3 API to mount OVH Object storage bucket on an OVH instance.

What is Object Storage

Object Storage allows you to store a large amount of unstructured data in a highly scalable manner using REST APIs.

The benefits behind using such service are :

Durability : data is kept safe on the cloud.

Availability : data is available and…


Introduction

In today’s world we deal a lot with real-time streaming data and events that come from a bunch of different sources and from which we can derive stats, facts and information that are crucial for today’s businesses.

Processing and storing these real-time streams in a scalable and fault-tolerant manner is now possible using Kafka as an Event Streaming Platform and KsqlDB as an Event Streaming Database. These technologies have made Event Stream Processing Applications thrive.

I have explained the basics behind Apache Kafka and Zookeper in my previous article. However, In this article, i will be dealing with the concepts…


Photos via Pexels

Introduction

Based on Packer’s documentation, Packer is defined as an open source tool for creating identical machine images for multiple platforms from a single source configuration.

A machine image is a single static unit that contains a pre-configured operating system and installed software which is used to quickly create new running machines.

There are many advantages of using packer :

  • Super fast infrastructure deployment. Packer images allow you to launch completely provisioned and configured machines in seconds,
  • Multi-provider portability. Because Packer creates identical images for multiple platforms, you can use those images to run your machines on any Cloud Provider plateform.

Sylia CH

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store