Visualize Azure Activity logs on Elastic-Kibana stack !

As Cloud adoption is growing, it has become more important than ever to monitor your cloud resources, activities in your cloud accounts and track the events happening within your services.

Azure tracks all the events in your Azure Account/Subscription and publishes it to Azure Activity Log service. As the number of events grow it becomes really difficult to filter these logs and translate them into readable information.



This is where ElasticSearch-Kibana stack makes life easy. All you need to do is stream these logs to Elastic Search service and then use Kibana to visualize the logs.

Architecture (Local ELK Stack – Elastic-Logstash-Kibana)


The above architecture shows ELK stack setup on a Linux or Windows VM in a public subnet.

Setting up ElasticSearch, Kibana and Logstash is not in scope of this article. We will be covering the setup in a separate article and linking here once it is ready. Lets get started-


Steps :

  • Login to the instance where Logstash service is running and add the following to the input section of the logstash config file-
input {
 http {
 codec => "json" 
 port => "8080"

The following configuration should be present in output section to push the ingested data to elastic search.

output {
 elasticsearch {
 hosts => "localhost:9200"
 manage_template => false
 index => "azure-%{+YYYY.MM}"
 document_type => "%{[@metadata][type]}"
 stdout {} 

This configuration enables http-input-plugin to listen to requests coming on port 8080 and output the data to Elastic Search endpoint. Make sure this port is open on the VM firewall inbound ports as well as Network Security Group.

  • Start the Logstash service. (service should keep running in background)
  • Login to Azure Portal and open EventHub service.
  • Click on +Add to Create a Namespace. Enter the details as given below and click on Create.


  • Once Namespace is created, Click on the +EventHub option to create EventHub.


  • Open App Services and click on +Add to create Function App.



  • Select the newly created App Function and click on + to create a Function. Choose Custom Function from right pane-


  •  Select Event Hub trigger from the set of triggers. Enter values as shown below. Click on New for selecting Event Hub Connection-


  • Select Connection values as below and click on Create.


  • Now open the function and add the below code to the run.csx file and save it.
using System;
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
using System.Net;

public static void Run(string myEventHubMessage, TraceWriter log)
String data = myEventHubMessage;
log.Info($"C# Event Hub trigger function processed a message: {data}");
data = data.Replace("", "");
data = data.Replace("", "");
JToken token = JObject.Parse(data);
//var parsedData = format_json(data);
string url = "http://<Logstash Server Public IP>:8080/";
foreach (var record in token.SelectToken("records"))
string result = ",";
using (var client = new WebClient())
client.Headers[HttpRequestHeader.ContentType] = "application/json";
result = client.UploadString(url, "POST", record.ToString());
  • The above C# code uses Newtonsoft.Json library which is not included in Function App by default. So we need to include those libraries in our Function. Add a file by name Project.json and paste the below code and save it.
"frameworks": {
"dependencies": {

  • Function configuration is now complete. Now open the Azure Activity log service and click on Export.
  • Select the Subscription, Region and configure the logs to be exported to Event Hub created earlier.


The configurations are complete. Logs will start flowing from Activity logs Service –> Event Hub –> Function App –> Logstash –> Elastic Search

Open Kibana and verify that Index are created and data has started flowing in. Below are some sample dashboard screenshots showing important Azure events-




Thanks for checking out.

Please comment in case of any issues or questions.



All comments.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.