Logging using Elastic Cloud for MuleSoft using HTTP Appender

Mazhar Ansari
5 min readMay 11, 2020

--

Note: While copy/paste the code snippet from blog please covert quotes and double quotes.

What is ELK?

  • ELK is the acronym for three open source projects: Elasticsearch, Logstash, and Kibana
  • Elasticsearch is a search and analytics engine
  • Elasticsearch is an open source, full-text search and analysis engine, based on the Apache Lucene search engine
  • Logstash is a server-side data processing pipeline that ingests data from multiple sources simultaneously, transforms it, and then sends it to a “stash” like Elasticsearch
  • Kibana lets users visualize data with charts and graphs in Elasticsearch

Why do we need a system like ELK?

  • Log aggregation and efficient searching
  • Generic Search

There are three main reasons we need ELK

  • It’s Interoperable
  • It’s Open Source
  • It’s Managed

What is Elastic Cloud?

Elastic cloud is offered from ELK on AWS. There are few benefits:

  • On-demand computing
  • Pay only for what you use
  • Failover and fault tolerance
  • Common coding
  • Ease of implementation

Register with Elasctc.co:

https://elastic.co
  • When you login for first time It will ask you to create a default ELK deployment which will install ElasticSearch and Kibana installation as below
Deployed Instances
  • It will also create username and password for elastic search please note it down as it will be use full for later stages

Enable Custom Logging For On Premise MuleSoft Runtime:

  • Go to Project
  • Open src/main/resoruces/log4j2.xml
  • Add below xml tag in Configuration/Appenders
<Http name=”ELK-Cloud” url=”https://bf1718d968f54023b4b4e75badd4fee2.us-east-1.aws.found.io:9243/applogs/_doc"> 
<Property name=”Authorization” value=”Basic ZWxhc3RpYzoxREd6b0RTMWdTNTdNYlJMbjJ5S2UyU1k=” />
<Property name=”Content-Type” value=”application/json” />
<PatternLayout pattern=’%msg’ />
</Http>
  • url should follow pattern as metioned https://<Elastic-Server-Name>:<Port>/<Index-Name>/_doc

Elastic-Server-Name: Server Name or IP Address of Elastic Search Server

Port: Port Number of Elastic Search Server default value is 9243

Index-Name: Index Name which you to create for these logs.

<AppenderRef ref=”ELK-Cloud” />
  • Run Application
  • See the Console Log
Console Output

Note: for this exercise I am using JSON Logger for logging and also make sure you use PatternLayout as mentioned above

Note: Kibana and Elastic Server URL are different. Please make sure you use correct URLs.

Kibana Portal -> Management
  • Click on Create Index Pattern
  • You can see a new index applogs is created. Select it and click on Next Step
Kibana Portal Index Creation Step 1
  • Click on drop-down and select @timestamp and click on Create Index Pattern
Kibana Portal Index Creation Step 2
  • Start Mule application
  • Run few cases so Mule file can generate the logs
Console Output
Kibana Portal -> Discover
  • In Search you can write any suitable expression to search specific text from log file

Enable Custom Logging For CloudHub Application:

Before enabling the logging for cloudhub application, you need Disable CloudHub logs. By default this option is not available and you need to raise a ticket with MuleSoft for providing this option.

MuleSoft Application Folder Structure

Once you disabled cloudhub logs, MuleSoft is not responsible for below things

  • MuleSoft is not responsible for lost logging data due to misconfiguration of your own log4j appender.
  • MuleSoft is also not responsible for misconfigurations that result in performance degradation, running out of disk space, or other side effects.
  • When you disable the default CloudHub application logs, then only the system logs are available. For application worker logs, please check your own application’s logging system. Downloading logs is not an option in this scenario.
  • Only Asynchronous log appenders can be used, Synchronous appenders should not be used.
  • Use asynchronous loggers and not synchronous ones is to avoid threading issues. Synchronous loggers can lock threads waiting for responses.

You need to create log4j2.xml at location src/main/resources.

You need to add a Http appender inlog4j.xml. Provide url to connect Logz.io with token and type of logger, source, pattern layout etc.

<Http name=”ELK-Cloud” url=”https://bf1718d968f54023b4b4e75badd4fee2.us-east-1.aws.found.io:9243/applogs/_doc"> 
<Property name=”Authorization” value=”Basic ZWxhc3RpYzoxREd6b0RTMWdTNTdNYlJMbjJ5S2UyU1k=” />
<Property name=”Content-Type” value=”application/json” />
<PatternLayout pattern=’%msg’ />
</Http>

You can add more loggers like Log4J2CloudhubLogAppender into your log4j2.xml to enable logging on the cloudhub log console of your application.

<Log4J2CloudhubLogAppender name=”CLOUDHUB”
addressProvider=”com.mulesoft.ch.logging.DefaultAggregatorAddressProvider”
applicationContext=”com.mulesoft.ch.logging.DefaultApplicationContext”
appendRetryIntervalMs=”${sys:logging.appendRetryInterval}”
appendMaxAttempts=”${sys:logging.appendMaxAttempts}”
batchSendIntervalMs=”${sys:logging.batchSendInterval}”
batchMaxRecords=”${sys:logging.batchMaxRecords}” memBufferMaxSize=”${sys:logging.memBufferMaxSize}”
journalMaxWriteBatchSize=”${sys:logging.journalMaxBatchSize}”
journalMaxFileSize=”${sys:logging.journalMaxFileSize}”
clientMaxPacketSize=”${sys:logging.clientMaxPacketSize}”
clientConnectTimeoutMs=”${sys:logging.clientConnectTimeout}”
clientSocketTimeoutMs=”${sys:logging.clientSocketTimeout}”
serverAddressPollIntervalMs=”${sys:logging.serverAddressPollInterval}”
serverHeartbeatSendIntervalMs=”${sys:logging.serverHeartbeatSendIntervalMs}”
statisticsPrintIntervalMs=”${sys:logging.statisticsPrintIntervalMs}”>
<PatternLayout pattern=”[%d{MM-dd HH:mm:ss}] %-5p %c{1} [%t]: %m%n” />
</Log4J2CloudhubLogAppender>

Below is full log4j2.xml which can be used for your application for enabling custom logging on cloudhub and http appender for logz.io.

<?xml version=”1.0" encoding=”UTF-8"?>
<Configuration status=”INFO” name=”cloudhub”
packages=”com.mulesoft.ch.logging.appender,org.apache.logging.log4j”>
<Appenders>
<Log4J2CloudhubLogAppender name=”CLOUDHUB”
addressProvider=”com.mulesoft.ch.logging.DefaultAggregatorAddressProvider”
applicationContext=”com.mulesoft.ch.logging.DefaultApplicationContext”
appendRetryIntervalMs=”${sys:logging.appendRetryInterval}”
appendMaxAttempts=”${sys:logging.appendMaxAttempts}”
batchSendIntervalMs=”${sys:logging.batchSendInterval}”
batchMaxRecords=”${sys:logging.batchMaxRecords}” memBufferMaxSize=”${sys:logging.memBufferMaxSize}”
journalMaxWriteBatchSize=”${sys:logging.journalMaxBatchSize}”
journalMaxFileSize=”${sys:logging.journalMaxFileSize}”
clientMaxPacketSize=”${sys:logging.clientMaxPacketSize}”
clientConnectTimeoutMs=”${sys:logging.clientConnectTimeout}”
clientSocketTimeoutMs=”${sys:logging.clientSocketTimeout}”
serverAddressPollIntervalMs=”${sys:logging.serverAddressPollInterval}”
serverHeartbeatSendIntervalMs=”${sys:logging.serverHeartbeatSendIntervalMs}”
statisticsPrintIntervalMs=”${sys:logging.statisticsPrintIntervalMs}”>
<PatternLayout pattern=”[%d{MM-dd HH:mm:ss}] %-5p %c{1} [%t]: %m%n” />
</Log4J2CloudhubLogAppender>
<Http name=”ELK-Cloud” url=”https://bf1718d968f54023b4b4e75badd4fee2.us-east-1.aws.found.io:9243/applogs/_doc">
<Property name=”Authorization” value=”Basic ZWxhc3RpYzoxREd6b0RTMWdTNTdNYlJMbjJ5S2UyU1k=” />
<Property name=”Content-Type” value=”application/json” />
<PatternLayout pattern=’%msg’ />
</Http>
</Appenders>
<Loggers>
<AsyncLogger name=”org.mule.runtime.core.internal.processor.LoggerMessageProcessor” level=”INFO” />
<AsyncLogger name=”com.mulesoft.agent” level=”INFO” />
<AsyncRoot level=”INFO”>
<AppenderRef ref=”CLOUDHUB” />
<AppenderRef ref=”ELK-Cloud” />
</AsyncRoot>
<AsyncLogger name=”com.gigaspaces” level=”ERROR” />
<AsyncLogger name=”com.j_spaces” level=”ERROR” />
<AsyncLogger name=”com.sun.jini” level=”ERROR” />
<AsyncLogger name=”net.jini” level=”ERROR” />
<AsyncLogger name=”org.apache” level=”WARN” />
<AsyncLogger name=”org.apache.cxf” level=”WARN” />
<AsyncLogger name=”org.springframework.beans.factory” level=”WARN” />
<AsyncLogger name=”org.mule” level=”INFO” />
<AsyncLogger name=”com.mulesoft” level=”INFO” />
<AsyncLogger name=”org.jetel” level=”WARN” />
<AsyncLogger name=”Tracking” level=”WARN” />
<AsyncLogger name=”org.mule” level=”INFO” />
<AsyncLogger name=”com.mulesoft” level=”INFO” />
<AsyncLogger name=”org.mule.extensions.jms” level=”INFO” />
<AsyncLogger name=”org.mule.service.http.impl.service.HttpMessageLogger” level=”INFO” />
<AsyncLogger name=”org.mule.extension.salesforce” level=”INFO” />
<AsyncLogger name=”org.mule.extension.ftp” level=”INFO” />
<AsyncLogger name=”org.mule.extension.sftp” level=”INFO” />
<AsyncLogger name=”com.mulesoft.extension.ftps” level=”INFO” />
<AsyncLogger name=”org.mule.modules.sap” level=”INFO” />
<AsyncLogger name=”com.mulesoft.extension.mq” level=”INFO” />
<AsyncLogger name=”com.mulesoft.mq” level=”INFO” />
<AsyncLogger name=”org.mule.extension.db” level=”INFO” />
<AsyncLogger name=”httpclient.wire” level=”DEBUG” />
<AsyncLogger name=”org.mule.transport.email” level=”DEBUG” />
</Loggers>
</Configuration>

This is how you can enable Log.io logging using HTTP Appender for MuleSoft applications.

--

--

Mazhar Ansari
Mazhar Ansari

Written by Mazhar Ansari

I am seasoned Integration Architect with around 18+ yrs of exp. I have extensively worked on TIBCO and Mulesoft. Mainly in EAI, ESB, SOA, API and BPM projects.

Responses (1)