Logging using Elastic Cloud for MuleSoft using HTTP Appender
Note: While copy/paste the code snippet from blog please covert quotes and double quotes.
What is ELK?
- ELK is the acronym for three open source projects: Elasticsearch, Logstash, and Kibana
- Elasticsearch is a search and analytics engine
- Elasticsearch is an open source, full-text search and analysis engine, based on the Apache Lucene search engine
- Logstash is a server-side data processing pipeline that ingests data from multiple sources simultaneously, transforms it, and then sends it to a “stash” like Elasticsearch
- Kibana lets users visualize data with charts and graphs in Elasticsearch
Why do we need a system like ELK?
- Log aggregation and efficient searching
- Generic Search
There are three main reasons we need ELK
- It’s Interoperable
- It’s Open Source
- It’s Managed
What is Elastic Cloud?
Elastic cloud is offered from ELK on AWS. There are few benefits:
- On-demand computing
- Pay only for what you use
- Failover and fault tolerance
- Common coding
- Ease of implementation
Register with Elasctc.co:
- Go to https://elasctic.co/
- Click on Free Trial if not already have account
- When you login for first time It will ask you to create a default ELK deployment which will install ElasticSearch and Kibana installation as below
- It will also create username and password for elastic search please note it down as it will be use full for later stages
Enable Custom Logging For On Premise MuleSoft Runtime:
- Go to Project
- Open src/main/resoruces/log4j2.xml
- Add below xml tag in Configuration/Appenders
<Http name=”ELK-Cloud” url=”https://bf1718d968f54023b4b4e75badd4fee2.us-east-1.aws.found.io:9243/applogs/_doc">
<Property name=”Authorization” value=”Basic ZWxhc3RpYzoxREd6b0RTMWdTNTdNYlJMbjJ5S2UyU1k=” />
<Property name=”Content-Type” value=”application/json” />
<PatternLayout pattern=’%msg’ />
</Http>
- url should follow pattern as metioned https://<Elastic-Server-Name>:<Port>/<Index-Name>/_doc
Elastic-Server-Name: Server Name or IP Address of Elastic Search Server
Port: Port Number of Elastic Search Server default value is 9243
Index-Name: Index Name which you to create for these logs.
- Authorization is base64(Username + “:” + password) you can also generate the same using Basic Authentication Header Generator
- Add below xml tag in Configuration/Loggers/AsyncRoot
<AppenderRef ref=”ELK-Cloud” />
- Run Application
- See the Console Log
Note: for this exercise I am using JSON Logger for logging and also make sure you use PatternLayout as mentioned above
- Now go to Kibana (https://96c0390b651146119124148b6605cc3c.us-east-1.aws.found.io:9243/) or https://<Kibana-Server-Name>:<Port>-> Management -> Index pattern (Data Views)
- Kibana-Server-Name: Server Name or IP Address of Kibana Server
- Port: Port Number of Kibana Server default value is 9243
Note: Kibana and Elastic Server URL are different. Please make sure you use correct URLs.
- Click on Create Index Pattern
- You can see a new index applogs is created. Select it and click on Next Step
- Click on drop-down and select @timestamp and click on Create Index Pattern
- Start Mule application
- Run few cases so Mule file can generate the logs
- Go to Kibana (https://96c0390b651146119124148b6605cc3c.us-east-1.aws.found.io:9243/) -> Discover
- Select Index Pattern Create in previous step
- In Search you can write any suitable expression to search specific text from log file
Enable Custom Logging For CloudHub Application:
Before enabling the logging for cloudhub application, you need Disable CloudHub logs. By default this option is not available and you need to raise a ticket with MuleSoft for providing this option.
Once you disabled cloudhub logs, MuleSoft is not responsible for below things
- MuleSoft is not responsible for lost logging data due to misconfiguration of your own log4j appender.
- MuleSoft is also not responsible for misconfigurations that result in performance degradation, running out of disk space, or other side effects.
- When you disable the default CloudHub application logs, then only the system logs are available. For application worker logs, please check your own application’s logging system. Downloading logs is not an option in this scenario.
- Only Asynchronous log appenders can be used, Synchronous appenders should not be used.
- Use asynchronous loggers and not synchronous ones is to avoid threading issues. Synchronous loggers can lock threads waiting for responses.
You need to create log4j2.xml at location src/main/resources.
You need to add a Http appender inlog4j.xml. Provide url to connect Logz.io with token and type of logger, source, pattern layout etc.
<Http name=”ELK-Cloud” url=”https://bf1718d968f54023b4b4e75badd4fee2.us-east-1.aws.found.io:9243/applogs/_doc">
<Property name=”Authorization” value=”Basic ZWxhc3RpYzoxREd6b0RTMWdTNTdNYlJMbjJ5S2UyU1k=” />
<Property name=”Content-Type” value=”application/json” />
<PatternLayout pattern=’%msg’ />
</Http>
You can add more loggers like Log4J2CloudhubLogAppender into your log4j2.xml to enable logging on the cloudhub log console of your application.
<Log4J2CloudhubLogAppender name=”CLOUDHUB”
addressProvider=”com.mulesoft.ch.logging.DefaultAggregatorAddressProvider”
applicationContext=”com.mulesoft.ch.logging.DefaultApplicationContext”
appendRetryIntervalMs=”${sys:logging.appendRetryInterval}”
appendMaxAttempts=”${sys:logging.appendMaxAttempts}”
batchSendIntervalMs=”${sys:logging.batchSendInterval}”
batchMaxRecords=”${sys:logging.batchMaxRecords}” memBufferMaxSize=”${sys:logging.memBufferMaxSize}”
journalMaxWriteBatchSize=”${sys:logging.journalMaxBatchSize}”
journalMaxFileSize=”${sys:logging.journalMaxFileSize}”
clientMaxPacketSize=”${sys:logging.clientMaxPacketSize}”
clientConnectTimeoutMs=”${sys:logging.clientConnectTimeout}”
clientSocketTimeoutMs=”${sys:logging.clientSocketTimeout}”
serverAddressPollIntervalMs=”${sys:logging.serverAddressPollInterval}”
serverHeartbeatSendIntervalMs=”${sys:logging.serverHeartbeatSendIntervalMs}”
statisticsPrintIntervalMs=”${sys:logging.statisticsPrintIntervalMs}”>
<PatternLayout pattern=”[%d{MM-dd HH:mm:ss}] %-5p %c{1} [%t]: %m%n” />
</Log4J2CloudhubLogAppender>
Below is full log4j2.xml which can be used for your application for enabling custom logging on cloudhub and http appender for logz.io.
<?xml version=”1.0" encoding=”UTF-8"?>
<Configuration status=”INFO” name=”cloudhub”
packages=”com.mulesoft.ch.logging.appender,org.apache.logging.log4j”>
<Appenders>
<Log4J2CloudhubLogAppender name=”CLOUDHUB”
addressProvider=”com.mulesoft.ch.logging.DefaultAggregatorAddressProvider”
applicationContext=”com.mulesoft.ch.logging.DefaultApplicationContext”
appendRetryIntervalMs=”${sys:logging.appendRetryInterval}”
appendMaxAttempts=”${sys:logging.appendMaxAttempts}”
batchSendIntervalMs=”${sys:logging.batchSendInterval}”
batchMaxRecords=”${sys:logging.batchMaxRecords}” memBufferMaxSize=”${sys:logging.memBufferMaxSize}”
journalMaxWriteBatchSize=”${sys:logging.journalMaxBatchSize}”
journalMaxFileSize=”${sys:logging.journalMaxFileSize}”
clientMaxPacketSize=”${sys:logging.clientMaxPacketSize}”
clientConnectTimeoutMs=”${sys:logging.clientConnectTimeout}”
clientSocketTimeoutMs=”${sys:logging.clientSocketTimeout}”
serverAddressPollIntervalMs=”${sys:logging.serverAddressPollInterval}”
serverHeartbeatSendIntervalMs=”${sys:logging.serverHeartbeatSendIntervalMs}”
statisticsPrintIntervalMs=”${sys:logging.statisticsPrintIntervalMs}”>
<PatternLayout pattern=”[%d{MM-dd HH:mm:ss}] %-5p %c{1} [%t]: %m%n” />
</Log4J2CloudhubLogAppender>
<Http name=”ELK-Cloud” url=”https://bf1718d968f54023b4b4e75badd4fee2.us-east-1.aws.found.io:9243/applogs/_doc">
<Property name=”Authorization” value=”Basic ZWxhc3RpYzoxREd6b0RTMWdTNTdNYlJMbjJ5S2UyU1k=” />
<Property name=”Content-Type” value=”application/json” />
<PatternLayout pattern=’%msg’ />
</Http>
</Appenders>
<Loggers>
<AsyncLogger name=”org.mule.runtime.core.internal.processor.LoggerMessageProcessor” level=”INFO” />
<AsyncLogger name=”com.mulesoft.agent” level=”INFO” />
<AsyncRoot level=”INFO”>
<AppenderRef ref=”CLOUDHUB” />
<AppenderRef ref=”ELK-Cloud” />
</AsyncRoot>
<AsyncLogger name=”com.gigaspaces” level=”ERROR” />
<AsyncLogger name=”com.j_spaces” level=”ERROR” />
<AsyncLogger name=”com.sun.jini” level=”ERROR” />
<AsyncLogger name=”net.jini” level=”ERROR” />
<AsyncLogger name=”org.apache” level=”WARN” />
<AsyncLogger name=”org.apache.cxf” level=”WARN” />
<AsyncLogger name=”org.springframework.beans.factory” level=”WARN” />
<AsyncLogger name=”org.mule” level=”INFO” />
<AsyncLogger name=”com.mulesoft” level=”INFO” />
<AsyncLogger name=”org.jetel” level=”WARN” />
<AsyncLogger name=”Tracking” level=”WARN” />
<AsyncLogger name=”org.mule” level=”INFO” />
<AsyncLogger name=”com.mulesoft” level=”INFO” />
<AsyncLogger name=”org.mule.extensions.jms” level=”INFO” />
<AsyncLogger name=”org.mule.service.http.impl.service.HttpMessageLogger” level=”INFO” />
<AsyncLogger name=”org.mule.extension.salesforce” level=”INFO” />
<AsyncLogger name=”org.mule.extension.ftp” level=”INFO” />
<AsyncLogger name=”org.mule.extension.sftp” level=”INFO” />
<AsyncLogger name=”com.mulesoft.extension.ftps” level=”INFO” />
<AsyncLogger name=”org.mule.modules.sap” level=”INFO” />
<AsyncLogger name=”com.mulesoft.extension.mq” level=”INFO” />
<AsyncLogger name=”com.mulesoft.mq” level=”INFO” />
<AsyncLogger name=”org.mule.extension.db” level=”INFO” />
<AsyncLogger name=”httpclient.wire” level=”DEBUG” />
<AsyncLogger name=”org.mule.transport.email” level=”DEBUG” />
</Loggers>
</Configuration>
This is how you can enable Log.io logging using HTTP Appender for MuleSoft applications.