JSON Logger Baltimore Meetup

ManjuKumaraGH 701 views 40 slides Jul 05, 2022
Slide 1
Slide 1 of 40
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33
Slide 34
34
Slide 35
35
Slide 36
36
Slide 37
37
Slide 38
38
Slide 39
39
Slide 40
40

About This Presentation

JSON Logger in Mule 4


Slide Content

‹#› Baltimore Meetup will start shortly...meanwhile @ Attendees : Kindly introduce yourself in Chat Name Company Location Mule Experience

JSON Logger in Mulesoft Baltimore MuleSoft Meetup SnowFlake Logging Framework Start Recording...

‹#› Introduction Json Logger&Logging Framework Async Scope Demo Q&A Trivia/Quiz & Prize Meetup: Feedback & Upcoming Events Agenda

‹#› About the organizers: Manju Kumara Hanumanthappa Ravi Padmanavar About the sponsor: … Introductions A SHOW OF HANDS: @Attendees: Kindly introduce yourself in Chat (Name,Company,Location & Mule Exp)

‹#› Meet your Baltimore Meetup Leaders

‹#› Both the speaker and host are organizing this meet up in individual capacity only. We are not representing our companies here. This presentation is strictly for learning purpose only. Organizer/Presenter do not hold any responsibility that same solution will work for your business requirements also. This presentation is not meant for any promotional activities. This meeting will be recorded and shared. Safe Harbour Statement

‹#› Today’s Meetup Speaker Sai Krishna Sanjapu Software Engineer at OneSingleView Having 5years of Experience in Developing various projects Currently working in SingleView (Fintech Startup)

Json Logger & Framework

Json Logger How Json Logger is differentiate from Normal Logger The out-of-the-box logger component offered by Mule runtime is really easy to use and understand, but it hardly enforces any standards and does not facilitate the creation of structured logs Json Logger as capability to log the data in a Proper Structured Manner in your console Capabilites in Json Logger Location Info Disabled fields Sending Data to External Destination

Adding Json Logger In to Studio Step 1. Clone the repo https://github.com/mulesoft-consulting/json-logger/tree/mule-4.x/json-logger Step2. Inside the folder template-files/ there is a file called settings.xml which can be used as a reference to configure your maven settings. ANYPOINT_PLATFORM_USER and ANYPOINT_PLATFORM_PASS are the credentials you normally use to login into our Anypoint Platform. Just make sure you have Exchange Contributor role granted Step3. Run the script: deploy.sh Open the Json-Logger Folder and run the below command ./deploy-to-exchange.sh <YOUR_ORG_ID>

Json Logger Features

Location Info Json Logger offers the ability to obtain context information for our operations and have access, for instance, to the Location object which basically gives you everything you want to know about where are you in your application.As you can see, critical troubleshooting information such as the name of the flow (rootContainer), name of the xml configuration file (fileName), and the exact line number where the logger is located (lineInFile) can now be part of the log metadata and configured through the global config element:

Disabled fields If We want to filter “sensitive” data Json logger provides decoupling of log functional data that provides meaningful application state information without including sensitive data && troubleshooting data that contains raw data coming in and out of different components Message : Field meant for meaningful non-sensitive functional messages such as “Request received from Customer with ID: 1234” Content : Field meant for raw data typically useful during development and test phases (e.g. payload, attributes, vars, etc.) Json Logger skips any field defined here we can add many feilds by comma-separated e.g.,content,message we can assign an environment variable (e.g. ${logger.disabled.fields}), which in lower environments could be null, but then on stage and production could be set to “content.”

Content field data parsing In Json Logger By Default data inside the content field is always “stringified.” After new release json logger provides choice and convenience as long as the content field output is JSON, they will be parsed by default . In new release (2.0.0), if the content field output is a proper JSON, it will get parsed as part of the main logger If the content is anything else other than JSON, it will stringified as usual: If you still want to keep the previous behavior where everything is stringified, that can be defined globally in the logger configuration by making “Parse Content field in Json output ” to false

DataWeave Logger Module StringifyAny(<DATA>): It will “stringify” any <DATA> regardless of=content type. stringifyNonJSON(<DATA>) :It will “stringify” any <DATA> unless it’s content type is “application/json” in which case, it will preserve the JSON content type and structure. stringifyAnyWithMetadata(<DATA>): It will “stringify” any <DATA> regardless of content type and will include the metadata contentLength,dataType stringifyNonJSONWithMetadata(<DATA>) : It will “stringify” any <DATA> unless it’s content type is “application/json” in which case, it will preserve the JSON content type and structure and will include the metadata contentLength,dataType Note: It is not required converting payloads to JSON on every single logger json logger aims to avoid the additional conversion overhead and performance impact by keeping the raw contents in its native format as much as possible (e.g. when dealing with XML or CSV payloads).

External Destinations If we want to facilitate pushing logs to external platforms, a feature called “External Destinations was availabe in json logger Supported Destinations: Anypoint MQ JMS AMQP Log categories is used to filter the paricular logs need to send to external destiantion if we want to stop particular logs to queue if we define “logger.category.amqp.external.” in category only those logs forward to external if we didnot mention anything in category by default org.mule.extension.jsonlogger.JsonLogger this time all logs will publish to external destination

Data Masking I f we want to prevent logging sensitive data such as credentials and customer account information json logger provides masking capabilities Not have a detrimental impact on performance Preserve original data structures (for readability) Work only at the content field level (or any field marked as type “content” ) Json Paths: $.payload.addresses[1] → indicates that the second address object Under the payload parent element should be masked $..state → indicates that “any child” field named “state” should be Masked Note: we need to provide CSVs (with no spaces in between) with the name of the JSON fields and/or JSON paths (relative to the root element inside the content field) that we want masked.

Logger Scope If we want to find the response time that the api request is taken we will use json logger scope Usecase:scenarios where we are particularly interested in measuring the performance of an outbound call,the Scope Logger introduces the notion of an “scope elapsed time” (scopeElapsed) which is a calculation specific to whatever was executed inside the scope (e.g. calling an external API). Tracepoints: DATA_TRANSFORM_SCOPE, OUTBOUND_REQUEST_SCOPE, FLOW_LOGIC_SCOPE

Custom Logging What Is a Custom Logging Framework? A logging framework is a utility specifically designed to standardize the process of logging in your application To get more specific, we can conceive of a logging framework encapsulating three main concerns: recording, formatting, and appending. When you want to capture runtime information about your application, you start by signaling what to record. Then, you determine how to format that information. And, finally, you append it to something. Classically, this is a file, but you can also append to RDBMS tables, document databases, or really anywhere capable of receiving data. If you have a piece of code dedicated to solving these problems, you have yourself a logging framework.

Custom Logging in Mulesoft How to Implement custom Logging in application Before Implementing custom logging in our application we should consider Impact on Your Application’s Runtime Robustic Architecture of Logging Framework Security perspective of logs in Application Future scope of Logs

Components to Implement Common Logging Framework Json Logger Active MQ VM SnowFlake Connector XML SDK Custom Connector

SnowFlake Connector Add the Dependency <dependency> <groupId>net.snowflake</groupId> <artifactId>snowflake-jdbc</artifactId> <version>3.12.17</version> <type>jar</type> </dependency> AccountName: HostName(sb80335.ap-south-1.aws) Ware-House: COMPUTE_WH

Building a Common Project As a Developer we need to make a robustic api and use that api as a common framework across all apis.This can be done in 2 ways 1. Make project as a jar file and push the jar to anypoint connector as a connector and add maven dependency in your pom.xml 2. Make project as a jar file and push to maven repo and add maven dependency in your pom.xml mvn install:install-file -Dfile=<jar file name> -DgroupId=<groupid> -DartifactId=<artificatid> -Dversion=<snapshot version> -Dpackaging=jar

Logging Architecture

Async Scope When we are calling logging framework across all our api’s the log payload will go to snowflake via logging framework so if we process synchronous thread processing,payload will returns snowflake database response in the thread and same payload will go to next messaging processor to overcome this we need to set asynchronous proce ss

PUB-SUB Mechanism The Log payload from json logger need to send Snowflake there is no direct coniguration available In Jsonlogger for snowflake connector we need to setup PUB-SUB Mechanism between Jsonlogger And snowflake connector Firstly we need send json payload in to the queue And more listener need to listen to that particuar Queue and sends the data to snowflake actively Note: As per the documentation available queue’s Is AMQ,JMS

VM Configuration As per official Notes there is not configuration availabe for vm connector So,we need to build robustic architecture to use vm in our logging Json Logger is not compatible for vm.So,that we need to build traditional way of logging,So that we use to store all required data in variables from every api and format in proper json format and will send that data to VM publish framework.

Use Cases for Demo

Logging

Demo Time !

Logging in Led Architecture

Q&A

Trivia/Quiz

‹#› All Questions will be Multiple Choice Questions. Respond answers in Chat Window. First correct answer for every question will be the winner for that question One voucher per month - Across All Meetups Note for Trivia Winners: Make sure the host has your full name, email address and linkedIn profile before leaving Voucher sent to winners within 10 days Trivia Rules

‹#› Which of the following External Destination is not provided by Json Logger? Anypoint MQ JMS VM AMQP Questions 1:

‹#› Which of the following dataweave module is correct It will “stringify” any <DATA> unless it’s content type is “application/json” which includes contentLength,dataType stringifyAnyWithMetadata(<DATA>) stringifyNonJSON(<DATA>) stringifyNonJSONWithMetadata(<DATA>) stringifyAny(<DATA>) Questions 2:

‹#› What is the Default Max Batch Size in Queue Destinations in Json Logger? 30 25 35 20 Questions 3:

Meetup Feedback

‹#› Share : Tweet using the hashtag #MuleSoftMeetups #MuleMeetup Invite your network to join: https://meetups.mulesoft.com/baltimore/ Feedback : Fill out the survey feedback and suggest topics for upcoming events Contact MuleSoft at [email protected] for ways to improve the program Nominate Yourself as Meetup Speaker: Amazing opportunity to public speaking, broadening skills and expanding network Knowledge Shared is Knowledge Squared!

subtitle Meetup Photo