Custom Log Parser in OMC Log Analytics to deal with Extended Logging in WebLogic Access Log image 14

Custom Log Parser in OMC Log Analytics to deal with Extended Logging in WebLogic Access Log

We recently enabled Extended HTTP Access Logging on our WebLogic Managed Servers. This adds the average response times and potentially several other valuable values to the WebLogic Access Log files. Unfortunately, the WebLogic Access Log file parser in Oracle Management Cloud Log Analytics does not know how to handle the Extended Log File Format (ELFF) that is now used for formatting the log files. As a result, no structured analysis can be performed on the access log files: OMC does not get a URI, nor an action or status code nor the response time.

It is my strong opinion that OMC Log Analytics should provide a parser for the ELFF format and should automatically recognize the fact that WebLogic Access Log files are produced in that format.

However, I can have any opinion in the world, that does not alter the capabilities of OMC Log Analytics right here and now. So, hands on the keyboard and create my own workaround. Which turns out to be not very challenging at all. OMC LA allows me to define custom log parsers that can be associated with specific files harvested by the Log Analytics agent from designated entities. Once I have defined a parser that can interpret the ELFF format, I simply associate a log source based on that parser with the access log files on the WebLogic Managed Servers that have extended logging switched on. Let me show you how that works.

Creating a Work Around

After switching on extended logging, the format of the WLS Access Log Files looks as follows:

2018-01-18 16:39:19 GET /domainA/serviceB/1.0 200 - - 1.406 4:39:20 PM

2018-01-18 16:40:03 POST /domainZ/serviceX/2.0 200 - - 0.898 4:40:03 PM
2018-01-12    20:41:40    POST      /soa-infra/services/default/HelloWorldProject/helloworldprocess_client_ep 200 -  -  13.13  8:41:40 PM
2018-01-12    20:41:41    POST    /soa-infra/services/default/HelloWorldProject/helloworldprocess client 200 -  -  0.047   8:41:41 PM

The default parsing in OMC Log Analytics does not handle this format well. It probably expects and can only handle the non-extended HTTP Access Log format, which is a bit different: - weblogic [07/Jun/2019:16:26:51 +0200] "POST /sbconsole/sbconsole.portal?_nfpb=true&_pageLabel=wliReportManagementPortlet&wliReportManagementPortlet_actionOverride=/com/bea/alsb/console/reporting/jmsprovider/filterMessages HTTP/1.1" 302 487 - SYSMAN [03/Oct/2013:02:07:12 -0700] "GET /em/faces/logon/core-uifwk-console-login HTTP/1.1" 200 5905 - - [03/Oct/2013:02:07:12 -0700] "GET /empbs/check HTTP/1.1" 200 231 - weblogic [07/Jun/2019:16:44:39 +0200] "GET /sbconsole/ HTTP/1.1" 302 317

As a result, the Log Explorer simply does not have values for some of the most relevant fields.


Solution: create a custom log parser to handle WLS Access Logs for which HTTP Extended Logging has been turned on (see for example on enabling this extended logging)


  • Create a new Log Parser in OMC
  • Create a new Log Source in OMC associated with the new parser
  • Upload a local file in the right log format and associate it with the new Log Source
  • Verify that the parser was able to extract all desired fields from this file
  • Associate the WLS Access Log Files on the WLS Managed Servers with Extended Logging enabled with my Custom Log Parser, instead of the out of the box parser
  • (Publish Blog Article and make OMC Product Management aware of this situation)
  • start leveraging the results from the enhanced parsing of the extended access log files

These steps are well documented in the OMC Documentation, so I will just show and describe some of the highlights.

Create the Log Parser

Press the Create button on the Log Parser overview page to create a new Log Parser:


Specify the name of the new parser, provide a description and copy / paste or upload a sample of the log file the parser will process:


Press Next.

Highlight a single log entry, and press next:


The easiest way to prepare the regular expression used for processing the log entries is in Guided mode. Here, we can select individual elements in the selected log file entry that we want to extract, one at a time:


We can specify the field definition – name, type and regular expression. Bit by bit – field by field – we construct the entire expression.SNAGHTML143350c0

Press Next, and confirm the field definitions. Now press Create Parser.image

Next, the Log Parser definition is validated


Parse Expression (Edit: added expression to capture URI GET Parameters):


and when done, the parser is added to the OMC Log Analytics instance. Note that the parser definition can be exported and shared to other OMC instances.


Create a New Log Source

A Log Source defines for specific files associated with specific Entity Type(s) what parser is to be used to process them.

I will create a new Log Source


for the WLS Server Extended Access file. It uses the new Log Parser and is associated with the WebLogic Server entity type.


Once the Log Source is saved, we can make use of it. For processing sample log files, and for permanent association with standard log files produced by WebLogic Server entities in our landscape under scrutiny.


Upload Sample Log File (to test Custom Parser)

Log files can be uploaded for analysis to OMC Log Analytics. This feature can be used to add log data from systems that cannot be observed by OMC Agents – for whatever reason. It can also be used to add files in new log file format and associate them with new log source to find out whether the log parser will do its job correctly. When this test is successful, a real association can be created between the actual entities and the new log source.

I have prepared a log file (locally, on my laptop) with the ELFF format. I want to upload this file and have my new parser attempt to parse this file. These are the steps:


Click on New Upload.

Drag and drop the file and specify a name for this upload:


Press the Next icon.

Select the Log Source (and indirectly the Log Parser) for this uploaded file:


Also specify with which entity the log file entries should be associated (one of the OSB WLS Managed Servers).

Press the Next icon,

Confirm the selections and press Upload:


The file is now uploaded and will be parsed using the new Log Parser:


After a little while, processing is done. The log parser has parsed the log entries in the uploaded file and these were added to the Log Analytics repository. At this point, these entries are similar to those gathered and processed from he agent in the regular route. You will not see any distinction in the Log Explorer between Agent harvested log entries and entries retrieved from uploaded log files.

Now we quickly check in the Log Explorer if the log entries from the sample log file have been processed correctly, can be inspected and contain the fields we are after: URI, status code, action, response time.

Here is my first glance at the Log Explorer:


The fields Status Code, URI, Action, Host IP Address are identified by the parser in this special log file. These fields can be displayed – as used for filtering and aggregating:


This satisfies the sanity check for the new parser. I believe it is up to the task of processing the access log files produced by the WebLogic Managed Servers.

Specify File Pattern and Associate Log Source with WLS Managed Servers

Specify the file pattern for the newly defined Log Source – use the same file pattern used for the out of the box WLS Server Access Log Source: {domain_home}/servers/{server_names}/logs/access*.log*


Associate the WLS Managed Server entities with this Log Source (and dissociate them from the out of the box Log Source) :


Save the change and confirm that the log agents have to be reconfigured with this update:


The new association was added successfully:


And dissociate:


And action…

After creating the new association between the WLS Manager Server entities and the newly defined Log Source it will take a few minutes before log entries are parsed by the ELFF aware log parser. When the parser kicks in, we get what we are after: log entries with clearly identified URI, action, status and response time.

In the Log Explorer, I check for all access log entries for status codes 400 and above -indicative of errors that we should look into:


Another valuable insight is provided by looking at the response time that are higher than intended. Let’s check for all calls that took longer than 3 seconds to produce a response to and let’s find out which endpoints are primarily responsible:


Here we see which service endpoints are responsible for most slow responses. Let’s act on that information!

Using the information gathered by the Custom Log Parser, I have composed a custom Dashboard that can be used by the OSB Application Operator (and that contains some built in alerts that trigger automatic notifications)  to keep a good overview of events inside the OSB:


The dashboard is configurable and down-drillable. It shows the (almost) current and past activity (load in terms of number of service calls) and the distribution across the OSB instances. It shows the top 15 most popular services and the number of calls to these services in the selected time window. It shows all calls that are recorded in the access log files with status 40X or 50X. And this shows slow service responses (currently defined as a response time > 3 seconds).


OMC LA Docs: Configure Log Source and Create Custom Parser

Sample Parser Expressions:

Perform Advanced Analytics with Links on Access Logs