This website uses cookies. By clicking Accept, you consent to the use of cookies. Click Here to learn more about how we use cookies.
Accept
Reject

NetWitness Community

  • Home
  • Products
    • NetWitness Platform
      • Advisories
      • Documentation
        • Platform Documentation
        • Known Issues
        • Security Fixes
        • Hardware Documentation
        • Threat Content
        • Unified Data Model
        • Videos
      • Downloads
      • Integrations
      • Knowledge Base
    • NetWitness Cloud SIEM
      • Advisories
      • Documentation
      • Knowledge Base
    • NetWitness Detect AI
      • Advisories
      • Documentation
      • Knowledge Base
    • NetWitness Investigator
    • NetWitness Orchestrator
      • Advisories
      • Documentation
      • Knowledge Base
      • Legacy NetWitness Orchestrator
        • Advisories
        • Documentation
  • Community
    • Blog
    • Discussions
    • Events
    • Idea Exchange
  • Support
    • Case Portal
      • Create New Case
      • View My Cases
      • View My Team's Cases
    • Community Support
      • Getting Started
      • News & Announcements
      • Community Support Forum
      • Community Support Articles
    • Product Life Cycle
    • Support Information
    • General Security Advisories
  • Training
    • Blog
    • Certification Program
    • Course Catalog
      • Netwitness XDR
      • EC-Council Training
    • New Product Readiness
    • On-Demand Subscriptions
    • Student Resources
    • Upcoming Events
    • Role-Based Training
  • Technology Partners
  • Trust Center
Sign InRegister Now
cancel
Turn on suggestions
Auto-suggest helps you quickly narrow down your search results by suggesting possible matches as you type.
Showing results for 
Search instead for 
Did you mean: 
NetWitness Community Blog
Subscribe to the official NetWitness Community blog for information about new product features, industry insights, best practices, and more.
cancel
Turn on suggestions
Auto-suggest helps you quickly narrow down your search results by suggesting possible matches as you type.
Showing results for 
Search instead for 
Did you mean: 
  • NetWitness Community
  • Blog
  • Extracting Event Time from Logs

Extracting Event Time from Logs

DavidWaugh1
Employee DavidWaugh1
Employee
Options
  • Subscribe to RSS Feed
  • Mark as New
  • Mark as Read
  • Bookmark
  • Subscribe
  • Printer Friendly Page
  • Report Inappropriate Content
‎2017-01-24 04:47 AM

Last Updated: 12:41 February 27th 2017

Latest Version: 17

 

I had a customer who wishes to extract the raw event time for particular logs. This is because they use this raw event time for further analysis of events in a third party system. The raw event time may differ greatly from the actual log decoder processing time, especially if there is a delay in the logs reaching the log decoder, perhaps due to network latency or file collection delays.

 

Currently they use the event.time field. However this has some limitations:

  • If the timestamps are incomplete then this field is empty. For example many Unix systems generate a timestamp that does not contain the year.
  • Even for the same device types, event source date formats can be different. For example US based system may log the date in MM/DD/YY format, where as a UK based system may log the date in DD/MM/YY format. A date of 1/2/2017 could be interpreted as either the 1st February 2017 or the 2nd January 2017.
  • The event.time field is actually a 64 bit TimeT field which can not be manipulated within the 32 bit LUA engine that currently ships with the product.

 

All these issues are being addressed in future releases of the product, but the method outlines here gives something that can be used today.

 

Create some new meta keys for our Concentrators

 

We add the following to the index-concentrator-custom.xml files:

 

<key description="Epoch Time" level="IndexValues" name="epoch.time"format="UInt32" defaultAction="Open" valueMax="100000" />
<key description="Event Time String" level="IndexValues" name="eventtimestr" format="Text" valueMax="2500000"/>
<key description="UTCAdjust" level="IndexValues" name="UTCAdjust" format="Float32" valueMax="1000"/>

The meta key epoch.time will be used to store the raw event time in Unix Epoch format. This is seconds since 1970.

The meta key event.time.str will be used to store a timestamp that we create in the next step.

The meta key UTCAdjust will hold how many hours to add or remove from our timestamp.

 

Create a  Feed to tag events with a timestamp

Within the Netwitness LUA parser we are restricted on what functions we can use. As a result the os.date functions are not available, so we need another method of getting a timestamp for our logs.

 

To do this, create a cronjob on the SA Server that will run every minute and populate the following CSV file.

 

#!/bin/bash
# Script to write a timestamp in a feed file
devicetypes="rsasecurityanalytics rhlinux securityanalytics infobloxnios apache snort squid lotusdomino rsa_security_analytics_esa websense netwitnessspectrum bluecoatproxyav alcatelomniswitch vssmonitoring voyence symantecintruder sophos radwaredp ironmail checkpointfw1 websense rhlinux damballa snort cacheflowelff winevent_nic websense82 fortinet unknown"

for i in {1..60}
do
mydate=$(date -u)
echo "#Device Type, Timestamp" >/var/netwitness/srv/www/feeds/timestamp.csv
for j in $devicetypes
do
echo "$j",$mydate >>/var/netwitness/srv/www/feeds/timestamp.csv
done
sleep 1
done

This will generate a CSV file that we can use as a feed with the following format

checkpointfw1,Wed Jan 25 09:17:49 UTC 2017
citrixns,Wed Jan 25 09:17:49 UTC 2017
websense,Wed Jan 25 09:17:49 UTC 2017
rhlinux,Wed Jan 25 09:17:49 UTC 2017
damballa,Wed Jan 25 09:17:49 UTC 2017
snort,Wed Jan 25 09:17:49 UTC 2017
cacheflowelff,Wed Jan 25 09:17:49 UTC 2017
winevent_nic,Wed Jan 25 09:17:49 UTC 2017

Here the first column of the csv is our device.type and the part after the column is our UTC timestamp. 

 

We use this as a feed which we push to our Log decoders.

 

feed-definition.png

eventtimestr.png

The CSV file is created every minute, and we also refresh the feed every minute. This means that potentially this timestamp could be 2 minutes out of date compared with our logs.

 

Here is an example of the timestamp visible in our websense logs:

eventtimestr holds our dummy timestamp

epoch.time holds the actual epoch time that the raw log was generated. 

 

websense_2017-01-25_09-11-35.png

 

Create an App Rule to Tag Session without a UTC Time with an alert.

Create an App Rule on your log decoders that will generate an Alert if the no UTCAdjust metakey exists. This prevents you having to define UTC Offsets of 0 for your devices when they are already logging in UTC.

 

[ADM] _ldecoder_ config_2017-02-06_10-21-59.png

It is important that the following are entered for the rule:

Rule Name:UTC Offset Not Specified

Condition:UTCAdjust !exists

Alert on Alert

Alert box is ticked.

 

Create a feed to specify how to adjust the calculated time on a per device ip and device type setting

Create a CSV file with the following columns:

#DeviceIP,DeviceType,UTC Offset
192.168.123.27,rhlinux,0
192.168.123.27,snort,1.0
192.168.123.27,securityanalytics,-1.5

Copy the attached UTCAdjust.xml. This is the feed definition file for a Multi-indexed feed that uses the device.ip and device.type. We are unable to do this through the GUI.

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<FDF xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="feed-definitions.xsd">
<FlatFileFeed comment="#" separator="," path="UTCAdjust.csv" name="UTCAdjust">
<MetaCallback name="Device IP" valuetype="IPv4">
<Meta name="device.ip"/>
</MetaCallback>
<MetaCallback name="DeviceType" valuetype="Text" ignorecase="true">
<Meta name="device.type"/>
</MetaCallback>
<LanguageKeys>
<LanguageKey valuetype="Float32" name="UTCAdjust"/>
</LanguageKeys>
<Fields>
<Field type="index" index="1" key="Device IP"/>
<Field type="index" index="2" key="DeviceType"/>
<Field key="UTCAdjust" type="value" index="3"/>
</Fields>
</FlatFileFeed>
</FDF>

 

Run the  UTCAdjustFeed.sh script to generate the feed. This generates the actual feed and then copies it to the correct directories on any log and packet decoders. (There really isn't any reason to copy it to a packet decoder). This script will download the feed from a CSV file hosted on a webserver. This script could be scheduled as a cronjob depending on how often it needs to be updated.

 

wget http://localhost/feeds/UTCAdjust.csv -O /root/feeds/UTCAdjust.csv --no-check-certificate

find /root/feeds | grep xml >/tmp/feeds
for feed in $(cat /tmp/feeds)
do
FEEDDIR=$(dirname $feed)
FEEDNAME=$(basename $feed)
echo $FEEDDIR
echo $FEEDNAME
cd $FEEDDIR
NwConsole -c "feed create $FEEDNAME" -c "exit"
done
scp *.feed root@192.168.123.3:/etc/netwitness/ng/feeds
scp *.feed root@192.168.123.2:/etc/netwitness/ng/feeds
scp *.feed root@192.168.123.44:/etc/netwitness/ng/feeds
NwConsole -c "login 192.168.123.2:50004 admin netwitness" -c "/decoder/parsers feed op=notify" -c "exit"
NwConsole -c "login 192.168.123.3:50002 admin netwitness" -c "/decoder/parsers feed op=notify" -c "exit"
NwConsole -c "login 192.168.123.44:50002 admin netwitness" -c "/decoder/parsers feed op=notify" -c "exit"

UTCAdjust.png

 

Use a LUA parser to extract the event time and then calculate Epoch time.

 

We then use a LUA parser to extract the raw log and then calculate epoch time. Our approach to do this is as follows:

 

  • Define the device types that we are interested in
  • Create a regular expression to extract the timestamps for the logs we are interested in
  • From this timestamp add additional information to calcualate the epoch time. For example for timestamps without a year, I assume the current year and then check that this does not create a date that is too far into the future. If the date is too far in the future, then I use the previous year. This should account for logs around the December 31st / 1st January boundary.
  • Finally adjust the calculated time depending on the UTC Offset feed.

 

I've attached and copied the code below. This should be placed in /etc/netwitness/ng/parsers on your log decoders.

Currently this LUA parser supports:

 

  • windows event logs
  • damballa logs
  • snort logs
  • rhlinux
  • websense

 

The parser could be expanded further to account for different timestamp formats for particular device.ip values. You could create another feed to tag the locale of your devices and then use this within the LUA parser to make decisions about how to process dates.

 

Here is the finished result:

 

epoch.png

 

I can investigate on epoch.time.

 

For example: 1485253416 is Epoch Converter - Unix Timestamp Converter 

GMT: Tue, 24 Jan 2017 10:23:36 GMT

 

and all the logs have an event time in them of 24th January 10:23:36

 

epochtime.png

 


					
				
			
			
			
			
			
			
			
			
EpochTime.lua.zip
utcadjust.xml.zip
UTCAdjust.csv.zip
  • event.time
  • event_time
  • logs and packets
  • Lua
  • lua parsers
  • NetWitness
  • netwitness logs
  • NW
  • NWP
  • RSA NetWitness
  • RSA NetWitness Platform
EpochTime.lua.zip
utcadjust.xml.zip
1 KB
UTCAdjust.csv.zip
1 Like
Share
2 Comments
TomiReiman
TomiReiman Beginner
Beginner
  • Mark as Read
  • Mark as New
  • Bookmark
  • Permalink
  • Print
  • Report Inappropriate Content
‎2017-02-02 04:06 PM
‎2017-02-02 04:06 PM

I might have missed something big time, but... What is the actual purpose of 'eventtimestr' here? If I read correctly, it is populated from a feed that is generated on a core device. Don't the collected logs already get populated with a collection timestamp in the 'time' meta? Doesn't 'eventtimestr' here contain the exact same thing - except not as accurate? Also, for the logs from which you are currently able to parse the timestamp, couldn't the extracted timestamp be just as well populated into 'eventtimestr' in addition to 'time.epoch', assuming that Lua lets you handle date objects so that you can print out in a format of your liking?

 

Also, on a related matter (talking timestamps): Do you know how to properly use the UTC function (UTC(msgSegment, formatString, inputParm …)) that's available for use in the XML parsers? I failed to notice how this function would take in the timezone as its parameter...so how on earth would it be possible to 'normalize' the timestamp found in a log event? Similarly, I assume that times contained in 'event.time' (TimeT) do not get adjusted with a user's timezone profile either in the GUI since we never tell NetWitness what timezone the original log uses?

0 Likes
DavidWaugh1
Employee DavidWaugh1
Employee
  • Mark as Read
  • Mark as New
  • Bookmark
  • Permalink
  • Print
  • Report Inappropriate Content
‎2017-02-03 05:05 AM
‎2017-02-03 05:05 AM

I've made some adjustment here but let me answer your questions:

 

What is the actual purpose of 'eventtimestr' here? 

Basically within the product it is not possible with LUA to manipulate TimeT values. This is logged as internal issue ASOC-29050 .I also dont have access to the os.date functions within LUA, so I needed another way of getting a timestamp that was fairly current. That is what the purpose of eventtimestr is. It enables me within LUA to get a timestamp that I know is accurate to within a few minutes. This is all basically to help me get round the fact that I cant work with TimeT values currently.

 

If logs are parsered correctly then the event.time.str meta gets populated so I couldnt use this meta key as sometimes it would contain values that were not in the same format as the timestamp I was checking.

 

I'm afraid I dont know how to use the UTC function in the XML parser. Maybe wRAlmdLu8uOnkbiouAPmB5mqnlFr6baANOTo7eT0Oa4=‌ might?

0 Likes

You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.

  • Comment
Latest Articles
  • FirstWatch Threat Spotlight: Truly Asynchronous AsyncRAT
  • File Activity Alert Optimization in Multi-EPS Deployment
  • Threat Profile Series: An Introduction to Royal Ransomware
  • FirstWatch Threat Spotlight: APT-C-36
  • Integration of OPSWAT MetaAccess with Netwitness
  • DCSync Detection with NetWitness
  • FirstWatch Threat Spotlight: Brute Ratel C4
  • Hunting Misconfigured Web Applications
  • Examining APT27 and the HyperBro RAT
  • FirstWatch Threat Spotlight: DarkTortilla
Labels
  • Announcements 60
  • Events 4
  • Features 10
  • Integrations 8
  • Resources 63
  • Tutorials 27
  • Use Cases 24
  • Videos 116
Powered by Khoros
  • Blog
  • Events
  • Discussions
  • Idea Exchange
  • Knowledge Base
  • Case Portal
  • Community Support
  • Product Life Cycle
  • Support Information
  • About the Community
  • Terms & Conditions
  • Privacy Statement
  • Acceptable Use Policy
  • Employee Login
© 2022 RSA Security LLC or its affiliates. All rights reserved.