RSA NetWitness has a number of integrations with threat intel data providers but two that I have come across recently were not listed (MISP and Minemeld) so I figured that it would be a good challenge to see if they could be made to provide data in a way that NetWitness understood.
Now setup the NetWitness recurring feed to pull from the local feed location
map the ip-dst values (for this script) to the 3rd column and the other columns as required
Minemeld is another free intel aggregation tool from Palo Alto Networks and can be installed many ways (i tried a number of installs on different Ubuntu OSes and had difficulties), the one that worked the best for me was via a docker image.
docker run -it --tmpfs /run -v /somewhere/minemeld/local:/opt/minemeld/local -p 9443:443 jtschichold/minemeld
to make it run as daemon after testing add the -d command to have it continue running after you exit the terminal
After installing (if you do this right you can get a certificate included in the initial build of the container that will help with the Certificate trust to NW) you will log in and set up a new output action to take your feeds and map them to a format and output that can be used with RSA NetWitness.
This is the pipeline that we will create which will map a sample threat intel list to an output action so that NetWitness can consume that information
And it gets defined by editing the yml configuration file (specifically this section creates the outboundhcvalues section that NetWitness reads)
translate IP ranges into CIDRs. This can be used also with v=json and v=csv.
returns the indicator list in CSV format.
The list of the attributes is specified by using the parameterfone or more times. The default name of the column is the name of the attribute, to specify a column name add|column_namein thefparameter value.
Thehparameter can be used to control the generation of the CSV header. When unset (h=0) the header is not generated. Default: set.
Encoding is utf-8. By default no UTF-8 BOM is generated. Ifubom=1is added to the parameter list, a UTF-8 BOM is generated for compatibility.
F are the column names from the feed
This command testing drops a file in your browser to look at and make sure you have the data and columns that you want
Now once you are confident in the process and the output format you can script and crontab the output to drop into the local feed location on the head server (I did this as i couldn't figure out how to accept the self signed certificate from the docker image).