2014-03-03 01:42 PM
Is there a way to automate this backup process? I am not looking for content I am just looking for the configs so I don't lose them. This would include the table-map.xml and the -custom files.
2014-03-04 07:05 AM
This is what I backup:
I don't know specifically whether the app rule configs reside within the above but I know when I previously restored a log hybrid the event source config came back including the checkpoint keys so it may do what you want.
Alternatively, I think you can manually extract rules from within the UI. Granted that isn't automated but at least it gives you a place in time.
Kind regards,
Patrick
2014-03-04 03:35 AM
Hi Sean,
I automate using cronjobs and define which files/ directories I want backing up into a script.
Does this help?
Kind regards,
Patrick
2014-03-04 06:45 AM
That is what I was likely going to do, but I have no idea where the config files are stored. I know where the index files are stored but I am talking about the app rules, reports, etc... Do you have any idea where they are stored?
2014-03-04 07:05 AM
This is what I backup:
I don't know specifically whether the app rule configs reside within the above but I know when I previously restored a log hybrid the event source config came back including the checkpoint keys so it may do what you want.
Alternatively, I think you can manually extract rules from within the UI. Granted that isn't automated but at least it gives you a place in time.
Kind regards,
Patrick
2014-03-10 08:14 PM
App rules are stored in the NwDecoder.cfg file in /etc/netwitness/ng (or /etc/netwitness/9.0 before 9.8).
If you back up and restore that file you'll get app rules, as well as defined application-level user accounts and groups, and other app-level configuration.
I pretty much just grab all of /etc to be safe. This gets almost everything you need to reapply the OS-, network-, and application-level configurations if you have to rebuild.
2014-03-12 12:24 PM
wish in the future they put all the configuration under same folder..., also the logs, during troubleshooting, keep on changing the folders....
2014-03-12 02:02 PM
FYI I do have a script that is working. Please test it in your environment first before confirming but it backs up pretty much everything that you will need to restore a system. This script works on all devices. I will first post the instructions I was given, then I will post the script after. ! !
This script works for Brokers, Concentrators, Decoders, Log
Decoders, ESA Appliances and SA Servers. It will backup the directories
listed below which contain a majority of the configuration files. The
script will automatically compress the contents into a tar file. There is
also a line to scp the backup file(s) after it is compressed in a tar file,
however that line is commented out by default. Define your host at the
top of the script and uncomment the scp lines to enable automatic secure copy
off of the box(es). The script will dump the tar file into /tmp/backup by
default, but the variables are configurable and listed at the top. The
script is attached.
SA Server: /home/rsasoc/rsa/soc/reporting-engine <—With
the exception of /resultstore/ and /formattedReports/
SA Server: /var/lib/netwitness/uax
ESA: /opt/rsa/esa
Broker/Concentrator/Decoder/Log Decoder : /etc/netwitness/ng
You can set the script up using a cron job and run it at whatever
intervals you would like.
To test:
Place Script in /usr/sbin
Make executable using chmod +x backup.sh
Test it:
cd /usr/sbin
./backup.sh
cd /tmp/backup
look for files...
Schedule cron job:
crontab -e
(in the editor) type colon ':' on a blank line
wq!
(the above commands create the crontab file if one
does not exist)
cd /var/spool/cron
edit root
(I recommend doing this using WinSCP and a file-format aware gui text editor
like editplus instead of using ‘vi’,
because errors with crontab for root can corrupt the system rendering it
unbootable
Add these entries:
PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin
07 00 * * * /usr/sbin/backup.sh > /dev/null
2>&1
NOTE: This script is not RSA Supported. Customers should use their commercial
backup product and point to the directories indicated.
#!/bin/bash
# The routine moves config files into the archive area and compresses the files.
# Old logs are also removed if over "n" days old.
#
#
#
#
#----- Define Variables
HOST="`hostname`"
day=`date +%d`
month=`date +%b`
year=`date +%Y`
BACKUP=/tmp/backup
DECCONBRK1=/etc/netwitness/ng
SASERVER1=/home/rsasoc/rsa/soc/reporting-engine
SASERVER2=/var/lib/netwitness/uax
ESASERVER1=/opt/rsa/esa
GZIP="/bin/gzip -f"
# Check for backup directory
if [ -d "${BACKUP}" ]; then
#Directory /backup exists.
echo .
else
#Error: Directory /backup does not exists.
mkdir ${BACKUP}
fi;
########################################################
## Decoder, Concentrator, or Broker
########################################################
if [ -d "${DECCONBRK1}" ]; then
#Directory exists.
tar -cvf ${BACKUP}/etc-netwitness-ng.tar.$year.$day.$month-12AM ${DECCONBRK1}
#scp ${BACKUP}/etc-netwitness-ng.tar.$year.$day.$month-12AM user@host:/path/$HOST.etc-netwitness-ng.tar.$year.$day.$month-12AM
else
#Error: Directory does not exists.
echo .
fi;
########################################################
## SA Server
########################################################
if [ -d "${SASERVER1}" ]; then
#Directory exists.
#tar --exclude='/home/rsasoc/rsa/soc/reporting-engine/formattedReports/*' --exclude='/home/rsasoc/rsa/soc/reporting-engine/resultstore/*' -cvf foo.tar /home/rsasoc/rsa/soc/reporting-engine
tar --exclude='/home/rsasoc/rsa/soc/reporting-engine/formattedReports/*' --exclude='/home/rsasoc/rsa/soc/reporting-engine/resultstore/*' -cvf ${BACKUP}/home-rsasoc-rsa-soc-reporting-engine.tar.$year.$day.$month-12AM ${SASERVER1}
#scp ${BACKUP}/home-rsasoc-rsa-soc-reporting-engine.tar.$year.$day.$month-12AM user@host:/path/$HOST.home-rsasoc-rsa-soc-reporting-engine.tar.$year.$day.$month-12AM
else
#Error: Directory does not exists.
echo .
fi;
if [ -d "${SASERVER2}" ]; then
#Directory exists.
tar -cvf ${BACKUP}/var-lib-netwitness-uax.tar.$year.$day.$month-12AM ${SASERVER2}
#scp ${BACKUP}/var-lib-netwitness-uax.tar.$year.$day.$month-12AM user@host:/path/$HOST.var-lib-netwitness-uax.tar.$year.$day.$month-12AM
else
#Error: Directory does not exists.
echo .
fi;
########################################################
## ESA
########################################################
if [ -d "${ESASERVER1}" ]; then
#Directory exists.
tar -cvf ${BACKUP}/opt-rsa-esa.tar.$year.$day.$month-12AM ${ESASERVER1}
#scp ${BACKUP}/opt-rsa-esa.tar.$year.$day.$month-12AM user@host:/path/$HOST.opt-rsa-esa.tar.$year.$day.$month-12AM
else
#Error: Directory does not exists.
echo .
fi;
#----- Cleanup the Backup Area
find ${BACKUP} -mtime +15 -exec rm {} \;
2014-03-12 08:24 PM
You can also use rsync to back up and move data between systems. It's installed in the CentOS 5.7 build by default, but I believe you have to install it manually in 6.x. Works smoothly using ssh (with keys) as a transport mechanism.
backup_base=/your/local/backup/dir/here
hosts=decoder concentrator broker
for hostname in ${hosts}; do
# Create a directory for host's backups
[ -d ${backup_base}/${hostname} ] || mkdir -p ${backup_base}/${hostname}
# recursively copies remote:/dir to /your/local/backup/dir/here
rsync -azr -e ssh ${user}@${hostname}:/dir ${backup_base}/${hostname}
done
Since rsync is written intelligently and "unix-aware" you can even do things like use standard Bash groupings to specify multiple directories in the same command:
rsync -azr -e ssh ${user}@${hostname}:{/etc/netwitness/ng,/home/rsasoc/rsa/soc/reporting-engine,/var/lib/netwitness/uax,/opt/rsa/esa} ${backup_base}/${hostname}
2014-03-13 05:52 PM
We got the following two attached files from support. run the normal NWTECH and then the backup.sh. They do work for Archiver & ESA.
Also note these were modified under windows so make sure you open them in vi and do a :set ff=unix & then :wq!
Then write a cron job to run them and scp them off.
2016-09-25 12:26 AM
good stuff!