2021-04-08 10:15 AM - edited 2021-04-08 12:12 PM
Hello all,
I am unable to identify how the ESA secondary functions or why it is required.
I want to understand what is the relationship between ESA primary and ESA secondary.
Does the primary and secondary ESA's function in Warm-Standby mode or Load Sharing mode?
Can I achieve HA with two primary ESA hosts within single NW environment?
Please suggest.
Thanks.
2021-06-11 11:08 AM
Hi,
ESA Primary is just the one that holds:
- the alerts mongo database (contains alerts/incidents that have triggered on any ESA)
- the Context Hub service (there can only be 1 context hub and it must be installed in the ESA Primary)
You can only have 1 ESA Primary in your environment.
ESA Secondary is any ESA other than the Primary. It doesn't contain any mongo database or context hub. All alerts triggered on ESA Secondaries will be stored on the ESA Primary db. You can install any number of ESA Secondaries, for example when the ESA Primary is already overloaded and you need to correlate more data. Or if you want to deploy distinct ESA appliances for different regions / subnets / device groups / etc.
At the moment there is no HA/Warm-Standby/Load Balancing in the ESA Service. However you can perform a backup of the ESA config + data (mongo db) and then restore it to recover from a DR scenario, by using the nw-recovery-tool:
nw-recovery-tool --export --dump-dir /var/netwitness/backup --category ESAPrimary --component mongo
I think RSA Engineering is already aware of the HA requirement for ESA, but if you want you can let them know by submitting a new idea:
https://community.rsa.com/t5/rsa-netwitness-platform-ideas/idb-p/netwitness-ideas
2021-07-22 05:55 AM
We are going to decomission our Primary ESA and would like to promote a brand new ESA in a new Datacenter to become the primary. This will include new hostname and IP etc.
What do you think would be the best approach to migrate?
2021-12-21 01:49 PM
Just want to share an experience , Taking a backup of Primary ESA server is a very lengthy process , once you issue the command it will pull the data and compress it , which takes time.