Configure the Destination

You can configure the destination using NFS, SFTP, and WebHDFS. Change the destination to which the Warehouse Connector service needs to write the collected data using NFS:

  • NetWitness Warehouse (MapR) deployments
  • Commercial MapR M5 Enterprise Edition for Apache Hadoop deployments

You can configure the Warehouse Connector to write to a remote destination using Secure File Transfer Protocol (SFTP). The remote destination can be a remote server that is NFS mounted to the MapR cluster or it can be a remote staging server.

By default, in the remote destination the Warehouse Connector writes data in the following directory structure:

  • /<staging_folder>/rsasoc/v1/sessions/data/<year>/<month>/<day>/<hour>/
  • /<staging_folder>/rsasoc/v1/logs/data/<year>/<month>/<day>/<hour>/
    Where <staging_folder> is the folder on the remote server where the Warehouse Connector writes the data.

If you are using a remote staging server as the remote destination, you need to manually copy or move the directory structure to any of the following deployments:

  • NetWitness Warehouse (MapR)
  • Commercial MapR M5 Enterprise Edition for Apache Hadoop
  • HortonWorks HD

To generate reports from the data written by Warehouse Connector, make sure that in your Hadoop deployment you maintain a similar directory structure that is created by Warehouse Connector in the remote destinations.

The following illustration describes how you can use SFTP to write data from Warehouse Connector to a remote destination.

netwitness_sftp_image.png

You can configure the Warehouse Connector service to write the collected data to a Hadoop-based distributed computing system that supports WebHDFS.