Configure the Destination Using SFTP
Make sure that you have:
- Installed the Warehouse Connector service or virtual appliance in your network environment.
- Added the Warehouse Connector service to NetWitness. For more information, see the "Add a Service to a Host" in the Hosts and Services Getting Started Guide.
- For the SFTP destination type, the destination host should be listed in the /root/.ssh/known_hosts file used by the ssh service (for example, sshd) running on the Warehouse Connector.
Add Destination from Warehouse Connector Host
To add the destination host to the /root/.ssh/known_hosts file, from the Warehouse Connector host, initiate a secure connection to the destination host:
- Log in to the Warehouse Connector.
- Enter ssh root@<SAWIP> or ssh username@<SAWIP>.
- Select Yes and enter the password.
- Add the host key in the /root/.ssh/known_hosts file
Note: After you upgrade Warehouse Connector to 11.0, you must make sure that the destination host is listed in the /root/.ssh/known_hosts file used by the SSH service (i.e. sshd) running on the Warehouse Connector. If you do not perform this action, the streams configured with SFTP in Warehouse Connector will not start.
- If you want to use SFTP to write data into the destination using SSH key-based access, you need to configure SSH key-based access between the Warehouse Connector and the Warehouse host or Hadoop node. For more information, see Configure SSH Keys below.
Note: If you want to enable checksum validation to validate the integrity of the AVRO files that are transferred from the Warehouse Connector to the destinations, make sure that you generate the keys without setting the passphrase and do a key exchange between Warehouse Connector and the warehouse nodes.
Configure SSH Keys
To configure SSH key-based access between the Warehouse Connector and the Warehouse host or Hadoop node:
-
Generate SSH keys on the Warehouse Connector at the default location. Perform the following:
- SSH to the Warehouse Connector.
-
Type the following command and press ENTER:
$ OWB_FORCE_FIPS_MODE_OFF=1 ssh-keygen -t ecdsa -b 521
-
The command prompts you to enter the file in which to save the generated key.
Enter file in which to save the key (/root/.ssh/id_ecdsa):
-
Enter the file in which you want to save the key and press ENTER.
The command prompts you to enter and confirm the passphrase.
Note: If you want to enable checksum validation to validate the integrity of the AVRO files that are transferred from the Warehouse Connector to the destinations, make sure that you do not set the passphrase. Then, the below steps e, f, g, and h are not applicable.
Enter passphrase (empty for no passphrase):
Enter same passphrase again:The public key is generated and is saved in the location that you provided.
Note: If the SSH key is not generated in the default location (/root/.ssh/id_ecdsa), you need to configure the destination for warehouse connector through Explore view. For more information, see To configure the destination through Explore view:.
- Change the directory by entering the following command:
cd /root/.ssh/
-
Move the generated key to the below location:
mv ~/.ssh/id_ecdsa ~/.ssh/id_ecdsa.old
-
Type the following command and press ENTER:
$ OWB_FORCE_FIPS_MODE_OFF=1 openssl pkcs8 -topk8 -v2 des3 -in id_ecdsa.old -out id_ecdsa
The command prompts you to enter and confirm the passphrase.
- Enter the encryption passphrase.
-
Run the following command to change the file permission:
chmod 600 ~/.ssh/id_ecdsa
- Copy the generated public key to append to the remote Warehouse host or Hadoop node.
ssh-copy-id -i ~/.ssh/id_ecdsa.pub root@<destination host ipaddress>
-
SSH to remote Warehouse host or Hadoop node as "ssh '<user>@<ip address>", if identity key file is at default location.
or
SSH to remote Warehouse host or Hadoop node as "ssh '<user>@<ip address> -i <identity file path>", If identity key file is not at default location. - Append the generated public key to the remote Warehouse host or Hadoop node's authorized keys list located at ~/.ssh/authorized_keys.
Note: Make sure that you copy the public keys to the Hadoop node and while copying the public key ensure that you provide the login details of the user using which the WebHDFS destination would be added.
You can now securely communicate between Warehouse Connector and Warehouse nodes or Hadoop nodes.
Configure Warehouse Connector to use SFTP destination
Note: If the SSH key is not generated in the default location (/root/.ssh/id_ecdsa), you need to configure the destination through Explore view. For more information, see To configure the destination through Explore view:.
To configure the destination through User Interface:
- Log on to NetWitness
- Go to (Admin) > Services.
- In the Services view, select the added Warehouse Connector service, and select > View > Config.
The Services Config view of Warehouse Connector is displayed. - On the Sources and Destinations tab, in the Destination Configuration section, click .
- In the Add Destination dialog, select SFTP from the Type drop-down list.
- In the Name field, enter a unique symbolic name for the destination.
Note: The Name field does not support spaces or special characters except underscore (_).
- In the Host field, enter the remote server IP address.
- In the Port field, retain the default port, 22.
- In the Username field, enter the SSH username.
Note: In the case of HortonWorks HD, ensure that the username is gpadmin and for password based access the password for gpadmin should be used. For passphrase-based access, the passphrase used to generate the keys for gpadmin user should be used.
- In the Password/Passphrase field, enter one of the following:
- SSH password - If you are using SFTP to write data into the destination using password-based access.
- SSH passphrase - If you are using SFTP to write data into the destination using SSH key-based access.
- In the Remote Path field, enter the path of the directory present on the SFTP server.
- Click Save.
- (Optional) If you want to enable checksum validation, perform the following:
- Go to (Admin) > Services.
- In the Services view, select the added Warehouse Connector service, and select > View > Explore.
The Explore view of Warehouse Connector is displayed. - In the options panel, navigate to warehouseconnector/destinations/sftp/config.
- Set the parameter isChecksumValidationRequired to 1.
- Restart the respective stream.
To configure the destination through Explore view:
-
Go to (Admin) > Services.
- In the Services view, select the added Warehouse Connector service, and select > View > Explore.
The Explore view of Warehouse Connector is displayed. -
Right click on "warehouseconnector" node and select properties.
-
Select "add" property and manually enter the below config parameters.
name=<destination name> destination=sftp://<destination path> host=<destination host ipaddress:port> type=hdfs port=22 username=<username> password=<password> privKeyFile=<private key file path>
Aggregate Metas and Raw Logs for a Log Session
To aggregate raw logs and metas from Log Decoder into a single AVRO file instead of two folders.
-
Go to ADMIN > Services.
- Select a Warehouse Connector service and click > View > Explore.
The Explore view for the Warehouse Connector is displayed.
- Open warehouseconnector/streams/<stream name>/loader/config and in the right pane, select the export.logAndsession.avro.enabled parameter.
- Change the value to yes.
- Restart the service.
- Go to ADMIN > Services.
- Select a Warehouse Connector service and click > View > Config.
-
On the Streams tab, select the stream that you want to reload.
-
Click Reload.