Target and Process

Previous Next  Print this Topic

Target and Process begins the Information Lifecycle Management process, in that it ingests data from computers on your network into SoleraTec's Phoenix VCM Information Repository. Target and Process and Target and Process Policies manage all ingest jobs from local file systems. They are responsible for initiating processing, identifying data eligible for ingesting, and pushing data to Vaults.

The tools for Target and Process are:

Target and Process Policies - Target and Process Policies allows you to set up policies for ingesting files from your system into a Vault. With the policies you are designating how you want your files ingested and when; the policies can include jobs that run regularly. See Target and Process Policies.
Target and Process - The Target and Process application allows you to ingest files for a one time backup or move files from your computer into a Vault. See Ingest Files Manually.
Target and Process Service - This service ingests data into the Information Repository and metadata is extracted from the data and cataloged in databases. The service constantly looks for date changes in saved Target and Process policy files, runs based on the policies saved, and updates if changes are made to the policies. Target and Process Service reads an automatically created configuration file that includes all policies for a single desktop client. This configuration file is what enables the Target and Process Service to execute policies.

Target and process is responsible for ingesting files from its source into the Information Repository. It backs up or permanently moves files while the Target and Process clients control all reporting, scheduling, and network bandwidth. Each Target and Process client runs independently and manages error logging, job operation completion times, and other information for its own computer platform and file systems. Any client can read data on any computer that it has rights to and write data to Vaults it has rights to.

Keep in mind the following when working with Target and Process:

Offline Media - When data is ingested, offline media is never used as a destination.
Conserve Bandwidth - It is recommended to run jobs when there is little network and client activity; during non-business hours or when users are off the network.
Monitor Jobs - Keep an eye on Network Activity to see when jobs are running.
Move Data - After ingesting files, use Data Service Policies to migrate, replicate, and purge data from any Vault in the Information Repository, enabling you to create an efficient and powerful multi-tiered Information Repository.
Load Balancing - Use storage pools as part of the destination criteria for jobs. If there are multiple Vaults with media belonging to a storage pool, clients pick the one with the most available free space on it. This keeps the entire Information Repository load balanced.
Data Verification - Phoenix VCM continually verifies the completeness and accuracy using a cyclic redundancy check as the job runs and as files are written to storage media.