Data Service Policies Panel

Previous Next  Print this Topic

The Data Service Policies Panel within the Management Console is the ultimate data management tool, providing automated digital asset management for Windows, Mac, and Linux environments. The Data Services Service is the engine that manages all the data within your Information Repository.

From data and video ingest all the way to storage, Phoenix DEM stands out among other data management systems because your data assets are under complete control and quality is preserved. The Data Services Service delivers the peace of mind that data and video from yesterday or five years ago that is stored in the Information Repository can be searched, played back, and exported to any desktop on the network at any time.

Through creating a Data Service Policy, the Data Services Service performs duties such as migrating, replicating, and purging data between Vaults, and the overall management of data, video, and supports file movement to other Vaults on the network.

The migration, replication,  and purging of data is key to Phoenix DEM because those functions allow for a very scalable solution. Many types of storage devices may be used with Phoenix DEM and can be added as you need them. This proves to be cost effective and even more scalable for long-term retention with this multi-tiered architecture.

Since the Data Services Service manages the data between Vaults within the Information Repository, it is the Locator Service that tracks which Vaults on the network are active.

The product architecture proves advantageous with the following:

Load Balancing - You can mark a specific storage pool as part of the destination criteria for jobs. The clients locate the Vault(s) that contain the designated storage pool and stores the defined files to it. If there are multiple Vaults with media belonging to the storage pool defined for the initial job, clients pick the one with the most available free space on it. This keeps the entire Information Repository load balanced.
Complete and Accurate Stored Files - As a job is running, the completeness and accuracy of the stored files are constantly being verified. Most other products that perform data verification read data from the media piece and compare it to the original data on the client computer. This method is resource intensive and tiyme consuming. Phoenix DEM uses a cyclic redundancy check (CRC) as the files are written to storage media.

If an error occurs while running jobs that prevents files from being processed, clients add the files to an internal retry list. At the end of the job, the client retries to process the files that failed on the first pass. If it fails during the retry session, the failure is logged to a retry file and report and is included in the Information Repository's error logs. When a subsequent job starts, the system looks for the retry file, and the files slated for retry are added with first priority to the list of files to be captured.

The Data Service Policies Panel is divided into two main panes. The left pane provides an overview of all of the Data Service Policies in the Information Repository. The right pane consists of all policies in the Information Repository when Policies is selected and the Source, Criteria, Destination, Scheduling, and Advanced tabs when an individual policy is selected.

Use the Data Service Policies toolbar to create a new policy by clicking the Add button. The toolbar consists of the following buttons:

- Save all policy settings.

- Add a new policy.

- Delete the selected policy.

- Refresh the view for the latest version of each policy to display. Since Data Service Policies are implemented across the whole Information Repository, refreshing is important to do to get the latest updates for the Data Service Policies view.  Also, it refreshes for changes that could affect contents of Vaults, media names, and storage pools. Before you refresh, save. If you do not save, any changes made or added policies will be lost.

The Data Service Policies window is described below:

Clicking on Data Service Policies heading will collapse and expand the accordion. When Policies is selected, a list of policies along with general information will be displayed in the right pane.

Workflow - Data Service Policy

An intricate part of data management is working with Data Service Policies. The following steps supply an overview look at the data management workflow:

1.From your suite of product applications, open the Management Console application (ManagementConsole.exe). For further explanation of how to launch applications using different operating systems, see Getting Started.
2.Click Data Service Policies to expand the panel.
3.If a policy does not open by default, you need to create a policy. Before you can set parameters and initialize the process, a policy must be created and opened. Add a policy by right-clicking on Policies and selecting Add Policy from the shortcut menu or click the green plus sign at the bottom of the panel.

If the green plus sign at the bottom of the panel is grayed out, make sure either Policies or one of the individual policies is already selected to make the option available.

4.Select a policy.
5.The Policy tab will appear first. Set the name, description, policy type, and activity setting.
6.Select the Source tab to identify a Vault, media, or storage pool  to migrate, replicate,  or purge files. Also set the type and speed of the media.
7.Select the Criteria tab to designate files to migrate, replicate,  or purge. Within this tab, you can also add file and directory filters.
8.Select the Destination tab to specify where the files are to be migrated or replicated to.
9.Select the Scheduling tab to define job occurrences. By default, Data Service Policies are not scheduled to run.
10.Select the Advanced tab to set options such as pre-processing and post-processing commands, log output location, connection sharing, and throttling.