Import/Export Data Service Policies

Previous Next  Print this Topic

Policy import and export functionality offered by SoleraTec's Phoenix RSM allows you to save policies, restore policies, and copy sets of policies from one computer to another. After creating a set of policies, you can export them to a policy file. Then, at a later time, if you want to re-implement the set of policies or implement them for the first time on another computer, they are there for import.

Use the import and export functionality to take a snapshot of the working policy configuration, bring new computers online, test different policy configurations, or restore user corrupted policy configurations.

By default, recording policies are exported to CameraPolicies.cfg.Data service policies are exported to DataSvcsSvc.cfg.

Although each policy editor uses only one policy set at a time, multiple policy sets exported to policy files can be used. Create, name, and use any file naming convention for policy files, and be sure to keep track of where policies are exported to if it is a location other than the default location (the user's home directory).

Before restoring a set of policies to a computer that currently has policies set up, be sure to delete all of the current policies before importing other policies - otherwise imported policies are added to current policies.

Data Service Policies are incompatible with other client policies and vice versa. If you try to import a Data Service policy into a non-Data Service Policies client (or vice versa), an error message appears and no policies are added.

Import Policies

When importing policies, they are added to the policies that are already in the policy editor.

1.From the application's main menu (or policy editor), click File > Import from. The Import from window appears.
2.If you are not importing policies from your local computer, navigate to the networked computer that contains the policies that you need.
3.Select the policy file that contains the policies that you want to use.
4.Click Open. The policies in the configuration file are implemented.

Export Policies

1.Create policies.
2.From the application's main menu (or policy editor), click File > Export to. The Export to window appears.
3.To save your policies with the default policy file name, click Save without entering a new file name. Or, enter a new name In the Export to window, and then click Save.

Import/Export Policy File Structure

An exported policy file is block structured but free from any column or order formatting. The tokens listed below specify job criteria. By default, policy files are exported to and imported from <install-dir>/Config/DataSvcsSvc.cfg.

Files that contain policies are formatted as follows. Each policy is a single block. The beginning of each block is indicated by policy_name="<policy name>".

policy_name="<policy name>"
[
 <token>=<value>
 <token>=<value>
 . . .
]

policy_name="<policy name>"
[
 <token>=<value>
 <token>=<value>
 . . .
]

. . .

Token/Value Pairs

A token is a job attribute that is given a value that determines how the job runs.

Token/Value

Description

absolute_start = <yyyy>/<mm>
/<dd>-<hh>:<mm>:<ss>

The exact start time and date that the job is to run. Multiple absolute start tokens may be entered.1, 2

active = <boolean>

Define if the policy is active or inactive.2

Valid values = true | false

Default = true

complete = <boolean>

Print all summary information plus the path names of the files migrated to the file indicated by the list_pn option.

Valid values = true | false

Default = false

content_filter = "<character string>"

A word or phrase in the file to be processed.

Default = ""

data_paths = <number>

The maximum number of concurrent files permitted to be sent during a job.

Default = 1

deleted_files = <boolean>

Include files tagged for deletion.

Valid values = true | false

Default = false

description = "<character string>"

A description of the policy.2

Default = ""

diag = <boolean>

Print diagnostic information to the path indicated in the list_pn option.

Valid values = true | false

Default = false

dir_spec = "<pathname>"

Directories to include. Multiple dir_spec options can be specified. The dir_spec check is made for every directory found. This includes the directories contained in the path name list that the job traverses.

Default = "*" (any)

display_only = <boolean>

Search for target files without processing them. List the files that would have been processed.

Valid values = true | false

Default = false

dst_media_name = "<character string>"

The destination media name. If not specified, the first available unit of media is used.

Default = "*" (any)

dst_media_speed = <character string>

The destination media to be used according to media speed.

Valid values =

S0 - S10
Unspecified

Default = Unspecified

dst_media_type = <character string>

The destination media type to be used.

Valid Values =

Hdisk - hard disk
8mm - Eight-millimeter tape
Dds - Digital data storage
Dlt - Digital linear tape
Ait - Advanced intelligent tape
SuperAit - Super advanced intelligent tape
Vxa - Exabyte VXA tape
Travan - Travan tape
Lto - Linear tape open
MagOptical - Read/write optical
Unspecified

Default = Unspecified

dst_storage_pool = "<character string>"

Specify the name of the destination storage pool to be used. Use ''" for no storage pool name.

Default = "*" (any)

dst_vault_name = "<character string>"

Specify the destination vault to be used.

Default = "*" (any)

dst_volume_format

Internal use only.

dst_volume_name

Internal use only.

dst_volume_pack

Internal use only.

dt_access = <alpha><yyyy>/<mm>
/<dd>-<hh>:<mm>:<ss>

Process files based on the accessed date and time.

Valid values =

AO (Absolute Old) – Select files with a dtu (date_time_use) equal to or before the specified date. Enter the year as a 4-digit number.
AN (Absolute New) – Select files with a dtu equal to or after the specified date. Enter the year as a 4-digit number.
RO (Relative Old) – Select files with a dtu equal to or before the relative specified date.
RN (Relative New) – Select files with a dtu equal to or after the relative specified date.
I (Inactive) – Do not use date.

Default = I

dt_stored = <alpha><yyyy>/<mm>
/<dd>-<hh>:<mm>:<ss>

Process files based on the date and time stored on the Vault.

Valid values =

AO (Absolute Old) – Select files with a dtu (date_time_use) equal to or before the specified date. Enter the year as a 4-digit number.
AN (Absolute New) – Select files with a dtu equal to or after the specified date. Enter the year as a 4-digit number.
RO (Relative Old) – Select files with a dtu equal to or before the relative specified date.
RN (Relative New) – Select files with a dtu equal to or after the relative specified date.
I (Inactive) – Do not use date.

Default = I

erase_media_if_unref = <boolean>

Erase the media when all the files on the media are marked for deletion.

Valid values = true | false

Default = false

error_log = "<pathname>"

The path for the error log file. If you specify "/dev/tty", error messages display on the default output device. Supports token substitution.2

Default path for the configuration file = "%R/Logs/%P.err"

exclude_dir_spec = "<pathname>"

Omit directories. Accept multiple instances.

Default = "" (no directories excluded)

exclude_file_spec = "<pathname>"

Omit files. Accept multiple instances.

Default = "" (no files excluded)

file_spec = "<pattern>"

Files to be processed. This token/value pair can be specified multiple times to include multiple file filters.

Default = "*" (any)

hostname = "<host name>"

Select the files that are to be processed based on the original host name for the file.

Default = ""

include_offline = <boolean>

Process files from offline media.

Valid values = true | false

Default = false

job_type = <alpha>

Specify the job type.

Valid values =

replicate
migrate
purge

Default = migrate

list_pn = "<pathname>"

The filename where all report information is written. Supports token substitution.

Default path for the policy file = "%R/Logs/%P.log"

max_file_size = <number>

The maximum size in bytes of a source file. The min_file_size and max_file_size options are mutually exclusive.

Default = 18446744073709551615

min_file_size = <number>

The minimum size in bytes of a source file. The max_file_size and min_file_size options are mutually exclusive.

Default = 0 (none)

out_of_band = <boolean>

Enable the application to constantly monitor its processes and select the network transmission protocol that best suits the files being processed.

Valid values = true | false

Default = true

post_proc_cmd = "<command>"

A command that runs immediately after the job is finished. Supports token substitution.

Default = ””

pre_proc_cmd = "<command>"

A pre-process command that runs immediately before a job starts. Supports token substitution.

Default = ””

process_directories = <boolean>

Directories will be processed.

Valid values = true | false

Default = true

process_files = <boolean>

Files will be processed.

Valid values = true | false

Default = true

repeat_interval = <hh:ss:mm>

The amount of time that elapses before a job is repeated.1, 2

Default = 24:00:00

src_high_water_mark = <number>

The maximum percent (%) of media that is  to be used. If the % used is greater than the entered value, the file is processed.

Valid values = 1 - 100

Default = 0

src_low_water_mark = <number>

The minimum percent (%) of media used after the high watermark has been reached. If the % of media used is greater than this value and the high watermark has been reached, the file is processed.

Valid values = 1 - 100

Default = 0

src_media_name = "<character string>"

The source media name.

Default = "*" (any)

src_media_speed = "<character string>"

The source media speed.

Valid values =

S0 - S10
Unspecified

Default = Unspecified

src_media_type = "<character string>"

The source media type from where the file should be processed.

Hdisk - Hard disk.
8mm - Eight-millimeter tape.
Dds - Digital data storage.
Dlt - Digital linear tape.
Ait - Advanced intelligent tape.
SuperAit - Super advanced intelligent tape.
Lto - Linear tape open.
Vxa - Exabyte Vxa tape.
Travan - Travan tape.
MagOptical - Read/write optical.
NullDevice
Unspecified

Default = Unspecified

src_storage_pool = "<character string>"

The source storage pool.

Default = "*" (any)

src_vault_name = "<character string>"

The source vault name.

Default = "*" (any)

src_volume_format

Internal use only.

src_volume_name

Internal use only.

src_volume_pack

Internal use only.

start_days = <mtwtfss>

The days the job is to run. Use - to indicate days the job should not run.

For example, if the job is to run on Thursday, you would enter ---t---.1, 2

Default = mtwtfss

start_time = <hh:mm:ss>

The time the job should start for each defined start day in start_days.1, 2

Default = 0:0:0 (midnight)

stop_time = R<yyyy>/<mm>
/<dd>-<hh>:<mm>:<ss>

The date and time the job is to stop. The date and time entered is relative to the start of the job.2

Default = R0000/00/00-00:00:00 (no stop time)

summary = <boolean>

Print the start and stop times and final statistics of the job to the file indicated by the list_pn option.

Valid values = true | false

Default = true

throttle = <number>

The throttle controls the amount of bandwidth used by jobs (and, the speed files are sent to Vaults). Throttle values are relative to Information Repository capacity and range from 1% to 100%. A lower setting reduces the amount of available bandwidth that is used.

Valid values = 1 - 100

Default = 100

type = <alpha>

The policy type of the policy file.

Default = "ds"

vcs = <alpha><number>

 

Specify the number of files (F) or number of bytes (S) to store per single Vault connection.

Valid values =

F – files
S – bytes
1 -

Default = F500

version = "<number>"

The version number of the policy. The current version number is 1.0.2

volume_use

Internal use only.

1Note: Timing tokens are mutually exclusive. Only one timing token may be set. In other words, you may set either absolute_start, or repeat_interval, or start_days and start_time.

2Note: Token/Value pair is used for policy only.

Sample Data Service Policy

If you were to export a policy, it would look similar to the following:

#

# Start of object rule for policy name 'Virtualization2LongTerm'

#

policy_name = "Virtualization2LongTerm"

[

 type                     = "ds"

 description              = ""

 version                  = "1.0"

 active                   = false

 complete                 = false

 process_files            = true

 process_directories      = true

 content_filter           = ""

 data_paths               = 1

 deleted_files            = false

 diag                     = false

 display_only             = false

 dst_vault_name           = "*"

 dst_media_name           = "*"

 dst_storage_pool         = "LongTerm"

 dst_media_speed          = Unspecified

 dst_media_type           = Unspecified

 dst_volume_format        = Unspecified

 dst_volume_name          = "*"

 dst_volume_pack          = ""

 dt_access                = I

 dt_stored                = I

 error_log                = "%R/Logs/%P.err"

 hostname                 = ""

 include_offline          = false

 job_type                 = "migrate"

 list_pn                  = "%R/Logs/%P.log"

 min_file_size            = 0

 max_file_size            = 18446744073709551615

 src_vault_name           = "*"

 src_media_name           = "*"

 src_storage_pool         = "Virtualization"

 src_media_speed          = Unspecified

 src_media_type           = Unspecified

 src_volume_format        = Unspecified

 src_volume_name          = "*"

 src_volume_pack          = ""

 volume_use               = Unspecified

 out_of_band              = true

 post_proc_cmd            = ""

 pre_proc_cmd             = ""

 repeat_interval          = 00:00:20

 src_high_water_mark      = 80

 src_low_water_mark       = 80

 stop_time                = R0000/00/00-00:00:00

 summary                  = true

 throttle                 = 100

 vcs                      = F500

]