Docstoc

sharepoint_db

Document Sample
sharepoint_db Powered By Docstoc
					Features - SharePoint Database iDataAgent
                                        Features - SharePoint Database iDataAgent




SharePoint Database iDataAgent - Table of Contents
Overview
System Requirements
Deployment          Install the SharePoint Database iDataAgent
                       Remote Installs - Windows Agents

Operations
Backup                 Backup - Microsoft SharePoint Portal
                       Supported Data Types
                       Supported Backup Types
                       Full Backups
                       Backup Options
                       Set a Job Priority
                       Start in Suspended State
                       Start New Media
                       Mark Media Full
Browse - (SharePoint   Browse Data - Overview
2003 Database Only)
                       Browse Options
                       Control the Browse Time Interval
                       Browse from Copies
Restore                Restore Data - SharePoint Portal
                       Restore Options
                       Basic Restore (SharePoint 2001 Database Only)
                       Browse and Restore (SharePoint 2003 Database Only)
                       Restore from Copies
                       Restore Data Using a Specific MediaAgent, Library or DrivePool
                       Set a Job Priority
                       Start in Suspended State
                       Related Topics
                       List Media (Media Prediction)
                       Restore From Anywhere
Full System Restore
Command Line           Overview
Interface
                       Save a Job as Script

Configuration
Agents                 Agents - Microsoft SharePoint Portal
Subclients             Subclients - SharePoint Portal Server
Pre/Post Process
Storage Policies       Overview
                       iDataAgent Backup (Standard) storage policy
                      Incremental Storage Policy
Storage Policy Copies Overview




                                                       Page 1 of 263
                                         Features - SharePoint Database iDataAgent




                      Alternate Data Paths (GridStor)
                      Selective Copy
                      Manual Retention
                      Job Based Pruning
                      Mark Job Disabled
                      Data Multiplexing
                      Data Verification

Administration
Scheduling
Schedule Policy
Data Encryption
Data Compression
Auxiliary Copy
Data Aging
User Accounts and Passwords

Management
Job Management        Job Controller
                      Operation Window
                      Activity Control
                      Job Preemption Control
                      Job Priorities and Priority Precedence
                      Job Alive Check Interval
                      Control Jobs Through Job Queuing
                      Job Update Interval
                      Job Running Time
                      Job Restart
                      Data Protection Operations
                      Data Recovery Operations
Job History           Backup Job History
                     Restore Job History
Hardware Specific Issues




                                                       Page 2 of 263
                                         Features - SharePoint Database iDataAgent




Overview - SharePoint Server 2001 iDataAgents

Choose from the following topics:
   Introduction
   Client Terminology
   Supported Data Types
   Tree Levels in the SharePoint iDataAgents
   License Requirement
   Disaster Recovery Considerations


Introduction
The SharePoint 2001 Database iDataAgent is the backup and restore vehicles for Microsoft SharePoint Portal workspaces
and databases; this iDataAgent only secures the SharePoint Portal server data that reside on a SharePoint Portal server.
The SharePoint 2001 Document iDataAgent is the backup and restore vehicles for documents and categories.
The following figure shows the iDataAgents needed to fully secure a heterogeneous computing environment.




Back to Top



Client Terminology
The following definitions are applicable to the SharePoint iDataAgents.
   A client is a computer whose data is backed up via an iDataAgent. This may include SharePoint, File System, etc.
   SharePoint Portal Clients are computers that access SharePoint data on the SharePoint Portal Server.




                                                       Page 3 of 263
                                          Features - SharePoint Database iDataAgent




Back to Top



Supported Data Types
A SharePoint Portal Server 2001 database contains workspaces and is responsible for storing and retrieving data within
those workspace folders. The SharePoint Portal consists of several components, such as, search capability, categories,
document library, subscriptions, and management. The SharePoint Portal Server supports indexing of network file shares,
Web shares, Lotus Notes databases, and other SharePoint Servers. This information is stored in a logical view of
information as needed for various user types, as opposed to a folder view. This information is presented to the user via the
digital dashboard web page serviced by the IIS web server. Users will have specific permissions to access the content of
workspace folders.
To secure the SharePoint Portal application files that resides on the file system of the SharePoint Portal server, you must
use the File System iDataAgent.
The SharePoint Database iDataAgent and SharePoint Document iDataAgent provide comprehensive backup and restore
solutions for the data available in the SharePoint Portal server.
Data Secured by the SharePoint Database iDataAgent
The SharePoint Database iDataAgent backs up and restores the database components of the system. Note that the
Exchange or Lotus Notes data is not secured by the SharePoint Database iDataAgent .
Data Secured by the SharePoint Document iDataAgent
The SharePoint Document iDataAgent backs up and restores the category folders and document libraries. In addition, the
SharePoint Document iDataAgent also backs up the properties of the workspace.
SharePoint Data Secured by the File System iDataAgent
In addition to the data stored on the SharePoint Server, there may be data stored on a Web Server, such as IIS. Such data
is not backed up by the SharePoint iDataAgents. To secure this data you must back it up using the File System iDataAgent
on the Web Server computer.
Back to Top



Tree Levels in the SharePoint iDataAgents
When the SharePoint iDataAgent is installed, the following levels are automatically created in the CommCell Browser:




metal: Client                  default: Subclients

SharePoint Database
SharePoint Database: Agent

SharePoint Document
SharePoint Document: Agent

defaultBackupSet: Backup
Sets




                                                        Page 4 of 263
                                         Features - SharePoint Database iDataAgent




Back to Top



License Requirement
To perform a data protection operation using this Agent a specific Product License must be available in the CommServe.
Review general license requirements included in License Administration. Also, View All Licenses provides step-by-step
instructions on how to view the license information.
Back to Top



Disaster Recovery Considerations
Before you use your agent, be sure to review and understand the associated full system restore (or disaster recovery)
procedure. The procedure for some agents may require that you plan specific actions or consider certain items before an
emergency occurs. See Disaster Recovery for more information regarding your agent.
Back to Top




                                                       Page 5 of 263
                                         Features - SharePoint Database iDataAgent




Overview - SharePoint Server 2003 iDataAgents

Choose from the following topics:
   Introduction
      SharePoint Portal Servers, SharePoint Portal Clients, and Clients
   Supported Data Types
   Tree Levels in the SharePoint iDataAgent
   License Requirement
   Disaster Recovery Considerations


Introduction
Microsoft SharePoint Portal Server 2003 includes several components that are backed up by the SharePoint Server 2003
iDataAgents, as well as data which must be backed up using the File System iDataAgent. SharePoint database files can also
reside on separate SQL servers within a server farm; to secure this data, you must back up these files using the
appropriate SQL iDataAgent. The SharePoint entities that can be backed up by the system are described in detail in the
following sections.
The following figure shows the iDataAgents needed to fully secure a SharePoint Portal all-in-one server:




The following figure shows the iDataAgents needed to fully secure an example of a small SharePoint server farm:




                                                       Page 6 of 263
                                           Features - SharePoint Database iDataAgent




SharePoint Portal Servers, SharePoint Portal Clients, and Clients
The following definitions are offered to prevent any confusion that may arise from the use of the term client:
   A SharePoint Portal Server is a computer on which the Windows 2003 Server software and Microsoft SharePoint Portal
   Server 2003 software have been installed. You can secure file system data on a SharePoint Portal server using the
   Windows File System iDataAgent for the server's file system.
   SharePoint Portal Clients are computers that access SharePoint data through the SharePoint Portal Server.
   A client is a computer whose data is backed up via an iDataAgent.
Back to Top



Supported Data Types
A SharePoint Portal Server 2003 consists of many components, listed in more detail below, and supports indexing of
network file shares, Web shares, Lotus Notes databases, and other SharePoint Servers. This information is stored in a
logical view of information as needed for various user types, as opposed to a folder view. This information is presented to
the user via the digital dashboard web page serviced by the IIS web server. Users will have specific permissions to access
the content of workspace folders.
In addition to the database on the SharePoint Portal Server, there may be SharePoint Portal Server entities or application
data that is not backed up by the SharePoint Portal iDataAgents. To secure this data you must back it up using the File
System iDataAgent, and in the case of a server farm, the SQL Server iDataAgent.
Data Secured by the SharePoint Database 2003 iDataAgent
The SharePoint Database 2003 iDataAgent backs up and restores the database components of the system. Note that the
Exchange or Lotus Notes data is not secured by the SharePoint Database iDataAgent.
   All-in-one Server
       Portal Sites, including Service, Profile, and Site Databases
       Portal Site Indexes (Content Sources)
       Content Databases



                                                         Page 7 of 263
                                         Features - SharePoint Database iDataAgent



      Team Site Databases
      Site Collections (top level sites)
      Single Sign-on (Database and Encryption Key)
      Backward Compatible Library
   Server Farm (SQL Databases are not backed up using the SharePoint iDataAgent)
      Portal Site Indexes
      Single Sign-on (Encryption Key)
      Site Collections
Data Secured by the SharePoint Document 2003 iDataAgent
The SharePoint Document 2003 iDataAgent backs up and restores Site Collections, sub-sites, the Libraries and Lists.
   For Document Libraries, Forms Libraries, and Picture Libraries:
      Documents and items in the Forms Folder
      User-defined columns:
          Number
          Single line of text
          Multiple lines of text
          Yes/No
          Calculated
          Date and Time
   Lists, including Columns
   Alerts associated with Documents and List Items (requires creating registry key dwBackupAlerts.)
SharePoint Data Secured by the File System iDataAgent
In addition to the data stored on the SharePoint Server, there may be data stored on Web Server. Such data is not backed
up by the SharePoint iDataAgents. To secure this data you must back it up using the File System iDataAgent on the Web
Server computer.
   Root Directories of SharePoint Services - Extended Virtual Servers
   Custom Web Part Assemblies
   Custom Templates
   IIS
   IIS Root Directories (including Portal Site web.config files)
   Add-in software:
     Language Template Packs
     Web Part page solutions (3rd-party developed aspx pages that include Web Parts)
     Templates that work with Microsoft Office
     Microsoft Office Web Parts and Components, which is a collection of Web Parts
SharePoint Data Secured by the SQL iDataAgent in a Server Farm
   SQL Database on a remote server


Data that is not Secured by the SharePoint 2003 iDataAgents
Record this information and store it in a safe place, as part of your disaster recovery planning. For more information about
Full System Restore, refer to Full System Restores for SharePoint Portal iDataAgents.
Data that is not Secured by the SharePoint Database iDataAgent
The following information is not backed up by the system, and must be recorded for use in the event a Full System Restore
is required:
   Configuration Database:
     E-Mail Server Settings
     Anti-virus Settings
     Blocked File Types



                                                       Page 8 of 263
                                         Features - SharePoint Database iDataAgent



     Logging Settings
     HTML Viewer
     Usage Analysis Processing
     Shared Services
Data that is not Secured by the SharePoint Document 2003 iDataAgent
The following information is not backed up by the system, and must be recorded for use in the event a Full System Restore
is required:
  For Sites, Alerts, Registry Keys
  For Meeting Workspaces, Pages list – multiple-page Meeting Workspaces, Meeting Series list, Recurring Meeting
  workspaces -- not restored as recurring and only List items for current meeting are backed up
  For Lists, Issues List (Issue Items are restored but the Issue History is not)
  Portal and Area listings
  Individual Portal Areas
  Web Discussions
  Areas - The content on Areas can be backed up and restored, but the Area itself cannot be restored with the SharePoint
  Document 2003 iDataAgent.
Back to Top



Tree Levels in the SharePoint iDataAgent
When the SharePoint iDataAgent is installed, the following levels are automatically created in the CommCell Browser:




metal: Client                              default: Subclients

SharePoint Database
MS SharePoint Server 2003 Database:
Agent
SharePoint Document
MS SharePoint Server 2003 Document:
Agent
defaultBackupSet: Backup Sets

Back to Top



License Requirement
To perform a data protection operation using this Agent a specific Product License must be available in the CommServe.
Review general license requirements included in License Administration. Also, View All Licenses provides step-by-step
instructions on how to view the license information.
Back to Top



                                                       Page 9 of 263
                                        Features - SharePoint Database iDataAgent




Disaster Recovery Considerations
Before you use your agent, be sure to review and understand the associated full system restore (or disaster recovery)
procedure. The procedure for some agents may require that you plan specific actions or consider certain items before an
emergency occurs. See Disaster Recovery for more information regarding your agent.
Back to Top




                                                     Page 10 of 263
                                        Features - SharePoint Database iDataAgent




System Requirements - Microsoft SharePoint Portal iDataAgents

The following requirements are for the SharePoint Database and SharePoint Document iDataAgents:


Application/Operating System                                                Processor

Microsoft SharePoint Portal 2001 Server 32-bit up to Service Pack 3 on: OBSOLETE

            Microsoft Windows 2000 Server with Service Pack 2, 3 or 4       Intel Pentium or compatible minimum required
            Microsoft Windows 2000 Advanced Server with Service Pack
            2, 3 or 4
            Microsoft Windows Server 2003 up to Service Pack 1
            (SharePoint Document iDataAgent only)

            For SharePoint 2001 Document iDataAgent - Microsoft Office SharePoint Portal Server 2003 Backwards
            Compatible Library Server Component

Microsoft SharePoint Portal 2003 Server 32-bit up to Service Pack 2 on:

            Microsoft Windows Server 2003 Standard Edition up to            Intel Pentium or compatible minimum required
            Service Pack 1
            Microsoft Windows Server 2003 Enterprise Edition up to
            Service Pack 1
            Microsoft Windows Server 2003 Web Server Edition up to
            Service Pack 1
            Microsoft Windows Server 2003 Datacenter Edition up to
            Service Pack 1

            Microsoft Windows Server 2003 Enterprise x64* Edition up        x64
            to Service Pack 1
            * Special configuration considerations apply. See Installing
            32-bit Agents on a Microsoft Windows x64 Platform for
            more information.

            For SharePoint 2003 Database iDataAgent:
               Microsoft SharePoint Portal 2003 Server 32-bit up to Service Pack 2 and Microsoft Windows SharePoint
               Services up to Service Pack 2
            For SharePoint 2003 Document iDataAgent:
               Microsoft Windows SharePoint Services up to Service Pack 2
               OR
               Microsoft SharePoint Portal 2003 Server 32-bit up to Service Pack 2 and Microsoft Windows SharePoint
               Services up to Service Pack 2

Microsoft SharePoint Portal 2007 Server 32-bit on:

            Microsoft Windows Server 2003 Standard Edition up to            Intel Pentium or compatible minimum required
            Service Pack 1
            Microsoft Windows Server 2003 Enterprise Edition up to
            Service Pack 1
            Microsoft Windows Server 2003 Web Server Edition up to
            Service Pack 1
            Microsoft Windows Server 2003 Datacenter Edition up to
            Service Pack 1




                                                     Page 11 of 263
                                         Features - SharePoint Database iDataAgent




            For SharePoint 2007 Document iDataAgent:
               Microsoft SharePoint Services 3.0
               OR
               Microsoft Office SharePoint Server 2007


Memory
32 MB RAM minimum required beyond the requirements of the operating system and running applications


Hard Disk
50 MB minimum of hard disk space for software
50 MB of additional hard disk space for log file growth
10 MB of temp space required for install or upgrade (where the temp folder resides)


Peripherals
CD-ROM drive
Network Interface Card


Miscellaneous
TCP/IP Services configured on the computer
The File System iDataAgent will be automatically installed during installation of the Microsoft SharePoint Portal
iDataAgents if it is not already installed. For System Requirements and install information specific to the File System
iDataAgent, refer to System Requirements - Microsoft Windows File System iDataAgent.




                                                      Page 12 of 263
                                           Features - SharePoint Database iDataAgent




Install the SharePoint Database iDataAgent

Click on a link below to go to a specific section of the software installation:
   Install Requirements
   Install Checklist
   Before You Begin
   Install Procedure
      Getting Started
      Select Components for Installation
      Set Up the Required Privileges
      Configuration of Other Installation Options
      Firewall Configuration
      Interface Name and Job Results Location
      SharePoint Administration Account
      Verify Summary of Install Options
      Storage Policy Selection
      SharePoint Portal Server Information
      Schedule Automatic Install of Updates
      Setup Complete
   Post-Install Considerations


Install Requirements

The SharePoint Database iDataAgent is installed on the SharePoint Portal Server 2001 computer. This computer is
referred to as the Client computer in this install procedure.
The MS SharePoint Server 2003 Database iDataAgent is installed on the Microsoft Office SharePoint Portal Server 2003.
This computer is referred to as the Client computer in this install procedure.
Verify that the computer in which you wish to install the software satisfies the minimum system requirements; refer to
System Requirements - Microsoft SharePoint Portal iDataAgents and System Requirements - Microsoft Windows File
System iDataAgent.
The following procedure describes the steps involved in installing the Windows File System and SharePoint Database
iDataAgents. If you choose to install additional components simultaneously, refer to the appropriate procedures for
installation requirements and steps specific to the component.
Review the following Install Requirements before installing the software:
General
  Agents should only be installed after the CommServe and at least one MediaAgent have already been installed in the
  CommCell. Also, keep in mind that the CommServe and MediaAgent must be installed and running (but not necessarily
  on the same computer), before you can install the Agent.
  You must also install the File System iDataAgent on the computer on which you plan to install any application
  iDataAgents, Quick Recovery Agents, DataMigrator Agents, DataArchiver Agents, Recovery Director, or 1-Touch
  Server.
  This version of the software is intended to be installed in a CommCell where the CommServe and MediaAgent(s)
  version is 6.1.0.
  Close all applications and disable any programs that run automatically, including anti-virus, screen savers and
  operating system utilities. Some of the programs, including many anti-virus programs, may be running as a service.
  Stop and disable such services before you begin. You can re-enable them after the installation.
  Ensure there is an available license on the CommServe for the Agent.
  Verify that you have the software CD-ROM that is appropriate to the destination computer’s operating system. See
  Software Installation Discs for a list of available CD-ROMs.
  Make sure that you have the latest CD-ROM for the software version before you start to install the software. If you are
  not sure, contact your software provider.



                                                         Page 13 of 263
                                           Features - SharePoint Database iDataAgent




Firewall
   If the CommServe, MediaAgent and/or Clients communicate across two-way firewall(s):
       Ensure port 8400 is allowed connections through the firewall.
       In addition a range of bi-directional ports (consecutive or discrete) must also be allowed connections through the
       firewall.
   If the CommServe, MediaAgent and/or Clients communicate across one-way firewall(s):
       Identify a range of outbound ports (consecutive or discrete) for use by the software.
     For information on configuring the range of ports see Port Requirements.
     If the MediaAgent/Client communicates with the CommServe across a one-way firewall, you must add the
     MediaAgent/Client host name (or the IP address) in the CommServe computer before installing the necessary software
     on the MediaAgent/Client computer.
SharePoint Portal 2003 Server
  The SharePoint iDataAgent should be installed only on a Front End Web Server. If the farm contains more than one
  Front End Web Server, each with a unique portal, then the SharePoint iDataAgent must be installed on all of them.
     The configuration and topology of any server farm must be one that is supported by SharePoint Portal 2003 Server.
     In a SharePoint server farm (i.e., where SQL Server runs on a remote machine), the SQL Server iDataAgent must be
     installed on the SQL Server machine.
     If the Single Sign-on Service is configured in the SharePoint server farm then the SharePoint iDataAgent can be
     installed on the job server.


Install Checklist
Collect the following information before installing the software. Use the space provided to record the information, and
retain this information in your Disaster Recovery binder.

1.      Install folder location:________________________________________________________________
        See Step 9 for more information.

2.      If the CommServe and the client computer communicate across a firewall:
        Firewall ports: ______________________________________________________________________
        Names of CommServe and MediaAgent computers on the other side of the firewall:______________
        Keep Alive Interval minutes:____________________________________________________________
        See Firewall Configuration for more information.

3.      CommServe Host Name or the CommServe IP address:______________________________________
        See Step 14 for more information.

4.      Local computer's Host name (NetBIOS name) or IP address____________________________________
        See Interface Name and Job Results Location for more information.

5.      Job result folder location:________________________________________________________________
        See Interface Name and Job Results Location for more information.

6.      (SharePoint 2003 Database only)
        Administrative Group Account / iDataAgent Account:_________________________________________
        Account Password:______________________________________________________________________
        See SharePoint Administration Account for more information.

7.      Storage Policy used by the default subclient:________________________________________________
        See Storage Policy Selection for more information.

8.      SharePoint Portal Server Name:_____________________________________________________________




                                                        Page 14 of 263
                                          Features - SharePoint Database iDataAgent




        (The following are for SharePoint 2003 Database only)
        SSO Account:__________________________________________________________________________
        Account Password:______________________________________________________________________
        See SharePoint Portal Server Information for more information.

9.      (SharePoint 2003 Database only)
        Index Backup/Restore path:_______________________________________________________________
        See Post-Install Considerations for more information.

10.     Start Date and Start Time for Automatic Updates Schedule:_________________________________________
        See Schedule Automatic Install of Updates for more information.


Before You Begin

     Log on to the client as local Administrator or as a member of the Administrators group on that computer.


Install Procedure

Getting Started

1. Place the software CD-ROM for the appropriate platform into
   the CD-ROM drive. (See Software Installation Discs for specific
   information about which CD-ROM to use for your operating
   system and platform.)
      After a few seconds, the installation program is launched.
      If the installation program does not launch automatically:
         Click the Start button on the Windows task bar, and then
         click Run.
         Browse to the CD-ROM drive, select Setup.exe, click Open,
         then click OK.

2. Select the desired language and click Next to continue.




3. Click Install QiNetix on this computer.
      NOTES
         The options in the installation menu depend on the
         computer in which the software is being installed, and may
         look different from the example shown.




                                                       Page 15 of 263
                                            Features - SharePoint Database iDataAgent




4. Read the Welcome screen.
     Click Next to continue, if no other applications are running.




5. Read the virus scanning software warning.
     Click OK to continue, if virus scanning software is disabled.




6. Read the license agreement, then select I accept the terms
   in the license agreement.
     Click Next to continue.




Select Components for Installation

7.    Select the component(s) to install.
      NOTES



                                                         Page 16 of 263
                                         Features - SharePoint Database iDataAgent




        Your screen may look different from the example shown.
        Components that either have already been installed, or
        which cannot be installed, will be dimmed.
     Click Next to continue.
     To install the Microsoft SharePoint Database iDataAgent,
     expand the Client Modules folder, the iDataAgents folder,
     and SharePoint iDataAgents folder, and select one of the
     following:
        MS SharePoint Server 2003 Database
        iDA for SharePoint Database

     When you select the SharePoint Database iDataAgent for
     install, the appropriate Windows File System iDataAgent is
     automatically selected for install.


Set Up the Required Privileges

8.   Click Yes to set up the required privileges for the local
     administrators group.
     NOTES
        This option will only appear if the Windows user account
        used to install the software does not have the required
        administrator rights (e.g., if the operating system was
        newly installed).
        If you choose to click Yes, the install program will
        automatically assign the required rights to your account.
        You may be prompted to log off and log back on to
        continue the installation.
        If you choose to click No, the installation will be aborted.
        You will be prompted at the end of the installation to
        decide if you want these privileges to be revoked.

     ADDITIONAL NOTES
     The install program checks your Windows user account for the following necessary operating system rights:
        Right to increase quotas (this is referred to as adjust memory quotas for a process on Windows Server 2003).
        Right to act as a part of the operating system.
        Right to replace a process level token.

Configuration of Other Installation Options

9.   Specify the location where you want to install the software.
     NOTES
        Do not install the software to a mapped network drive.
        Do not use the following characters when specifying the
        destination path:
         /:*?"<>|
        It is recommended that you use alphanumeric characters
        only.
        If you intend to install other components on this computer,
        the selected installation directory will be automatically
        used for that software as well.
        If a component has already been installed, this screen may
        not be displayed if the installer can use the same install
        location as previously used.




                                                       Page 17 of 263
                                         Features - SharePoint Database iDataAgent




     Click Browse to change directories.
     Click Next to continue.




Firewall Configuration

10. Select from the following:
       If this Client communicates with the CommServe and/or
       MediaAgent across a firewall, select Yes, configure
       Galaxy firewall services, and then click Next to
       continue. Proceed to the next Step.
       If firewall configuration is not required, click No, do not
       configure Galaxy firewall services and then click Next
       to continue. Proceed to the next section.




11. Perform the following:
       Enter the host name(s) of the MediaAgents/Clients that will
       need to be contacted through a firewall. Type the host
       name or the IP address and click Add to place it in the
       Host Name/IP Address List. Consider the following in a
       one-way firewall:
           On the CommServe, this list should include all the
           MediaAgents and Clients that are on the other side of
           the firewall.
           On the MediaAgents/Clients this should include the
           CommServe computer, if it is on the other side of the
           firewall, and any other Clients/MediaAgents on the
           other side of the firewall with which communications will
           be established.
       Choose the type of firewall configuration based on the
       firewall setup in your environment. Choose from the
       following options:
           Click on 2-way firewall if you can open certain ports
           as bi-directional ports.
           Click 1-way firewall; host is reachable from this
           machine on the CommServe in a one-way firewall. This
           option is also applicable if a MediaAgent/Client is on the
           same side of the firewall as the CommServe and



                                                      Page 18 of 263
                                        Features - SharePoint Database iDataAgent




           communicates with a MediaAgent/Client on the other
           side of the firewall.
           Click 1-way firewall; host is NOT reachable from
           this machine in a one-way firewall, when a
           MediaAgent or Client communicates with a CommServe
           (and any Clients/MediaAgents) on the other side of the
           firewall.
     Click Next to continue.

12. Enter the starting and ending port ranges, and click Add to
    place it in the Open Port List. Repeat as needed.
     NOTES
        Specify the range of ports that must be used for
        communication between the Client and CommServe and/or
        MediaAgent computers. For more information on the port
        requirements, see Port Requirements in Firewall
        Requirements.
     Click Next to continue.




13. If desired, modify the Keep Alive interval.
     Click Next to continue.
     This concludes the firewall configuration process.




Select Components for Installation (continued)

14. Enter the fully qualified domain name of the CommServe
    computer. (TCP/IP network name. e.g.,
    computer.company.com)
     NOTES
        If a component has already been installed, this screen will
        not be displayed; instead, the installer will use the same
        Server Name as previously specified.
     Click Next to continue.




                                                     Page 19 of 263
                                          Features - SharePoint Database iDataAgent




Interface Name and Job Results Location

15. Enter the following:
       The local (NetBIOS) name of the client computer.
       The TCP/IP IP host name of the NIC that the client
       computer must use to communicate with the CommServe.
     NOTES
        The default network interface name of the client computer
        is displayed if the computers has only one network
        interface. If the computer has multiple network interface,
        enter the interface name that is preferred for
        communication with the CommServe.
        If a component has already been installed, this screen will
        not be displayed; instead, the install program will use the
        same name as previously specified.
     Click Next to continue.

16. Specify the location of the client’s job results directory.
     NOTES
        The Agent uses the job results directory to store the
        client’s backup and restore job results.
     Click Browse to change directories.
     Click Next to continue.




SharePoint Administration Account

17. Enter the User Name and Password for the SharePoint
    Administration Account.
     Click Next to continue.




                                                       Page 20 of 263
                                          Features - SharePoint Database iDataAgent




     NOTES
        This screen will not be displayed for SharePoint Server
        2001.
        When installing both the SharePoint Document 2003 and
        SharePoint Database 2003 iDataAgents at the same time,
        this screen will only be shown once.
        During installation, the Base Services of the client are
        configured to run as the user account entered through this
        screen.
        For both the SharePoint Server 2003 Document and
        SharePoint Server 2003 Database iDataAgents, run Base
        Services on the Client using an account that meets the
        following criteria:
           member of the local Administrator Group
           member of the SharePoint Portal Administrator Group
           System Administrator role on the SQL Server Instance
        Refer to the article, Galaxy Service Account User
        Information for Windows 2003 and Window Server 2003
        clients available from the Maintenance Advantage web site.
        In addition, for the SharePoint 2003 iDataAgents, this
        account must have "Log on as Service" permissions to
        ensure the CVD service will start.
        When installing the SharePoint Database 2003 iDataAgent
        on a job server, the user account entered through this
        screen must have administrative privileges to the Single
        Sign-On Service.

Verify Summary of Install Options

18. Verify the summary of selected options.
     NOTES
        The Summary on your screen should reflect the
        components you selected for install, and may look different
        from the example shown.
     Click Next to continue or Back to change any of the options.
     The install program now starts copying the software to the
     computer. This step may take several minutes to complete.




19. The System Reboot message may be displayed. If so, select
    one of the following:
      Skip Reboot
      This option will be displayed if the install program finds
      any files belonging to other applications, that need to be
      replaced. As these files are not critical for this installation,
      you may skip the reboot and continue the installation and
      reboot the computer at a later time.
      Reboot Now




                                                        Page 21 of 263
                                        Features - SharePoint Database iDataAgent




        If this option is displayed without the Skip Reboot option,
        the install program has found files required by the software
        that are in use and need to be replaced. If Reboot Now is
        displayed without the Skip Reboot option, reboot the
        computer at this point. The install program will
        automatically continue after the reboot.
        Exit Setup
        If you want to exit the install program, click Exit Setup.




Storage Policy Selection

20. Select the storage policy through which you want to back
    up/migrate/archive the indicated component (subclient,
    instance, etc.)
     NOTES
        A storage policy directs backup data to a media library.
        Each library has a default storage policy.
        When you install an Agent, the install program creates a
        default subclient for most Agents.
        If desired, you can change your storage policy selection at
        any time after you have installed the client software.
        If this screen appears more than once, it is because you
        have selected multiple agents for installation and are
        configuring storage policy association for each of the
        installed agents.
     Click Next to continue.

SharePoint Portal Server Information

21. You are prompted for the Microsoft SharePoint Portal Server
    name. The computer name is displayed by default; if this is
    not correct, enter the correct name.
     Click Next to continue.




Schedule Automatic Install of Updates

22. If necessary, select this option to schedule an automatic
    installation of software updates.
     NOTES



                                                     Page 22 of 263
                                         Features - SharePoint Database iDataAgent




        Install Updates Schedule allows a one-time automatic
        installation of the necessary software updates on the
        computer. If you do not select this option, you can
        schedule these updates later from the CommCell Console.
        To avoid conflict, do not schedule the automatic
        installation of software updates to occur at the same time
        as the automatic FTP downloading of software updates.
     Click Next to continue.




Setup Complete

23. Click Next to continue.
     NOTES
        Schedules help ensure that the data protection operations
        for the Agent are automatically performed on a regular
        basis without user intervention. For more information, see
        Scheduling.




24. Click Yes to remove the privileges that were assigned earlier
    by the install program. If you do not wish to remove them,
    click No.
     NOTES
        This option will only be displayed if you were prompted to
        assign the privileges earlier in the installation.




25. Setup displays the successfully installed components.
     NOTES
        The Setup Complete message displayed on your screen
        will reflect the components you installed, and may look
        different from the example shown.
        If you install an Agent with the CommCell Console open,
        you need to refresh the CommCell Console (F5) to see the
        new Agents.
     Click Finish to close the install program.
     The installation is now complete.



                                                      Page 23 of 263
                                        Features - SharePoint Database iDataAgent




Post-Install Considerations


SharePoint Portal 2003
  If the SharePoint Portal 2003 Server is configured to use the Single Sign-on Service, then the administrator account for
  this service must be entered in the SharePoint Server 2003 Database iDataAgent properties before any backups are
  run. Perform the following steps:

  1.   From the CommCell Console, right-click the SharePoint Database iDataAgent, and select Properties.
  2.   In the iDataAgent Properties, click the Change Account button for the SSO Account, and enter the Single Sign-
       on Account name and password.
  The following information is not backed up by the system, and must be recorded for use in the event a Full System
  Restore is required:
    E-Mail Server Settings
    Anti-virus Settings
    Blocked File Types
    Logging Settings
    HTML Viewer
    Usage Analysis Processing
    Shared Services
  Record this information and store it in a safe place, as part of your disaster recovery planning. For more information
  about Full System Restore, refer to Full System Restores for SharePoint Portal iDataAgents.
SharePoint Server Farms
  For SharePoint server farms only (i.e., where SQL Server runs on a remote machine), perform the following steps.

  1.   From the CommCell Console, right-click the SharePoint Database iDataAgent, and select Properties.
  2.   In the General tab of the iDataAgent Properties, select SQL Databases hosted on a remote SQL Server.
  You must install the Windows and SQL iDataAgents on the remote SQL Server to protect its data; the SharePoint
  iDataAgent will not do so. Ensure that backups of the SharePoint Portal Server and the remote SQL Server are
  scheduled to run at about the same time.
  For medium to large SharePoint server farms, create a folder on a machine with enough disk space to temporarily
  store the Portal Site Index for the farm during backup and restore operations, as well as job results information, and
  share this folder on the network. (The Services account used by the software, the Front End Web Server, and the
  Portal Site Index server must all be able to access this folder.) Perform the following steps:

  1.   From the CommCell Console, right-click the SharePoint Database iDataAgent, and select Properties.
  2.   In the General tab of the iDataAgent Properties, in the Index Backup/Restore field, type the path to the folder
       you just created.



                                                     Page 24 of 263
                                        Features - SharePoint Database iDataAgent




General
  Install post-release updates or Service Packs that may have been released after the release of the software. If you are
  installing a Service Pack, verify and ensure that it is the same version as the one installed in the CommServe.
  Alternatively, you can enable Automatic Updates for quick and easy installation of updates in the CommCell.
  After installing the Agent, you may want to configure the Agent before running a data protection operation. The
  following list includes some of the most common features that can be configured:
      Configure your subclients - see Subclients for more information.
      Schedule your data protection operations - see Scheduling for more information.
      Configure Alerts - See Alerts and Monitoring for more information.
      Schedule Reports - See Reports for more information.
The software provides many more features that you will find useful. See the Index for a complete list of supported
features.
NOTES
Before you use your agent, be sure to review and understand the associated full system restore (or disaster recovery)
procedure. The procedure for some agents may require that you plan specific actions or consider certain items before an
emergency occurs. See Disaster Recovery for more information regarding your agent.




                                                     Page 25 of 263
                                           Features - SharePoint Database iDataAgent




Remote Installs - Windows Agents

Click on a link below to go to a specific section of the software installation:
   Install Requirements
   Install Checklist
   Before You Begin
   Install Procedure
      Getting Started
      Select Components for Installation
      Firewall Configuration
      Schedule Data Classification Service
      SharePoint Administration Account
      Configure the Storage Policy for the Default Subclient
      Configure DataMigrator for Exchange Options
      Select Remote Computers and Configure Access Credentials
      Verify Summary of Install Options
      Configure Advanced Settings
      Setup Complete
   Post-Install Considerations


Install Requirements

Use this procedure to remotely install MediaAgents, iDataAgents, DataArchiver Agents, DataMigrator Agents, Quick
Recovery Agents and related software components — such as the Open File Handler (OFH) or QSnap snapshot enablers.
See Support Information - Installation for a list of all of the components that support remote installs and can be installed
using this procedure.
Remote installs allow you to install system components on multiple Windows computers at the same time. When
performing a remote installation or upgrade, the install program is launched on one computer in the network, referred to
as the local computer, but the software is installed over the network to other selected computers, referred to as remote
computers.
Also note that remote install is not supported in the following environments:
    To or from 64-bit computers running Windows XP or Windows Server 2003
    To or from Windows NT Computers
    To or from Clustered computers
    Workgroup only environments
The File System iDataAgent will be automatically installed during remote installation if you select an agent that requires
the File System iDataAgent.
Verify that the computers to which you wish to install the software satisfy the minimum requirements specified in System
Requirements.
Review the following Install Requirements before installing the software:
General
  Agents should only be installed after the CommServe and at least one MediaAgent have already been installed in the
  CommCell. Also, keep in mind that the CommServe and MediaAgent must be installed and running (but not necessarily
  on the same computer), before you can install the Agent.
  You must also install the File System iDataAgent on the computer on which you plan to install any application
  iDataAgents, Quick Recovery Agents, DataMigrator Agents, DataArchiver Agents, Recovery Director, or 1-Touch
  Server.
  This version of the software is intended to be installed in a CommCell where the CommServe and MediaAgent(s)
  version is 6.1.0.
  Close all applications and disable any programs that run automatically, including anti-virus, screen savers and



                                                         Page 26 of 263
                                           Features - SharePoint Database iDataAgent




     operating system utilities. Some of the programs, including many anti-virus programs, may be running as a service.
     Stop and disable such services before you begin. You can re-enable them after the installation.
     Ensure there is an available license on the CommServe for the Agent.
     Verify that you have the software CD-ROM that is appropriate to the destination computer’s operating system. See
     Software Installation Discs for a list of available CD-ROMs.
     Make sure that you have the latest CD-ROM for the software version before you start to install the software. If you are
     not sure, contact your software provider.
Firewall
   If the CommServe, MediaAgent and/or Clients communicate across two-way firewall(s):
       Ensure port 8400 is allowed connections through the firewall.
       In addition a range of bi-directional ports (consecutive or discrete) must also be allowed connections through the
       firewall.
   If the CommServe, MediaAgent and/or Clients communicate across one-way firewall(s):
       Identify a range of outbound ports (consecutive or discrete) for use by the software.
     For information on configuring the range of ports see Port Requirements.
     If the MediaAgent/Client communicates with the CommServe across a one-way firewall, you must add the
     MediaAgent/Client host name (or the IP address) in the CommServe computer before installing the necessary software
     on the MediaAgent/Client computer.
     For installation using the software CD-ROM, the local computer's CD-ROM drive must be shared and accessible by all
     the remote computers.
     For an installation from a mapped network drive, all computers must be able to access the share that contains the
     install program.
     The Task Scheduler Services must be installed and enabled on the local and remote computers and you must be able
     to schedule tasks.
ContinuousDataReplicator
  If any of the computers in which you are installing this software have multiple Network Interface Cards (NIC) you must
  configure them so that the source and destination machines can communicate for replication activities. For more
  information, see Data Interface Pairs.


Install Checklist
Collect the following information before installing the software. Use the space provided to record the information, and
retain this information in your Disaster Recovery binder.

1.      Install folder location:________________________________________________________________
        See Step 9 for more information.

2.      If the CommServe and the client computer communicate across a firewall:
        Firewall ports: ______________________________________________________________________
        Names of CommServe and MediaAgent computers on the other side of the firewall:______________
        Keep Alive Interval minutes:____________________________________________________________
        See Firewall Configuration for more information.

3.      CommServe Host Name or the CommServe IP address:______________________________________
        See Step 14 for more information.

4.      (SharePoint 2003 Database and Document only)
        Administrative Group Account / iDataAgent Account:_________________________________________
        Account Password:______________________________________________________________________
        See SharePoint Administration Account for more information.

5.      Data Classification Service Start Date and Time:___________________________________________




                                                        Page 27 of 263
                                          Features - SharePoint Database iDataAgent




        See Step 15 for more information.

6.      Storage Policy used by the default subclient:________________________________________________
        See Configure the Storage Policy for the Default Subclient for more information.

7.      DataMigrator for Exchange Agents selected to support Outlook Web Access (OWA): Outlook Web Access (Enabled or
        Disabled):_______________________________________________
        Outlook Web Access Alias:______________________________________________________________
        See Step 18 and Step 19, respectively, for more information.

8.      DataMigrator for Exchange Agents selected to support WebProxy Agent for Exchange: URL and Port number for the
        WebProxy server:_______________________________________________
        See Step 24 for more information.

9.      Remote access folders location:
        UNC location of installation CD:__________________________________________________________
        UNC path to store remote install output log files:______________________________________________
        See Step 27 for more information.


Before You Begin

     One account must be used for the entire remote install.
     This account needs to be a domain administrator. The domain administrator, by default, should have all the rights
     needed to perform the remote installation, including the right to schedule tasks for all remote client computers.
     This domain administrator account needs to be in the local administrators group on all target remote machines.
     This domain administrator account needs to be in the local administrators group on the machine from which the
     remote install is launched.
     A Windows two-way trust must exist for all cross domain remote installs.
     For a database Agent, you may be required or be given the option to shut down the instances/databases for the agent
     before or during the following procedure. See Shut Down Instances for more information.


Install Procedure

Getting Started

1.     Place the software CD-ROM for the appropriate platform into
       the CD-ROM drive. (See Software Installation Discs for
       specific information about which CD-ROM to use for your
       operating system and platform.)
       After a few seconds, the installation program is launched.
       If the installation program does not launch automatically:
          Click the Start button on the Windows task bar, and then
          click Run.
          Browse to the CD-ROM drive, select Setup.exe, click
          Open, then click OK.

2.     Select the desired language and click Next to continue.




                                                       Page 28 of 263
                                         Features - SharePoint Database iDataAgent




3.   Click Install QiNetix on this computer.
     NOTES
        The options in the installation menu depend on the
        computer in which the software is being installed, and may
        look different from the example shown.




4.   Read the Welcome screen.
     Click Next to continue, if no other applications are running.




5.   Read the virus scanning software warning.
     Click OK to continue, if virus scanning software is disabled.




6.   Read the license agreement, then select I accept the terms
     in the license agreement.
     Click Next to continue.




                                                      Page 29 of 263
                                           Features - SharePoint Database iDataAgent




7.   Select the type of install you want to perform.
     For remote installs, select New Install.
     For remote upgrades, select Upgrade.
     Click Next to continue.




Select Components for Installation

8.   Select the component(s) to install.
     NOTES
        Your screen may look different from the example shown.
        Components that either have already been installed, or
        which cannot be installed, will be dimmed.
     Click Next to continue.




9.   Specify the location where you want to install the software.
     NOTES
        Do not install the software to a mapped network drive.
        Do not use the following characters when specifying the
        destination path: / : * ? " < > |



                                                        Page 30 of 263
                                         Features - SharePoint Database iDataAgent




        It is recommended that you use alphanumeric characters
        only. If you intend to install other components on this
        computer, the selected installation directory will be
        automatically used for that software as well.
     Click Next to continue.
     If you are installing only the DataMigrator Outlook Add-In
     software, skip to Step 20.




Firewall Configuration

10. Select from the following:
       If this Client communicates with the CommServe and/or
       MediaAgent across a firewall, select Yes, configure
       Galaxy firewall services, and then click Next to
       continue. Proceed to the next Step.
       If firewall configuration is not required, click No, do not
       configure Galaxy firewall services and then click Next
       to continue. Proceed to the next section.




11. Perform the following:
       Enter the host name(s) of the MediaAgents/Clients that will
       need to be contacted through a firewall. Type the host
       name or the IP address and click Add to place it in the
       Host Name/IP Address List. Consider the following in a
       one-way firewall:
           On the CommServe, this list should include all the
           MediaAgents and Clients that are on the other side of
           the firewall.
           On the MediaAgents/Clients this should include the
           CommServe computer, if it is on the other side of the
           firewall, and any other Clients/MediaAgents on the
           other side of the firewall with which communications will
           be established.
       Choose the type of firewall configuration based on the
       firewall setup in your environment. Choose from the
       following options:
           Click on 2-way firewall if you can open certain ports
           as bi-directional ports.
           Click 1-way firewall; host is reachable from this
           machine on the CommServe in a one-way firewall. This
           option is also applicable if a MediaAgent/Client is on the
           same side of the firewall as the CommServe and



                                                      Page 31 of 263
                                        Features - SharePoint Database iDataAgent




           communicates with a MediaAgent/Client on the other
           side of the firewall.
           Click 1-way firewall; host is NOT reachable from
           this machine in a one-way firewall, when a
           MediaAgent or Client communicates with a CommServe
           (and any Clients/MediaAgents) on the other side of the
           firewall.
     Click Next to continue.

12. Enter the starting and ending port ranges, and click Add to
    place it in the Open Port List. Repeat as needed.
     NOTES
        Specify the range of ports that must be used for
        communication between the Client and CommServe and/or
        MediaAgent computers. For more information on the port
        requirements, see Port Requirements in Firewall
        Requirements.
     Click Next to continue.




13. If desired, modify the Keep Alive interval.
     Click Next to continue.
     This concludes the firewall configuration process.




Select Components for Installation (continued)

14. Enter the fully qualified domain name of the CommServe
    computer. (TCP/IP network name. e.g.,
    computer.company.com)
     NOTES
        If a component has already been installed, this screen will
        not be displayed; instead, the installer will use the same
        Server Name as previously specified.
     Click Next to continue.




                                                     Page 32 of 263
                                          Features - SharePoint Database iDataAgent




Schedule Data Classification Service

15. This screen will only appear if you are remotely installing Data
    Classification. Specify when you prefer to start Data
    Classification Service.
     NOTES
        If this option is not selected, the Data Classification service
        will start scanning the system as soon as the installation is
        complete. This may cause increased I/O and CPU usage.
        Also, if there is activity on the system during the initial
        scan (e.g., keyboard use, mouse use), the initial scan will
        not run until 30 seconds after such activity stops.
        If this option is selected, you can schedule the service to
        start at a later date and time of your convenience and
        therefore avoid these issues. (Alternatively, you can avoid
        using the computer for some time depending on the
        amount of data in your system.)
     Click Next to continue.

SharePoint Administration Account

16. This screen will only appear if you are remotely installing
    SharePoint 2003 Database or Document iDataAgents. Enter
    the User Name and Password for the SharePoint
    Administration Account.
     Click Next to continue.
     NOTES
        This screen will not be displayed for SharePoint Server
        2001.
        When installing both the SharePoint Document 2003 and
        SharePoint Database 2003 iDataAgents at the same time,
        this screen will only be shown once.
        During installation, the Base Services of the client are
        configured to run as the user account entered through this
        screen.
        For both the SharePoint Server 2003 Document and
        SharePoint Server 2003 Database iDataAgents, run Base
        Services on the Client using an account that meets the
        following criteria:



                                                       Page 33 of 263
                                         Features - SharePoint Database iDataAgent




           member of the local Administrator Group
           member of the SharePoint Portal Administrator Group
           System Administrator role on the SQL Server Instance
        Refer to the article, Galaxy Service Account User
        Information for Windows 2003 and Window Server 2003
        clients available from the Maintenance Advantage web site.
        In addition, for the SharePoint 2003 iDataAgents, this
        account must have "Log on as Service" permissions to
        ensure the CVD service will start.
        When installing the SharePoint Database 2003 iDataAgent
        on a job server, the user account entered through this
        screen must have administrative privileges to the Single
        Sign-On Service.

Configure the Storage Policy for the Default Subclient

17. This screen will only appear if you have selected an agent that
    uses a Storage Policy.
     If you are installing only the MediaAgent software, skip to
     Step 25.
     Select the storage policy through which you want to back
     up/migrate/archive the indicated component (subclient,
     instance, etc.)
     NOTES
        A storage policy directs backup data to a media library.
        Each library has a default storage policy.
        When you install an Agent, the install program creates a
        default subclient for most Agents.
        If desired, you can change your storage policy selection at
        any time after you have installed the client software.
        If this screen appears more than once, it is because you
        have selected multiple agents for installation and are
        configuring storage policy association for each of the
        installed agents.
     Click Next to continue.

Configure DataMigrator for Exchange Options

     ADDITIONAL NOTES
     The following steps will only appear if you have selected a DataMigrator for Exchange Agent and/or the Outlook add-
     in. Skip to Select Remote Computers and Configure Access Credentials if you have not selected one the
     aforementioned components.

18. A dialog will ask whether you want to configure the agent for
    Web Access. If you would like to recover migrated messages
    from stubs using Outlook Web Access (OWA), then click Yes,
    and continue on to the next step, otherwise click No and skip
    the next step.

19. A dialog will ask you to enter an alias that you will use to
    connect to the system via your web browser. Type in the
    desired alias, then click Next to continue.




                                                      Page 34 of 263
                                        Features - SharePoint Database iDataAgent




20. Select the option to Disable Outlook's AutoArchive /
    Personal Storage features (if applicable), then click Next
    to continue.
     NOTES
        This option allows you to disable Outlook’s AutoArchive
        and Personal Storage features. It is recommended that you
        disable these features, otherwise messages that are
        archived into Personal Storage (.PST) files by Outlook will
        not be migrated by the DataMigrator Agent.
        This option is only supported for use with the DataMigrator
        for Exchange Mailbox Agent.




21. Select any of the following options:
       Allow end-user to select/de-select messages for
       migration
       This option allows you to enable the ability of end-users to
       add or remove messages/items from the migration list.
       This option is enabled by default, and is only supported for
       use with the DataMigrator for Exchange Mailbox Agent.
       Prompt user before message recovery
       This option allows you to enable a confirmation prompt
       when recovering a message or item from Outlook.
       Allow end-user to browse/search migrated copies of
       their messages/folders
       This option allows you to enable the ability of end-users to
       browse and search for migrated or archived copies of their
       messages. This option is enabled by default, and is
       supported for use with the DataMigrator for Exchange
       Mailbox Agent and the DataArchiver for Exchange Agent.
       This option must be selected for DataArchiver to support
       Compliance Searches from Outlook Add-In.
       Allow end-user to erase migrated copies of their
       messages/folders
       This option allows you to enable the ability of end-users to
       erase migrated copies of their messages and folders,
       provided that the Erase Data feature license is present on
       the CommServe. This option is only supported for use with




                                                     Page 35 of 263
                                       Features - SharePoint Database iDataAgent




       the DataMigrator for Exchange Mailbox Agent.
     Click Next to continue.

22. Configure the default recovery mode.
      Force Overwrite / Replace stub mode
      This option causes the system to overwrite the stub with a
      copy of the original message during a stub recovery.
      Force Append / Save stub mode
      This option causes the system to append a copy of the
      original message to the same folder from which it was
      migrated and will not delete the stub.
      Recover stubs from PST files to Recovered Items
      folder
      This option allows users to recover messages from stubs
      residing within migrated PST files into the Recovered Items
      folder of their mailbox.
     Click Next to continue.



23. Specify the Connection Type.
     NOTES
       Direct connectivity is the default method of
       communication to perform stub recovery.
       The HTTPS connectivity option routes communications
       for stub recovery through an IIS server on which the
       DataMigrator WebProxy Agent for Exchange has been
       installed.
     Click Next to continue.
     If you select Direct connectivity, skip to Step 25.




24. Specify the URL and Port number for the WebProxy server
    you have set up.
     NOTES
       The URL is determined when you install the DataMigrator
       WebProxy Agent for Exchange.
       Use the fully qualified domain name when specifying the
       URL. For example:
       https://sslserver.mycompany.com/DMproxy

     Click Next to continue.




Select Remote Computers and Configure Access Credentials

25. The install program also checks your Windows user account
    for the necessary operating system rights. These rights
    include:



                                                    Page 36 of 263
                                         Features - SharePoint Database iDataAgent




        Right to increase quotas (this is referred to as adjust
        memory quotas for a process on Windows Server 2003).
        Right to act as a part of the operating system.
        Right to replace a process level token.
     If your account does not have the necessary rights (e.g., if
     the operating system was newly installed), the install
     program automatically assigns the required rights to your
     account, and then prompts you to log off and log back on.
     If your install requires the software to be installed on a
     domain controller, the install program checks your operating
     system for domain controller status. If your computer is not a
     domain controller, a message informs you of the need to
     abort this installation.

26. Select any number of clients in the Available clients list, then
    click the > button to move them to the Selected Clients list.
     NOTES
        The Available clients list includes all clients in the domain
        that are currently online and have the Windows operating
        system installed.
        To install to computers within another domain, click
        Domain Select and browse for the desired domain. When
        you click OK, the install program imports a list of client
        computers within the selected domain that have the
        Windows operating system installed.
        If you bring a computer online after you have selected a
        domain, you can manually add it to your selected clients
        list by entering the client name in the box next to the Add
        button, and clicking Add, or by reselecting the domain by
        clicking the Select button and choosing the domain. Note
        that refreshing the domain will clear any selections added
        to the Selected clients list.
        The local computer does not appear on this list. You cannot
        perform a remote install to the local computer.
     Click Next to continue.

27. Configure the remote access credentials, install source path
    and output log path.
     NOTES
        The username that will be used by the install program to
        login to the remote computers is displayed.
        If you do not want to use the user, Cancel the upgrade, log
        out of the computer and then log back on and run the
        upgrade program using the desired username to perform
        the remote install/upgrade.
        Remote Access Credentials
            The setup program requests a confirmation of the
            administrator’s password to schedule tasks on remote
            computers. Provide the password for the displayed
            username.
        Folder Locations
        All computers targeted for remote installation of the
        software must have access to the paths below:
            Enter or browse to the UNC path that contains the
            install program.




                                                      Page 37 of 263
                                        Features - SharePoint Database iDataAgent




           Enter or browse to the UNC path of the directory in
           which you will store the remote installation logs.
        Silent reboot of remote clients, if required
           When selected, the install program will restart any
           remote computers that require a restart. When
           deselected, the install program is not permitted to
           restart the remote computers.
           If a remote computer requires a restart (silent or
           manual), remote install for these computers must be
           run again after the restart for the installation to
           complete successfully.
     Click Next to continue.

Verify Summary of Install Options

28. Verify the summary of selected options.
     NOTES
        The Summary on your screen should reflect the
        components you selected for install, and may look different
        from the example shown.
     Click Next to continue or Back to change any of the options.
     The install program now starts copying the software to the
     computer. This step may take several minutes to complete.
     You have the option of configuring Advanced Settings before
     you click the Next button or during the install process.
     If you have selected QSnap to be remotely installed, setup
     will ask to reboot the remote computer(s).
        If you select Yes the remote computer will reboot.
        If you select No the remote computer will not reboot, but
        you must reboot the computer at a later time to initialize
        the block-filter driver.

29. A progress screen displays during the install process. While
    monitoring the install process, you can also open the
    Advanced Settings screen and/or stop the install process.
     Clicking the Stop button halts the install process, but does
     not uninstall clients already installed.




Configure Advanced Settings

30. Select any of the following options:
       Number of simultaneous remote machines to run
       Setup
       This option allows you to select the number of remote




                                                     Page 38 of 263
                                         Features - SharePoint Database iDataAgent




        clients that will install simultaneously. As this number
        increases, the overall time of install decreases. However,
        as more system resources are required, it must be
        balanced against server capabilities.
        Time to wait for Setup to start on each remote
        machine
        This option allows you to set the number of minutes the
        system will wait for an initial response from a client before
        it skips the client and moves on to the next.
        Time to wait for Setup to complete on each remote
        machine
        This option allows you to set the number of minutes the
        system will wait for install results from a remote client.
        After the allotted number of minutes has passed with no
        response, the system notes this delay in the summary
        screen. This does not necessarily indicate an install failure
        on the client, just that there was a measured delay in
        responding.
     Click OK to continue.

Setup Complete

31. Setup displays the successfully installed components.
     NOTES
        The Remote Install Results lists the successfully and
        unsuccessfully installed clients, and may look different
        from the example shown.
        If any errors are reported in the Remote Install Results,
        more details can be found in the install log files.
        If you install an Agent with the CommCell Console open,
        you need to refresh the CommCell Console (F5) to see the
        new Agents.
     Click Finish to continue.




32. Click Close to exit the install program.
     The installation is now complete.
     NOTES
        Schedules help ensure that the data protection operations
        for the Agent are automatically performed on a regular
        basis without user intervention. For more information, see
        Scheduling.




Post-Install Considerations




                                                      Page 39 of 263
                                         Features - SharePoint Database iDataAgent




All Agents
   Review the Post-Install Considerations specific to the components that were installed using this procedure. (See
   Installation for a list of all Install procedures.)
Remote Install
  The remote install output log files folder contains information that should be reviewed after performing a remote install
  or remote upgrade. This shared folder containing the log files was specified during the install. The setup program
  stores configuration and output log files in this folder within the following subfolders:
     RemoteInstallConfigs contains the configuration input provided during the install or upgrade.
      RemoteInstallLogs contains subfolders for each remote client that is installed or upgraded.

   If the install or upgrade finished with any errors, search for the cvinstall.txt from the RemoteInstallLogs folder. Any
   clients that encounter an error during a remote installation will have a cvinstall.txt file within its folder.
   See Remote Install/Upgrade Error Messages for an explanation of any error messages you may encounter in the install
   logs.
Active Directory
   After a remote installation of the Active Directory iDataAgent, you must configure account information with
   Administrator rights used to verify the rights to backup and restore data. Additionally, for Active Directory servers with
   high security settings, you have the option of using NT LAN Manager Bind for NTLM Encrypted authentication. For step-
   by-step instructions on editing these agent properties, see Change Account for Accessing Application Servers/Filers.
Image Level ProxyHost iDataAgent - Windows
   Ensure that the Backup Host computer has the appropriate Image Level and Windows File System iDataAgents
   installed. See Deployment - Image Level iDataAgent and Deployment - Windows File System iDataAgent for more
   information.
Data Classification
   If you choose to schedule Data Classification services later, you must wait for the schedule time to pass in order to
   allow volume monitoring to start. As a workaround, either change the schedule time from the schedule task option in
   the Windows control panel or start the Data Classification services either manually or from the Service Control
   Manager.
Lotus Notes / Domino Server
   During a remote installation of the Lotus Domino Server iDataAgents, the system detects and configures all available
   Lotus Domino Server partitions. If necessary, after the install, you can remove any of the detected partitions using the
   CommCell Console. For step-by-step instructions on deleting a partition, see Delete an Instance.
Microsoft Exchange Agents
   You must create a mailbox associated with the Site Service Account, that resides in the local Exchange Server and has
   the proper permissions if you have remotely installed any of the following agents (Follow the links for instructions
   appropriate to your agent):
       Install the DataMigrator for Exchange 5.5 Mailbox Agent
       Install the Exchange 5.5 Mailbox iDataAgents
       Install the DataArchiver for Exchange 5.5
   You must create a mail-enabled user that has the proper permissions if you have remotely installed any of the
   following agents (Follow the links for instructions appropriate to your agent):
       Install the DataMigrator for Exchange 2000 or 2003 Agents
       Install the Exchange Mailbox 2000 or 2003
       Install the Exchange Public Folder 2000 or 2003
       Install the DataArchiver for Exchange 2000 or 2003
   Before starting any Data Protection operations on an Exchange iDataAgent, DataMigrator for Exchange Agent or
   DataArchiver for Exchange Agent, you must configure the Exchange Profile, Mailbox and Administrator account
   information. See Change the Exchange Administrator Account, Change the Exchange Site Service Account, Modify
   Exchange Server Name, Modify Mailbox Name and Modify Profile Name for instructions on entering this information.
   To use Outlook Web Access (OWA) for stub recovery after a DataMigrator for Exchange Agent remote install or
   upgrade restart the World Wide Web Publishing (W3SVC) Service on the client computer.
   Before using the DataMigrator Outlook Add-In, ensure that the Organizational Forms Library has been set up and
   configured for special forms. See OFL Configuration for more information.
Microsoft SharePoint iDataAgents




                                                      Page 40 of 263
                                        Features - SharePoint Database iDataAgent




  SharePoint Server 2003 Database iDataAgent
  After remotely installing the software, you must manually enter a SharePoint Administrative Group Account in the
  iDataAgent properties. For more information on entering the account, see the post-install instructions in Install the
  SharePoint Database iDataAgent.
  SharePoint 2001 and 2003 Document iDataAgents
  After remotely installing the software, you must manually enter a SharePoint Document iDataAgent Account in the
  iDataAgent properties. For more information on entering the account, see the post-install instructions in Install the
  SharePoint Document iDataAgent.
ProxyHost iDataAgent - Windows
  Ensure that the Backup Host computer has the appropriate Windows File System iDataAgent installed. See Deployment
  - Windows File System iDataAgent for more information.
QSnap for the Quick Recovery Agent

See the Configuration section of QSnap for the Quick Recovery Agent for information on configuring QSnap to work with
the Quick Recovery Agent.
QSnap or OFH for the Windows File System, Image Level, Image Level ProxyHost, and SDM iDataAgents
  See the Configuration section of QSnap/OFH for the Windows File System iDataAgent for information on configuring
  QSnap or OFH with the Windows File System iDataAgent.
  See the Configuration section of QSnap/OFH for the Image Level iDataAgent for information on configuring QSnap or
  OFH to work with the Image Level iDataAgent.
  See the Configuration section of QSnap/OFH for the Image Level ProxyHost iDataAgent for information on configuring
  QSnap or OFH to work with the Image Level ProxyHost iDataAgent.
  See the Configuration section of QSnap/OFH for the Serverless Data Manager iDataAgent for information on
  configuring QSnap or OFH to work with Serverless Data Manager iDataAgent.




                                                     Page 41 of 263
                                         Features - SharePoint Database iDataAgent




Backup Data

   Overview
     What Gets Backed Up
     How Long is the Backup Data Retained
     When Does the Data get Backed Up
   Backup Options
     Full Backups
     Incremental Backups
     Differential Backups
     Synthetic Full Backups
     Pre-Selected Backup Type (Exchange Database iDataAgents)
     Transaction Log Backups (SQL iDataAgent)
     Delta Backups (DB2 iDataAgent)
   Comparing Backup Types
      When a Backup is Converted to a Full Backup
      Advanced Backup Options
      Agent specific Backup Overviews
   Pre/Post Processes
   Support Information - Backup Options
   Support Information - Advanced Backup Options
Related Topics:
   Command Line Interface - qoperation backup
   On Demand Data Protection Operations
   Save a Job as Script


Overview
The primary purpose of a backup is to secure your data to media, for recovery at a later time. Review the following topics
to understand the scope of backup operations:
   What Gets Backed Up
   How Long is the Backup Data Retained
   When Does the Data get Backed Up
What Gets Backed Up
The data that will be backed up is determined first by the agent, which is designed to handle one or more types of data.
Then, the subclient content configuration determines what specific data of the supported data type(s) will be backed up.
Data Types

Each agent is designed to back up one or more specific data types. For example, to secure Exchange 5.5 Database you
would use the iDataAgent for Exchange 5.5 Database. To find out what data types an agent is tailored to secure, read the
Product Overview for the agent. Some agents may overlap in what data types they can secure, and you should plan your
backups accordingly.
Subclient Content

Subclient content will determine what gets backed up by the Agent. If an agent is designed to backup an Exchange 5.5
Database, for example, the databases you want to backup must be included the contents of a subclient. Subclients provide
a flexible way of managing what gets backed up.
See Subclients for information on subclients and assigning content to subclients.
For agents that support On Demand Data Protection operations, the content is specified via Content Files (in some cases in
conjunction with a Directive File) instead of through a Subclient Properties (Content) tab. See Defining Content for On
Demand Data Protection Operations for more information.




                                                      Page 42 of 263
                                           Features - SharePoint Database iDataAgent



Excluding Data from Data Protection Operations

You may want to exclude files or subdirectories that are contained within a subclient content path from data protection
operations. This is useful to prevent the system from needlessly securing data that does not need to be protected. Also, you
can prevent the same data from being secured multiple times in cases where two agents are securing the same data, by
excluding the data from operations on one of the agents.
See Excluding Data from Data Protection Operations - Overview for information on methods of excluding data from data
protection operations.
Locked Files

Some times files cannot be backed up because they are locked or open. See Locked/Open Files for an overview of how
these files can be backed up by the system.
How Long is the Backup Data Retained
Each subclient is associated to a storage policy. How long the backup data will be retained on the backup media is
determined by the retention rules set in the Storage Policy Copy Properties dialog box. This will affect media usage, and
is an important consideration when planning your backups. A longer retention period, for example, could use more media
for securing the data over time.
If a retention period other than infinite is selected, the data will be pruned according to backup cycles in relation to the
retention rules you set in the Storage Policy Copy Properties dialog box. Pruned data can be overwritten on the backup
media.
The backup data from a subclient will be retained according to the rules set for the storage policy associated with it. The
ability to define data in subclients, and then associate them to specific storage policies allows you to prioritize exactly what
data is retained and for how long.
For example, a client is being backed up with the Windows File System iDataAgent using the default subclient (which backs
up the entire file system). It is associated to a storage policy that regularly ages the data. There is a critical folder on that
client that you would like retained longer than the rest of the file system. You could create a new subclient with that critical
folder as its content, and associate the new subclient with a storage policy that has the desired retention period.
See Full Backup Cycles and Retention Periods for a description of a typical backup cycle.
See Subclients for information on assigning a storage policy to a subclient.
See Data Aging for detailed information and advanced concepts on Data Aging and retention.
When Does the Data get Backed Up
The QiNetix system allows you to schedule or initiate backups at the subclient, instance and/or backup set level, depending
upon the agent. Scheduled data protection operations provide a convenient means of securing data without user
intervention. When scheduling data protection operations, you need to establish a schedule for each subclient. For example,
a backup schedule always contains a full backup and may contain one or more other backup operations. When combined for
a given subclient, these backups comprise a full backup cycle.
You can also schedule data protection operations using a Agent Specific Data Protection Schedule Policy or an All Agent
Types Data Protection Schedule Policy.
Almost all operations can be scheduled or performed immediately.
Back to Top



Backup Options
Backups for any client start with a full backup. For Agents that support non-full backups, the full backup becomes a
baseline to which subsequent backup types are applied. See the following for detailed information on each backup type:
   Full Backups
   Incremental Backups
   Incremental Backups - Lotus Notes
   Differential Backups
   Synthetic Full Backups
Not all agents support all backup types. Beyond the core backup types listed above, an agent may have additional backup
options. See Support Information - Backup Options for a list of supported backup types for each agent. Read the Agent



                                                        Page 43 of 263
                                         Features - SharePoint Database iDataAgent



specific Backup Overview for more information on a specific agent.
Comparing Backup Types
To determine which combination of backup types best suits your data protection, performance and media usage needs, see
the following:
   Comparing Full, Incremental, and Differential Backups
   Combining Backup Types for the SQL Server iDataAgent
When a Backup is Converted to a Full Backup
In some cases, the system will automatically run a job as a full backup to ensure the integrity of your data, even if you
have selected a non-full backup option. An unplanned full backup could have the following effects:
   Increased backup size and therefore increased media usage.
   Increased time to complete the backup, possibly pushing the backup out of the Operation Window.
   Increased network bandwidth usage.
Therefore, conversions to full should be considered when planning your backups. You can avoid, or plan for these situations
by familiarizing yourself with the general circumstances, and the circumstances for each agent in which an operation is
converted to full.
See When a Non-Full Backup is Automatically Converted to a Full Backup for detailed information.
Advanced Backup Options
Once you have selected your backup type, you can choose to apply Advanced Backup options to your operation. The
advanced backup options provide media management tools at the operation level, as well as tools to optimize your backups
for specific circumstances.
For a description for each of the Advanced Backup options and why you would use them, see Advanced
Backup/Migrate/Archive Options.
To find out which Advanced Backup options are available for a given agent, see Support Information - Advanced
Backup/Migrate/Archive Options.
Back to Top



Agent Specific Backup Overviews
Given below is a list of supported Agents. The corresponding linked page provides information on the Agent specific backup
options and procedures.
Backup - Active Directory
Backup - DB2
Backup - EMC Centera
Backup - Image Level
Backup - Informix
Backup - Linux NetWare File Systems
Backup - Lotus Domino Server
   Lotus Notes Database iDataAgents
   Lotus Notes Document iDataAgents
Backup - Microsoft Exchange Server
   Exchange Database iDataAgents
   Exchange Mailbox iDataAgents
   Exchange Public/Web Folder iDataAgents
Backup - Microsoft SharePoint Portal
   SharePoint Database iDataAgents
   SharePoint Document iDataAgents



                                                      Page 44 of 263
                                          Features - SharePoint Database iDataAgent




Backup - Microsoft SharePoint Portal

Choose the following topic:
   Overview
   Backup Considerations for This Agent
   Support Information - Backup Options
   Support Information - Advanced Backup Options
   How To


Overview
Plan your backup jobs for this agent by reviewing the following information:
   Backup Data for an overview of backup jobs.
   Overview - SharePoint Server 2001 iDataAgent or Overview - SharePoint Server 2003 iDataAgents for a list of supported
   data types for these agents.
   Subclients for information on subclients.
      Subclients - SharePoint Portal Server for information on configuring subclients for these agents.
      Excluding Data from Data Protection Operations for information on excluding data via subclients (SharePoint 2003
      Database only)
Supported Backup Types
The SharePoint Database iDataAgents support the following backup types:
   Full Backups
The SharePoint Document iDataAgents support the following backup types:
   Full Backups
   Incremental Backups
   Differential Backups
   Synthetic Full Backups


Backup Considerations for This Agent
Before performing any backup procedures for this agent, review the following information:
For both the SharePoint Server 2003 Document and SharePoint Server 2003 Database iDataAgents, run Base Services on
the Client using an account that meets the following criteria:
   member of the local Administrator Group
   member of the SharePoint Portal Administrator Group
   System Administrator role on the SQL Server Instance
Refer to the article, Galaxy Service Account User Information for Windows 2003 and Window Server 2003 clients available
from the Maintenance Advantage web site.
MS SharePoint Server 2003 Database iDataAgent
   When backing up a Site Collection, the data is moved from the server to a local file. If the Site Collection is larger than
   the available space on the drive on which the iDataAgent resides, then the backup will fail. If necessary, you can modify
   the path to another site that does have enough space through the SharePoint registry key value dHome.
   When SharePoint is configured in a Server Farm configuration, SQL databases must be backed up using the SQL
   iDataAgent. See Backup - Microsoft SQL Server. Run backups using the SharePoint Database iDataAgent at
   approximately the same time as the SQL iDataAgent backups of the respective databases.
MS SharePoint Server 2003 Document iDataAgent
   Filters can be used in conjunction with the "Items That Failed" list on the data protection Job History Report to eliminate
   backup or migration failures by excluding items which consistently fail that are not integral to the operation of the
   system or applications. Some items fail because they are locked by the operating system or application and cannot be



                                                       Page 45 of 263
                                         Features - SharePoint Database iDataAgent



  opened at the time of the data protection operation. This often occurs with certain system-related files and database
  application files.
  Also, keep in mind that you will need to run a full backup after adding failed files to the filter in order to remove them.
  While .aspx files for a Basic Page from a document library in SharePoint Document 2003 can be backed up and restored,
  restoring the file will not restore the web part to the page. Modifications made to the page using the Content Editor Web
  Part will not be restored and cannot be made to the file after it has been restored.
MS SharePoint Server 2001 Document iDataAgent
  During backup, only the current version of the document is backed up.
Back to Top



How To
Basic Operations
  Start a Full/Incremental/Differential Backup
  Schedule Backups
Backup Options
  Start a Synthetic Full Backup
Advanced Backup Options
  Start a Backup in the Suspended State
  Start a Backup on New Media
  Start a Backup that Marks Media Full on Completion
  Start a Backup with a Set Job Priority
  Start a Backup with Vault Tracking enabled
Back to Top




                                                       Page 46 of 263
                                          Features - SharePoint Database iDataAgent




Full Backups

Choose from the following topics:
   Overview
   Support Information - Backup Options
   How To


Overview
Backups for any client start with a full backup. For Agents that support non-full backups, the full backup becomes a
baseline to which subsequent backup types are applied. For example, if an agent supports incremental backups, a full
backup must be performed before an incremental backup can be initiated.
A full backup contains all the data associated to a subclient. If a client computer has multiple agents installed, then the
subclients of each agent require a full backup in order to secure all of the data on that client. Backups can also be
performed at the backup set or instance level, and will apply to all of the subclients within the selected backup set or
instance.
For Oracle and Oracle RAC iDataAgents, refer to Level 0 backups in the Oracle Backup and Recovery Guide from Oracle
Corporation. Also refer to Backup - Oracle for more information on this special type of full backup.
See Backup Data for an overview of all backup operations.



How To
   Start a Full Backup
   Schedule Backups
Back to Top




                                                       Page 47 of 263
                                          Features - SharePoint Database iDataAgent




Advanced Backup/Migrate/Archive Options

Choose from the following topics:
   Overview
     Follow Mount Points
     Back up Data in Remote Storage
     Back up Files Protected by system file protection
     Stop DHCP Service when Backing up System State Data
     Stop WINS Service when Backing up System State Data
     HotFix Information
     Start Log Backup after Successful Backup
     Set a Job Priority
     Start in Suspended State
     Create New Index
     Start New Media
     Mark Media Full
     Allow other schedule to use media set
     Create Content Index
     Skip Metafile Creation
     Release Resources during meta-data collection phase
     Job should wait for inline copy resources
     Reserve Resources before Scan
     Job Retry Tab
     Vault Tracking Tab
   Support Information - Advanced Backup/Migrate/Archive Options


Overview
Once you have selected your backup type, you can choose to apply Advanced Backup options to your operation. The
advanced backup options provide media management tools at the operation level, as well as tools to optimize your backups
for specific circumstances. The advanced options are listed below.
Data Tab
Follow Mount Points
For Windows File System, when selected (default), specifies that the backup include both the mount point and the data
pointed to from that mount point. This data is backed up, even if it is included in another subclient; therefore, data can be
duplicated in the backup. Deselecting Follow Mount Points in the Advanced Backup Options dialog box causes the
configuration (i.e., the mount point) to be backed up without backing up the data on the mounted volume.
Mount points, which point from the directory to the target data, allow you to add new volumes to the existing file system
without using new drive letters. Backups follow mount points by default, backing up both the configuration and the data on
the mounted volume. These backups will not duplicate data in any of the following situations:
   if more than one mount point on a subclient identifies the same location
   if there are mount points to volumes that are not assigned drive letters
   if two mount points point to each other
A backup that follows a mount point will duplicate data, however, if the mount point points to a volume with an existing
drive letter. In this case, data is backed up via the mount point and by any subclient that scans the existing drive letter. To
avoid this duplication, filter the mount point or volume letter, or deselect Follow Mount Points.

Back up Data in Remote Storage
For Windows File System, specifies that the backup include the data pointed to in remote storage. By default, the system
backs up only the reparse points — pointers to the remotely stored data — and not the data itself. You must select the
Backup data in remote storage option in the Advanced Backup Options dialog box in order to back up the actual remote



                                                       Page 48 of 263
                                          Features - SharePoint Database iDataAgent



storage data.
The Windows 2000 or Windows Server 2003 Remote Storage feature conserves disk space by moving infrequently used
data from the hard disk to directly-attached remote storage; it recalls the data when needed. You specify criteria for
moving files to remote storage during Windows installation.
Remotely stored data should not be included in every full backup for the following reasons:
   By definition, remotely stored data is data that is rarely accessed or modified. Therefore, it need not be backed up
   regularly.
   Backing up remotely stored data is less efficient than backing up data from the local disk. Including this data in backups
   decreases backup efficiency.
   During a full iDataAgent restore, all data is restored to the local hard drives. If data in remote storage exceeds the
   available space on the hard drive, the hard disk space will be inadequate for the restore. Plan for hard disk space
   accordingly.
Even if this option is not selected, reparse points can be restored if the local disk becomes corrupted. But data is not
restored in the event of a disaster to the remote storage device.
It is not recommended that you include remotely stored data in every backup; however, you should deploy a backup
scheme that includes some backups of remotely stored data, in accordance with your deployment of the Remote Storage
Service. Before instructing the system to back up this data, ensure that the media in the remote storage device will be
available at the times of backup and restore.
Back up Files Protected by system file protection
For Windows File System, specifies the backup to include those files that are protected by the Windows System File
Protection feature. This feature protects shared files that may be overwritten by application installations, such as dynamic
link libraries (DLLs).
Stop DHCP Service when Backing up System State Data
For Windows File System, specifies the system stop DHCP services on the client computer when System State data is
backed up. This option is provided for cases where you cannot use VSS to backup the System State. See VSS for Windows
File System iDataAgents for more information.
Stop WINS Service when Backing up System State Data
For Windows File System, specifies the system stop WINS services on the client computer when System State data is
backed up. This option is provided for cases where you cannot use VSS to backup the System State. See VSS for Windows
File System iDataAgents for more information.
HotFix Information
For Windows File System, tells the system to collect information about all hot fixes for the operating system configuration.
This is useful when knowing the operating system configuration is vital to successfully restoring a system. It is also useful
when browsing backup data, and in disaster recovery scenarios. The browser can also show hotfix information for previous
backups. For step by step procedures, see Start a Backup with HotFix Information.
Start Log Backup after Successful Backup
For the SQL Server iDataAgent, specifies that a Transaction Log backup will start automatically after a successful Full or
Differential backup operation is completed. This is useful when you want to backup logs immediately after a data backup,
and allows you to do so without creating two scheduled jobs.
Startup Tab
Set a Job Priority
This option allows you to manually set a job priority. This is useful if you have jobs that are very important and must
complete, and/or jobs that can be moved to a lower priority. For more information, see Job Controller.
Start in Suspended State
Specifies that this job will start in the Job Controller in a suspended state and cannot run until the job is manually resumed
using the Resume option. This option can be used, to add a level of manual control when a job is started. For example,
you could schedule jobs to start in the suspended state. An administrator could then choose which scheduled jobs complete
by resuming the operation started in the suspended state.
Media Tab
Create New Index



                                                       Page 49 of 263
                                           Features - SharePoint Database iDataAgent




This option is selected by default for all full backup operations, except On Demand Data Protection Operations. For
scheduled backups, disable this option only if you need full backup transparent browse capabilities. See Index for more
information.
This selection will override the following settings for this job only:
   The Create new index on full backup option on the Agent Properties Index tab.
   For an On Demand Data Protection operation, the Set index cycle to every nn backup jobs setting in the default
   Subclient Properties General tab.
Start New Media
This option starts the backup/migration/archive operation on a new media, which causes the following to occur:
   If removable media is used, the current active media is marked as Appendable and a new media is used for the
   backup/migration/archive.
   If magnetic media is used, a new volume folder is created for the backup/migration/archive.
If cleared, the operation automatically uses the current active media or volume.
This media management feature provides a degree of control over where the data physically resides, for example helping
you to meet security or performance goals. This feature is useful in situations where you would like the data to reside on a
new media, not appended to a media already containing backup/migration/archive data.
Another form of Start New Media option is also available from the library properties. See Library Properties - Start New
Media for more information.
Mark Media Full
This option marks media full after 2 minutes from the completion of the backup/migration/archive operation. If any jobs are
initiated within the 2 minutes, they are allowed to write to the media.
This media management feature provides a degree of control over where the data physically resides, for example helping
you to meet security or performance goals. This feature prevents any other data from being written to the media. If the
job was associated with the prior media, new media (such as a new tape) will be used for subsequent jobs. (Applies to all
backup/migration/archive types.)
Allow other schedule to use media set
This option allows jobs that are part of a schedule policy or schedule and using a specific storage policy to start a new
media and also prevent other jobs from writing to the set of media. This option is available only when the Start New
Media and Mark Media Full options are enabled. This option can be used in the following situations:
   When one scheduled job initiates several jobs and you only want to start new media on the first job
   When you want to target specific backups to a media, or a set of media if multiple streams are used.
For additional information on the Start New Media, Mark Media Full and Allow other schedule to use media set
options, see Creating an Exportable Media Set.
Create Content Index
Content indexing adds the ability to find backed up, migrated, or archived files and messages by searching their contents
for a keyword or phrase. Before you can search by content, a content index must be created. See Content Indexing for
more information. For DataArchiver, note that this option is accessible via the Archive Options dialog box instead of the
Media tab.
Skip Metafile Creation
The Image Level iDataAgents can restore at both the file and volume level. However, if you do not need the file level
restore capability, then you can increase performance by skipping metafile creation. This disables the file level restore, and
is most useful in situations where the file system is already backed up by the File System iDataAgent.
If selected:
   A metafile of the data on the specified volume will not be created as part of the backup, saving some time.
   The data can only be restored using Volume Level restore. (The metafile is required for File Level restore.)
If cleared, a metafile will be created, and the data can be restored using either File Level restore or Volume Level restore.
If File Level restores will never be used for this client, instead of manually selecting the Skip Metafile Creation option for
each job, edit the SkipMetaFileCreation registry key to automatically skip metafile creation for all backup jobs.



                                                         Page 50 of 263
                                         Features - SharePoint Database iDataAgent




Release Resources during meta-data collection phase
For the Image Level iDataAgents, specifies that reserved media will be released while the system is collecting the metadata
for the data protection operation. This is useful in cases where the system is collecting data on a large volume and you
want the resources available for other jobs.
Job should wait for inline copy resources
If you are performing a data protection/archive operation on a subclient that uses a storage policy that has an inline copy
enabled, then this option specifies that the operation should wait until resources are also available for both the data
protection/archive operation and the inline copy.
Reserve Resources before Scan
Normally, media is reserved for a job before the operation phase (backup, migration, or archive). When selected, this
option will reserve the media before the scan phase. This reserves the media earlier because the scan phase precedes the
operation phase.
Job Retry Tab
Click this tab to access the Job Retry and Job Running Time options. See Job Restart and Job Running Time for more
information. (You can also specify the maximum number of allowed restart attempts and the interval between restart
attempts for all data protection jobs. For procedures, see Specify Job Restartability for the CommCell.)
Vault Tracking Tab
Click this tab to access the Vault Tracking options. See Vault Tracker for more information.
Back to Top




                                                      Page 51 of 263
                                          Features - SharePoint Database iDataAgent




Browse Data

Choose from the following topics:
   Overview
   Control the Browse Time Interval
      The Time-of-Day Element
      Browse Data from Before the Most Recent Full Backup
   Image Browse
   No Image Browse
   Establish the Page Size for a Browse
   Browse Using a Specific MediaAgent
   Browse from Copies
   Browse Multiple Versions of a File or Object
   Browse and Restore/Recover/Retrieve Using The Exact Index
   Find a File/Directory/Object
   Full Backup Transparent Browse
   Support Information - Browse Options
   Support Information - Browse Features
   How To
Related Topics:
   List Media (Media Prediction)
   Browsing Data - DataArchiver Agents
   Browsing Data - DataMigrator Agents
   Browse QR Volumes
   Browse Data - ContinuousDataReplicator
   Content Indexing
   Assign Restore View Names to Newly-discovered Databases (Lotus Notes Document iDataAgent)


Overview
The option to browse the data obtained by data protection operations provides the facility to view and if necessary,
selectively restore/recover the data objects (files/folder/directories/database objects, etc.) that were backed up. This option
is especially useful to search for specific data object(s), including a specific version.
The browse option can be invoked from either the client, agent, instance, backup set, or subclient level depending on the
functionality of a given agent. This helps to narrow the search to a specific part of the data.
Depending on the agent, there are several additional options to customize your browse, including:
   Capability to browse the most recent (latest) data.
   Capability to browse data between a specified time range.
   Capability to limit the browse to a specified path.
   Facility to specify the page size of the browse window.
   Ability to browse the image of the data as it existed at the specified browse time.
   Ability to browse from a specified storage policy copy.
   Ability to browse only volumes for Volume Level Restores.
   Ability to browse databases or file groups/files for these databases.
   Ability to browse contents defined by file attributes other than size and time.
   Ability to browse folders/files owned by specific users.
After selecting the necessary browse options, the browse window provides a list of data objects that meets the specified
criteria. This window also provides the capability to select multiple or specific data object(s) that you wish to
restore/recover.




                                                       Page 52 of 263
                                                    Features - SharePoint Database iDataAgent




Note that most of the Unicode / International characters are automatically displayed in the browse window. If these
characters are not displayed, make sure that the necessary fonts or other software required to display these characters are
installed on the computer in which the CommCell Console is displayed. For example, if you run the CommCell Console as an
applet, make sure that the that the necessary fonts or other software required to display the characters are installed on the
local computer from which the applet is run.
If necessary, you can also perform the following operations in the browse window, depending on whether the agent
supports the operation:
   Search the data using the Find option to restore/recover specific data objects.
   View and restore/recover multiple version(s) of data objects.
   Perform a full iDataAgent restore.

                              For the Lotus Notes iDataAgents, for a database that is reconfigured from a default subclient to a
                              user-defined subclient, the database would appear twice in the Backup Data dialog box under
                              the following conditions:
                              The default subclient had been backed up prior to the reconfiguration, and after the
                              reconfiguration only the user-defined subclient is backed up. In the default view of the Backup
                              Data dialog box, the second occurrence of the database reflects the more recent backup, and
                              you would be able to successfully restore either occurrence. With the next back up of the default
                              subclient, the duplication would be eliminated.

                              For the Microsoft Exchange 5.5 Server iDataAgent, although the private and public information
                              stores must be included in the same subclient during backups, they may be individually selected
                              and restored.

                              For the Microsoft Exchange 2000/2003 Database iDataAgents, you must dismount all information
                              stores that you will be selecting during the restore operation. If you are restoring the entire
                              Exchange Server, be sure to dismount all stores.

                              For information on browsing QR Volumes, see Browsing Available QR Volumes.
                              For information on browsing DataMigrator Agents, see Browsing Data - DataMigrator Agents.
                              For information on browsing DataArchiver Agents, see Browsing Data - DataArchiver Agents.

Back to Top



Control the Browse Time Interval
The following agents have different behavior due to the nature of their data and operations. See the appropriate topic for
more information.
   Control   the   Browse   Time   Interval   for   Informix
   Control   the   Browse   Time   Interval   for   Oracle, Oracle RAC and DB2
   Control   the   Browse   Time   Interval   for   Exchange 5.5 and Exchange 2000 database
   Control   the   Browse   Time   Interval   for   Image Level and Serverless Data Manager
   Control   the   Browse   Time   Interval   for   SQL
   Control   the   Browse   Time   Interval   for   Sybase
Except for the above agents controlling the browse time interval works as follows:
The browse operation provides you with two options, Exclude Data Before and Browse Data Before, which allow you to
control the start and end points of the browse retrieval process. These features can be useful if you need to
restore/recover:
   Some previous version of data.
   Data that was deleted prior to the most recent data protection operation.


                              If you want to browse and restore previous versions of a data object only, you may
                              find it more convenient to use the Version option as described in Browsing Multiple




                                                                 Page 53 of 263
                                         Features - SharePoint Database iDataAgent




                         Versions of a File or Object.




Although both options have their uses, the Browse Data Before option is generally used far more often than the Exclude
Data Before option. Users are usually more interested in restoring the most recent data up to some date threshold than
they are in omitting data from before some given date.
The Exclude Data Before option identifies the starting point of the index search and the Browse Data Before option
identifies the ending point.
The following figure shows how the search process is affected by the Exclude Data Before and Browse Data Before
options.




As shown in the figure, the Browse Data Before date, when specified alone, causes the search process to begin with the
most recent full backup and end with the backup that occurred just prior to the specified date. The Exclude Data Before
date, when specified alone, causes the search process to begin with either the backup that occurred just after the specified
date and end with the most recent backup.
You can also use the Browse Data Before and Exclude Data Before options together to limit the search boundaries on
both ends.
The Time-of-Day Element
The specifications for both the Browse Data Before and Exclude Data Before options include not only the date, but the
time-of-day (i.e., hours and minutes) as well.
Specifying the time is necessary when isolating a backup on a date on which two or more backups occurred. (Note that this
condition can occur even if backups are scheduled only once a day. For example, someone may have launched a backup
using the Run Immediately option in addition to a scheduled backup. Also, depending on the size of a backup and the time
it is scheduled to begin, a backup can start on one date and complete on the next, since the backup need only span 12:00
midnight.)



                                                         Page 54 of 263
                                          Features - SharePoint Database iDataAgent




In determining whether to include a backup in a search, the system uses the time that a backup completes. The Exclude
Data Before option causes the system to begin its search on the backup that completed after the specified date and time,
unless it encounters a full backup first. The Browse Data Before option causes the system to end its browse search using
the most recent backup that completed before the specified date and time.
The following figure shows the minimum and maximum times that can be given for the Browse Data Before and Exclude
Data Before options in order to define the search boundaries as shown. Notice that the point of delineation is the backup
completion time, 2:25am in this case.




Browse Data from Before the Most Recent Full Backup
In the browses described previously, the searches are bounded by the most recent full backup. There may be times,
however, when you want to browse data that is older than the most recent full backup. One way of accessing that data is to
specify a Browse Data Before date that pre-dates the full backup. Remember, the Browse Data Before date establishes
the ending point of the search. Consequently, using a Browse Data Before date that pre-dates the most recent full backup
starts the search in the previous full backup cycle. This is only valid of course if the data in that full backup cycle has not
expired.
The following illustration demonstrates the use of the Browse Data Before option to access data that was backed prior to
the most recent full backup. Other searches including the default are shown for comparative purposes.




                                                       Page 55 of 263
                                          Features - SharePoint Database iDataAgent




This figure shows that the:
   Default search is bounded by the most recent full backup. It has no access to data that was backed up prior to that time.
   Browse Data Before option can be used to restore data that was backed up prior to the most recent full backup.
   The illustration shows the search boundaries that would be in effect if the date and time specified preceded the
   completion of the 5/10 backup. The search starts with the 5/9 backup and is bounded by the next most recent full
   backup.
   Exclude Data Before option, when used alone, cannot access data that was backed-up prior to the most recent full
   backup, regardless of the date/time that was specified. The end point of such a search is always bounded by either the
   most recent full backup or the Exclude Data Before date, whichever is most recent.
   Search can begin on a backup that occurred prior to the most recent full backup and end on a backup that occurred after
   the next most recent full backup.
Back to Top



Image Browse
Image browse displays the structure of the entity as it existed as of some specified time. An image restore operation
restores the data or some specified portion thereof. (Remember, the two operations - browse, restore - are the same
except that a restore returns the actual data while a browse displays only the structure.)
When you browse or restore data in the image mode, the system by default returns the requested data based on the latest
image available. This is usually the information that most users are interested in. The system does this by using the current
date and time as the effective date.
The following examples illustrate data retrieval:




                                                       Page 56 of 263
                                          Features - SharePoint Database iDataAgent




Example 1 - Basic Example
Example 2- Public/Web Folder and Mailbox Example
Example 3 - Directory/Container Example
Limitations of Image Browse
The image mode (i.e., searching through the current full backup cycle) may not meet your needs in all circumstances. It
can only restore the latest version of an entity (i.e. file/directory/database/public folder/mailbox/folder/message). Further,
if the requested entity was deleted before the most recent full backup, the default mode of operation cannot find the data.
In such situations, use either the no-image mode or the other browse capabilities provided by the system to control the
search and retrieval process.
Back to Top



No Image Browse
The no-image browse operation is useful for retrieving data that may have been deleted at some unknown time or
retrieving a previous backup version of a data object. Rather than returning an image of the specified data object, a no-
image browse returns all data objects that have existed, whether currently present or not, since the full backup of the
specified backup cycle.
The no-image browse/restore is more inclusive than the image form of the operation. However, it is not suitable for all
situations. Use the default image operation if you want to restore an entity to the state in which it existed as of a specified
time. Use the no-image option if you do not need to preserve the structure of an entity.
The following examples illustrate no-image browse:
   For database agents, see the example in Database Retrieval.
   For non-database agents, see the examples in Data Retrieval and Directory/Container Retrieval. A no-image restore of
   the same data would have returned all of the data as stated plus file D from the 5/10 backup.

                          The Type column in the browse window indicates whether a data object has been
                          deleted. Note, however, that the type column is not supported by all Agents.



Back to Top



Browse Multiple Versions of a File or Object
As part of the default browse operation, the system allows you to browse and recover previously backed up/migrated
versions of a data object. You can access this feature by using the View All Versions option. The system responds by
displaying the date-stamped versions of the selected entities that are available for recovery. You then select the version
you want and recover it.
The following example demonstrates the use of this feature.




                                                        Page 57 of 263
                                            Features - SharePoint Database iDataAgent




A default image browse of this entity returns:
   A from 5/13
   B from 5/15
   C from 5/13
In this example for iDataAgents, using the View All Versions option we can browse and restore any version of these files
back to the 5/10 full backup. For example, for File A, we can restore the 5/10, 5/11, and 5/13 versions. Note that this
feature is available only for individual data objects. It cannot be used to restore some previous version of a
directory/container. If you need to restore a directory or container to some prior state, use the Browse Data Before and
Exclude Data Before options as described in Controlling the Browse Time Interval.

When you restore all file versions simultaneously, the restore operation automatically appends a different number to the file
name (for example: file,1.txt; file,2.txt; where "1" is the most recent version, "2" is the next recent version, etc.) so that
each version remains unique. If the destination volume does not have the long namespace installed, the naming of these
files will vary according to the available namespace.
DataMigrator Agents for File System and Network Storage also support the View All Versions option to browse and
recover different versions of migrated files. For these agents, if all versions of a file are recovered then the original file will
remain a stub file and not be recovered. Upon recovery, the system appends a unique number to the file name (as
mentioned above) to ensure that different versions of a file residing in the same location have a unique name.

                           For the SharePoint 2003 Document iDataAgent, the View All Versions window in the
                           Backup Data Browse has different functionality.
                              List Items and documents backed up from a Document Library with Versioning
                              disabled will only show the latest version of the object in the View All Versions
                              window. To restore a past version of an object, you must perform a browse back
                              in time.
                              Documents backed up from Document Libraries with Versioning enabled will
                              display all backed up versions in the View All Versions window.

NOTES
   When the properties of a file are changed, such as permissions, but not its contents, while the file will be backed up with
   the next incremental or differential backup, the View All Versions option will only display the latest version of the file.

See Also:
   Differences between NetWare File System and NDS Versions
Back to Top




                                                         Page 58 of 263
                                          Features - SharePoint Database iDataAgent




Find a File/Directory/Object
The find feature allows you to search the data protection archives for any file or directory name or name pattern. Because
find supports the capability of searching multiple indexes, unlike browse, you can search beyond the last full backup (or
new index) as long as the data resides on an index that exists within the retention period. The find feature is accessible as
a right-click option on the browse window, and for supported agents a non-browse find is available from the All Tasks
menu. Depending on the agent, the Find option is accessible from either the backup set or migration set level, and for
DataArchiver it is accessible from the agent level.
The following list provides details of supported wildcard characters:
   The Active Directory iDataAgent does not support the use of wildcard expressions.
   For Windows, NetWare, Unix, SharePoint Document and NAS NDMP iDataAgents and DataMigrator File System Agents
   see Wildcards (File System).
   For Lotus Notes Document iDataAgent, see Wildcards (Lotus Notes Document).
   For Exchange Mailbox, Public Folder and Web Folder iDataAgents, DataArchiver and DataMigrator for Exchange Agents,
   see Wildcards (Exchange).
Advanced Search
The Advanced Search tab is accessible only to clients that support Content Indexing, and have created a content index for
their archived, backed up, or migrated data. This tab allows you to search by content (keyword or phrase). For Exchange-
based agents, you can also search by To, From, CC, BCC, and attachment name. Advanced searches with multiple criteria
can be performed. In addition, Exchange-based agents that support Content Indexing can also narrow search results by
specifying a received time range for messages on the Find (Receive Time Range) tab.
For example, you can choose to exclude files from your search results based on the criteria entered.
Note that the values you have entered into the Find Data (Search) and Find (Advanced Search) tabs will be used to perform
your search (i.e., both tabs are active). If any checkboxes are checked on the regular Find Data (Search) tab, you will have
to enter values for the associated fields or uncheck the checkboxes. See Parsing Rules for Content Indexing for instructions
and example search strings.
Save Search Results to a File
Exchange-based agents provide the additional capability to save the search results to a file on the client computer. This
feature is useful for identifying high-level properties of Exchange objects (such as Subject, From, To, etc.) that match the
search criteria, which can be saved to a text file without the need to perform a recovery or retrieve operation.
Back to Top



How To
   Browse the Latest Data
   Browse Data Before a Specified Time
   Browse Data Between a Specified Time
   Browse using a Specified Path
   Perform an Image / No Image Browse
   Establish the Page Size for a Browse
   Browse Data from a Specific Copy
   Select Objects From the Browse Window for Restore-Recover
   Find a File/Directory/Object
   Find and Restore/Recover/Retrieve Exchange Objects
   Schedule a Find and Retrieve Operation
   Browse File Versions
Back to Top




                                                       Page 59 of 263
                                          Features - SharePoint Database iDataAgent




Browse and Restore/Recover/Retrieve from Copies

Choose the following topic:
     Overview
     Restore/Recover/Retrieve Considerations
     Support Information - Restore Options
     How To


Overview
By default, when a browse and restore/recover/retrieve operation is requested, the software attempts to browse and
restore/recover/retrieve from the storage policy copy with the lowest copy precedence. If the data that you want to browse
and restore/recover/retrieve was already pruned from the primary copy, the software searches the other copies of the
storage policy in the following order:

1.   Lowest copy precedence to highest copy precedence among all synchronous copies.
2.   Lowest copy precedence to highest copy precedence among all selective copies (if your agent supports selective
     copies).

If the data that is to be browsed and restored/recovered/retrieved was secured through multiple storage policies, the
software will search for the requested data first from synchronous copies starting with the lowest copy precedence number
to the synchronous copy with the highest copy precedence number, and then from selective copies in the same order for
each of these storage policies.
You can choose to browse and restore/recover/retrieve from synchronous or selective copies other than the primary copies
by using an agent's Advanced Browse/Restore/Recover/Retrieve Options or Select Copy Precedence dialog box to
specify a copy precedence number at browse and restore/recover/retrieve time. This feature can be useful a variety of
circumstances, including the following:
     You know that the media containing the protected data for a particular copy has been removed from the storage library.
     In this case, you can choose to browse and restore/recover/retrieve from a copy whose media are inside the library.
     You want to browse and restore/recover/retrieve from a copy that accesses faster magnetic disk media rather than
     slower tape media.
     You know that the media drives used by a particular copy are busy with another operation and want to browse and
     restore/recover/retrieve from a different copy to avoid resource conflicts.
If you specify a copy precedence number for a browse and restore/recover/retrieve operation, the software searches only
the storage policy copy with that precedence number in each of the storage policies through which the data was secured. If
data does not exist in the specified copy, the browse and restore/recover/retrieve operation fails even if the data exists in
another copy of the same storage policy.
You can also use the copy precedence feature to browse and restore/recover/retrieve from specific copies as follows:

1.   Go through each storage policy that will be accessed by the browse and restore/recover/retrieve operation.
2.   Use the Storage Policy Properties - Copy Precedence tab to assign a particular copy precedence to the copy that
     you want checked for that particular storage policy.
3.   Specify that copy precedence number in the agent's Advanced Browse/Restore/Recover/Retrieve Options or
     Select Copy Precedence at browse/restore time.

For a more detailed discussion, see Recovering Data From Copies.



Restore/Recover/Retrieve Considerations
When you are directly performing a restore/recover/retrieve operation, as discussed in Basic Restore, you can select the
necessary copy from which to perform the operation from the agent's Advanced Restore/Recover/Retrieve Options dialog
box. Note that data should be restored/recovered/retrieved from the same copy from which you have performed the
browse operation.
Back to Top




                                                       Page 60 of 263
                                          Features - SharePoint Database iDataAgent




Restore Backup Data

Choose from the following topics:
   Overview
   What You Need to Know Before Performing a Restore
   Restore Options
   Basic Restore
   Browse and Restore
   Restore Data Using Wildcard Expressions
   Rename/Redirect Files on Restore
   Automatic Detection of Regular Expressions
   Restore from Copies
   Restore Data Using a Specific MediaAgent, Library or Drive Pool
   Restore Data Using the Exact Index
   Pre/Post Processes
   Filter Data From Restore/Recovery Operations
   Restore Destinations
   Restore Data Using a Map File
   Support Information - Restore Options
   Support Information - Restore Options - Restore Destinations
   Support Information - Restore Options - Others
Related Topics:
   List Media (Media Prediction)
   Browsing Data
   Restore From Anywhere
   Job Management - Data Recovery Operations
   Command Line Interface - Qoperation restore


Overview
The QiNetix system supports a variety of restore options to restore the data in the desired manner. Due to the wide variety
of restore options, this discussion is based on the operations supported by each Agent.
Back to Top



What You Need to Know Before Performing a Restore
To avoid common problems, review the following before starting a restore operation:
   Verify that the CommServe, MediaAgent and media library are powered on.
   You have successfully done a backup of the data you are attempting to restore.
   Verify that the media from which you wish to perform the restore is available in the CommCell. Use the List Media
   feature to identify the media required by the restore operation.
   Always ensure that sufficient space has been allocated for the restore.
   When you restore data, the system restores data in the following method:
      If the folders and files that you wish to restore are not available, the necessary folders and files are automatically
      created by the restore operation.
      If the folders and files are available, and if you do not select the Unconditional Overwrite option in the Restore
      Options dialog box, the existing folders and files will not be overwritten. Only the non-existent entities will be
      restored by the restore operation.




                                                       Page 61 of 263
                                          Features - SharePoint Database iDataAgent




      If the folders and files are available, and if you select the Unconditional Overwrite option in the Restore Options
      dialog box, the existing folders and files will be overwritten with the folders and files with the most recent time-stamp
      in the backup data.
   After restoring data, check the Restore Job History to view the list of files that were successfully restored.
   If you wish to restore data that has been aged, see Accessing Aged Data for information on restoring such data and
   saving the media containing the data for future use.
   When performing a restore to a NAS NDMP file server with LAN free data paths, it is important to import all of the
   required tapes for the restore job into the same library. For information about determining which tapes will be required
   for a given restore job, see List Media (Media Prediction).
Given below is a list of supported Agents. The corresponding linked page provides information on the Agent specific restore
options and procedures.
   Active Directory
   BlueArc NAS NDMP
   DB2
   EMC Celerra NAS NDMP
   EMC Centera
   Image Level
   Image Level ProxyHost
   Informix
   Linux NetWare File Systems
   Lotus Notes/ Domino Server
   Macintosh File System
   Microsoft Exchange Server
   Microsoft SharePoint Portal
   Microsoft SQL Server
   Microsoft Windows File Systems
   NetApp NAS NDMP
   NetWare Server
   Oracle
   Oracle RAC
   ProxyHost
   SAP
   Serverless Data Manager
   SnapVault
   Sybase
   Unix File Systems


Restore Options
The following options are available for controlling restore jobs.
Set a Job Priority
This option allows you to manually set a job priority. This is useful if you have jobs that are very important and must
complete, and/or jobs that can be moved to a lower priority. For more information, see Job Controller.
Start in Suspended State
Specifies that this job will start in the Job Controller in a suspended state and cannot run until the job is manually resumed
using the Resume option. This option can be used, to add a level of manual control when a job is started. For example,
you could schedule jobs to start in the suspended state. An administrator could then choose which scheduled jobs complete
by resuming the operation started in the suspended state.
Job Restarts and Job Running Time
For indexing-based, file system-like agents, you can click the Job Retry tab in the Advanced Restore Options dialog box



                                                        Page 62 of 263
                                        Features - SharePoint Database iDataAgent



to access the Job Retries and Job Running Time options, when you perform a data recovery operation.
You can also specify the maximum number of allowed restart attempts and the interval between restart attempts for all
data recovery jobs. For procedures, see Specify Job Restartability for the CommCell.
For more information on these subjects, see Job Restart and Job Running Time.
Back to Top




                                                     Page 63 of 263
                                         Features - SharePoint Database iDataAgent




Restore Data - SharePoint Portal

Choose the following topic:
   Overview
   Basic Restore (SharePoint 2001 Database only)
   Browse and Restore (not for SharePoint 2001 Database)
   Restore from Copies
   Restore Data Using a Specific MediaAgent, Library or Drive Pool
   Restore Destinations
      In-Place Restore
      Out-of-place Restore
   Restore Considerations for This Agent
   How To
See Also:
   Restore Backup Data


Overview
SharePoint Database iDataAgent
The SharePoint Database iDataAgent allows you to restore SharePoint Server 2001 Databases and SharePoint 2003
Databases.
When you restore the SharePoint data, you are over-writing your existing data. Data from after the date of that backup will
be lost. The restore operation only provides data from the point in time when it was backed up.
If you configure Shared Services, the Portal Site that provides the Shared Services must be restored/exist before restoring
any other Portal Sites.
If the SQL databases for the Share Point Portal Server 2003 were not backed up by the SharePoint iDataAgent, then they
must first be restored on the SQL Server using the SQL Server iDataAgent before performing the SharePoint Server 2003
Database Restore Procedure.
Restoring a SharePoint Server 2001 Database

You can restore all workspace elements as one unit:
   At the default subclient level, you are automatically restoring all of the database data.
   At the client browse level, you select the SharePoint option which restores all of the data that has been backed up for a
   SharePoint Database iDataAgent.
The restore operation then retrieves the workspaces from the backup media and restores them.

                         If a SharePoint 2001 Database restore is killed, this may result in an unstable state
                         and require that the server be rebuilt. The IKnowledgeServer Programming Interface
                         is used to backup and restore the SharePoint Database. The Microsoft Knowledge
                         Base contains an article addressing this issue, SPS: Workspace Is Unusable When
                         the Restore Operation that Uses the IKnowledgeServer Programming Interface Quits
                         Unexpectedly (287350).

Restoring a SharePoint Server 2003 Database

You can perform the following types of restores:
   Single Sign-on Database restore
   Site Content Database restore
   Site Collection Database restore
   Web Storage Database restore
   Teamsite Database restore




                                                      Page 64 of 263
                                          Features - SharePoint Database iDataAgent



   Portal Site restore
   Site Index restore
SharePoint Document iDataAgent
The SharePoint Document iDataAgent restores the category folders, document libraries, and the properties of the
workspace.
Restoring a SharePoint Server 2001 Document

The system offers the following options for restoring selected items when the object already exists, which are available in
the Restore Options dialog box. These options and their descriptions are as follows:
   Unconditional Overwrite - If an item that is selected for restore already exists, that item will be overwritten by the
   restored item.
   Skip - If an item is selected for restore and that item already exists, it will not be restored.
The following Restore States are supported in the Restore Options dialog box for SharePoint Server 2001 documents:
   Created State - Specifies that the document will be created and left in the control of the coordinator account specified
   on the iDataAgent properties only if the document has been deleted or when restoring a version out of place.
   Checked In State - Specifies that the document will be restored in the Checked In State, and available for editing.
   Published State - Specifies that the document will be restored in the Published State and available to all users. If the
   document is restored to a folder requiring Approval, then the document will be in a “Pending Approval” state until it has
   been approved by a user.
Restoring a SharePoint Server 2003 Document

The system offers the following options for restoring selected items when the object already exists, which are available in
the Restore Options dialog box.
   Unconditional Overwrite - If an item that is selected for restore already exists, then that item will be overwritten by
   the restored item.
   Skip - If an item is selected for restore and that item already exists, it will not be restored.
   Restore All Versions - Specifies that all versions of a multi-versioned document will be restored to a Version Enabled
   Library.
NOTES
   Documents restored to Version Enabled libraries will be restored with Unconditional Overwrite regardless of the
   option selected.
   Documents restored to Version Disabled libraries will be restored with Skip regardless of the option selected.
   List Items are always restored as new items.
   The .aspx files for any Web Part or basic web pages that are saved in a document library on a SharePoint site can be
   backed up and restored; however, the Web Part configuration or any other modifications for the pages will not be
   restored. This is a Microsoft limitation.


Restore Destinations
By default, the SharePoint Portal iDataAgents restore data to the client computer from which it originated; this is referred
to as an in-place restore. You can also restore the data to another Client computer in the CommCell. Keep in mind the
following considerations when performing such restores:
   The destination client must reside in the same CommCell as the client whose data was backed up.
   Note that when you perform restores other than in-place restores, the restored data assumes the rights (i.e.,
   permissions) of the parent directory.
   The restore destination must be on another SharePoint Portal Server with the SharePoint iDataAgents installed and
   operational.
   The destination server must either be in the same domain as the original server, or be in a domain having the proper
   trust relationship established both with the original domain, as well as all of the domains with which the original server’s
   domain had trust relationships.
   To restore a Top Level Site to a different computer:
      create the site on the destination computer before starting the restore
      use the SharePoint Database iDataAgent, if available



                                                       Page 65 of 263
                                          Features - SharePoint Database iDataAgent



      browse and select the Virtual Server for restore
SharePoint Server 2001 Database
  The restore destination drives must have the same drive letter configuration and have at least the capacity of the
  original drives that were backed up (i.e., if the original drives were a 20GB C: drive and a 40GB D: drive, the destination
  drives must be at minimum a 20GB C: and 40GB D: drive).
SharePoint Server 2001 Document
  It is possible to perform an out of place or cross-machine restore of SharePoint Document data to another client
  computer than the one that was backed up as long as the other client has the same configuration and registry key
  settings as the original client.
SharePoint Server 2003 Document
  To restore a site created by a user-defined Site Template to a server in a different farm, the Site Template must exist
  with the same ID in the destination computer. To change the configuration ID for user defined templates, open the
  CustomTemplates table in the Configuration database and modify the ID column for the row for the custom template
  used.
  To restore a deleted Sub-site, browse and select the Site Collection level as the destination.
The following section enumerates the types of restore destinations that are supported by the SharePoint Portal iDataAgents.
See Support Information - Restore Options - Restore Destinations for a list of Agents supporting each restore destination
type.
In-Place Restore
   Same path/destination
Out-of-Place Restore
  Same path/destination
  Different path/destination
Back to Top



Restore Considerations for This Agent
Before performing any restore procedures for this agent, review the following information:
SharePoint Server 2003 - General
If you have not already done so for the Client to which you are restoring data, ensure the Client is configured as follows:
For both the SharePoint Server 2003 Document and SharePoint Server 2003 Database iDataAgents, run Base Services on
the Client using an account that meets the following criteria:
   member of the local Administrator Group
   member of the SharePoint Portal Administrator Group
   System Administrator role on the SQL Server Instance
Refer to the article, Galaxy Service Account User Information for Windows 2003 and Window Server 2003 clients available
from the Maintenance Advantage web site.
SharePoint Server 2003 Database
   When restoring a Site Collection, the data is moved from the backup media to a local file. If the Site Collection is larger
   than the available space on the drive on which SharePoint iDataAgent resides, then the restore will fail. If necessary, you
   can modify the path to another site that does have enough space through the MSSharePoint registry key dHome.
   After restore, a Site Collection will have a new SiteID and will show up as new content during a discover/backup
   operation. You should manually find the restored site collection and reassign it to the previously assigned subclient.
SharePoint Server 2003 Document
General
   A restored document will not be available with the same version number it had when it was backed up.
   During a document library restore, the library's document template is set to "template.doc." If the library used a
   different template file, or none at all, you must change the document template properties to the desired value through
   the General Settings for the library.




                                                         Page 66 of 263
                                          Features - SharePoint Database iDataAgent




   For Survey Lists, the "Allow Multiple Responses" option must not be selected in SharePoint List Settings, or the restore
   will fail.
   For site restores using the SharePoint 2003 Document iDataAgent, we recommend that the destination site(s) already
   exist before starting the restore.
   While .aspx files for a Basic Page from a document library in SharePoint Document 2003 can be backed up and restored,
   restoring the file will not restore the web part to the page. Modifications made to the page using the Content Editor Web
   Part will not be restored and cannot be made to the file after it has been restored.
   For a restored document or List item, the value of the "Modified By" column will be the name of the account running
   Galaxy Services, and the value of the "Modified Time" column will be the current date and time.
List Items
   can only be restored to lists of the same type
   will not have the same List ID after restore
   The overwrite and skip options are ignored during restore.
   These are restored using the same account as Client Base Services, and will indicate that they were created by this user;
   thus, users will not be able to edit the specific list items from the restore because the Admin account was used for
   restore.
List Types
   Any deleted lists linked to by Outlook, Excel, Access, etc. must be relinked after the restore.
   Out-of-place restore of a list may fail if restored to a site that does not have that list type. For example, some Meeting
   Workspace have lists that are specific to the Meeting Workspace, so these cannot be restored to a Doc Workspace or a
   Team Site.
   Delete any unwanted views after the restore (they will appear numerous times.)
Portal Sites
   A Portal Site must be restored using the SharePoint Database iDataAgent.
   A Portal Area is not restored, and must be recreated before beginning the restore of a Portal Site.
   After restoring a Top Level Site, re-configure the connection to the Portal in the Site Settings.
Back to Top



How To
   Restore Destinations:
     Restore Out-of-Place
   Restore a SharePoint Server   2001 Database
   Restore a SharePoint Server   Document (2001/2003)
   Restore a SharePoint Server   2003 Portal Site
   Restore a SharePoint Server   2003 Site Content Database
   Restore a SharePoint Server   2003 Site Index
   Restore a SharePoint Server   2003 Single Sign-on Database
   Restore a SharePoint Server   2003 Site Collection
   Restore a SharePoint Server   2003 Teamsite Database
   Restore a SharePoint Server   2003 Webstorage System Database
Back to Top




                                                       Page 67 of 263
                                         Features - SharePoint Database iDataAgent




Basic Restore

Choose the following topic:
   Overview
   What You Need to Know Before Performing a Restore
   Support Information - Restore/Recover/Retrieve Options
   How To


Overview
There are two functions that help you retrieve backed up data from the backup media: Browse and Restore. Browse
operations allow you to view the data that has been backed up for a client computer without actually restoring the data.
Restore operations retrieve the data from the backup media and restore it to the desired location.
In the CommCell Browser, the Browse and various Restore commands appear in the right-click menus at the agent,
instance and/or backup set levels, depending on agent.
Using the Restore commands, i.e., restoring without browsing, is most appropriate when you want to restore the latest
backup job for an agent, instance or backup set and want to retain the current file structure.
In certain situations and for supported agents, Restore operations can run without utilizing the Browse feature. For
example, if you know the path/name of the volume of the data that you want to restore, you can restore it without
browsing. In these agents, this procedure is most appropriate when the number of paths for the data that you want to
restore is small or when the data that you want to restore is at a single volume. If you want to restore data from many
different paths or volumes, you should probably select the data from the Browse window.



How To
   Perform a Basic Restore
Back to Top




                                                      Page 68 of 263
                                         Features - SharePoint Database iDataAgent




Browse and Restore

Choose the following topic:
   Overview
   What You Need to Know Before Performing a Restore
   Support Information - Restore/Recover/Retrieve Options
   How To
Related Topics:
   Command Line Interface - qoperation restore


Overview
There are two functions that help you retrieve backed up data from the backup media: browse and restore. In the
CommCell Browser, the browse and variously-named restore commands appear, depending on agent, in the right-click
menus at the agent, instance and/or backup set levels.
Browse operations allow you to view data that has been backed up by the agent on the client computer and select all or
some of that data. Depending on the agent, there are several options available to customize your browse. See Browsing
Data for comprehensive information on Browse operations.
Restore operations allow you to retrieve data from backup media and restore it to the desired location. Restoring without
browsing is most appropriate when you want to restore the latest backup job for an agent, instance or backup set and want
to retain the current file structure. See Basic Restore for more information on restoring without using browse.
Browse and Restore
The Browse and Restore procedure is a sequential procedure that combines the two procedures. When you select a Browse
command from the CommCell Browser, you can define and run one of many potential browse sequences. At the end of the
browse, when you are looking at the resulting information presented in the Browse window, you can continue with a restore
procedure simply by selecting data and clicking the Recover All Data button. As with the browse, depending on the agent,
there are several options available to customize your restore.
Perform a browse and restore operation when you want to:
   restore from an earlier backup
   restore only select files/objects
   restore deleted files/objects
   when you don't want or don't need to retain the current file structure
   to utilize browse options
Back to Top



How To
   Browse and Restore
Back to Top




                                                      Page 69 of 263
                                          Features - SharePoint Database iDataAgent




Browse and Restore/Recover/Retrieve Data Using a Specific MediaAgent,
Library or Drive Pool

Choose the following topic:
   Overview
     Considerations for Browsing Data
     Considerations for Restoring NAS Data
     Considerations for File Level Restores with Image Level or Image Level ProxyHost
     Considerations for Browsing Data for Agents that do not have a Browseable Index
     Other Considerations
   Support Information - Browse Features
   Support Information - Restore/Recover/Retrieve Options
   How To


Overview
Data can be restored/recovered from any compatible library and drive type in the CommCell. By default the system
automatically identifies and restores/recovers data from any configured library in the CommCell, even if the media is not
available in the original library in which the data protection operation was performed. (This is described in
Restore/Recover/Retrieve From Anywhere.) Data Recovery operations use a specific data path - MediaAgent, Library and
Drive Pool - to perform the restore operations. When the default options are selected, the system automatically chooses the
most appropriate data path, as described in Data Recovery Operations using Alternate Data Paths.
In some situations you may want to use another data path to perform a data recovery operation. In such a situation, you
can specify the MediaAgent, Library and/or the Drive Pool.
Consider the following examples:
   You may want to use a specific MediaAgent to perform the browse and restore operation instead of the system selected
   default MediaAgent. For example, the default MediaAgent may be busy and you wish to use another MediaAgent which is
   idle, or you know the library attached to a specific MediaAgent contains the media associated with the data you wish to
   restore.
   The default MediaAgent may have a problem accessing the devices (library, drive) and hence you wish to use another
   MediaAgent sharing the library to perform the browse and restore operation.
                         This feature is applicable only for tape/optical libraries. The operation will fail if the requested data
                         is not available in the magnetic library attached to the specified MediaAgent.

Considerations for Browsing Data
When you perform a browse operation, the system returns the list of files requested by the browse by reading the most
recent version of the index in the MediaAgent. If the index is either not available in the MediaAgent's index cache, or not
accessible to browse/restore operation then it is restored to the selected MediaAgent from any available MediaAgent. If the
MediaAgent used for Browse happens to have visibility to the media then that MediaAgent is used for the browse operation.
The same MediaAgent will also get selected in the subsequent restore options. If necessary you can change the MediaAgent
in the subsequent restore operation.
Considerations for Browsing Data for Agents that do not have a Browseable Index
Some Agents (e.g., database Agents) do not have a browse able Index. Browse operations on such Agents do not provide
the option to select a MediaAgent, as an index is not used to display the data in the Browse window.
However, you can restore the data using a specific MediaAgent, Library or Drive Pool.
Considerations for Restoring NAS Data
Consider the following while selecting an alternate path for restoring NAS data.
   If the data protection operation was performed using a MediaAgent with NDMP Remote Server (library attached to the
   MediaAgent), use another MediaAgent with the NDMP Remote Server configuration to perform the data recovery
   operation.




                                                       Page 70 of 263
                                          Features - SharePoint Database iDataAgent



   If the data protection operation was performed using a MediaAgent configured to use a library attached to a NAS file
   server, then use a MediaAgent with a similar configuration to perform the data recovery operation.
If the appropriate MediaAgent is not selected, the restore operation will fail.
Considerations for File Level Restores with the Image Level or Image Level ProxyHost iDataAgents
Consider the following while selecting an alternate MediaAgent for File Level Restores using the Image Level or Image Level
ProxyHost iDataAgents on Windows:
   The specified alternate MediaAgent must be on Windows, and have its index cache on an NTFS partition, or the restore
   operation will fail.
   For large scale file systems, the time required to rebuild the index on an alternate MediaAgent should be considered.
   This can be avoided through the use of a Shared Index Cache.
Other Considerations
If the media is used in another compatible library to perform the restore, the library may read the barcode differently. In
such a situation, update the media barcodes and then perform the restore/recover operation.



How To
   Browse Using a Specific MediaAgent
   Restore/Recover/Retrieve Using a Specific MediaAgent, Library or Drive Pool
Back to Top




                                                        Page 71 of 263
                                          Features - SharePoint Database iDataAgent




List Media (Media Prediction)

Choose from the following topics:
   Overview
   How to Perform a List Media Operation
     List Media Associated with a Specific Backup Set, Instance or Subclient
     List Media Associated with Index
     List Media (Precise) Associated with a Specific File and/or Folder
     List Media Associated with a Specific Job
   General Information
   Support Information - List Media
   How To
Related Topics:
   Command Line Interface - qlist media
   Command Line Interface - qlist quickmedia
   Command Line Interface - qlist mediaagent


Overview
List media option is useful to predict media required for the following operations:
   To restore data associated with a specific backup set, subclient or instance
   To restore the index required to browse data associated with a specific backup set or subclient
   To restore a specific file (or specific files and folders)
   To restore data associated with a specific job
Media prediction is useful in a variety of circumstances, including the following:
   To ensure that media required by an operation is available in the library, especially if you are restoring/recovering data
   across a firewall.
   In cases where data spans across several media, to identify the exact media necessary to restore/recover a
   file/folder/sub-section of the data.
   To identify and restore/recover from a copy that accesses a faster magnetic disk media rather than slower tape/optical
   media.
   To identify media associated with an alternate copy, when the media containing data associated with a specific copy is
   not readily available due to the following reasons:
       When the media is exported from the library
       When the media is used by another operation


How to Perform a List Media Operation
The list media operation can be performed in several different ways, depending on the requirement. The following sections
describe each of these methods.


List Media Associated with a Specific Backup Set, Instance or Subclient
This operation is referred to as List Media in the CommCell Console and provides the following options:
   Search media associated with the latest data protection cycle, starting from the latest full backup. (This is the default
   option.)
   Search media associated with data protection operations performed between a specified time range.
   Search for media associated with a specific storage policy copy, synchronous or selective copies, with the specified copy
   precedence number.
Keep in mind that when you search media from a secondary copy, the listed media may not reflect the entire instance or
backup set data, unless all the storage policies associated with all the subclients have been configured for secondary copies.




                                                       Page 72 of 263
                                           Features - SharePoint Database iDataAgent



The List Media option is available as a right-click option in the subclient level and in the Browse Options dialog box from
the Backup Set/Instance level. See Perform List Media for a Subclient and Perform List Media for a Backup Set or Instance
for step-by step instructions.


List Media Associated with Index
When a browse operation is performed, the system automatically restores the index from the appropriate media, if the
index for the data is not available in the index cache, for Agents that support index. In such situations, this option is useful
to verify the following:
   Whether the index is available in the index cache or must be restored from a media
   Whether the index must be restored from a media, if the appropriate media is available
The List Media option for index restore is available in the Browse Options dialog box. See List Media Associated with Index
for step-by-step instructions.
Related Topics: Index


List Media (Precise) Associated with Specific Files and/or Folders
This operation is referred to as List Media (Precise) in the CommCell Console and is useful to precisely predict media in
which specific files or folders reside. For example:
   When a data protection operation spans across multiple media and you would like to know the exact media in which the
   files you wish to restore reside.
   You have a specific set of files (either a random set or a specific set, such as *.doc or *.txt) that you wish to restore and
   would like to know the all the media in which the files reside.
   You wish to restore a specific version of the file and would like to know the specific media in which the version resides.
The List Media option for specific files and/or folders can be accessed from the Browse window, after selecting the
appropriate files/folders for restore. See List Media (Precise) for Specific Files and/or Folders for step-by step instructions.
The precise media prediction is also available when you view different versions of the file (See Browse Multiple Versions of a
File or Object) or when you use the find operation (see Find a File / Directory / Object) to locate a file.


List Media Associated with a Specific Job
The Restore by Jobs feature provides the facility to restore data from a specific data protection operation. This option also
includes the facility to list media associated with the job.
See List Media for specific Jobs for step-by step instructions.



General Information
Other notable features provided by the list media operation are:
   Facility to Print or Save the prediction results. The files can be saved either as a tab (.xls) or comma (.csv) separated
   file.
   The List Media (Precise) operation can be run immediately or scheduled. When it is run immediately the results are
   displayed immediately in the CommCell Console and if it is scheduled the results are saved in a specified file.
   Note that in both cases the result provides information on the total space required to restore the selected data.
   The List Media (Precise) operation also includes the ability to email prediction results by generating an alert (if
   configured) which would in turn contain the prediction results.
   Command line interface provides commands for some of the list media operations. See Command Line Interface - qlist
   for more information.
   The list media operation is displayed as a job (with appropriate controls, such Suspend, Resume, and Kill) in the Job
   Controller. Appropriate event messages are also populated in the Event viewer.
   The List Media (Precise) operation will not be supported if the MediaAgent used for the operation is not upgraded to the
   current software version.




                                                        Page 73 of 263
                                          Features - SharePoint Database iDataAgent




Restore/Recover/Retrieve From Anywhere

The software automatically identifies and restores/recovers data from any configured library in the CommCell, even if the
media is not available in the original library in which the data protection operation was performed.
Consider the following example:
Client A is protected using MediaAgent A. For some reason, Client A's data requires to be restored to Client B using
MediaAgent B, with a compatible library. The following steps are required in such a situation:

1.   Export the media from MediaAgent A.
2.   Import the media in MediaAgent B.
3.   Select the files to be restored/recovered by performing a Browse on Client A's data.
4.   Restore/recover the files to Client B, by selecting the Destination as Client B in the Restore Options or Recovery
     Options.

The system automatically identifies and restores/recovers the data from the appropriate media.
If the media is used in another compatible library to perform the restore, the library may read the barcode differently. In
such a situation, update the media barcodes and then perform the restore/recover operation.
Note that although the media is displayed as a Media from a different library in the CommCell Console, when a media is
imported into another library, Data Recovery operations can be performed from the media. (However, the media will not be
used for Data Protection operations.)
For File Level restores using Image Level or Image Level ProxyHost on Windows, see Considerations for File Level Restores
with the Image Level or Image Level ProxyHost iDataAgents.


                 All libraries (with compatible drive types) including stand-alone drives, support this feature.



See Also:
     Data Recovery Operations using Alternate Data Paths




                                                        Page 74 of 263
                                          Features - SharePoint Database iDataAgent




Full System Restore - SharePoint Portal iDataAgents

Select the desired topic:
     Overview
     Perform a Full System Restore for SharePoint 2001
     Perform a Full System Restore for SharePoint 2003
SharePoint Portal Server
     All-In-One Farm
     Small Server Farm
     Medium Server Farm
Windows SharePoint Services
     All-in-One/Single Server
     Server Farm


Overview
The difference between a normal restore and a full system restore is the severity of the problem. Normally, if data is lost or
removed, it is recovered from the archives using the normal restore procedures. However, when a normal restore operation
cannot correct a software and/or hardware corruption problem, some level of full system restore is required.
When the client system (software, hardware, hard drives, etc.) is damaged or destroyed, a full server restore may be
required. If a System State or File System restoration is necessary on the system where the SharePoint Portal server
resides, refer to the appropriate File System's Disaster Recovery procedures. Using the SharePoint Document iDataAgent
for system recovery does not fully recover every part of the SharePoint Document server. We recommend using the
SharePoint Database iDataAgent to perform the recovery. Therefore the SharePoint Database iDataAgent should also be
installed and full backup scheduling enforced.




Perform a Full System Restore for SharePoint 2001
The general procedure for a full system rebuild is described below. Refer to the documentation on each of the components
when rebuilding the system. For example, be sure to perform the File System full iDataAgent restore procedure prior to
restoring the data using the SharePoint Database iDataAgent.
Valid backups of the SharePoint Database and Windows File System must exist in order to perform these operations.
     To perform a full system restore for SharePoint:

1.   If necessary, rebuild the hardware as it existed before, using all the same settings, and networking.
2.   Refer to QiNetix Component Recovery for the Windows File System to perform a full restore of the client computer.
3.   Uninstall the SharePoint Portal Server software. This is necessary because the restored SharePoint Portal Server
     services will appear to be operational, however you cannot access any workspaces.
4.   Install the SharePoint Portal Server software exactly as it was before, using the same networking parameters and
     passwords that were previously set.
5.   Install the SharePoint Portal Database iDataAgent in the same drive as it was previously installed. Use the same
     installation parameters and passwords.
6.   If there is software currently running on the server that may attempt to access the SharePoint database, such as Virus
     Scan, disable it at this time.
7.   Perform a full restore of the SharePoint database using the SharePoint Database iDataAgent documented Restore
     Procedures.




Perform a Full System Restore for SharePoint 2003



                                                         Page 75 of 263
                                          Features - SharePoint Database iDataAgent




The general procedure for a full system rebuild is described below. Refer to the documentation on each of the components
when rebuilding the system. For example, be sure to perform the appropriate File System's Disaster Recovery procedures
prior to restoring the data using the SharePoint Database iDataAgent.
To perform these operations, valid backups of the SharePoint Database and Windows File System must exist.
All-In-One Farm

1.   From SharePoint Central Administration, disconnect from the Configuration Database.
2.   Follow the procedure for Full iDataAgent Restores for Windows 2000, XP and Server 2003. Full iDataAgent restore will
     bring back IIS and SharePoint.
        If reinstalling the OS, do not install IIS.
        If planning to overwrite the existing installation, uninstall IIS and SharePoint before beginning the restore.
3.   The SharePoint Services will be running but trying to access the Configuration Database.

     a.  From SharePoint Central Configuration, disconnect from the Configuration Database.
     b.  Create a new Configuration Database.
     c.  Configure the same component assignments for the server on the Configure Server Topology page.
     d.  On the Virtual Server list page, the IIS Virtual Servers for the Web Server may appear extended. From the Virtual
         Server Settings page for each extended virtual server, click Remove Windows SharePoint Services from
         virtual server. It might return an error that the virtual server is not in the Configuration Database but this will
         ensure the Virtual Server is available to be extended.
4.   Using the SharePoint 2003 Database iDataAgent:

     a. Restore the Portal Site databases.
     b. Restore any additional Content Databases.
     c. Restore the Portal Index.
5.   Run iisreset from the command line.


Small Server Farm
Web Server

1.   From SharePoint Central Administration, disconnect from the Configuration Database.
2.   Follow the procedure for Full iDataAgent Restores for Windows 2000, XP and Server 2003. Full iDataAgent restore will
     bring back IIS and SharePoint.
        If reinstalling the OS, do not install IIS.
        If planning to overwrite the existing installation, uninstall IIS and SharePoint before beginning the restore.
3.   The SharePoint Services will be running but trying to access the Configuration Database.

     a.  From SharePoint Central Configuration, disconnect from the Configuration Database.
     b.  Create a new Configuration Database.
     c.  Configure the same component assignments for the server on the Configure Server Topology page.
     d.  On the Virtual Server list page, the IIS Virtual Servers for the Web Server may appear extended. From the Virtual
         Server Settings page for each extended virtual server, click Remove Windows SharePoint Services from
         virtual server. It might return an error that the virtual server is not in the Configuration Database but this will
         ensure the Virtual Server is available to be extended.
4.   Using the SQL iDataAgent, restore any databases used by the Portal Site (except for the Configuration Database).
5.   Using the SharePoint 2003 Database iDataAgent:

     a. Restore the Portal Site databases.
     b. Restore any additional Content Databases.
     c. Restore the Portal Index.
6.   Run iisreset from the command line.

Database Server

1.   From SharePoint Central Administration, unextend any virtual servers and disconnect from the Configuration Database.
2.   Follow procedure for Full System Restores for SQL Server iDataAgents.
     NOTE: When restoring the SQL Databases, do not restore the Configuration Database.
3.   From SharePoint Central Administration, create a new Configuration Database. Configure the same component
     assignments for the server on the Configure Server Topology page.
4.   Using the SharePoint 2003 Database iDataAgent on the Web Server:




                                                       Page 76 of 263
                                          Features - SharePoint Database iDataAgent




     a. Restore the Portal Site databases.
     b. Restore any additional Content Databases.
     c. Restore the Portal Index.
5.   From the Web Server, run iisreset from the command line.

Entire Farm (Both Servers)

1.   Web Server:
     From SharePoint Central Administration, unextend any virtual servers and disconnect from the Configuration Database.
2.   Database Server:
     Follow procedure for Full System Restores for SQL Server iDataAgents.
     NOTE: When restoring the SQL Databases, do not restore the Configuration Database.
3.   Web Server:
     Follow procedure for Full iDataAgent Restores for Windows 2000, XP and Server 2003. Full iDataAgent restore will bring
     back IIS and SharePoint.
        If reinstalling the OS, do not install IIS.
        If planning to overwrite the existing installation, uninstall IIS and SharePoint before beginning the restore.
4.   Web Server:
     The SharePoint Services will be running but trying to access the Configuration Database.

     a.  From SharePoint Central Configuration, disconnect from the Configuration Database.
     b.  Create a new Configuration Database.
     c.  Configure the same component assignments for the server on the Configure Server Topology page.
     d.  On the Virtual Server list page, the IIS Virtual Servers for the Web Server may appear extended. From the Virtual
         Server Settings page for each extended virtual server, click Remove Windows SharePoint Services from
         virtual server. It might return an error that the virtual server is not in the Configuration Database but this will
         ensure the Virtual Server is available to be extended.
5.   Web Server:
     Using the SharePoint 2003 Database iDataAgent:

     a. Restore the Portal Site databases.
     b. Restore any additional Content Databases.
     c. Restore the Portal Index.
6.   Web Server:
     Run iisreset from the command line.


Medium Server Farm
Web Server

1.   From SharePoint Central Administration, disconnect from the Configuration Database.
2.   Follow the procedure for Full iDataAgent Restores for Windows 2000, XP and Server 2003. Full iDataAgent restore will
     bring back IIS and SharePoint.
        If reinstalling the OS, do not install IIS.
        If planning to overwrite the existing installation, uninstall IIS and SharePoint before beginning the restore.
3.   The SharePoint Services will be running but trying to access the Configuration Database.

     a.  From SharePoint Central Configuration, disconnect from the Configuration Database.
     b.  Create a new Configuration Database.
     c.  Configure the same component assignments for the server on the Configure Server Topology page.
     d.  On the Virtual Server list page, the IIS Virtual Servers for the Web Server may appear extended. From the Virtual
         Server Settings page for each extended virtual server, click Remove Windows SharePoint Services from
         virtual server. It might return an error that the virtual server is not in the Configuration Database but this will
         ensure the Virtual Server is available to be extended.
4.   Using the SQL iDataAgent, restore any databases used by the Portal Site (except for the Configuration Database).
5.   Using the SharePoint 2003 Database iDataAgent:

     a. Restore the Portal Site databases.
     b. Restore any additional Content Databases.
     c. Restore the Portal Index.
6.   Run iisreset from the command line.




                                                       Page 77 of 263
                                          Features - SharePoint Database iDataAgent



Indexing Server

1.   From SharePoint Central Administration, disconnect from the Configuration Database.
2.   Follow the procedure for Full iDataAgent Restores for Windows 2000, XP and Server 2003. Full iDataAgent restore will
     bring back IIS and SharePoint.
        If reinstalling the OS, do not install IIS.
        If planning to overwrite the existing installation, uninstall IIS and SharePoint before beginning the restore.
3.   The SharePoint Services will be running but trying to access the configuration database.

     a. From SharePoint Central Configuration, disconnect from the Configuration Database.
     b. Reconnect to the existing Configuration Database.
     c. Configure the same component assignments for the server on the Configure Server Topology page.
4.   From the Web Server(s), using the SharePoint 2003 Database iDataAgent, restore the Portal Index.
5.   Force Propagate the Index(es) to the Search Server(s).

Database Server

1.   From SharePoint Central Administration, unextend any virtual servers and disconnect all servers from the Configuration
     Database.
2.   Follow procedure for Full System Restores for SQL Server iDataAgents.
     NOTE: When restoring the SQL Databases, do not restore the Configuration Database.
3.   From SharePoint Central Administration, create a new Configuration Database.
4.   Connect all previous servers to the new Configuration Database. Configure the same component assignments for the
     servers on the Configure Server Topology page.
5.   Using the SharePoint 2003 Database iDataAgent on the Web Server:

     a. Restore the Portal Site databases.
     b. Restore any additional Content Databases.
     c. Restore the Portal Index.
6.   From the Web Server(s), run iisreset from the command line.

Entire Farm (All Servers)

1.   From SharePoint Central Administration, unextend any virtual servers and disconnect all servers from the Configuration
     Database.
2.   Database Server:
     Follow procedure for Full System Restores for SQL Server iDataAgents.
     NOTE: When restoring the SQL Databases do not restore the Configuration Database.
3.   Web Server:
     Follow procedure for Full iDataAgent Restores for Windows 2000, XP and Server 2003. Full iDataAgent restore will bring
     back IIS and SharePoint.
        If reinstalling the OS, do not install IIS.
        If planning to overwrite the existing installation, uninstall IIS and SharePoint before beginning the restore.
4.   Web Server:
     The SharePoint Services will be running but trying to access the Configuration Database.

     a.  From SharePoint Central Configuration, disconnect from the Configuration Database.
     b.  Create a new Configuration Database.
     c.  Configure the same component assignments for the server on the Configure Server Topology page.
     d.  On the Virtual Server list page, the IIS Virtual Servers for the Web Server may appear extended. From the Virtual
         Server Settings page for each extended virtual server, click Remove Windows SharePoint Services from
         virtual server. It might return an error that the virtual server is not in the Configuration Database but this will
         ensure the Virtual Server is available to be extended.
5.   Web Server:
     Using the SharePoint 2003 Database iDataAgent:

     a. Restore the Portal Site databases.
     b. Restore any additional Content Databases.
     c. Restore the Portal Index.
6.   Web Server:
     Run iisreset from the command line.




                                                       Page 78 of 263
                                          Features - SharePoint Database iDataAgent




All-in-One/Single Server

1.   Follow the procedure for Full System Restores for SQL Server iDataAgents.
     NOTE: Restore the SQL Databases but do not restore the SharePoint Configuration Database.
2.   From SharePoint Central Administration:

     a. Create a new Configuration Database.
     b. On the Virtual Server list page, the IIS Virtual Servers for the Web Server may appear extended. From the Virtual
        Server Settings page for each extended virtual server, click Remove Windows SharePoint Services from virtual
        server. It might return an error that the virtual server is not in the Configuration Database but this will ensure the
        Virtual Server is available to be extended.
     c. Extend and Create a new Content database with specified Database name so you can delete that Database later.
     d. Delete Top-Level Site Collection.
     e. Add existing Content Databases for that virtual server.
     f. Delete Content Database with specified Database name.
     g. Repeat Steps b. through f. for all virtual servers.
3.   From the Web Server(s), run iisreset from the command line.



Server Farm
Web Server

1.   Follow the procedure for Full iDataAgent Restores for Windows 2000, XP and Server 2003. Full iDataAgent restore will
     bring back IIS and SharePoint.
        If reinstalling the OS, do not install IIS.
        If planning to overwrite the existing installation, uninstall IIS and SharePoint before beginning the restore.
2.   From SharePoint Central Administration:

     a. Create a new Configuration Database.
     b. On the Virtual Server list page, the IIS Virtual Servers for the Web Server may appear extended. From the Virtual
        Server Settings page for each extended virtual server, click Remove Windows SharePoint Services from
        virtual server. It might return an error that the virtual server is not in the Configuration Database but this will
        ensure the Virtual Server is available to be extended.
     c. Extend and Create a new Content database with specified Database name so you can delete that Database later.
     d. Delete Top-Level Site Collection.
     e. Add existing Content Databases for that virtual server.
     f. Delete Content Database with specified Database name.
     g. Repeat Steps b. through f. for all virtual servers.
3.   Run iisreset from the command line.

Database Server

1.   Follow the procedure for Full System Restores for SQL Server iDataAgents.
     NOTE: Restore the SQL Databases but do not restore the SharePoint Configuration Database.
2.   From SharePoint Central Administration:

     a. Create a new Configuration Database.
     b. On the Virtual Server list page, the IIS Virtual Servers for the Web Server may appear extended. From the Virtual
        Server Settings page for each extended virtual server, click Remove Windows SharePoint Services from virtual
        server. It might return an error that the virtual server is not in the Configuration Database but this will ensure the
        Virtual Server is available to be extended.
     c. Extend and Create a new Content database with specified Database name so you can delete that Database later.
     d. Delete Top-Level Site Collection.
     e. Add existing Content Databases for that virtual server.
     f. Delete Content Database with specified Database name.
     g. Repeat Steps b. through f. for all virtual servers.
3.   From the Web Server(s), run iisreset from the command line.



Back To Top




                                                       Page 79 of 263
                                        Features - SharePoint Database iDataAgent




Command Line Interface

Select the desired topic:                                        QCommands Continued
   Overview                                                        qlogin
      Configuration                                                qlogout
      Log In Sessions                                              qmodify
      Argument Files                                                  galaxypassword
      Date/Time Format                                                instance
   Saving a Job as a Script                                           password
      Script Considerations                                           subclient
   Writing and Editing Scripts                                     qoperation
   QCommands                                                          adduser
      qcreate                                                         agedata
         backupset                                                    automaticupdate
         sp (storage policy)                                          auxcopy
         spcopy (storage policy copy)                                 backup
         subclient                                                    capture (CommCell Migration)
         user                                                         erbackup
      qdelete                                                         jobcontrol
         backupset                                                    jobretention
         client                                                       media
         dataagent                                                    merge (CommCell Migration)
         sp (storage policy)                                          move
         spcopy (storage policy copy)                                 restore
         subclient                                                    vaulttracker
      qgeterrorstring                                            Support Information - Command Line Interface
      qinfo                                                      Considerations for Running GxCmd Commands
         backupset                                               How To
         instance
                                                              Related Topics:
         subclient
      qlist                                                      Error Codes
         alert                                                   Capabilities and Permitted Actions
         backupset                                               Running RMAN Scripts using the Command Line Interface
         client                                                  Troubleshooting
         dataagent                                                  Thread Libraries (FreeBSD)
         drivepool
         instance
         job
         jobsummary
         library
         location
         masterdrivepool
         media
         mediaagent
         quickmedia
         sp (storage policy)
         spcopy (storage policy copy)
         subclient
         vtp (VaultTracker policy)




                                                     Page 80 of 263
                                         Features - SharePoint Database iDataAgent




Overview
QCommands provide access to several basic functions through the command line, and can be integrated into your own
scripts and scheduling programs. You can write scripts using the commands listed below. Note that scripts can also be
generated through the CommCell Console for some features using the Save as Script option. All commands have consistent
output in all situations to facilitate easier script writing.
Although several command line options are supported, it must be noted that the CommCell Console is the recommended
method to manage your CommCell as it provides comprehensive support for all options available in the software. See
CommCell Console for more information.
Configuration
No special configuration is required to use the command line interface. The commands are integrated with the Base
package, and are therefore available on all computers which have any CommServe, MediaAgent, or Agent software
installed.
In order for the commands to function, the Galaxy Commands Manager service should be up and running on the
CommServe. The Galaxy Commands Manager is a service that is installed with the CommServe, and is responsible for
handling command line requests and forwarding them to the Event Management Service of the target CommServe. See
Services for more information.
Log In Sessions
Using the Qlogin command, you can start a User login session, removing the need to log in for every command. Once the
Qlogin command is successful, the login session for all computers remain valid until you explicitly log out using the
Qlogout command. This is true for all command sessions, not just the one for which the login took place.

Argument Files
Some commands can read command line arguments from a file. This is useful for complex commands that require many
arguments. If a command can accept an argument file, an Argument File section will appear under Options in the
command documentation below.
You can provide this file as an input to the command through the "-af <filename>" option. Command lines can accept both
the input file, as well as command line arguments. However, when both exist, arguments provided through the input file
will overwrite the command line options.
Commands which support file inputs
    qoperation backup
    qoperation restore
    qoperation auxcopy
    qcreate backupset
    qcreate subclient
    qmodify subclient
    qmodify instance
    qdelete backupset
    qdelete subclient
Format

The input file contains a list of options and their value(s). Both options and values should always start on a new line (white
space at the beginning and end of each line is ignored). Options are surrounded by "[" and "]". Any line that begins with "["
and ends with a "]" is treated as an option. An option can take more than one value, and each value should be specified in
a separate line. All lines that start with "#" are treated as comments and are ignored.
Sample input file
#   Sample input file    which has three options option1, option2 and option3.
#   option1 takes one    value "value1"
#   option2 takes two    values "value2" and "value3"
#   option3 takes one    value "value4"
[option1]
value1
[option2]




                                                       Page 81 of 263
                                          Features - SharePoint Database iDataAgent




value2
value3
[option3]
value4

Date/Time Format
For commands in which a time must be specified, the time must be entered in the following format:
mm/dd/yyyy hh:mm[:ss]
Example:
08/30/2005 05:58
Or
08/30/2005 05:58:15

                          The commands do not support time zones. Any time specified using the command line is
                          assumed to be in the CommServe time zone.


Back to Top



Saving a Job as a Script
Data Protection, Recovery, Auxiliary Copy, Disaster Recovery Backup, and Data Aging operations and their selected options
can be saved as script files using the CommCell Console, which can later be executed from the command line. See
Command Line Interface - Save a Job as a Script for more information.
Script Considerations
Consider the following before you save a job as a script:
   When you use the CommCell Console to save a script, only the options that are available for the corresponding
   QCommand are saved. For example, the Job Retry option is available in the CommCell Console for several operations
   such as backup and restore. However, this option is not available in the corresponding QCommand. Therefore, if you
   Save a Script with Job Retry options selected, they will be ignored when the script is generated.
   Saving an operation as a script through the CommCell Console creates the script and an input file on the CommServe
   computer. After they have been saved, the script can be run on the CommServe or you may move and run it on any
   other client computer. If you move the script, you must move both the script and input files to the client computer from
   which the script will be executed.
      On Windows, add a rem comment to the line that sets the GALAXY_BASE variable to the CommServe path and remove
      the rem comment from the line that sets it to the client's path. The batch file contains instructions on how to do this.
      On Unix, the script must be edited to reflect the absolute path for the GALAXY_BASE variable only in cases where you
      have added a node to a cluster beyond the first physical and virtual node. The batch file contains instructions on how
      to do this.
   When you use the CommCell Console to save a script, certain characters (for example, the left bracket [ and number
   sign #) in front of an object name (for example, #csmacs as the subclient name) may cause the script to not work
   correctly when it is run from the command line. If your script does not work from the command line, check it for these
   characters. You may need to rename the object by removing the characters, and then rerun your script.
   When you use the CommCell Console to save a backup script for a Microsoft SharePoint 2003 Document iDataAgent and
   run it from the command line, the script will only back up the latest version of a document. This will happen even when
   you choose the "Backup All Versions" option when generating the script.
Back to Top



Writing and Editing Scripts
When writing or editing scripts, note the following:
   Character Limitations
     Commands and their options are all contained on a single line. Check your operating system guidelines to ensure you



                                                       Page 82 of 263
                                       Features - SharePoint Database iDataAgent



    do not exceed any line length limitations.
    Unicode is not supported by the command line interface.
  DataMigrator Agent for File System with Data Classification

                           For the DataMigrator Agent for File System with Data Classification, the DataClassSet is
                           represented internally as a migration set, and the DataClassSet subclient is represented
                           internally as a subclient

     When executing a Data Classification-enabled DataMigrator script that requires both the DataClassSet and the
     DataClassSet subclient, provide the DataClassSet with the same name as that for the DataClassSet subclient. For
     Data Classification, the DataClassSet name is the same as the DataClassSet subclient name. For example, consider
     the following command:
     E:\commandline>qoperation backup –c COMPUTERNAME –a Q_WINFS_MIG –b BACKUPSET –s SUBCLIENT –t Q_FULL

     To migrate data for a DataClassSet and a DataClassSet subclient, the command might appear as follows:
     E:\commandline>qoperation backup –c BERRY –a Q_WINFS_MIG –b department01 –s department01 –t Q_FULL

     Since Data Classification is an enabler, any command for the DataMigrator Agent for Windows File System with Data
     Classification always requires that you use the DataMigrator Agent for Windows File System parameter (i.e.,
     Q_WINFS_MIG).
  Username and encrypted password
    Before using any of the commands, the user must log into the CommServe using the qlogin command. The -p
    argument of the qlogin command provides a password for this purpose. You can obtain this encrypted value by
    saving any supported operation (i.e., a backup or restore job) as a script through the CommCell Console. This creates
    the qlogin string and encrypted password for the user that is currently logged on to the CommCell Console. You can
    then copy and reuse the qlogin string from that script in other scripts. For more information, see the description for
    the qlogin command.
  Argument Values
    When specifying paths in command line arguments, always use the full path. In most cases, relative paths are not
    supported. However, they can be used if arguments are supplied using an input file.
    Avoid using the following characters when specifying argument values:
    "%*?-\$
    On Windows, when entering argument values that contain a space or special character, be sure to put the argument
    value in quotes (e.g., -argument "argument value"). On the Unix platform, use "\"\"" notation (e.g., -argument
    "\"argument value\""). If the argument value is an empty string, enter "" (Windows) and enter "\"\"" (Unix).

                           Argument values that you use in a command string or arguments file are case-sensitive.



  Miscellaneous
     If you are performing a backup of multiple subclients (for example, a backup at the backup set level), the command
     line will default to asynchronous mode.
  Platforms
     QCommands cannot be executed on the NetWare platform. However, QCommand scripts for the NetWare agents can
     be run on the CommServe, or another Windows or Unix client in the CommCell.
Back to Top



qcreate
This command creates an entity based on the specified command.
Usage: qcreate command <command-options-arguments> [-h]

Supported Commands:
  backupset
  sp (storage policy)
  spcopy (storage policy copy)




                                                    Page 83 of 263
                                        Features - SharePoint Database iDataAgent



     subclient
     user


backupset
This command creates a backup set under the given data agent or instance with the specified name.
Usage: qcreate backupset -c client -a dataagenttype -i instance -n backupset [-sp storagepolicy] [-t
NORMAL|DEFAULT|ONDEMAND] [-af Argument Filepath] [-h]

Description: This command creates a new backup set with the given backup set name under the specified client and
agent. An instance name is required for those agents that allow backup sets under instances. This command also allows the
user to create default backup sets and an on-demand backup sets. The default subclient created under the backup set can
also be associated with a storage policy (the subclient will remain unassigned if the storage policy name is not passed).
Upon successful completion, qcreate backupset displays the message "Created backupset successfully" in the CommCell
Console. In case of an error, an error code and the error description are displayed in the following format: "backupset:
Error errorcode: errordescription"
Options:
Command Line
-c Client computer name
-a Agent type installed on client computer (see Argument Values - Agent Types)
-i Instance name, required for a few agents
-n Name of the backup set to be created
-t Type of backup set (NORMAL, DEFAULT, or ONDEMAND)
-sp Storage policy name to which default subclients are to be associated
-af Reads arguments from a file
-h Displays help

Argument File
client           Client computer name
dataagent        Agent type installed on client computer (see Argument Values - Agent Types)
backupset        Backup set name to be created
sp               Storage policy name to be associated with all subclients in the backup set
backupsettype    Backup set type (NORMAL, DEFAULT, or ONDEMAND)
compressionat    Compression at client or MediaAgent (CLIENT|MA)
networkagents    Number of network agents
prebackup        Prebackup command name
postbackup       Postbackup command name
prescan          Prescan command name
postscan         Postscan command name

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
Create a backup set with name bs1 under client client1 and Agent Q_WIN2K_FS.
E:\commandline>qcreate backupset –c client1 –a Q_WIN2K_FS –n bs1

Back to Top


sp
This command creates a new storage policy (for data protection).
Usage: qcreate sp -sp storagepolicy -m mediaagent -l library -d drivepool -srp scratchpool [-isp
incstoragepolicy] [-h]

Description: This command creates a new storage policy for data protection. The drive pool name to which the backup
data of Primary copy is directed, the incremental storage policy name, and the scratch pool from which the copy obtains
new media can also be specified.
Upon successful completion, qcreate sp displays the message "Created storage policy successfully" in the CommCell




                                                     Page 84 of 263
                                          Features - SharePoint Database iDataAgent



Console. In case of an error, an error code and the error description are displayed in the following format: "sp: Error
errorcode: errordescription"
Options:
-sp   Storage policy name to be created
-m    MediaAgent name
-l    Media library name
-d    Drive pool name to which backup data of Primary copy is directed
-srp Name of the scratch pool from which the copy obtains new media
-isp Incremental storage policy name to be associated with the new storage policy name
-h    Displays help

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
Create a copy with name copy1 for storage policy sp1, media agent magent1, library maglib1.
E:\commandline>qcreate sp –sp sp1 –m magent1 –l maglib1

Back to Top


spcopy
This command creates a new secondary copy (synchronous) for the specified storage policy.
Usage: qcreate spcopy -sp storagepolicy -n copy -m mediaagent -l library -d drivepool -srp scratchpool
[-h]

Description: This command creates a new secondary copy of the specified storage policy. The new copy will be a
synchronous copy, which means that everything from the primary copy (including full, differential, and incremental
backups) are copied to the secondary copy. By default, the new copy is enabled for media reads/writes. The drive pool
name to which the backup data of the Primary copy is directed, as well as the scratch pool from which the copy obtains new
media, can also be specified.
Upon successful completion, qcreate spcopy displays the message "Created storage policy copy successfully" in the
CommCell Console. In case of an error, an error code and the error description are displayed in the following format:
"qcreate spcopy: Error errorcode: errordescription"
Options:
-sp Storage policy name for which a new copy is to be created
-n   Secondary copy name
-m   MediaAgent name
-l   Media library name
-d   Drive pool name to which backup data of the Primary copy is directed
-srp Name of the scratch pool from which the copy obtains new media
-h   Displays help

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
Create a copy with name copy1 for storage policy sp1, MediaAgent magent1, library maglib1.
E:\commandline>qcreate spcopy –n copy1 –sp sp1 –m magent1 –l maglib1

Back to Top


subclient
This command creates a subclient under the given backup set or instance with the specified storage policy and content.
Usage: qcreate subclient -c client -a dataagenttype -b backupset -i instance -n subclient -sp
storagepolicy -f content [-t SYSTEMSTATE] [-af argsfilepath] [-h]




                                                       Page 85 of 263
                                          Features - SharePoint Database iDataAgent




Description: This command creates a new subclient with the given subclient name and content under the specified client
and agent. The instance name and backup set name are required for certain agents, and should be specified when
applicable. The new subclient will be associated with the specified storage policy. System state subclient can also be
created using this command.
Upon successful completion, qcreate subclient displays the message "Created subclient successfully" in the CommCell
Console. In case of an error, an error code and the error description are displayed in the following format: "subclient: Error
errorcode: errordescription"
Options:
Command Line
-c  Client computer name
-a  Agent type installed on client computer (see Argument Values - Agent Types)
-b  Backup set name, required for certain agents
-i  Instance name, required for certain agents
-n  Name of the subclient to be created
-sp Name of the storage policy to be associated with the subclient
-f  File/Directory Path to be included as content in the new subclient
-t  Subclient type to create system state subclient (SYSTEMSTATE)
-h  Displays help

Argument File
client            Client computer name
dataagent         Agent type installed on client computer (see Argument Values - Agent Types)
backupset         Backup set name
subclient         Subclient name to be created
sp                Storage policy name to be associated with the subclient.
content           Subclient content (list of files/directories)
subclienttype     Subclient type to create system state subclient (SYSTEMSTATE)
compressionat     Compression at client or MediaAgent (CLIENT|MA)
networkagents     Number of Network Agents
prebackup         PreBackup command name
postbackup        PostBackup command name
prescan           Prescan command name
postscan          PostScan command name

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
Create a subclient with name sc1 under client client1 and Agent Q_WIN_FS and backup set bs1 with content c:\ and
storage policy sp1.
E:\commandline>qcreate subclient –n sc1 –c client1 –a Q_WIN_FS –b bs1 –f "c:\" –sp sp1

Back to Top


user
This command creates a user account.
Usage: qcreate user -u username [-p password] [-h]

Description: This command creates a new user account with the supplied user name and password. If the password is not
specified at the command line, it will be prompted.
Upon successful completion, qcreate user displays the message "Created a new user successfully" in the CommCell
Console. In case of an error, an error code and the error description are displayed in the following format: "user: Error
errorcode: errordescription"
Options:
-u   User name
-p   Password
-h   Displays help

Diagnostics: Possible exit status values are:



                                                       Page 86 of 263
                                        Features - SharePoint Database iDataAgent



0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
Create a user user1 with password password1.
E:\commandline>qcreate user –u user1 –p password1

Back to Top



qdelete
This command deletes an entity based on the specified command.
Usage: qdelete command <command-options-arguments> [-h]

Supported Commands:
  backupset
  client
  dataagent
  sp (storage policy)
  spcopy (storage policy copy)
  subclient


backupset
This command deletes a given backup set.
Usage: qdelete backupset -c client -a dataagenttype -b backupset [-af ArgumentFilepath] [-h]

Description: This command allows you to delete a given backup set.
Upon successful completion, qdelete backupset displays the message "Deleted backupset successfully" in the CommCell
Console. In case of an error, an error code and the error description are displayed in the following format: "backupset:
Error errorcode: errordescription"
Options:
Command Line
-c  Client computer name
-a  Agent type installed on client computer (see Argument Values - Agent Types)
-b  Name of the backup set to be deleted
-af Reads arguments from a file
-h  Displays help

Argument File
client       Client computer name
dataagent    Agent type installed on client computer (see Argument Values - Agent Types)
backupset    Backup set name to be deleted

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
Delete a backupset with name bs1 under client client1 and Agent Q_WIN2K_FS.
E:\commandline>qdelete backupset –c client1 –a Q_WIN2K_FS –b bs1

Back to Top


client
This command deconfigures and deletes a given client.



                                                     Page 87 of 263
                                          Features - SharePoint Database iDataAgent




Usage: qdelete client -c client [-y] [-h]

Description: This command deconfigures and deletes a given client. All the iDataAgents configured under the client are
also deleted automatically. By default, this command asks for confirmation before deletion but this prompting can be
suppressed by using the -y option.

Upon successful completion, qdelete client displays the message "Deleted the client successfully" in the CommCell
Console. In case of an error, an error code and the error description are displayed in the following format: "client: Error
errorcode: errordescription"
Options:
-c   Client computer name
-y   Delete without confirmation
-h   Displays help

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
Delete a client with name client1.
E:\commandline>qdelete client –c client1



Back to Top


dataagent
This command deconfigures and deletes an iDataAgent.
Usage: qdelete dataagent -c client –a iDataAgent [-y] [-h]

Description: This command deconfigures and deletes an iDataAgent. By default, this command asks for confirmation
before deletion but this prompting can be suppressed by using the "-y" option.
Upon successful completion, qdelete dataagent displays the message "Deleted the iDataAgent successfully" in the
CommCell Console. In case of an error, an error code and the error description are displayed in the following format:
"dataagent: Error errorcode: errordescription"
Options:
-c   Client computer name
-a   iDataAgent to be deleted (see Argument Values - Agent Types)
-y   Delete without confirmation
-h   Displays help

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
Delete an iDataAgent Q_ORACLE under client client1.
E:\commandline>qdelete dataagent –c client1 –a Q_ORACLE



Back to Top


sp
This command deletes a given storage policy.
Usage: qdelete sp -sp storagepolicy [-h]




                                                       Page 88 of 263
                                          Features - SharePoint Database iDataAgent




Description: This command allows you to delete a given storage policy.
Upon successful completion, qdelete sp displays the message "Deleted storage policy successfully" in the CommCell
Console. In case of an error, an error code and the error description are displayed in the following format: "sp: Error
errorcode: errordescription"
Options:
-sp Storage policy name to be deleted
-h   Displays help

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
Delete a storage policy with name sp1.
E:\commandline>qdelete sp –sp sp1

Back to Top


spcopy
This command deletes a given storage policy copy.
Usage: qdelete spcopy -sp storagepolicy -spc copy [-h]

Description: This command allows you to delete a given storage policy copy.
Upon successful completion, qdelete spcopy displays the message "Deleted storage policy successfully" in the CommCell
Console. In case of an error, an error code and the error description are displayed in the following format: "spcopy: Error
errorcode: errordescription"
Options:
-sp   Storage policy name
-spc Storage policy copy name to be deleted
-h    Displays help

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
Delete a storage policy copy with name spc1 under storage policy sp1.
E:\commandline>qdelete spcopy –sp sp1 –spc spc1

Back to Top


subclient
This command deletes a given subclient.
Usage: qdelete subclient -c client -a dataagenttype -b backupset -s subclient [-af ArgumentFilepath] [-
h]

Description: This command allows you to delete a given subclient.
Upon successful completion, qdelete subclient displays the message "Deleted subclient successfully" in the CommCell
Console. In case of an error, an error code and the error description are displayed in the following format: "subclient: Error
errorcode: errordescription"
Options:
Command Line
-c Client computer name
-a Agent type installed on client computer (see Argument Values - Agent Types)
-b Name of the backup set



                                                       Page 89 of 263
                                          Features - SharePoint Database iDataAgent



-s Name of the subclient to be deleted
-af Reads arguments from a file
-h Displays help

Argument File
client    Client computer name
dataagent Agent type installed on client computer (see Argument Values - Agent Types)
backupset Backup set name
subclient Subclient name to be deleted

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
Delete a subclient with name sc1 under client client1 and Agent Q_WIN2K_FS and backupset bs1.
E:\commandline>qdelete subclient –c client1 –a Q_WIN2K_FS –b bs1 –s sc1

Back to Top



qgeterrorstring
This command displays the error description of a given error number.
Usage: qgeterrorstring error number [-h]

Description: If the specified error number is valid, the error description for the error code is displayed. Otherwise, an error
code and the error description are displayed in the following format: "qgeterrorstring: Error errorcode: errordescription"
Options:
-h Displays help

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:

1.   Display error description of error number 10.
     E:\commandline>qgeterrorstring 10
     Connection to the CommServe terminated, Relogin required

2.   Display error description of invalid error number.
     E:\commandline>qgeterrorstring 4543584
     qgeterrorstring: Error 0x771: Invalid error no 4543584


Back to Top



qinfo
This command displays information about an entity.
Usage: qinfo command <command-options-arguments> [-h]

Supported Commands:
     backupset
     instance
     subclient


backupset
This command displays information about a backup set.



                                                       Page 90 of 263
                                          Features - SharePoint Database iDataAgent




Usage: qinfo backupset -c client -a dataagenttype -i instance -b backupset [-h]

Description: This command displays information about a given backup set in the specified client and iDataAgent. If the
iDataAgent supports instances above backup sets, only those backup sets under the specified instance are listed.
In case of an error, an error code and the error description are displayed in the following format: "backupset: Error
errorcode: errordescription"
Options:
-c   Client computer name
-a   Agent type installed on client computer (see Argument Values - Agent Types)
-i   Instance name, required for a few agents
-b   Backup set name
-h   Displays help

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
Display information about a backup set defaultBackupSet in client client1 and Agent Q_WIN_FS.
E:\commandline>qinfo backupset –c client1 –a Q_WIN_FS –b defaultBackupSet
Name       : defaultBackupSet
Default : Yes
Ondemand : No

Back to Top


instance
This command displays information about an instance under the given client and iDataAgent.
Usage: qinfo instance -c client -a dataagenttype -i instancename [-h]

Description: This command displays information about a given instance in the specified client and iDataAgent. If the
iDataAgent does not support instances, the default instance name is returned. This command is mainly used for instances
in Oracle agents where command line and log storage policy are displayed. For other instances, just the name is displayed.
In case of an error, an error code and the error description are displayed in the following format: "instance: Error
errorcode: errordescription"
Options:
-c   Client computer name
-a   Agent type installed on client computer (see Argument Values - Agent Types)
-i   Instance name, required for a few agents
-h   Displays help

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
Display information about an instance OEMREP in client client1 and Agent Q_ORACLE.
E:\commandline>qinfo instance –c client1 –a Q_ORACLE –i OEMREP
Name                              : OEMREP
Commandline storage policy        : battra_t(battra)-DP(2)
Log storage policy                : battra_t(battra)-DP(2)

Back to Top


subclient
This command displays information about a subclient under the given instance or backup set.



                                                       Page 91 of 263
                                          Features - SharePoint Database iDataAgent




Usage: qinfo subclient -c client -a dataagenttype -i instance -b backupset -s subclient [-h]

Description: This command displays information about a subclient in the specified client, iDataAgent, and backup
set/instance. The backup set name and instance name should be specified based on the iDataAgent, as certain agents place
subclients under backup sets, and others place them under the instance.
In case of an error, an error code and the error description are displayed in the following format: "subclient: Error
errorcode: errordescription"
Options:
-c    Client computer name
-a    Agent type installed on client computer (see Argument Values - Agent Types)
-i    Instance name, required for a few agents
-b    Backup set name, required for a few agents
-s    Subclient name
-h    Displays help

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
Display information about a subclient sc1 in the client client1 Agent Q_WIN_FS and backup set bs1
E:\commandline>qinfo subclient –c client1 –a Q_WIN_2KFS –b bs1 -s sc1
Name               : sc1
Default            : Yes
Storage policy : battra_t(battra)-DP(2)

Back to Top



qlist
This command lists down various entities based on the specified command.
Usage: qlist command <command-options-arguments> [-h]

Supported Commands:
     alert
     backupset
     client
     dataagent
     drivepool
     instance
     job
     jobsummary
     library
     location
     masterdrivepool
     media
     mediaagent
     quickmedia
     subclient
     sp (storage policy)
     spcopy (storage policy copy)
     vtp (VaultTracker policy)


alert
This command lists all of the alerts configured in the CommCell.



                                                       Page 92 of 263
                                          Features - SharePoint Database iDataAgent




Usage: qlist alert [-h]

Description: This command lists the names of all the configured alerts. When more than one alert is found, the alerts are
listed one alert per line, and are displayed in the CommCell Console. No message is displayed in the CommCell Console
when alerts are not found.
In case of an error, an error code and the error description are displayed in the following format: "alert: Error errorcode:
errordescription"
Options:
-h Displays help

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
List all the alerts configured.
E:\commandline>qlist alert
NAME           ALERTCATEGORY      ALERTTYPE              CREATOR
----           -------------      ---------              -------
Backup         Job Management     Data Protection        admin
Aux Copy       Job Management     Auxiliary Copy         admin

Back to Top


backupset
This command lists all of the backup sets under the given client, agent and instance (if applicable).
Usage: qlist backupset -c client -a dataagenttype -i instance [-h]

Description: This command lists the names of all the backup sets in the specified client and agent. If the agent supports
instances above the backup set level, then only those backup sets under the specified instance are listed. When more than
one backup set is found, the backup sets are listed one backup set per line, and are displayed in the CommCell Console. No
message is displayed when backup sets are not found.
In case of an error, an error code and the error description are displayed in the following format: "backupset: Error
errorcode: errordescription"
Options:
-c Client computer name
-a Agent type installed on client computer (see Argument Values - Agent Types)
-i Instance name, required for certain agents
-h Displays help

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
List all the Backup sets in the client client1, Agent Q_WIN_FS.
E:\commandline>qlist backupset –c client1 –a Q_WIN2K_FS
defaultBackupSet
bs1

Back to Top


client
This command lists all of the client computers configured in the CommCell.
Usage: qlist client [-h]




                                                       Page 93 of 263
                                            Features - SharePoint Database iDataAgent




Description: This command lists names of all the client computers configured in the CommCell. In addition to the client
name, the operating system and installation state are also displayed. When more than one client is found, the clients are
listed one client per line, and are displayed in the CommCell Console. No message is displayed when clients are not found.
In case of an error, an error code and the error description are displayed in the following format: "client: Error errorcode:
errordescription"
Options:
-h Displays help

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
List all the clients configured in the CommCell.
E:\commandline>qlist client
NAME                 OS                         ACTIVE
----                 --                         ------
spock                Windows Server 2003        Yes
caput                AIX                        Yes
ursa_major           Windows XP                 Yes
metal                Windows 2000               Yes
virgo                SunOS                      Yes
aries                HP-UX                      Yes
chara                NetWare                    Yes
copper-Irina         Windows Server 2003        No

Back to Top


dataagent
This command lists all of the agents installed on the given client computer.
Usage: qlist dataagent -c client [-h]

Description: This command lists all of the agents installed on the specified client computer. For a list of possible agents
that qlist dataagent can return, see Argument Values - Agent Types. When more than one agent is found, the agents are
listed one agent per line, and are displayed in the CommCell Console. No message is displayed in the CommCell Console
when no agents are found.
In case of an error, an error code and the error description are displayed in the following format: "dataagent: Error
errorcode: errordescription"
Options:
-c Client computer name
-h Displays help

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
List all the agents configured in the client client1.
E:\commandline>qlist dataagent –c client1
NAME           DESCRIPTION        ACTIVE
----           --                 ------
Q_HPUX_FS      File System        Yes

Back to Top


drivepool




                                                         Page 94 of 263
                                          Features - SharePoint Database iDataAgent




This command lists all of the drive pools under the specified master drive pool.
Usage: qlist drivepool -m mediaagent -l library -md masterdrivepool [-h]

Description: This command lists the names of all the drive pools available under the specified MediaAgent, library and
master drive pool. When more than one drive pool is found, the drive pools are listed as one drive pool per line, and are
displayed in the CommCell Console. No message is displayed in the CommCell Console when drive pools are not found.
In case of an error, an error code and the error description are displayed in the following format: "drivepool: Error
errorcode: errordescription"
Options:
-m MediaAgent name
-l Media library name
-md Master drive pool name
-h Displays help

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
List all the drive pools available in the MediaAgent client1, media library maglib1 and master drive pool
MasterPool_Magnetic_33.
E:\commandline>qlist drivepool –m client1 –l maglib1 –md MasterPool_Magnetic_33
DrivePool_Magnetic_33

Back to Top


instance
This command lists all of the instances under the given client and agent.
Usage: qlist instance -c client -a dataagenttype [-h]

Description: This command lists names of all the instances in the specified client and agent. If the agent does not support
instances, the default instance name is returned. When more than one instance is found, the instances are listed as one
instance per line, and are displayed in the CommCell Console. No message is displayed in the CommCell Console when
instances are not found.
In case of an error, an error code and the error description are displayed in the following format: "instance: Error
errorcode: errordescription"
Options:
-c Client computer name
-a Agent type installed on client computer (see Argument Values - Agent Types)
-h Displays help

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
List all the Instances in the client client1, Agent Q_WIN_FS.
E:\commandline>qlist instance –c client1 –a Q_WIN2K_FS
defaultInstanceName

Back to Top


job
This command lists job details of "a given job ID" OR "all active jobs under a given
"client/agent/instance/backupset/subclient".
Usage: qlist job [-co i|o|s|p|r] [-j jobid] [-jt jobtype] [-js Running|Suspended|Waiting|Pending] [-c



                                                       Page 95 of 263
                                           Features - SharePoint Database iDataAgent



client] [-a iDataAgent Type] [-i instance] [-b backupset] [-s subclient] [-h]

Description: This command displays job details such as job ID, type, phase, status, and failure reason. When a job ID is
specified, its details are displayed irrespective of the job status. Only active jobs are displayed when you query jobs based
on client, agent, instance, backup set, or subclient. This commands displays all active jobs in the CommServe when no
option is specified. Additionally, you can filter the jobs based on the job type or status. Only those jobs with the specified
type/status are displayed in the CommCell Console. When more than one job is found, the jobs are listed one job per line,
and are displayed in the CommCell Console. The system displays a message "No jobs to display" when there are no jobs.
In case of an error, an error code and the error description are displayed in the following format: "job: Error errorcode:
errordescription"
Options:
-co Columns to display (i|o|s|p|r)
-j Job ID
-jt Filter by job type
-js Filter by job status
-c Client computer name
-a Agent type installed on client computer (see Argument Values - Agent Types)
-i Instance name
-b Backup set name
-s Subclient name
-h Displays help

Column Codes:
Column codes are used to filter the columns that are displayed by the command. Only those columns that are set using the
"-co" option are displayed.
i    Job ID
o    Job type
s    Job status
p    Job phase
r    Failure/pending reason

By default, i, o, s, p columns are enabled.
Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:

1.   List job details of a given job ID.
     E:\commandline>qlist job –j 1
     JOBID         OPERATION STATUS      PHASE
     -----         ----------------      -----
     1             Backup Running        Scan

2.   List job details of a given job ID and display only the ID and Status.
     E:\commandline>qlist job –j 1
     JOBID         STATUS
     -----         ------
     1             Running

3.   Display failure/pending reason (ID and the message) for a job.
     E:\commandline>qlist job -j 1 -co r
     FAILUREREASON
     -------------
     13187897

     Message for job failure/pending reasons:

     1      13187897     ->   Job has been killed by the user


Back to Top




                                                        Page 96 of 263
                                          Features - SharePoint Database iDataAgent




jobsummary
This command displays summary of the jobs in the CommServe.
Usage: qlist jobsummary [-c client] [-a iDataAgent] [-i instance] [-b backupset] [-s subclient] [-h]

Description: This command displays the summary of all the jobs currently active in the CommServe. Jobs are classified
into Running, Suspended, Pending, Queued and Waiting states. You can filter the jobs by limiting them to a client, agent,
instance, backup set or subclient.
In case of an error, an error code and the error description are displayed in the following format: "media: Error errorcode:
errordescription"
Options:
-c Client computer name
-a Agent type installed on client computer (see Argument Values - Agent Types)
-i Instance name
-b Backup set name
-s Subclient name
-h Displays help

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
Display job summary of all jobs of client cl1.
E:\commandline>qlist jobsummary –c cl1
RUNNING PENDING WAITING QUEUED SUSPENDED TOTAL
1         10     0          4         1        16

Back to Top


library
This command lists all of the media libraries under the specified MediaAgent.
Usage: qlist library [-m mediaagent] [-h]

Description: This command lists the names of all types of media libraries configured in the specified MediaAgent. If a
MediaAgent is not specified, all libraries are listed. When more than one library is found, the libraries are listed one media
library per line, and are displayed in the CommCell Console. No message is displayed when media libraries are not found.
In case of an error, an error code and the error description are displayed in the following format: "library: Error errorcode:
errordescription"
Options:
-m MediaAgent name
-h Displays help

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
List all the media libraries configured in the MediaAgent client1.
E:\commandline>qlist library –m client1
maglib1

Back to Top


location
This command lists all of the locations (both transit and stationary) configured in the CommCell.




                                                        Page 97 of 263
                                          Features - SharePoint Database iDataAgent




Usage: qlist location [-h]

Description: This command lists all stationary and transit locations configured in the CommCell. Location name and type
(stationary/transit) are displayed for each location. When more than one location is found, the locations are listed one
location per line, and are displayed in the CommCell Console. No message is displayed in the CommCell Console when no
locations are configured.
In case of an error, an error code and the error description are displayed in the following format: "location: Error errorcode:
errordescription"
Options:
-h Displays help

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
List all the locations.
E:\commandline>qlist location
NAME TYPE
stat1 Stationary
trans1 Transit

Back to Top


masterdrivepool
This command lists all of the master drive pools under the specified MediaAgent and media library.
Usage: qlist masterdrivepool -m mediaagent -l library [-h]

Description: This command lists the names of all the master drive pools available under the specified MediaAgent and
library. When more than one master drive pool is found, the drive pools are listed one master drive pool per line, and are
displayed in the CommCell Console. No message is displayed in the CommCell Console when master drive pools are not
found.
In case of an error, an error code and the error description are displayed in the following format: "masterdrivepool: Error
errorcode: errordescription"
Options:
-m MediaAgent name
-l Media library name
-h Displays help

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
List all the master drive pools available in the MediaAgent client1 and media library maglib1.
E:\commandline>qlist masterdrivepool –m client1 –l maglib1
MasterPool_Magnetic_33

Back to Top


media
This command displays media used by a given job.
Usage: qlist media -j jobid [-sp storagepolicy] [-spc copy] -l library [-h]

Description: This command displays information about all the media on the basis of a given job ID, storage policy, storage
policy copy, and library. Barcode, location, library, storage policy name, and copy name are displayed for each media. Since




                                                       Page 98 of 263
                                         Features - SharePoint Database iDataAgent



barcodes are not applicable for magnetic libraries, CV_MAGNETIC is displayed if the job uses a magnetic library. If more
than one media is used by a job, the media are listed one per line. A message "No media have been allocated for this job"
will be displayed in cases where there is no media used by the job.
In case of an error, an error code and the error description are displayed in the following format: "media: Error errorcode:
errordescription"
Options:
-j    Job ID
-sp   Storage policy name
-spc Storage policy copy name
-l    Library name
-h    Displays help

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:

1.   List all the media used by job 209.
     E:\commandline>qlist media –j 209
     BARCODE LOCATION LIBRARY             STORAGEPOLICY        COPYNAME
     ------- -------- -------             -------------        --------
     000064 slot 9        EXABYTE EXB-220 CommServeDR (purple) Primary


Back to Top


mediaagent
This command lists all of the MediaAgents configured in the CommCell.
Usage: qlist mediaagent [-h]

Description: This command lists the names of all the MediaAgents configured in the CommCell. When more than one
MediaAgent is found, the MediaAgents are listed one MediaAgent per line, and are displayed in the CommCell Console. No
message is displayed in the CommCell Console when MediaAgents are not found.
In case of an error, an error code and the error description are displayed in the following format: "mediaagent: Error
errorcode: errordescription"
Options: -h Displays help
Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
List all the MediaAgents configured in the CommCell.
E:\commandline>qlist mediaagent
client1

Back to Top


quickmedia
This command lists media required to restore data associated with a client, iDataAgent, backup set, or subclient.
Usage: qlist quickmedia [-co e|b|c|l|t|p|o|s|r] -c client [-a iDataAgent] [-i instance ] [-b backupset]
[-s subclient] [-bf browsefrom] [-bt browseto] [-syp syncprecedence] [-sep selprecedence] [-h]

Description: This command lists all the media and related information, based on the input provided. The following
information is displayed for each media:
1. export location



                                                      Page 99 of 263
                                           Features - SharePoint Database iDataAgent



2.   barcode
3.   container name
4.   library name
5.   media type
6.   storage policy name
7.   copy name
8.   subclient name
9.   copy precedence number
If more than one media is listed, they are displayed one per line. The message “No media information to display” will be
displayed in cases where there is no media.
In case of an error, an error code and the error description are displayed in the following format: “quickmedia: Error
errorcode: errordescription”
Options:
-co Columns to display (e|b|c|l|t|p|o|s|r)
-c   Client computer name
-a   Agent type installed on the client computer (see Argument Values - Agent Types)
-i   Instance name
-b   Backup set name
-s   Subclient name
-bf Browse From
-bt Browse To
-syp Synchronous copy precedence
-sep Selective copy precedence
-h   Displays help

                          If -c (client computer name) is the only option specified, then all the media used by backups in
                          the last backup cycle (latest backup option) from all application types installed on the client will
                          be displayed.

Column Codes:
Column codes are used to filter the columns that are displayed by the command. Only those columns that are set using the
"-co" option are displayed.
b    Barcode of the media
c    Container
l    Library
p    Storage policy
o    Copy name
s    Subclient
e    Export location
t    Media type
r    Copy precedence

By default, b, c, l, p, o, s columns are enabled.
Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
List the media on the client battra.
E:\commandline>qlistquickmedia -c battra
BARCODE CONTAINER LIBRARY STORAGEPOLICY COPYNAME SUBCLIENT
------- --------- ------- ------------- ------------------------
T:\ Dummy battra_t battra_t(battra)-DP(2) Primary Cursors
T:\ Dummy battra_t battra_t(battra)-DP(2) Primary default

Back to Top


sp
This command lists all storage policies (of all types) available in the CommCell.



                                                        Page 100 of 263
                                           Features - SharePoint Database iDataAgent




Usage: qlist sp [-h]

Description: This command lists the names of all the storage policies available in the CommCell. All types of storage
policies (i.e., "iDataAgent Backup" or "Express Recovery Backup" or "Data Migrator/Archiver") are listed by this command.
When more than one storage policy is found, the policies are listed one Storage Policy per line, and are displayed in the
CommCell Console. No message is displayed in the CommCell Console when storage policies are not found.
In case of an error, an error code and the error description are displayed in the following format: "sp: Error errorcode:
errordescription"
Options:
-h Displays help

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
List all the storage policies configured in the CommCell.
E:\commandline>qlist sp
policy1

Back to Top


spcopy
This command lists all of the copies (primary and secondary), under the given storage policy.
Usage: qlist spcopy -sp storagepolicy [-h]

Description: This command lists the names of all the storage policy copies under the given storage policy. Both primary
and secondary copies are returned. When more than one storage policy copy is found, the copies are listed one copy per
line, and are displayed in the CommCell Console. No message is displayed in the CommCell Console when storage policy
copies are not found.
In case of an error, an error code and the error description are displayed in the following format: "spcopy: Error errorcode:
errordescription"
Options:
-sp Storage policy name
-h   Displays help

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
List all the storage policy copies in the storage policy policy1.
E:\commandline>qlist spcopy –p policy1
Primary
copy1

Back to Top


subclient
This command lists all of the subclients under the given instance or backup set.
Usage: qlist subclient -c client -a dataagenttype -i instance -b backupset [-h]

Description: This command lists the names of all the subclients in the specified client, agent, backup set or instance.
Backup set names and instance names should be specified for agents that place subclients under the backup set an/or
instance level. When more than one subclient is found, the subclients are listed one subclient per line, and are displayed in
the CommCell Console. No message is displayed in the CommCell Console when subclients are not found.



                                                        Page 101 of 263
                                           Features - SharePoint Database iDataAgent




In case of an error, an error code and the error description are displayed in the following format: "subclient: Error
errorcode: errordescription"
Options:
-c Client computer name
-a Agent type installed on client computer (see Argument Values - Agent Types)
-i Instance name, required for certain agents
-b Backup set name, required for certain agents
-h Displays help

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
List all the Subclients in the client client1, Agent Q_WIN_FS and backup set bs1.
E:\commandline>qlist subclient –c client1 –a Q_WIN_2KFS –b bs1
default
System State
sc1

Back to Top


vtp
This command lists all of the VaultTracker policies configured in the CommCell.
Usage: qlist vtp [-h]

Description: This command lists the names of all the VaultTracker policies configured. When more than one policy is
found, the policies are listed one policy per line, and are displayed in the CommCell Console. No message is displayed in the
CommCell Console when VaultTracker policies are not found.
In case of an error, an error code and the error description are displayed in the following format: "vtp: Error errorcode:
errordescription"
Options:
-h Displays help

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
List all the VaultTracker policies configured.
E:\commandline>qlist vtp
vtp1

Back to Top



qlogin
This command enables you to log on to the given CommServe by supplying a username and password of a CommCell user
with sufficient permissions to perform the desired operations.
Usage: qlogin [-cs CommServe] [-u username] [-p password] [-af ArgumentFilepath][-h]

Description: qlogin allows a user to log on to a CommServe. By default, if a CommServe name is not specified, this
command logs on to the CommServe of the current host. The user name and password can be specified in the appropriate
arguments (an encrypted password will be required in this case). If the arguments are not supplied in the command, the
user is prompted to enter a user name and password at the command line. Encrypted User login information is written into
a .qsession file in the user's home directory (and overwrites the file if it already exists), which can be used by other
commands until a qlogout command is executed. Upon successful completion, qlogin does not display any message in the




                                                        Page 102 of 263
                                          Features - SharePoint Database iDataAgent



CommCell Console.
For more information on cli scripting, see Script Considerations or Writing and Editing Scripts sections in this document.
In case of an error, an error code and the error description are displayed in the following format: "qlogin: Error errorcode:
errordescription"

                          If you receive an error such as qlogin: Error Ox203: HOME environment variable not set. Please
                          export it and retry the command", this indicates that the user's HOME environment variable is
                          not set. The QCommands depend on the HOME environment variable being set to find the home
                          directory of the user. On Windows, the variable is either HOMEDRIVE or HOMEPATH. On Unix,
                          the variable is HOME. These are system-set variables found by default on most machines.

Options:
Command Line
-cs CommServe host name
-u User name (prompted if not specified)
-p Password (prompted if not specified)
-af Input file containing the arguments
-h Displays help

Argument File
server                CommServe host name
user                  User name (prompted if not specified)
password              Password (prompted if not specified)

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Examples:

1.   Log on to CommServe server1 with user name user1.
     E:\commandline>qlogin –cs server1 –u user1
     Password:

2.   Log on to CommServe server1.
     E:\commandline>qlogin –cs server1
     Enter User Name:
     Password:

3.   Unsuccessful Log on to CommServe server1 with user name user1 due to Invalid Password.
     E:\commandline>qlogin –cs server1 –u user1
     Password:
     qlogin: Error 0x260: Invalid Login/Password


Back to Top



qlogout
This command enables logging off a user from a connected CommServe.
Usage: qlogout [-h]
Description: qlogout allows a user to log off the CommServe and terminates the secure connection with the CommServe
established by qlogin. Upon successful completion, qlogout does not display any message in the CommCell Console.
The .qsession file, located in the user’s home directory, created during qlogin, is deleted.

In case of an error, an error code and the error description are displayed in the following format: "qlogout: Error errorcode:
errordescription"
Options:
-h Displays help




                                                       Page 103 of 263
                                        Features - SharePoint Database iDataAgent




Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Examples:

1.   Logout command executed when connected to a CommServe.
     E:\commandline>qlogout

2.   Logout command executed when you are not logged on to any CommServe.
     E:\commandline>qlogout
     qlogout: Error 0x268: User not logged in


Back to Top



qmodify
This command modifies an entity based on the specified command.
Usage: qmodify command <command-options-arguments> [-h]

Supported Commands:
     galaxypassword
     instance
     password
     subclient


galaxypassword
This command changes the password of a user for the CommCell Console.
Usage: qmodify galaxypassword –u username [-p password] [-h]

Description: This command allows you to change the password of a given user. The user whose password is to be changed
is provided as a command line option. The user is prompted for the current password (for security purposes) and then
prompted for the new password. This command also supports the option to provide the password at the command line for
scripting purposes, in which case it modifies the password silently without any user intervention.
Upon successful completion, qmodify galaxypassword displays the message "Changed user password successfully" in the
CommCell Console. In case of an error, an error code and the error description are displayed in the following format:
"galaxypassword: Error errorcode: errordescription"
Options:
-u User name
-p Password
-h Displays help

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Back to Top


instance
This command modifies command line and log storage policies of an Oracle instance.
Usage: qmodify instance -c client -a dataagenttype -i instance -csp cmdlinestoragepolicy -lsp
logstoragepolicy [-af ArgumentFilepath] [-h]

Description: This command allows you to modify an Oracle instance by changing its command line and log storage
policies.



                                                     Page 104 of 263
                                          Features - SharePoint Database iDataAgent




Upon successful completion, qmodify instance displays the message "Modified instance successfully" in the CommCell
Console. In case of an error, an error code and the error description are displayed in the following format: "instance: Error
errorcode: errordescription"
Options:
Command Line
-c   Client computer name
-a   Agent type installed on client computer (see Argument Values - Agent Types)
-i   Instance name, required for certain agents
-csp Name of the command line storage policy
-lsp Name of the log storage policy
-af  Input file containing arguments
-h   Displays help

Argument File
client        Client computer name
dataagent     Agent type installed on client computer (see Argument Values - Agent Types)
instance      Instance name to be modified.
cmdlinesp     Command line storage policy name
logsp         Log storage policy name

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
Modify an instance with the name instance1 under client client1 and Agent Q_WIN2K_FS by changing its command line
storage policy to logsp2.
E:\commandline>qmodify instance –c client1 –a Q_WIN2K_FS –i instance1 –csp logsp2

Back to Top


password
This command allows for password administration.
Usage: qmodify password -c client -a dataagenttype -i instance -b backupset -s subclient [-h]

Description: This command interactively allows password administration of File System, Oracle, SQL and Exchange
Agents.
   For SQL Server iDataAgents, you can change the user account and password used for authentication of the SQL connect
   string.
   For Windows File System iDataAgents, you can change the user account and password used for permissions for UNC
   paths.
   For Oracle iDataAgents on Windows, you can change the user account and password used for authentication of the
   connect string and catalog connect.
   For Oracle iDataAgents on Unix, you can change the user account and password used for authentication of the catalog
   connect.
   For Exchange iDataAgents, you can change the user account and password when the option is available in the CommCell
   Console.
In case of an error, an error code and the error description are displayed in the following format: "password: Error
errorcode: errordescription"
Options:
-c Client computer name
-a Agent type installed on client computer (see Argument Values - Agent Types)
-i Instance name, required for certain agents
-b Backup set name, required for certain agents
-s Name of the subclient
-h Displays help

Diagnostics: Possible exit status values are:




                                                       Page 105 of 263
                                          Features - SharePoint Database iDataAgent



0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Back to Top


subclient
This command modifies the content/storage policy of a given subclient.
Usage: qmodify subclient -c client -a dataagenttype -i instance -b backupset -s subclient -sp
storagepolicy -f content [-af ArgumentFilepath] [-h]

Description: This command modifies a subclient by changing the assigned storage policy or content. New content is added
to the existing content.

                          When using the -f content option, you must verify that the subclient content is in the correct
                          format path. This command does not enforce the content format, and any path (including an
                          incorrect path) can be specified. This may result in incorrect content being assigned to a file
                          system subclient and a "pending" subclient scan.

Upon successful completion, qmodify subclient displays the message "Modified subclient successfully" in the CommCell
Console. In case of an error, an error code and the error description are displayed in the following format: "subclient: Error
errorcode: errordescription"
Options:
Command Line
-c Client computer name
-a Agent type installed on client computer (see Argument Values - Agent Types)
-i Instance name, required for a few agents
-b Backup set name, required for a few agents
-s Name of the subclient to be modified
-sp Name of the storage policy to be associated with the subclient
-f File/Directory Path to be added to the existing content
-af Input file containing arguments
-h Displays help

Argument File
client       Client computer name
dataagent    Agent type installed on source client computer (see Argument Values - Agent Types)
instance     Instance name to which the subclient belongs, required for certain agents
backupset    Backup set name to which the subclient belongs, required for certain agents
subclient    Name of the subclient to be modified
sp           New storage policy for the subclient
content      List of files/directories to be added to the existing content

                          The content option is supported only for file system agents. However, changing the storage
                          policy association of a subclient is supported for all agents.


Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
Modify a subclient with name sc1 under client client1 and Agent Q_WIN2K_FS and backup set bs1 by changing its storage
policy to sp2.
E:\commandline>qmodify subclient –c client1 –a Q_WIN2K_FS –b bs1 –s sc1 –sp sp2

Back to Top



qoperation



                                                       Page 106 of 263
                                         Features - SharePoint Database iDataAgent




This command performs an operation based on the specified command.
Usage: qoperation command <command-options-arguments> [-h]

Supported Commands:
   adduser
   agedata
   automaticupdate
   auxcopy
   backup
   capture
   erbackup
   jobcontrol
   jobretention
   media
   merge
   move
   restore
   vaulttracker


adduser
This command adds a user to a user group.
Usage: qoperation adduser -u user -ug group [-h]

Description: This command adds a user to a user group.
Upon successful completion, qoperation adduser displays the message "Added user to the group successfully" in the
CommCell Console. In case of an error, an error code and the error description are displayed in the following format:
"adduser: Error errorcode: errordescription"
Options:
-u   User name
-ug User group name
-h   Displays help

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
Add a user user1 to group group1.
E:\commandline>qoperation adduser –u user1 –ug group1

Back to Top


agedata
This command performs a data aging operation.
Usage: qoperation agedata [-af ArgumentFilepath][-h]

Description: This command starts a data aging operation. The Job ID is returned and displayed in the CommCell Console if
the data aging job was successfully initiated by the CommServe.
In case of an error, an error code and the error description are displayed in the following format: "agedata: Error
errorcode: errordescription"
Options:
Command Line
-af Reads arguments from a file



                                                      Page 107 of 263
                                         Features - SharePoint Database iDataAgent



-h    Displays help

Argument File
alert       Alert name
retryno     Number of times to retry the job
retrytime   Number of hours to retry the job


Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
Start Data aging.
E:\commandline>qoperation agedata
175

Back to Top


automaticupdate
This command performs an automatic update operation on the specified client from the CommServe.
This is not to be confused with the InstallUpdates command, which is used to manually install the updates from the
command line in a non-interactive mode from the client computer. For more information, see Automatic Updates - Perform
Silent Installs.
Usage: qoperation automaticupdate -c client [-h]

Description: This command starts an automatic update operation on a specified client, MediaAgent, or CommServe. The
Job ID is returned and displayed in the CommCell Console if the update job was successfully initiated by the CommServe.
In case of an error, an error code and the error description are displayed in the following format: "automaticupdate: Error
errorcode: errordescription"
Options:
-c Client name
-h Displays help

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
Start update on client client1.
E:\commandline>qoperation automaticupdate –c client1
175

Back to Top


auxcopy
This command performs an auxiliary copy operation on the specified storage policy and secondary copy.
Usage: qoperation auxcopy -sp storagepolicy [-spc copy] [-af ArgumentFilepath] [-h]

Description: This command starts an auxiliary copy job on the specified storage policy and secondary copy. The secondary
copy must be created before running auxcopy. The Job ID is returned and displayed in the CommCell Console if the
auxiliary copy job was successfully initiated by the CommServe.
In case of an error, an error code and the error description are displayed in the following format: "auxcopy: Error
errorcode: errordescription"
Options:
Command Line



                                                      Page 108 of 263
                                         Features - SharePoint Database iDataAgent



-sp    Storage policy name
-spc   Secondary copy name
-af    Reads arguments from a file
-h     Displays help

Argument File
sp                  Storage policy name
spcopy              Storage policy copy name
priority            Job priority
exportlocation      Export Location
transitlocation     Transit Location
usevms              Use VMS Flag (0/1)
mediastatus         Media status
startnewmedia       Start new media flag (0/1)
alert               Alert name
startsuspended      Start job in suspended state (0/1)
retryno             Number of times to retry the job
retrytime           Number of hours to retry the job

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
Start auxcopy for storage policy policy1 and copy copy1.
E:\commandline>qoperation auxcopy –sp policy1 –spc copy1
175

Back to Top


backup
This command performs an immediate backup operation (full, incremental, or differential) on a given subclient.
Usage: qoperation backup -c client -a dataagenttype -i instance -b backupset -s subclient -t
Q_FULL|Q_INC|Q_DIFF|Q_SYNTH [–af ArgumentFilepath] [-h]

Description: This command starts an immediate backup operation on the specified subclient. In addition to the subclient
name, this command needs a client computer name, agent type, instance name and backup set name to uniquely identify
the subclient. The instance name and backup set name are optional for certain Agents, in which case they are not
applicable. The type of backup suggests whether if is a full, incremental, differential, or synthetic backup. The backup
command can also accept the arguments from a file though -af option. This file can accept all the arguments supported
through command line, and also accepts advanced backup options. The Job ID is returned and displayed in the CommCell
Console if the backup job was successfully initiated by the CommServe.
In case of an error, an error code and the error description are displayed in the following format: "backup: Error errorcode:
errordescription"
Options:
Command Line
-c Client computer name
-a Agent type installed on client computer (see Argument Values - Agent Types)
-i Instance name, required for a few agents
-b Backup set name, required for a few agents
-s Name of the subclient to be backed up
-t Type of backup
-af Input file containing the arguments
-h Displays help

Argument File
client               Client computer name
dataagent            Agent type installed on client computer (see Argument Values - Agent Types)
instance             Instance name, required for a few agents
backupset            Backup set name, required for a few agents
subclient            Name of the subclient to be backed up
backuptype           Type of backup (See Supported Backup Types)



                                                      Page 109 of 263
                                         Features - SharePoint Database iDataAgent



options              Backup options (see Argument Values - Backup Options)
priority             Job priority
exportlocation       Export location
transitlocation      Transit location
usevms               Use VMS
mediastatus          Media status
inclevel             Incremental level
cumulative           Cumulative option for QR agents
qrexechostid         QRExec host name for QR agents
transportablesnap    Transportable snap option for QR agents
vssimportmap         VSS import Map for QR agents
retryno              Number of times to retry the job
retrytime            Number of hours to retry the job
ondemandinputfile    Input file for on-demand backups
alert                alert name
alertid              alert ID (used by save as script)
srctodest            source to destination pairs
instancescripts      Oracle instance name(s) and script name(s) for RMAN backup job
datatype             Type of Oracle data being backed up (DATA or LOG) for RMAN backup job
sp                   Storage policy used for RMAN backup job
streamcount          Number of streams to reserve for RMAN backup job


Supported Backup Types

Backup Type              Description
Q_FULL                   Full Backup
Q_INC                    Incremental Backup
Q_DIFF                   Differential Backup (use for DB2 delta
                         backups)
Q_SYNTH                  Synthetic Backup
Q_PRESELECT              Preselected Backup (used for Exchange DB)
Q_ASR                    ASR Backup

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
Start a Full backup on the subclient sc1 under client client1, agent Q_WIN_FS and backup set bs1.
E:\commandline>qoperation backup –c client1 –a Q_WIN_FS –b bs1 –s sc1 –t Q_FULL
171

Back to Top


capture
This command captures the client information into a database.
Usage: qoperation capture -c client -dfn dumpfile [-dbn databasename] [-afn answerfile] [-st starttime] [-et endtime] [-u
username] [-p password] [-af argsfile] [-h]
Description: This command captures information from a single client or multiple clients and stores the information in the
form of a database on the CommServe. An answer file is also created which can be edited by an user during the merge
operation. This answer file can be created either on the CommServe or on the local client machine. If the database file path
is an UNC path, then the command line operation will prompt the user for impersonation details. On successful completion,
qoperation capture displays the message “Capture Successful” in the CommCell Console.
In case of an error, an error code and the error description are displayed in the following format: "capture: Error errorcode:
errordescription"
Options:
Command Line
-c  Name of the client to capture




                                                      Page 110 of 263
                                         Features - SharePoint Database iDataAgent



-dfn   Full path and the dump file name on the CommServe
-dbn   Name of the database from which you wish to capture data. (This is useful when you want to capture
data   from a database other than the current CommServ.
-afn   Full path and the answer file name on the local Client
-st    Start date and time from which data will be captured
-et    End date and time until which data must be captured
-u     User name, if database location is an UNC path
-p     Password, if database location is an UNC path
-af    Reads arguments from a file
-h     Displays help

Argument File
clients         [list] List of clients to capture
dumpfilename    [string] Full path and the dump file name on the CommServe
databasename    [string] Name of the database from which you wish to capture data. (This is useful when
you want to capture data from a database other than the current CommServ.
answerfile      [string] Full path and the answer file name on the local Client
startime        [time] Start date and time from which data will be captured
endtime         [time] End date and time until which data must be captured
user            [string] User name, if database location is an UNC path
password        [string] Password, if database location is an UNC path

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
Capture a client with the name client1, save the database with the name db at c:\capturedb on the CommServe, and save
the answer file at C:\afile on the local client.
E:\commandline>qoperation capture –c client1 –dfn c:\capturedb –dbn db –afn c:\afile


Back to Top


erbackup
This command performs an Enterprise recovery backup operation.
Usage: qoperation erbackup -t Q_FULL|Q_DIFF [-af ArgumentFilepath][-h]

Description: This command starts an Enterprise recovery backup operation. The Job ID is returned and displayed in the
CommCell Console if the ER backup job was successfully initiated by the CommServe.
In case of an error, an error code and the error description are displayed in the following format: "erbackup: Error
errorcode: errordescription"
Options:
Command Line
-t  Backup type (Q_FULL|Q_DIFF)
-af Reads arguments from a file
-h  Displays help

Argument File
backuptype   Backup type (Q_FULL|Q_DIFF)
alert        Alert name
retryno      Number of times to retry the job
retrytime    Number of hours to retry the job


Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
Start a Full ER backup.



                                                      Page 111 of 263
                                          Features - SharePoint Database iDataAgent



E:\commandline>qoperation erbackup –t Q_FULL
175

Back to Top


jobcontrol
This command allows you to control (kill/suspend/resume/changepriority) of a single job or multiple jobs.
Usage: qoperation jobcontrol -o kill|suspend|resume|changepriority [-j jobid] [-m mediaagent] [-c
client] [-a dataagenttype] [-i instance] [-b backupset] [-s subclient] [-p priority] [-h]

Description: This command allows you to kill, resume, suspend, or change the priority of a given job(s). To operate on a
single job, you need to specify the Job ID. By specifying the client/MediaAgent name, the operation can be applied to more
than one job (e.g., on all jobs running on the specified client). To further filter the jobs, you can specify additional levels
such as agent, instance, backup set and/or subclient.
If the operation is successful, no message is displayed in the CommCell Console. In case of an error, an error code and the
error description are displayed in the following format: "jobcontrol: Error errorcode: errordescription"
Options:
-o   Operation to be performed on the job (kill, suspend, resume, or changepriority)
-j   Job ID
-m   Media agent name
-c   Client computer name
-a   Agent type installed on client computer (see Argument Values - Agent Types)
-i   Instance name
-b   Backup set name
-s   Subclient name
-p   Job priority
-h   Displays help

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Examples:

1.   Kill a job with job ID 175.
     E:\commandline>qoperation jobcontrol –j 175 –o kill

2.   Suspend all jobs under MediaAgent ma1.
     E:\commandline>qoperation jobcontrol –m ma1 –o suspend

3.   Resume all jobs under client cl1 and dataagent "Q_WIN2K_FS".
     E:\commandline>qoperation jobcontrol –c cl1 –a Q_WIN2K_FS –o resume

4.   Change priority of a job with job ID 175 to 100.
     E:\commandline>qoperation jobcontrol –j 175 -p 100 –o changepriority


Back to Top


jobretention
This command retains/does not retain a given job.
Usage: qoperation jobretention -o pin|unpin -j jobid -sp storagepolicy -spc copy [-h]

Description: This command retains/does not retain a given job. If the job is retained, the data will never be aged.
However, if the job is not retained, it will become eligible for data aging. Pin and unpin are the two supported operation
names. Along with the job ID, you must specify the storage policy and copy names.
In case of an error, an error code and the error description are displayed in the following format: "jobretention: Error
errorcode: errordescription"




                                                       Page 112 of 263
                                         Features - SharePoint Database iDataAgent




Options:
-o    Operation to be performed on the job (pin or unpin)
-j    Job ID
-sp   Storage policy name
-spc Secondary copy name
-h    Displays help

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
Retain a job with job ID 175, sp sp1 and spcopy copy1.
E:\commandline>qoperation jobretention –j 175 –o pin –sp sp1 –spc copy1

Back to Top


media
This command allows you to export media from a library.
Usage: qoperation media -o export -b barcode(s) -l library -el exportlocation -m mediaagent [-v] [-h]

Description: This command exports a single or multiple media from a library. If there are multiple media, they should be
separated by a comma.
In case of an error, an error code and the error description are displayed in the following format: "media: Error errorcode:
errordescription"
Options:
-o Operation to be performed on the media
-b Barcode list (separate each barcode with a comma)
-l Library name
-el Export location name
-m MediaAgent name
-v Verify OML before export (0/1)
-h Displays help

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Examples:

1.   Export media.
     E:\commandline>qoperation media -o export -b 127126 -l lib1 -el loc1 -m med1 -v 1


Back to Top


merge
This command merges a captured client database into a target CommCell.
Usage: qoperation merge -dfn dumpfile [-afn answerfile] [-u username] [-p password] [-cl y|n] [-rt y|n]
[-af argsfile] [-h]

Description: This command merges a given captured database into a target CommCell. The database should reside on the
CommServe and the user should supply the database name and the location.
This command also supports an option to read the answer file for the merge configuration. If the database file path is an
UNC path, then the command line prompts the user to enter the impersonation details. You can specify whether or not to
consume a license during this merge.
Upon successful completion, qoperation merge displays the message “Merge Successful” in the CommCell Console.




                                                      Page 113 of 263
                                         Features - SharePoint Database iDataAgent




In case of an error, an error code and the error description are displayed in the following format: “merge: Error errorcode:
errordescription”
Options:
Command Line
-dfn Path and Dump file name on the CommServe
-afn Answer file name (with the path) on the client that was generated during capture
-u   User name, if database location is an UNC path
-p   Password, if database location is an UNC path
-cl y or n - This option determines whether a licenses is consumed or not for the merged client.
-rt y or n - This option determines whether or not to mark the media as reusable in the new CommCell.
If the answer is set to 'Y' the media will be reused even if the library option to use media from a
different CommCell is disabled.
-af Reads arguments from a file
-h   Displays help

Argument File
dumpfilename        [string] Path and Dump file name on the CommServe
answerfile          [string] Answer file name (with the path) on the client
                    that was generated during capture
consumelicense      [y|n] This option determines whether a licenses is consumed or not for the merged
client.
reusetapes          [y|n] This option determines whether or not to mark the media as reusable in the new
CommCell. If the    answer is set to 'Y' the media will be reused even if the library option to use media
from a different    CommCell is disabled.
user                [string] User name
password            [string] Password

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
Merge using dump file c:\capturedump.dmp on the CommServe, and read the answer file at C:\afile on the local client.
E:\commandline>qoperation merge –dbn c:\capturedump.dmp –afn c:\afile

Back to Top


move
This command moves an Oracle instance from one client to another.
Usage: qoperation move -sc sourceclient -a iDataAgent -i instance -dc destinationclient [-af argsfile]
[-h]

Description: This command moves an Oracle instance from one or more clients to a single destination client.
   If the specified instance is found on any of the source clients, then it is deconfigured on the source client and a new
   instance with the same name appears on the destination client.
   If the instance is not found on any of the source clients, or, it is found on two or more source clients, then the command
   fails and returns an appropriate error message. Use the -sc option multiple times for specifying more than one source
   client from the command line.

                          Ensure that the source and destination clients have the same operating system before moving an
                          instance of Oracle. See the qlist client command.


Upon successful completion, qoperation move displays the message "Move operation successful" in the CommCell Console.
In case of an error, an error code and the error description are displayed in the following format: "move: Error errorcode:
errordescription"
Options:
Command Line
-sc Source client computer name




                                                      Page 114 of 263
                                          Features - SharePoint Database iDataAgent



-a    Agent type installed on source client computer (see Argument Values - Agent Types)
-i    Instance name to be moved
-dc   Destination client name
-af   Reads arguments from a file
-h    Displays help

Argument File
sourceclient              List of source client computer names
dataagent                 Agent type installed on source client computer (see Argument Values - Agent Types)
instance                  Instance name to be moved
destinationclient         Destination client computer name

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
Move an instance rman901 from source clients sc1 or sc2 to a destination client dc1.
E:\commandline>qoperation move –sc sc1 –sc sc2 –a Q_ORACLE –i rman901 –dc dc1

See Moving an Oracle Instance for complete information on the qoperation move command.

Back to Top


restore
This command enables a subclient level or file level recovery within or across client computers.
Usage: qoperation restore -sc sourceclient -a dataagenttype -i instance -b backupset -s subclient -spath
sourcepath -j jobid [-dc destinationclient] [-dpath destinationpath] [-af ArgumentFilepath][-h]

Description: This command starts a restore job immediately on the subclient/source path specified. In cases where the
subclient name is specified, the content of that subclient will be restored. If a file path is specified, the content of that
directory that has been backed up will be restored. Both Data and ACLs are restored, where appropriate. Out of place
restores can be achieved by specifying the destination path and destination client computer name. By default, data is
restored to the source path and source client computer. The restore command can also accept the arguments from a file
though -af option. This file can accept all the arguments supported through command line, as well as restore options. For
the list of supported Restore options refer to Argument Values - Restore Options. The Job ID is returned and displayed in
the CommCell Console if the restore job was successfully initiated by the CommServe.
In case of an error, an error code and the error description are displayed in the following format: "restore: Error errorcode:
errordescription"
Options:
Command Line
-sc     Source client computer name from which data has to be restored
-a      Agent type installed on source client computer (see Argument Values - Agent Types)
-i      Instance name to which subclient belongs to, required for a few agents. If an iDataAgent
supports instances, this option is mandatory.
-b      Backup set name to which subclient belongs to, required for a few agents. If an iDataAgent
supports backup sets, this option is mandatory (see Support Information - Backup Set/Migration Set)
-s      Name of the Subclient to be restored
-spath File/Directory to be restored. For NAS restores on Unix, the path values must be surrounded by
single quotes (for example, -spath '/abc/def').
-dc     Destination client computer name, for cross-client restores (Optional)
        By default, data is restored to the source client computer
-dpath Destination path to which backed up data will be restored (Optional)
        By default, data is restored to the source paths
-af     Input file containing the arguments
-h      Displays help

Argument File (Options used by all iDataAgents except Oracle)
sourceclient          Source client computer name from which data has to be restored
dataagent             Agent type installed on source client computer (see Argument Values - Agent Types)
instance              Instance name to which subclient belongs to, required for a few agents
backupset             Backup set name to which subclient belongs to, required for a few agents



                                                       Page 115 of 263
                                   Features - SharePoint Database iDataAgent



subclient              Name of the Subclient to be restored
sourcepaths            File/Directory to be restored
filterpaths            Filter paths
destinationclient      Destination client computer name, for cross-client restores (Optional)
                       By default, data is restored to the source client computer
destinationpath        Destination path to which backed up data will be restored (Optional)
                       By default, data is restored to the source paths
options                Restore options (see Argument Values - Restore Options)
jobid                  Job ID for Restore
level                  Number of source path levels to strip or preserve
                       Use QR_STRIP_LEVEL option (see Argument Values - Restore Options) to strip paths;
                       Use QR_PRESERVE_LEVEL option to preserve paths
currentuserid          Current User ID
copyprecedence         Copy precedence
streamcount            Stream count
fromtime               Restore from time (Captures the "relative" time for index-free restores)
totime                 Restore to time (Captures the "relative" time for index-free restores)
browsefrom             Browse from time
browseto               Browse to time
jobstatus              Job status
backuplevel            Backup Level
useageddata            Use aged data flag
devfileasreg           Treat device files as Regular
mediaagent             MediaAgent
drivepool              Drive pool name
suffix                 Rename restored file
destinationdataagent   Destination dataagent name
priority               Job priority
alert                  Alert name
ondemandinputfile      On demand restore input file
mapfile                Map file name
nomapdata              No map data flag
onetouch               Onetouch restore flag
fullida                Full iDA restore flag
ntuser                 NT user name for impersonation
ntpassword             NT password for impersonation
precmd                 PreRestore command
postcmd                PostRestpre command
prepostuser            Pre-Post user name
prepostpassword        Pre-Post password
destinstancename       Destination instance name (required for SQL iDataAgent)
startsuspended         Starts the job in suspended state
sourcepathsfile        Source paths file (contains list of files/directories to restore)
retryno                Number of times to retry the job
retrytime              Number of hours to retry the job

Options used by the Oracle iDataAgent
sourceclient          Source client computer name from which data has to be restored
dataagent             Agent type installed on source client computer (see Argument Values - Agent Types)
instance              Instance name to which subclient belongs to, required for a few agents
backupset             Backup set name to which subclient belongs to, required for a few agents
subclient             Name of the subclient to be restored
destinationclient     Destination client computer name, for cross-client restores (optional) By default,
data is restored to the source client computer.
retryNo               Number of times to retry the restore job
retryHours            Number of hours to retry the restore job
precmd                Pre restore command
postcmd               Post restore command
prepostuser           User impersonation details for running prepost command
prepostpassword       User impersonation details for running prepost command
mediaagent            MediaAgent name
drivepool             Drive pool name
startsuspended        Start job in suspended state
priority              Priority of the job
alert                 Alert name
controlfilepath       Control file name
filetime              File time
startlsnno            Start LSN number



                                                Page 116 of 263
                                          Features - SharePoint Database iDataAgent



endlsnno                  End LSN number
restorefrom               Restore from time
restoretime               Restore to time
restoretag                Restore tag
recoverfrom               Recover from time
recovertime               Recover to time
recoverscn                Recover SCN
endlogtime                End log time
controlfiletime           Control file time
archivelog                Archive log
logtarget                 Log target
redirectpaths             Redirect paths
renameall                 Rename all
resetlogs                 Reset logs
duplicatetoname           Duplicate name
duplicatetopfile          Duplicate file
skiptablespace            Skip tablespace
duplicatelogfile          Duplicate log file
options                   Restore options
passphrase                Pass phrase
spfilepath                Storage policy (SP) file name
cataloguser               Catalog connect string, user name
catalogpassword           Catalog connect string, password
catalogservice            Catalog connect string, service name
oraclerestorescript       Restore script for RMAN script restore

                          For a scheduled data recovery operation of encrypted data to run successfully when the Client
                          encryption Restore Access property is set to With a Pass-Phrase, prior to the start of the
                          scheduled recovery you must have exported the file that contains the scrambled pass-phrase to
                          the destination client(s). See Export an Encryption Pass-Phrase for more information.

Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Examples:

1.   Restore subclient sc1 in source client sclient to destination client dclient and path c:\.
     E:\commandline>qoperation restore –sc client1 –a Q_WIN_FS –b bs1 –s sclient –dc dclient –dp "d:\"
     172

2.   Restore source path c:\ in source client sclient to destination client dclient and path d:\.
     E:\commandline>qoperation restore –sc client1 –a Q_WIN_FS –b bs1 –sp "c:\" –dc dclient –dp "d:\"
     173

3.   Restore subclient sc1 from source client sclient to the same client and path.
     E:\commandline>qoperation restore –sc client1 –a Q_WIN_FS –b bs1 –s sclient
     174


Back to Top


vaulttracker
This command runs a given VaultTracker policy.
Usage: qoperation vaulttracker -vtp vaulttrackerpolicy [-h]

Description: This command runs a given VaultTracker policy.
In case of an error, an error code and the error description are displayed in the following format: "vaulttracker: Error
errorcode: errordescription"
Options:
-vtp Vault tracker policy name
-h    Displays help




                                                       Page 117 of 263
                                         Features - SharePoint Database iDataAgent




Diagnostics: Possible exit status values are:
0 - Successful completion.
1 - CLI usage failures, due to the use of an unsupported option or missing argument.
2 - Any other failure.
Example:
Run vault tracker policy vtp1.
E:\commandline>qoperation vaulttacker –vtp
vtp1

Back to Top



Considerations for Running GxCmd Commands
Users must begin migrating their command line scripts from the GxCmd commands to the new and improved
Qcommands. Beyond this release, there will be no support for GxCmd commands.
In this release, we provide limited support for the GxCMD command set which includes support for the following
commands:
  Backup (for Oracle iDataAgent backups, the -orainc and -oracumu options are excluded)
  Jobquery
  Jobcontrol
  Change Storage Policy
Although all others commands may continue to work in this release, they are considered to be deprecated and should be
replaced using the appropriate Qcommand.
Note that scripts created from the CommCell Console in the current release will use the Qoperation commands instead of
GxCmd.
On Unix clients, if you continue to use the GxCmd commands, perform the following tasks after the upgrade:
  Create an empty file with the name of the CommServe in the following folder:
  /etc/CommVaultRegistry

  The filename should match the name of the CommServe exactly as it appears in the script. For example, if the
  CommServe name appears as ruby.company.com in the scripts, the following file must be created:

  /etc/CommVaultRegistry/ruby.company.com

  Change the modes of the following files to 777:
    /etc/CommVaultRegistry/.global.lock
     GxCmd.log (located in the log file directory specified during installation)

If Unix Command Line Interface scripts created in a prior release are not executing properly after the upgrade, make sure
that they are using the correct notation for entering an empty string value in the arguments as follows:
  In the previous releases, an empty string could be specified as an argument value using "". However, in the current
  release, empty strings should be specified using "\"\"".

  For example, in a previous release, the following argument value would be correct:
  -sdpair ""

  However, after an upgrade, the correct value is :
  -sdpair "\"\"".




How To
  Save a Job as Script
Back to Top



                                                      Page 118 of 263
                                          Features - SharePoint Database iDataAgent




Agents

Select the desired topic:

   Overview
   How To
   Support Information - Agents
   Support Information - Agent Properties
Related Topics:
   Command Line Interface - qlist dataagent

The following pages provide information on agent-specific operations:

   Active Directory                                                Microsoft SharePoint Portal
   ContinuousDataReplicator                                        Microsoft SQL Server
   DB2                                                             Microsoft Windows File System
   DataArchiver for Exchange                                       NAS NDMP
   DataMigrator for Exchange                                       NetWare Server
   DataMigrator for File System                                    Novell GroupWise
   DataMigrator for Network Storage                                Oracle
   EMC Centera                                                     ProxyHost
   Image Level                                                     Quick Recovery
   Image Level ProxyHost                                           Recovery Director
   Informix                                                        SAP
   Linux NetWare File System                                       Serverless Data Manager
   Lotus Notes/Domino Server                                       Sybase
   Macintosh File System                                           Unix File System
   Microsoft Exchange Server



Overview
Agent software is installed on a client computer to perform data protection and data recovery operations for specific
operating systems or applications. Multiple agents may be used to protect all types of data residing on a computer. For
example, to secure all the data on a computer where a Microsoft SQL Server resides, you would need the following
iDataAgents:
   One Windows File System iDataAgent to back up the computer’s file system.
   One Microsoft SQL Server iDataAgent to back up the database.
In the CommCell Console, such a configuration would appear as two iDataAgents on a client computer.
Most agents must be installed. Once installed, agents are displayed in the CommCell Browser as levels under the client
computer in which they are installed. Each agent has sub-levels in the CommCell Browser. These sub-levels are organized
into one or more of the following logical levels as a level in the client computer. Depending on the agent, sub-levels can
appear as follows:
   instances
   backup sets
   subclients
These logical groups of data along with the agent itself are organized according to a parent-child scheme, which varies
depending on the agent. In effect, each scheme determines the order in which the logical groups of data must be created
per agent. Basically, a child cannot be created and used until a parent for that child is created. For most agents, the agent
is always the primary parent.
For example, suppose that an agent allows for the creation of instances, backup sets, and subclients and that this order
reflects the required creation dependency order for these logical group types. In such a case, a subclient for this agent



                                                       Page 119 of 263
                                          Features - SharePoint Database iDataAgent



cannot be created until a backup set (i.e., the subclient's parent) for the subclient is created; similarly, the backup set
containing this subclient cannot be created until the instance (i.e., the backup set's parent) for the backup set is created.
Typical agent/logical information group schemes include the following:
   Agent   ===>   Instance(s) ===> Backup Set(s) ===> Subclient(s)
   Agent   ===>   Instance(s) ===> Subclient(s)
   Agent   ===>   Backup Set(s) ===> Subclient(s)
   Agent   ===>   Subclient(s)
                   The Oracle RAC iDataAgent deviates somewhat from this scheme. For example, this agent is
                   not installed. As such, no agent level is displayed for this agent; however, a RAC pseudo-
                   client must be created, and the corresponding level is displayed. For more information, see
                   Overview - Oracle RAC iDataAgent.
For most agents that support backup sets, a default backup set and a default subclient are automatically created once the
agent is installed. When created, these logical groups contain all of the available data for backup. For some agents that
support instances, an instance can be created during agent installation, while other agents require that you create instances
after agent installation. Also, for some agents, an instance may be restricted to contain just a single database. See Support
Information - Agents for more information.
Most agents allow you to create multiple logical data groups and also multiple children for these groups. These are regarded
as user-defined items, and they allow you to distribute the data for backup. For example, you may be able to create three
backup sets for an agent, and you may be able to create two subclients for each backup set. To facilitate the processing of
data, you can also create series of backups and subclient groups for some agents.
Per agent, each logical group of data that is created appears as a level under the agent level. For agents where a logical
group of data is created by default after the agent is installed, the corresponding level also appears by default under the
agent and is assigned a name. For example, for agents for which a default backup set is automatically created, the name
defaultBackupSet is typically assigned to the backup set and therefore appears as the level name. In some cases, levels
appear only after you create the affected item. You can review Support Information - Agents to compare and contrast the
CommCell Browser level hierarchies across agents.
You can click or expand the agent level to view the level immediately below the agent as long as the appropriate logical
group of information for the agent has been created. For most agents, if you click the agent level, the level that lies
immediately under the agent level is also displayed in the right-hand pane of the CommCell Browser.
Back to Top



How To
   Deconfigure the Agent
   Delete the Agent
Back to Top




                                                       Page 120 of 263
                                          Features - SharePoint Database iDataAgent




Agents - Microsoft SharePoint Portal

Choose the following topic:
     Overview
     Configurable Properties
     How To
     Support Information - Agents
     Support Information - Agent Properties


Overview
This page provides an overview of the options available for the Microsoft SharePoint Portal iDataAgents. See Agents for an
overview of all agents.



Configurable Properties
Once installed, the agent is configured and is therefore able to manage the data on the client computer. However, you can
change certain aspects of the configuration (e.g., client computer, backup sets, etc.) to manage the data in the manner
that best suits your needs.
Depending on the agent, you can view or change the agent configuration from the available tabs of the agent Properties
dialog box. See Support Information - Agent Properties for a summary of the property tabs supported by each agent.
Activity Control
You can enable or disable all operations for this CommCell object, and all objects below it. For more information, see
Activity Control.
Releasing the Topology Manager Lock for SharePoint Database Backups and Restores
For SharePoint 2003 Database, you may have to allow the system to release the lock for the topology manager to allow
SharePoint Database backup and restore jobs to complete. Data protection operations require certain access to the
topology manager for some items. Without such access, jobs may be placed in a pending state until the topology manager
becomes available. See Release the Topology Manager Lock for SharePoint Database Backups and Restores for step-by-step
instructions.
Enabling Backups of SQL Server Databases Hosted on a Remote SQL Server for SharePoint 2003 Database
For SharePoint 2003 Database, SQL Server databases that are hosted on a remote SQL Server cannot be backed up using a
SharePoint 2003 iDataAgent; instead, such databases must be backed up using the SQL Server iDataAgent. Also, you
should back up these databases before you back up the SharePoint Server. Before you can back up the databases, you
must first do the following:

1.   From the CommCell Browser, right-click the SharePoint 2003 Database iDataAgent and select Properties.
2.   In the General tab of the iDataAgent Properties, select the SQL Database Hosted on a Remote SQL Server check
     box and click OK.
3.   Install the SQL Server iDataAgent on the remote SQL Server.

Enabling Backups of a Site Index from a Remote SharePoint Index
For medium to large SharePoint 2003 server farms, create a folder on a machine with enough disk space to temporarily
store the Portal Site Index for the farm during backup and restore operations, as well as job results information, and share
this folder on the network. (The administrator account, the Front End Web Server, and the Portal Site Index server must all
be able to access this folder.) Perform the following steps:

1.   From the CommCell Console, right-click the SharePoint 2003 Database iDataAgent, and select Properties.
2.   In the General tab of the iDataAgent Properties, in the Index Backup/Restore field, type the path to the folder you
     just created and click OK.

Changing Agent User Accounts




                                                       Page 121 of 263
                                           Features - SharePoint Database iDataAgent




Agent user accounts are provided to allow the system to perform various tasks (e.g., access application databases/servers,
execute specific types of commands, etc.) Generally, you can change these accounts under the proper circumstances. The
dialog boxes, spaces, and buttons for some user accounts appear within or in conjunction with agent level properties tabs
(see the next section) while the same items for similar or other user accounts appear at the agent level via an operation
and/or at one or more other levels either within or in conjunction with properties tabs or via operations at these levels. With
some restrictions, several user accounts in the latter category (i.e., accounts not accessed via Agent Properties) can be
accessed by using the agent level as a starting point, as follows:
   Accounts for accessing application databases (for DB2 and Oracle on Windows)
   Accounts for accessing application servers (for DataArchiver for Exchange, DataMigrator for Exchange Mailbox, and
   DataMigrator for Exchange Public Folder, Exchange 5.5 Database, Exchange Mailbox, and Exchange Public Folder)
   Accounts for executing pre/post commands for data protection (for DataArchiver for Exchange, DataMigrator for Network
   Storage, DataMigrator for Windows, Image Level on Windows, Image Level ProxyHost, and ProxyHost)
   Accounts for executing pre/post commands for data recovery (for Exchange Database, Oracle on Windows, SQL Server,
   SQL Server 2005 and Windows File System)
   Accounts for restoring/recovering to mapped/share network drives and restricted directories (for DataMigrator for
   NetWare, DataMigrator for Network Storage, DataMigrator for Windows, Image Level on Windows, Lotus Notes
   Database, Lotus Notes Document, NetWare File System, NDS, Image Level ProxyHost, ProxyHost, SharePoint 2003
   Document, and Windows File System)
See User Accounts and Passwords: Change Agent Accounts for an overview.
User Security
You can perform the following functions:
   Identify the user groups to which this CommCell object is associated.
   Associate this object with a user group.
   Disassociate this object from a user group.
For more information, see User Administration and Security.
Version
The Version tab displays the software version and post-release service packs and updates installed for the component. See
Version for an overview.



How To
   Back up the default backup set
   Convert to Full Backup on Indexing Failures
   Browse the default backup set
   Browse backup data
   Create a new backup set
   Define operation rules
   Deconfigure the agent
   View backup history
   View restore history
   Delete the agent
   Change Account for Restoring to Mapped/Shared Network Drives and Restricted Directories
   Release the Topology Manager Lock for SharePoint Database Backups and Restores
   Change Agent Accounts
   View Software Version
Back to Top




                                                        Page 122 of 263
                                           Features - SharePoint Database iDataAgent




Subclients

Choose from the following topics:

   Overview
     Default Subclients
     System State Subclients
     User-Defined Subclients
     DataClassSet Subclients
     Creating and Configuring Subclients
     Establishing Parallel Data Protection Operations Using Subclients
   Subclient Operations
   Support Information - Subclient Properties - Part 1
   Support Information - Subclient Properties - Part 2
   How To
Related Topics:
  Command Line Interface - qcreate subclient
  Command Line Interface - qdelete subclient
  Command Line Interface - qinfo subclient
  Command Line Interface - qlist subclient
  Command Line Interface - qmodify subclient
  Content, Filters, and Regular Expressions
  Subclient Policies
The following pages provide information on Agent-specific subclient operations:

   Active Directory iDataAgent                                      Oracle iDataAgent
   DataArchiver for Exchange Agent                                  Oracle RAC iDataAgent
   DataMigrator Agents                                              Quick Recovery Agent
   DB2 iDataAgent                                                   Recovery Director (see Snapshot Volume Units)
   EMC Centera                                                      SAN iDataAgents (Image Level, Image Level ProxyHost,
   Exchange iDataAgents                                             ProxyHost, SDM)
   Informix iDataAgent                                              SharePoint iDataAgent
   Linux NetWare File System iDataAgents                            SQL Server iDataAgents
   Lotus Notes iDataAgents                                          Sybase iDataAgent
   NAS NDMP iDataAgents (BlueArc, EMC Celerra, Hitachi,             Windows/Unix/Macintosh File System iDataAgents
   NetApp)
   NetWare iDataAgents (NetWare File System, NDS, Novell
   GroupWise)



Overview
A subclient is a portion of a client, and can either contain all of the client's data or a designated subset thereof. There are
two main types of subclients, default and user-defined, which are described below. In addition, this section discusses
general subclient configuration requirements as well as how to utilize multiple subclients to perform data protection
operations in parallel.
Default Subclients
When the Agent software is installed, depending on your Agent, the install program automatically creates a Default
Subclient for that Agent. At that time, the Default Subclient comprises all fixed disk resources on the Agent. (CD-ROM
drives and mapped network drives are excluded by default.)
The Default Subclient contains all the data backed up/migrated by an Agent that is not allocated to other subclients (where
applicable). This means that, with a minimum of configuration, you can run data protection operations that include all of the




                                                        Page 123 of 263
                                          Features - SharePoint Database iDataAgent



data that you want to secure. However, this definition may change depending on your Agent, as additional subclients are
defined. Although you can re-configure the content of Default Subclients to back up or migrate specific objects, we strongly
recommend against it because this would disable the capability of the Default Subclient to serve as a "catch-all" entity for
client data, thus increasing the likelihood that some data won't get backed up or scanned for migration.
On Demand Backup Sets contain a Default Subclient that functions differently from ordinary Default Subclients. For more
information, see Default Subclients for On Demand Backup Sets.
System State Subclients
System state components are always backed up together in full, but they can be restored individually or as a unit. By
default, the system state is backed up with the file system in the Default Subclient for a Windows iDataAgent. For Windows
2003, VSS is used by default to back up the system state part of the Default Subclient. You can deselect the Use VSS
option to the Backup System State option. As another option, you can omit system state backup from the Default
Subclient configuration and create a separate System State Subclient. (See Create a New Subclient Policy.) Whichever
method you choose, the system state should be backed up on a regular basis.
User-Defined Subclients
For most Agents you can create additional subclients, called user-defined subclients, with each subclient containing a
unique portion of the client data. The user must create and define subclient contents for clients that do not automatically
create a Default Subclient. In this case only the data specified by the user is protected, migrated or archived. For Agents
that have Default Subclients, user-defined subclients are optional and need not be defined provided the Default Subclient
implementation satisfies your data protection / data recovery requirements.
DataClassSet Subclients
DataClassSet subclients are used for the DataMigrator Agent for Windows with Data Classification. DataClassSet subclients
are created and administered from the DataClassSet level for this Agent. This involves creating and administering the
required migration rules. See Migrate - DataMigrator for File System and Network Storage for an overview.
Creating and Configuring Subclients
When you create a subclient, you need to configure the following information:
   Provide a subclient name.
   Define the content/databases of the subclient.
   Associate a storage policy/QR Policy to the subclient.
   In addition to the above, some Agents have additional configuration requirements.
You enter this information using the Subclient Properties dialog box. Although this information alone is sufficient to declare
a subclient, you can optionally establish other subclient properties as well (if supported for your Agent).
Due to the variation among Agents on the specifics regarding subclient creation and configuration, click on the desired link
at the top of this page for a more detailed discussion on your Agent.


                          The size of the subclient directly impacts the data protection and subsequently the recovery. The
                          larger the subclient, the longer the time required to backup/restore. Additionally, extra space is
                          needed on the MediaAgent.

Establishing Parallel Data Protection Operations Using Subclients
Subclients fulfill two general purposes. They allow you to:
   Perform data protection or archiving on different parts of the client at different times.
   Perform data protection or archiving on multiple parts of the client in parallel.
You can perform a data protection operation on an entire backup set/instance/agent quicker by scheduling multiple
subclients simultaneously. This way, the data protection jobs proceed in parallel and take less time than if the backup
set/instance/agent was not divided into separate subclients.
Note that in order for the data protection jobs for most Agents to run in parallel, the backup set/instance/agent must be
configured to use either different storage policies or a storage policy that is configured to have at least one data stream for
each subclient. If the subclients are configured to use the same storage policy and that storage policy is not configured for
multiple data streams, then a media group resource contention will arise and the competing subclients will perform the data
protection operations one after the other, in a serial manner.

                          The Oracle RAC iDataAgent is designed to perform data protection jobs in parallel without using




                                                       Page 124 of 263
                                         Features - SharePoint Database iDataAgent




                         multiple subclients and storage policies. See Overview - Oracle RAC iDataAgent for more
                         information.



See Storage Policies and Data Streams and Hardware-Specific Resource Issues for the respective overview.



Subclient Operations
You can perform subclient operations as long as the Agent is installed or the pseudo-client is created, any dependent nodes
(such as backup sets/databases/instances/partitions) are created/configured, and the subclient appears enabled in the
CommCell Browser. Subclient operations include the running of data protection and archive operations, viewing job history,
viewing job schedules, as well as configuring subclient properties and deleting a user-defined subclient.



How To
  Configure a Subclient for Pre/Post Processing of Data Protection Operations
  Configure Subclient Content
  Create a New Subclient
  Delete a User-Defined Subclient
  Enable or Disable Data Protection Operations
  Remove a Process from Pre/Post Processing of Data Protection Operations
  Rename a Subclient
  View Subclient Content
Back to Top




                                                      Page 125 of 263
                                         Features - SharePoint Database iDataAgent




Subclients - SharePoint Portal Server

Choose from the following topics:
   Overview
   Configurable Properties
   Things to Consider when Creating and Configuring SharePoint Subclients
   How To
Related Topics:
   Subclients


Overview
The following table shows subclient creation and configuration details specific to SharePoint Portal Server iDataAgents.

Agent                      Type of       Default   Supports     Supports     Contents    Other                Notes
                            Data       Subclient    Default       User         of the   Types of
                                         created   Subclient     Defined      default subclients
                                         during                 Subclient    subclient supported
                                        install of                             when      by the
                                       the Agent                               user-     Agent
                                                                              defined
                                                                             subclient
                                                                            is present
SharePoint 2001           SharePoint      Yes          Yes          No          N/A       None                None
Database iDataAgent       databases
SharePoint 2003           SharePoint      Yes          Yes         Yes       portion of    Do Not   *See Caution Against
Database iDataAgent       databases                                           database     Backup   Re-configuring Default
                                                                               (s) not              Subclient Content.
                                                                            assigned to
                                                                                other
                                                                             subclients,
                                                                               unless
                                                                             otherwise
                                                                            configured*
SharePoint Document SharePoint            Yes          Yes         Yes       portion of    None     *See Caution Against
iDataAgents         documents                                               documents               Re-configuring Default
                                                                                 not                Subclient Content.
                                                                            assigned to
                                                                                other
                                                                             subclients,
                                                                               unless
                                                                             otherwise
                                                                            configured*

The following figure shows a simple subclient configuration for a SharePoint Portal iDataAgents.
For the SharePoint Database iDataAgent, the default subclient, when backed up or restored, establishes a logical data
channel through which data can travel to or from the backup media.
For the SharePoint Database iDataAgent, Subclient2 comprises of a user-defined set of data. The default subclient consists
of all other data. Each subclient, when it is backed up or restored, establishes a logical data channel through which data can
travel to or from the backup media.




                                                      Page 126 of 263
                                           Features - SharePoint Database iDataAgent




For the SharePoint Database iDataAgent, you can schedule the backups of the default subclient and subclient2 either at
different times or simultaneously. Splitting the backups into two time periods can be useful if you need to perform backups
on a large amount of data around a particularly busy time of network or client utilization.



Configurable Properties
Once installed, the agent is configured and is therefore able to manage the data or volumes on the client computer.
However, you can change certain aspects of the subclient configuration to manage the data in the manner that best suits
your needs.
You can view or change the subclient configuration from the Subclient Properties dialog box. The following information can
be configured.
Activity Control
You can enable or disable all operations for this CommCell object, and all objects below it. For more information, see
Activity Control.
Content/Databases
You can define the content of the subclient. Most agents include a configure button that displays a dialog where you can
add or modify the data included as subclient content. For step-by-step instructions, see Configure Subclient Content.
The Content tab is not available for the SharePoint 2001 Database iDataAgent.
Data Transfer Options
Several configurable options to efficiently use available resources for transferring data secured by data protection
operations are provided in the subclient. This includes the following:
   Enable or disable Data Compression either on the client or the MediaAgent.
   Configure the transfer of data in the network using the options for Network Bandwidth Throttling and Network Agents.
Data Encryption
You can enable or disable the encryption of data for transmission over unsecure networks and for storage on media. For
more information, see Data Encryption.
Data Paths
You can view the data paths associated with the primary storage policy copy of the selected storage policy or incremental
storage policy. In addition you can also modify the data paths for the subclient, including its priority. For additional
information, see Configuring Alternate Data Paths for Subclients.
Data Protection Filters
You can perform the following functions:
   Define data protection filters to exclude specified subclient data from being backed up or migrated. For more
   information, see Filters.
   Perform in-place editing of subclient data protection filter exclusions and exceptions. See Editing Filters for more
   information.
The Filters tab is not available for the SharePoint Database iDataAgent.



                                                        Page 127 of 263
                                           Features - SharePoint Database iDataAgent




Pre/Post Processes
You can add, modify or view Pre/Post processes for the subclient. These are batch files or shell scripts that you can run
before or after certain job phases. For more information, see Pre/Post Processes.
You can also change the user account for running Pre/Post processes for data protection operations of agents residing on
Windows platforms. These agents include Active Directory, DataArchiver, DataMigrator, DB2 on Windows, Exchange, Image
Level on Windows, Lotus Notes, NAS, Oracle on Windows, Image Level ProxyHost on Windows, ProxyHost, Quick Recovery
Agent, SDM on Windows, SharePoint, SQL Server, and Windows File System. For more information, see Change Agent
Accounts.
Storage Policies
You can associate the subclient to a storage policy. For more information, see Storage Policies.
Subclient Name
You can rename a subclient. For step-by-step instructions, see Rename a Subclient.
User Security
You can perform the following functions:
   Identify the user groups to which this CommCell object is associated.
   Associate this object with a user group.
   Disassociate this object from a user group.
For more information, see User Administration and Security.
The Security tab is not available for the SharePoint Database iDataAgents.



Things to Consider when Creating and Configuring SharePoint Subclients
When creating and configuring subclients for SharePoint Portal Server iDataAgents, keep in mind the following
considerations:
   Caution Against Re-configuring Default Subclient Content
   We recommend that you do not re-configure the content of a default subclient, because this would disable its capability
   to serve as "catch-all" entity for client data, thus increasing the likelihood that some data won't get backed up or
   scanned for migration.



How To
   Add/Edit a Data Protection Filter for a Subclient (SharePoint Document)
   Assign Data Types to Another Subclient (SharePoint 2003 Database)
   Associate a Subclient to a Storage Policy
   Associate or Disassociate a User Group to a Subclient (SharePoint Document)
   Change Account for Executing Pre/Post Commands (Data Protection)
   Configure the Subclient for Data Encryption
   Delete a Data Protection Filter from a Subclient (SharePoint Document)
   Discover and Assign New Data Types (SharePoint 2003 Database)
   Enable Software Compression for a Subclient
   Set the Network Bandwidth and Network Agents for a Data Protection Operation
   View Data Paths
Back to Top




                                                        Page 128 of 263
                                          Features - SharePoint Database iDataAgent




Pre/Post Processes

Choose from the following topics:
   Overview
      Pre/Post Processes for Data Protection Operations
      Pre/Post Processes for Data Recovery Operations
      Pre/Post Processes for Data Replication Activities
      Run Post Processes for All Attempts
   Pre/Post Impersonation
   Commands and Arguments
   Pre/Post Processes Considerations
   Support Information - Subclient Properties - Part 1
   Support Information - Restore Options
   How To


Overview
Data Protection and Recovery jobs are comprised of one or more sequential phases. Each phase must complete before its
successor can begin. These phases can be used to trigger operations outside of the data protection or recovery process. For
example, copying files from a share to your local disk and then deleting the files after the data protection operation has
run, running a database integrity check prior to backing up your databases/workspaces, or deleting files to make room for
the data to be restored.
For each batch file/shell script that is associated with one of a job's Pre/Post processing phases, the system examines the
return code. A return code of zero indicates a successful run and the job continues. When a non-zero return code is
encountered, the job will wait and restart that phase.
For jobs that are run with Pre/Post Process commands, the CommServe sends additional arguments, appending them to
your Pre/Post commands. See Commands and Arguments for more information.
Pre/Post Processes for Data Protection Operations
Pre/Post Processes for Data Protection are configured in the Subclient Properties (Pre/Post Process) tab, which allows you to
use the job phases as triggers to start processes. For information on which agents support Pre/Post Processes for Data
Protection Operations, see Support Information - Subclient Properties - Part 1.
Pre/Post Processes for Data Recovery Operations
Pre/Post Processes for Data Recovery Operations are configured in the Advanced Restore options dialog box, which allows
you to use the restore phases as triggers to start processes. For more information on which agents support Pre/Post
Processes for Restore, see Support Information - Restore Options.
Pre/Post Processes for Data Replication Activities
Pre/Post Processes for Recovery Points for ContinuousDataReplicator are configured in the Pre/Post tab of the Replication
Set Properties; this allows you to execute processes either before or after the Recovery Point is created. See Create a
Replication Set.
Run Post Processes for All Attempts
Under normal circumstances a specified post process command is only executed upon the successful completion of the
respective job phase, or when the job is killed, or when the job fails. However, you may still want to run a post process
even if the job phase did not complete successfully, especially if your post process is used to bring a database online or
release a snapshot.
The system provides you with an option to run post data protection or restore processes when the respective phase of the
job is interrupted, suspended or fails--in addition to successful, killed or failed jobs. For data protection operations, the Run
Post <phase> Process for All Attempts option can be selected in the Subclient Properties (Pre/Post Process) tab. For
restores, the Run PostRestore Process for All Attempts option is selected on the Pre/Post Processes for Restore dialog
box.
Back to Top




                                                       Page 129 of 263
                                        Features - SharePoint Database iDataAgent




Pre/Post Process Considerations
Consider the following when adding a Pre/Post Process to a job:
Pre/Post Processes for Data Protection Operations
  Windows Agents
     When batch files (or other script files) are attached to the PreScan phase of data protection operations and if the
     permissions for the file is set to run using a specific user account, make sure that the option Allow service to
     interact with desktop is not selected for the Galaxy Communications Service (GxCVD). (You can check this
     option by right-clicking the service and then choosing Properties in the Windows Services dialog box.) If this option
     is selected, de-select the option and then stop and restart the service.

    Note: If the Allow service to interact with desktop option is selected for GxCVD, the job will go into pending and
    display the exit code [128] message.
    If you are using the Windows File System iDataAgent with OFH, QSnap, or VSS, you can include pre-snap and post-
    snap commands around a snapshot. You do this by creating batch files that contain the commands. Pre/Post snap
    scripts operate at the subclient level, and must be placed in the ../<software installation path>/Base folder on
    the client running the backup. The system looks for and executes any batch file(s) that you created in these
    directories. Batch files are not provided with the Windows File System iDataAgent; they must be created manually or
    through a utility. See Configure Pre/Post Snapshot Scripts for the Windows File System iDataAgent.
  Image Level iDataAgent
  Pre/Post process commands are used for the PreScan and PostScan phases when performing Oracle BLI backups. Two
  sample scripts are provided in the <software installation path>\Base folder to serve as a template which you
  customize for your environment:
  PreOraScan.bat - used to quiesce the Oracle instance.

  PostOraScan.bat - used to unquiesce the Oracle instance and take an archive log volume snapshot of the Oracle
  instance.
  Copy and rename the sample files before customizing them for a given Oracle instance (e.g., copy PreOraScan.bat to
  PreOraScan_<instancename>.bat and copy PostOraScan.bat to PostOraScan_<instancename>.bat, thus, to quiesce
  the oltpdb Oracle instance, copy the PreOraScan.bat to PreOraScan_oltpdb.bat and copy PostOraScan.bat to
  PostOraScan_oltpdb.bat.)

  The provided scripts are samples only, and must be customized to quiesce and unquiesce the Oracle instances that you
  are backing up. Edit the scripts to have the following variables set properly: ORACLE_HOME, ORACLE_SID, GALAXY_BASE,
  SQL_CONNECT, GALAXY_SQL_TEMP_DIR.

  The PreOraScan.bat and PostOraScan.bat files are commented with instructions about setting each of these values for
  your environment.
  Lotus Notes Database iDataAgent
  The PreScan and PostScan phases are not used by Transaction Logs.
  NAS NDMP iDataAgents
  For NAS, the Pre/Post process commands are executed on the MediaAgent associated with the active drive pool for the
  Storage Policy that is assigned to the subclient. If this Storage Policy assignment changes, or the active drive pool for
  that Storage Policy changes, the machine where the Pre/Post commands will be executed will also change. Ensure that
  the Pre/Post commands that you have specified are available on the MediaAgent where they will be executed.
  Netware/NDS/GroupWise iDataAgents
  NetWare will not wait for pre/post processing commands to complete, unless a command delay is configured for that
  process. If a command delay is not configured, NetWare will immediately move to the next phase of the backup job after
  launching the command. Also, command failure will not prevent the next phase of the backup job. See Configuring a
  Command Delay for a Pre/Post Process for more information.
  Quick Recovery Agent
  In most cases the QR Agent will create your QR volumes seamlessly. However, the QR Agent must be able to lock the
  destination volume to complete a successful copy or incremental update. If there are other applications, services, or
  users preventing the QR Agent from locking the destination volume, you must create pre/post binding scripts to ensure



                                                     Page 130 of 263
                                        Features - SharePoint Database iDataAgent



  the QR Agent can lock that volume.

  Pre/Post binding script files are not provided with the QR Agent; they are user-created. In most cases it is not necessary
  to create pre/post binding scripts. They should only be created by administrators to address potential errors that may be
  encountered when creating QR Volumes. Pre/Post scripts operate at the subclient level, and they must be manually
  created and placed in the ../<software installation path>/Base folder on the client controlling the destination
  volume. See Configure Pre/Post Binding Scripts for the Quick Recovery Agent.
  ProxyHost and Image Level ProxyHost iDataAgents
    When you create a ProxyHost subclient, you must associate it with a PreScan process. For BCV implementations this
    is often a batch file or shell script file containing commands to synchronize and split a BCV, which is then backed up.
    For specific configuration information on these batch files or shell scripts, see the Resource Pack for examples that
    apply to specific applications and hardware (e.g., Exchange databases in an EMC Symmetrix environment).
    Pre/Post Processing can be done either on the Primary Host and/or the Backup Host with the Backup Host being the
    current default.
    A ProxyHost subclient must always have a PreScan process. If you delete the PreScan process, you must replace it
    with another process before saving your changes.
  Serverless Data Manager iDataAgent
    The Pre/Post processes can run on either a Primary (Production Server), or on a MediaAgent computer.
    When you create a Serverless Data Manager subclient, you must associate it with a PreScan process. For specific
    configuration information on these batch files or shell scripts, see the Resource Pack for examples that apply to
    specific applications and hardware.
    SDM executes the Pre/Post commands on the MediaAgent and the Client.
    A Serverless Data Manager subclient must always have a PreScan process. If you delete the PreScan process, you
    must replace it with another process before saving your changes unless the Use QSnap checkbox has been selected in
    the Subclient Properties (Contents) tab. If the checkbox is not selected, then a prescan process must be used to
    generate a third-party snapshot in the prescan phase.
Pre/Post Process for Data Protection and Recovery Operations
  Unix File System iDataAgents (and other agents running on a Unix platform)
     The first line of the shell script must execute a shell that is designed to call in the pre/post program. Therefore,
     ensure that you have included the appropriate command in the first line of the script. The following is an example of
     a typical pre/post shell script:

     #!/bin/bash
     base='basename $0'
     echo $0, 'date' > /extra/aah/RESULTS/$base.out
     exit 0

     If you are using the Linux File System iDataAgent to back up VMware virtual machines, you must create one
     PreBackup process script and one PostBackup process script that will allow the system to snapshot the virtual
     machines. See Create Pre/Post Backup Process Scripts to Back up VMware Virtual Machines Using the Linux File
     System iDataAgent for step-by-step instructions.

  Windows File System iDataAgent
  On a Windows NT client, if you are executing a batch file that uses the exit command to produce a return code, the
  system will not detect the return code due to the way Windows NT handles return values. Instead, use the
  supplied \base\exitntbat.exe executable in your batch file in place of the exit command.

Back to Top



How To
  Configure a Subclient for Pre/Post Processing of Data Protection/Archive Operations
  Remove a Process from Pre/Post Processing of Data Protection/Archive Operations
  Configure a Replication Set for Pre/Post Processes for Recovery Point Creation
  Configure a Command Delay for a Pre/Post Process on NetWare/NDS Computers
  Add Pre/Post Commands to Restore Jobs
  Create Pre/Post Backup Process Scripts to Back up VMware Virtual Machines Using the Linux File System iDataAgent




                                                     Page 131 of 263
                                          Features - SharePoint Database iDataAgent




Storage Policies

Choose from the following topics:
   Overview
      Quick Recovery Agent
   Types of Storage Policies
      iDataAgent Backup (Standard) Storage Policy
      Disaster Recovery Backup Storage Policy
      DataMigrator/Archiver Storage Policy
   Incremental Storage Policy
   Storage Policy Operations
      Data Verification
      Other Operations
   Storage Policy Properties
   Considerations
      NAS-attached Libraries and Data Paths
      EMC Centera and NAS NDMP iDataAgents
      Subclients
      Migrating a Magnetic Library
      Incremental Storage Policy
      iDataAgent Backup or DataMigrator/Archiver Storage Policy
     Disaster Recovery Backup Storage Policy
     Synchronous Copies
     Selective Copies
     Data Streams
   Support Information - Storage Policy
   How To
Related Topics:
   Command Line Interface - qcreate sp
   Command Line Interface - qdelete sp
   Command Line Interface - qlist sp
   QR Policies
   Storage Policy Copies
   Auxiliary Copy on the iDataAgent
   Data Aging


Overview
Storage policies act as the primary channels through which data is included in data protection and data recovery
operations. A storage policy forms the primary logical entity through which a subclient or instance is backed up. Its chief
function is to map data from its original location to a physical media. The system provides a default iDataAgent storage
policy for each media library, stand-alone, or magnetic drive once they are configured.
You can create storage policies to:
   Customize data retention periods for different subclients.
   For example, where it may be necessary to restore/recover old data, you may want to create a storage policy with
   longer retention periods when performing data protection operations on a server. On the other hand, if the data being
   protected is not as critical, you can set a shorter retention period in order to release the media more quickly.
   Define the number of streams available to run simultaneous data protection or data recovery operations for all subclients
   that use the same storage policy.
   For example, if you set a stream count to three in the storage policy:




                                                       Page 132 of 263
                                          Features - SharePoint Database iDataAgent



      data protection operations from three subclients using the same storage policy can run simultaneously rather than in
      series.
      data protection operations from one subclient using multiple streams can run up to three streams simultaneously.
      (Streams are supported for certain database iDataAgents and File System Multi-Streaming is supported for certain
      non-database iDataAgents, allowing increased data protection and data recovery speed by splitting the data across
      multiple tapes simultaneously.)
   Direct data from data protection operations to another library.
A secondary copy of a storage policy provides a means of making an additional copy of backed up data and is used in
auxiliary copy operations, or data protection operations that create inline copies. For more information on storage policy
copies, see Storage Policy Copies.
Quick Recovery Agent

If you are using the Quick Recovery Agent, this agent uses QR Policies for the data included in the agent's Quick Recovery
Creation operations. For more information on QR Policies, see QR Policies.



Types of Storage Policies
Three types of storage policies can be defined. They are:
   iDataAgent Backup (Standard) storage policy
   Disaster Recovery Backup storage policy
   DataMigrator/Archiver storage policy
The following sections provide a brief description of each of these storage policies.
iDataAgent Backup (Standard) Storage Policy
An iDataAgent Backup (Standard) storage policy is used by subclients associated with iDataAgents, to perform backup and
restore operations. Subclients can be configured to use storage policies using one of the following methods:
   One storage policy for all types of backups, including full and non-full backups.
   One storage policy for full backups and another storage policy for non-full backups (incrementals and differentials). The
   storage policy for non-full backups is referred to as the Incremental Storage Policy. For more information on Incremental
   Storage Policy, see Incremental Storage Policy.
   For SQL Server iDataAgents, it is possible to have one storage policy for full backups, another storage policy for
   differential backups (using incremental storage policy), and another storage policy for transaction log backups.
By default, the retention period for the iDataAgent Backup storage policy is set for an infinite period of time. This retention
period can be modified to better suit the data retention needs of your business. If the retention time is changed, it is
recommended that you keep this data for a minimum of 15 days and two cycles. For more information on changing
retention rules, see Retention.
Disaster Recovery Backup Storage Policy
As an extra protection to rebuild your CommCell in the event of a disaster, Disaster Recovery Backup storage policies are
used to store metadata to media. This metadata stores information about the CommCell and the backed up data. In case of
a system failure, you can get Disaster Recovery Backup data back from the media used by the Disaster Recovery Backup
storage policy.
A default Disaster Recovery Backup storage policy [CommServeDR (host name)] is automatically created when the first
library in the CommCell is configured. This type of storage policy is recommended because it only writes the Disaster
Recovery Backup data to the media, and its retention period is defined as infinite. The media being used for this storage
policy should be removable to prevent accidental data loss due to system failure. If the first library configured is a magnetic
library, change your Disaster Recovery Backup configuration to use a Disaster Recovery Backup storage policy associated
with a tape library as soon as the first tape library is configured. You can create as many Disaster Recovery Backup storage
policies as needed.
By default, the retention period for Disaster Recovery Backup data is set to be retained for an infinite period of time. If you
want to change the retention time for Disaster Recovery Backup data, it is recommended that you keep this data for a
minimum of 60 days and 60 cycles. For more information on Disaster Recovery Backups, see Disaster Recovery Backup.
DataMigrator/Archiver Storage Policy
A DataMigrator/Archiver storage policy is used to perform the archiving of data for the DataMigrator and DataArchiver
agents. The retention period for this storage policy can only be set by time, not cycles.



                                                       Page 133 of 263
                                           Features - SharePoint Database iDataAgent




By default, the retention period for DataMigrator or DataArchiver data is set to be retained for an infinite period of time. If
you want to change this retention time, it is recommended that it be set for a minimum of 365 days.
For more information on the DataMigrator and DataArchiver agents, see the appropriate product features page.

                          Quick Recovery Agent uses QR policies for QR volume creation and QR volume recovery
                          operations.

Back to Top



Storage Policy Operations
Storage policy operations allow you various options for customization and maintenance. These include features such as
optimizing tape speeds, verifying data validity for restoring and copying, and ensuring alternate data paths.
Data Verification
You can configure a copy for Data Verification so that all backups, all full backups, or backups occurring on or after a certain
date will be verified during a data verification operation.
Other Operations
Various storage policy operations dealing with creation and maintenance are available in the CommCell Browser at the
storage policy level. These options are discussed below.
Create a Storage Policy

You can create a storage policy in the CommCell Console from the Storage Policy level. A Storage Policy Wizard guides you
through the process of creating a storage policy. By default, the primary copy gets created when a new storage policy is
created.
Perform an Auxiliary Copy Operation

During an Auxiliary Copy operation, data is copied from the primary copy to the secondary (synchronous or selective)
copies that you have defined.
Create a Secondary Copy

When a storage policy is created, the software automatically creates a primary copy. All data from data protection
operations from the subclient(s) is channeled through the primary copy.
In addition, you can create any number of additional secondary copies to the same/different libraries and MediaAgents.
These copies are components of storage policies and are used in auxiliary copy operations, and can either be synchronous
or selective. A Synchronous Copy or a Selective Copy can be created. For more information about storage policy copies, see
Storage Policy Copies.
Cloning

The Cloning Policies feature allows you duplicate a storage policy that retains all of the options of the original storage
policy. A cloned storage policy is identical to the original policy, except it does not retain the original associated subclients.
See Cloning Policies for an overview.
Delete

You may decide to delete a storage policy if:
   You determine that you do not need the data that was backed up through that storage policy.
   Data no long exists on the storage policy and you have no plans to use it for future data protection operations.
View Media Not Copied

You can view the media that has data that has not yet been copied to all secondary copies within a storage policy. This
media can be viewed from the Media Not Copied dialog box.
Schedules

You can view the schedules of jobs associated with a storage policy copy.
Back to Top




                                                        Page 134 of 263
                                          Features - SharePoint Database iDataAgent




Considerations
   It is recommended that you do not change any of the storage policy properties while the storage policy being used by an
   operation. (e.g., data protection operation, data recovery operation, data aging, auxiliary copy, synthetic full, etc.)
   A storage policy cannot be deleted if there are any data aging, data recovery operations, or auxiliary copy jobs running
   on the storage policy.
NAS-attached Libraries and Data Paths
In an environment with NAS NDMP file servers, consider the following when creating and using storage policies:
   For the BlueArc, EMC Celerra, and NetApp NAS NDMP iDataAgents, data can be backed up to a library directly attached
   to a NAS file server; however, a NAS client of one type (e.g., NetApp, Celerra, etc.) should never use a storage policy
   with a data path pointing to a library directly attached to a NAS file server of another type.
   If the default data path on the primary copy of a storage policy points to a drive pool attached to a NAS file server, then
   all additional data paths on the primary copy or any other copy must also point to a drive pool attached to the same
   type of NAS file server.
   For Storage Policies utilizing a drive pool directly attached to a NAS file server, any MediaAgent specified in a data path
   must have connectivity to that NAS file server.
   If the default data path on the primary copy of a storage policy points to a drive pool configured on a MediaAgent, then
   all additional data paths on the primary copy or any other copy must also point to a drive pool configured on a
   MediaAgent. If NDMP Remote Server (NRS) is installed on the MediaAgent in the default data path, then NRS must also
   be installed on the MediaAgent specified in any additional data paths.
   A DataMigrator/Archiver storage policy cannot be created for a NAS NDMP MediaAgent.
   Data from any Agent can be backed up to a library attached to a NAS NDMP file server, by selecting a Storage Policy for
   a NAS-attached library. All copies of the Storage Policy must utilize the NAS-attached library.
   Disaster Recovery Backup storage policies can not be created using NAS datapaths.
   Standard storage policies created using NAS datapaths cannot be used for Disaster Recovery Backup operations.
EMC Centera and NAS NDMP iDataAgents
  For all NAS NDMP iDataAgents as well as the EMC Centera iDataAgent, data can be backed up through a MediaAgent on
  which you have installed NDMP Remote Server (NRS). When creating or selecting a storage policy for use with NRS,
  ensure the specified MediaAgent has a Drive Pool configured, and all requisite software installed and configured, either
  for the NAS NDMP iDataAgents, or the EMC Centera iDataAgent. Such storage policies will be available for use by the
  subclient, when you Associate a Subclient to a Storage Policy. The data protection and recovery jobs for any NAS NDMP
  or EMC Centera client will run on the MediaAgent where the selected drive pool for that job is configured.
   NAS NDMP and EMC Centera Load-Balancing - The Drive Pools, Storage Policies, and Subclients for NAS NDMP and EMC
   Centera can be configured so that backup, restore, and auxiliary copy jobs will be spread among different MediaAgent,
   and thus the processor load for these jobs will occur on different machines. For more information, see Subclients - NAS
   NDMP - Load-Balancing Considerations or Subclients - EMC Centera - Load-Balancing Considerations.
Subclients
  Whenever you create a subclient, you must associate that subclient with a data storage policy. The associated storage
  policy is the one through which data protection/archive operations and recovery/retrieve operations for the selected
  subclient are conducted. Some agents may also have a separate storage policy association for transaction logs.
  You must re-associate the subclients of a storage policy if you want to delete it. You can re-associate all the subclients of
  a storage policy at the same time.
  Re-associating the subclients of a storage policy automatically forces the next backup to be a full backup.
   When a user changes the storage policy association of a subclient, a subclient is deleted, or an agent or client is
   deconfigured, only the retention days must be exceeded for data to be aged. In these cases, retention cycles are set to
   zero (0).
   If you plan to change the associated data storage policy of a subclient that has been backed up/migrated/archived, keep
   in mind the following considerations:
       Verify that the retention period of the primary copy of the target storage policy satisfies your requirements. If
       necessary, either change the retention period of the primary copy or create a new storage policy with a primary copy
       that has the desired retention period.
       When assessing retention periods, be particularly careful in the case of multiple subclients. To maintain consistency,
       we recommend that the data for all subclients within an agent/backup set/instance/database/partition expire at the
       same time. Therefore, if you change storage policy associations, you should keep in mind the retention periods of any
       sibling subclient data as well.




                                                       Page 135 of 263
                                         Features - SharePoint Database iDataAgent



        For the Image Level iDataAgent, if you have assigned volumes from a single database to multiple subclients, all of
        those subclients should have the same retention period.
     The data that was backed up/migrated/archived through the previous storage policy remains valid for the length of
     time expressed in the associated retention period. Since the data remains valid, you can still recover/retrieve it if
     necessary.
     All subclient data that was backed up through the previous storage policy will be aged based on its storage policy
     copy retention time (days) rule only. If you select to run a full backup after changing the storage policy, all subclient
     data on the new storage policy will be aged according to its retention time and cycle rules. If you select to run a non-
     full backup as the next backup operation, it is recommended that you run a full backup as soon as possible. All non-
     full backups run before a full backup will be retained as a partial cycle according to the new storage policy copy's
     retention cycle rule (even though not a full cycle). The non-full backups (partial cycle) will be aged when the new
     storage policy copy's retention time and cycle rules are met.
     For more information, see Data Aging.
     For most iDataAgents, the user is given a choice whether or not to automatically convert the next backup of that
     subclient into a full backup. However, for DataMigrator and DataArchiver Agents, the system automatically converts
     the next operation for that subclient into a new index job. You may want to verify that this does not result in conflicts
     with any operation rules that have been established.
     It is not possible to change a storage policy association for a subclient that is being backed up/migrated/archived. If
     the subclient is newly established and has no data protection/archive history, then you can change the associated
     storage policy.
     Oracle and Oracle RAC subclients use two storage policies, one for data which is set at the subclient level, and one for
     Archive Logs which is set at the instance level.
     For SDM subclients the Storage Policy must reside on a MediaAgent with snapshot/BCV support (e.g., TimeFinder),
     and the Storage Policy should be associated with a Drive Pool that has Copy Manager configured for SDM.
     For Informix subclients, you can specify the maximum number of streams used for database backup operations by
     setting the BAR_MAX_BACKUP parameter in the $ONCONFIG file on the Informix client. Also, the number of streams
     specified by the storage policy must be greater than or equal to the number specified by the BAR_MAX_BACKUP
     parameter.
     For NAS subclients, you can select a Storage Policy associated with either of the following:
         drives configured on a MediaAgent with the following installed:
         - NDMP Remote Server
         - File System iDataAgent
         drives configured on a NAS file server
     For EMC Centera or NAS NDMP subclients using NRS, there are specific considerations for storage policies to be used.
     For more information, see EMC Centera and NAS NDMP iDataAgents using NRS.
     For Image Level subclients, the Storage Policy must reside on the client where the snapshot command will be
     executed.
     For SQL Server subclients:
        System database subclients only use one storage policy.
        Non-system database user-defined subclients can use two storage policies:
        - one for full and differential database backups
        - one for the storage policy's own Incremental backup.
        Non-system database default subclients can use up to three different storage policies:
        - one for full and differential database backups
        - one for the storage policy's own Incremental backup
        - one for transaction log backups.
        Changing the associated transaction log storage policy does not require conversion to a full backup.
        If you are using the auxiliary copy feature of combining streams, the Number of Transaction Log Backup
        Streams must be set to one.
Migrating a Magnetic Library
  To migrate a magnetic library, the target MediaAgent must have access to the same mount path as the source
  MediaAgent. Therefore, it is recommended that the target MediaAgent be created using mirroring. Mirroring allows one
  MediaAgent to be set up identically as another MediaAgent.
  If mirroring is not an option, the user must have mount paths that are accessible to the target MediaAgent. After
  migration, ensure that all mount paths are online or accessible.
  You cannot migrate the following magnetic libraries:



                                                      Page 136 of 263
                                         Features - SharePoint Database iDataAgent



      Centera
      A shared magnetic library
Incremental Storage Policy
  You cannot enable a storage policy as an incremental storage policy if that storage policy already has an incremental
  storage policy enabled.
  The incremental storage policy option is available for an iDataAgent Backup (Standard) storage policy.
   If you are using a different MediaAgent for an incremental storage policy than the MediaAgent used for a full storage
   policy, one of the following conditions must be met:
       The primary copies of both this storage policy and the selected incremental storage policy use a shared index cache.
       The primary copies of both this storage policy and the selected incremental storage policy are set to use preferred
       data paths.
       If it is a case of failover/round robin, the primary copies of both this storage policy and the selected incremental
       storage policy must have the same data paths.
   An incremental storage policy must be de-associated from a storage policy before that storage policy can be deleted.
   If an incremental storage policy is de-associated from a storage policy, the most recent incremental backup may be
   pruned before the next full backup occurs.
   If you want to perform a synthetic full backup using an alternate MediaAgent (one other than the MediaAgent used for
   the Primary backup), you must configure an Incremental Storage Policy.
iDataAgent Backup or DataMigrator/Archiver Storage Policy
   When a storage policy is deleted, all the data from data protection operations associated with the storage policy is
   removed from the CommServe database. Thus, once a storage policy is deleted, the data from data protection
   operations associated with the storage policy cannot be restored/recovered.
   Data backed up/archived using this storage policy will not be available for data recovery operations. The storage policy
   will be deleted even if magnetic volumes are not accessible.
   Verify that you will no longer need the data that was backed up/migrated/archived using the storage policy, and verify
   that the storage policy will not be needed for future data protection operations.
Disaster Recovery Backup Storage Policy
   A Disaster Recovery Backup storage policy cannot be deleted if there are any data aging, data recovery operations, or
   auxiliary copy jobs running on this storage policy.
   It is recommended that you do not delete a Disaster Recovery Backup storage policy before exporting the media
   containing your Disaster Recovery Backups. If you delete a Disaster Recovery Backup storage policy before exporting
   this media, the media may be used by another data protection operation and it can be overwritten. Once the media is
   exported, you can use Media Explorer to restore the metadata from this media.
   Disaster Recovery Backup storage policies can not be created using NAS datapaths.
   Standard storage policies created using NAS datapaths cannot be used for Disaster Recovery Backup operations.
Synchronous Copies
  It is recommended that synchronous copies be configured with a retention period that is greater than or equal to that of
  the primary copy.
  Synchronous copies should be configured using the same number of data streams as the primary, unless that secondary
  copy has the Combined to <n> Streams option enabled.
Selective Copies
  It is recommended that selective copies are configured with a retention period that is greater than or equal to that of the
  primary copy.
  Selective copies should be configured using the same number of data streams as the primary, unless the copy has the
  Combined to <n> Streams option enabled.
Data Streams
  The number of data streams is the same for all the copies in the storage policy; hence the number of data streams
  cannot be changed for individual copies. Only copies with the Combined to <n> Streams option can be changed.
   You can change the number of data streams if the storage policy does not have any data protection operation data
   associated with it. However, you cannot decrease the number of streams for a storage policy which contains data
   associated with a subclient which supports multiple streams (For example, a subclient associated with SQL iDataAgent)
   and the number of streams defined in the subclient properties are more than the number you modified. In such a
   situation an error message is displayed.
Back to Top



                                                      Page 137 of 263
                                         Features - SharePoint Database iDataAgent




Incremental Storage Policy

Choose from the following topics:
   Overview
   Support Information - Storage Policy
   Incremental Storage Policy Considerations
   How To
Related Topics:
   Storage Policies
   Data Aging of an Incremental Storage Policy


Overview
By default, an iDataAgent Backup (Standard) storage policy directs subclient backup data from all types of backups,
including full and non-full (incremental and differential) backups to the same storage policy. This may not always be time
and cost efficient if your storage policy is using the same resources.
To reduce your backup window and utilize storage resources effectively, you can define a storage policy for full backups and
another storage policy, called an Incremental Storage Policy, for non-full backups.
In such a configuration, it is recommended that the full storage policy use tape media, and the incremental storage policy
use magnetic media to effectively utilize the available resources.
In the following example, Subclient A is backed up using a storage policy which has the incremental storage policy option
enabled. Hence, the full backup uses a storage policy that backs up to a tape library. The incremental and differential
backups use a storage policy that backs up to magnetic media.




To configure your subclients to use this feature, the incremental
storage policy must be enabled in the Storage Policy Properties dialog
box.
In the sample image, SP_Mag_Library2_2 is enabled as an
Incremental Storage Policy for storage policy SP_Tape_Library1_1.

Multiple storage policies can enable the same storage policy as its
Incremental Storage Policy.
However, within the storage policy that is enabled as an Incremental
Storage Policy, you will not be able to enable the Incremental Storage
Policy option, as it is not available.




                                                      Page 138 of 263
                                           Features - SharePoint Database iDataAgent




In the sample image that follows, the Incremental Storage Policy
option is not available in SP_Mag_Library2_2, as it is already enabled
as an Incremental Storage Policy for SP_Tape_Library1_1.

For this reason, the list of incremental storage policies displayed in this
dialog box will include only those storage policies that do not have an
Incremental Storage Policy enabled.




In the sample image that follows, the list of incremental storage
policies in SP_Mag_Library3_3 does not include SP_Tape_Library1_1,
as SP_Tape_Library1_1 has an Incremental Storage Policy enabled.




                                                        Page 139 of 263
                                          Features - SharePoint Database iDataAgent




Once this is done, your subclients that are configured to use the
storage policy that has the incremental storage policy option enabled
will use different storage policies for full and non-full backups. This
information can be viewed from the Storage Device tab of the
Subclient Properties dialog box.

Note that the name of the Incremental Storage Policy is also
displayed.




Back to Top



Storage Policy Support Diagram
The following table lists the types of storage policies and storage policy copies, and whether or not they support an
incremental storage policy.

Storage Policy Type                             Enable Incremental Storage Policy
Disaster Recovery Backup                        No
iDataAgent Backup Storage Policy                Yes
DataMigrator/Archiver                           No
                         Incremental Storage Policies do not support Transaction Log backups in the following iDataAgents:
                            DB2
                            Informix




                                                       Page 140 of 263
                                        Features - SharePoint Database iDataAgent




                           Lotus Notes Database
                           Microsoft SQL Server
                           Oracle
                           SAP

Back to Top



Incremental Storage Policy Considerations
  If you want to perform a synthetic full backup using an alternate MediaAgent (one other than the MediaAgent used for
  the Primary backup), you must configure an Incremental Storage Policy.
  You cannot enable a storage policy as an incremental storage policy if that storage policy already has an incremental
  storage policy enabled.
  The incremental storage policy option is available for an iDataAgent Backup (Standard) storage policy.
  If you are using a different MediaAgent for an incremental storage policy than the MediaAgent used for a full storage
  policy, one of the following conditions must be met:
      The primary copies of both this storage policy and the selected incremental storage policy use a shared index cache.
      The primary copies of both this storage policy and the selected incremental storage policy are set to use preferred
      data paths.
      If it is a case of failover/round robin, the primary copies of both this storage policy and the selected incremental
      storage policy must have the same data paths.
  An incremental storage policy must be de-associated from a storage policy before that storage policy can be deleted.
  If an incremental storage policy is de-associated from a storage policy, the most recent incremental backup may be
  pruned before the next full backup occurs.


How To
  Enable an Incremental Storage Policy
  Disable an Incremental Storage Policy From a Storage Policy
  Delete a Storage Policy that has an Incremental Storage Policy Enabled
Back to Top




                                                     Page 141 of 263
                                      Features - SharePoint Database iDataAgent




Storage Policy Copies

Choose from the following topics:
  Overview
      Primary Copy
      Secondary Copy
      Supported Copy Types per Storage Policy Type
  Automatic Copy
  Deferred Copy
  Inline Copy
  Jobs on a Storage Policy Copy
  Source Copy
  Spool Copy
  Subclient-Based Storage Policy Copy
  Storage Policy Copies Operations
      Mark Active Media Full
      Change Data Path
      Delete a Storage Policy Copy
      Schedules
      View Media Not Copied
      View Media
      View Jobs
      View Aged Media
      View Aged Jobs
  Storage Policy Copy Considerations
      Auxiliary Copy
      Backups On and After Date
      Data Path
      Delete a Storage Policy Copy
      Disable a Job
      Hardware Compression
      Inactive Storage Policy Copy
      Prune a Job
      Retain a Job
      Retention Rules
      Secondary Copy
      Spool Copy
  Storage Policy Copy Properties
  Support Information - Storage Policy Copy
  How To
Related Topics:
  Command Line Interface - qcreate spcopy
  Command Line Interface - qdelete spcopy
  Command Line Interface - qlist spcopy
  Auxiliary Copy
  Data Aging
  Storage Policies


Overview



                                                     Page 142 of 263
                                          Features - SharePoint Database iDataAgent




A storage policy copy provides the means to make additional copies of the data. Copies can be created by performing an
auxiliary copy operation, or by performing a data protection operation that creates Inline copies.
Each storage policy consists of one or more storage policy copies. There are several types of storage policy copies. They
are:
Primary Copy
The primary copy is automatically created by the system when a storage policy is created. All data protection operations
that use a given storage policy use the primary copy. The primary copy carries all data that is directed to its parent storage
policy.
Secondary Copy
A secondary copy of a storage policy provides a means of making an additional copy of protected data and is used in
auxiliary copy operations, or data protection operations that create inline copies. An auxiliary copy operation or a data
protection operation with an inline copy replicates the data that has been protected through the primary copy to the
secondary copies within the same storage policy. While a secondary copy can use the same library as the primary, it is
recommended that a different library is used for a secondary copy.
When configured, each copy is assigned a set of attributes. These attributes define the nature of the data protection data
that is protected and/or copied on a copy. Such attributes include the:
   Destination library of the data secured through the copy
   Data retention rules or all data to be secured through the copy
   Copy precedence
   Copy type (synchronous or selective)
There are two types of secondary copies. They are synchronous copies and a selective copies. The following section
describes each.
Synchronous Copy

During an auxiliary copy operation or a data protection operation that creates an inline copy, all data protection operations
occurring on or after a selected date on the primary copy are copied to a synchronous copy. In case data is lost, you can
restore/recover the same data from a synchronous copy. Note that you can promote a synchronous copy to be the primary
copy.
Selective Copy

A selective copy allows you to copy full backup data selectively from a source copy to this copy, providing for better tape
rotation. Since only full backups can be copied to selective copies, the selective copies cannot be promoted to the primary
copy, only synchronous copies can be promoted. Note that the data selection process does not have to be the same for all
auxiliary copies.
During an auxiliary copy operation or a data protection operation that creates an inline copy, only those full backups from
the primary copy that meet certain criteria will be copied to a selective copy. You can define a selective copy to be time-
based, manually select full backups, all full backups, or based on the most recent full backup for each subclient that occurs
on the primary copy.
If the copy is defined as All Fulls, all full backups on the primary copy will be copied during an auxiliary copy operation or a
data protection operation that creates an inline copy. If the copy is defined as time-based, only the first or last full backup
that occurs within each selected weekly, monthly, quarterly, half-yearly, or yearly interval will be copied. You have the
option of creating and associating a custom calendar to the copy, so that the intervals can be further customized.

                          Only full backup jobs can be copied to selective copies. Note that full backup jobs for some
                          agents are not self contained because they require data from subsequent jobs in order to be
                          successfully restored, and therefore, cannot be copied to selective copies. Since Oracle online
                          and SQL File/File Group (FFG) full backup jobs are dependant upon corresponding transaction
                          logs for restorability, their data will not be copied to a selective copy.

See also:
   Auxiliary Copy With Synchronous and Selective Copies
   Grandfather-Father-Son (GFS) Tape Rotation
Supported Copy Types per Storage Policy Type
The following table lists the types of storage policies and the types of storage policy copies that support that storage



                                                       Page 143 of 263
                                         Features - SharePoint Database iDataAgent



policy:

Storage Policy Type                           Storage Policy Copy Type
                                              Primary Copy
iDataAgent Backup Storage Policy              Synchronous Copy*
                                              Selective Copy*
                                              Primary Copy
Disaster Recovery Backup
                                              Synchronous Copy*
                                              Primary Copy
DataMigrator/Archiver
                                              Synchronous Copy*

*Can also be designated as an Inline Copy.
Back to Top



Storage Policy Copy Operations
Storage policy operations allow you various options for media maintenance and data viewing. These include features such
as designating media as full, changing data paths for media, deleting storage policy copies, and viewing several media data
within the system.
Mark Active Media Full
This option marks all active media (except magnetic media) within a storage policy copy as full. Subsequent data protection
operations or auxiliary copy operations that are directed to the copy will start on new media.
Change Data Path
You can change the data paths for:
   The media group used by a storage policy copy to a different library, master drive pool, drive pool and scratch pool
   within the CommCell.
   A magnetic library from one MediaAgent to another MediaAgent within the CommCell.
For comprehensive information, see Data Path considerations.
Delete a Storage Policy Copy
When a storage policy copy is deleted, the data associated with the copy cannot be restored/recovered.
The primary copy of a storage policy cannot be deleted. If you want to delete a primary copy then you must delete the
entire storage policy.
Secondary copies (synchronous, selective) can be deleted from the CommCell Browser if there are no data aging, data
recovery operations, or Auxiliary Copy jobs running. Verify that none are running before attempting to delete the copy. See
Delete a Secondary Copy.
Schedules
You can view the schedules of jobs associated with a storage policy copy. For more information on scheduling, see
Scheduling.
View Media Not Copied
You can view the media that has data that has not yet been copied to all secondary copies within a storage policy.
This media can be viewed from the Media Not Copied dialog box.
View Media
The software provides information about all the media or mount paths containing data associated with a storage policy
copy. This can be useful in various circumstances, including the following:
   You need to know the mount paths location of your data stored on magnetic media.
   You want to change the data paths for media from one library to another and need to know which media are associated
   with a given storage policy copy.
   You are scheduling operations and want to make sure that all of the media necessary for the operations are inside the
   library.
   You want to view the contents on the media.



                                                      Page 144 of 263
                                           Features - SharePoint Database iDataAgent




A list of this media or mount paths can be viewed from the Media List dialog box.

Note: This option is available only if the library to which the storage policy copy directs its data is a tape or optical library.
View Jobs
You can view and perform operations on the jobs that reside on, or are scheduled to be copied to, a storage policy copy.
For more information on the View Jobs feature, see Jobs on a Storage Policy Copy.
View Aged Media
You can view the media that was used by a storage policy copy from the Media List dialog box. Data from data
protection operations have been pruned from this media, and the media has already been returned to the scratch volume
pool.
Note: This option is available only if the library to which the storage policy copy directs its data is a tape or optical library.
View Aged Jobs
You can view the jobs that have already been pruned from a storage policy copy from the Jobs for Storage Policy Copy
window.
Back to Top



Storage Policy Copy Considerations
   For DataMigrator, making multiple copies for migrated data is recommended to avoid single-point-failure. For example,
   you can create a secondary copy on a magnetic library that has a short retention period for faster recovery, and you can
   also create a secondary copy on tape media that has a longer retention period.
   For NAS NDMP environments, refer to Storage Policy Considerations for additional information.
   Multi-stream backups of the Microsoft SQL, DB2, and Sybase agents will not be copied during an auxiliary copy operation
   to a copy that combines streams.
   A custom calendar must first be defined before it can be associated with a storage policy copy.
   The same MediaAgent must be used for both the primary and inline copy.
   The View Mount Paths option is only available if the storage policy copy directs its data to magnetic media.
   You do not need to define a Auxiliary Copy schedule for an Automatic copy once the Automatic Copy option is enabled.
Auxiliary Copy
   When refreshing media, if the source of the auxiliary copy is a synchronous copy, the new copy can be a selective or
   synchronous copy. However, if the source of the auxiliary copy is a selective copy, the new copy must be a selective
   copy also.
   If a copy is using a source copy for auxiliary copy operations, the source copy should not then designate the original
   copy as a source copy also.
   By default, when a browse or data recovery operation is requested (without specifying copy precedence), the software
   attempts to browse/restore/recover from the storage policy copy with the lowest copy precedence. If the media for the
   copy with the lowest precedence is offsite, damaged, or if hardware resources are unavailable, then a specific storage
   policy copy must be specified in the Copy Precedence tab of the Storage Policy Properties dialog box. For more
   information, see Change the Copy Precedence.
Backups On and After Date
  You can only move the Backups On And After Date forward for a storage policy copy once it has been defined. This
  date must be a date which occurs after the date you originally selected.
  If you change the Backups On And After Date to a date after the one selected, all data protection operations that
  were to be copied from the primary copy before the new date will not be copied when an auxiliary copy is run.
Data Path
   If you change the data path between two different libraries, and if the storage policy copy using the source library is
   configured for a data path failover, once the data path is changed you must configure the storage policy copy for a data
   path(s).
   If your storage policy is configured for Alternate Data Paths, do not perform change the data path for media associated
   with the primary copy of that storage policy.




                                                        Page 145 of 263
                                        Features - SharePoint Database iDataAgent




  When you change the data paths make sure that the libraries and drives in the source and destination library are
  compatible. Specifically, the destination library must be capable of reading the bar codes on the media for which you
  have changed the data path. See the library manufacturer's documentation for compatible bar codes.
  Make sure that the firmware of the source and target library can read the barcodes of media exactly the same way.
  Make sure that the drives of the destination drive pool are compatible with the recording format and hardware type of
  the migrating media.
      An example of recording format incompatibility: Data paths can be changed from DLT 4000 drives to DLT 7000 drives
      as tapes written by DLT 4000 drives can be read in DLT 7000 drives. However, data paths cannot be changed from
      DLT 7000 drives to DLT 4000 drives, as tapes written by DLT 7000 drives cannot be read in DLT 4000 drives.
      Another example of hardware type incompatibility: DLT tapes cannot be inserted into AIT or Mammoth drives or vice-
      versa.
      Media from an NDMP drive pool can only be changed to another NDMP drive pool. (An NDMP drive pool is one
      containing drives that are attached to a NAS filer rather than to a MediaAgent.) Data Paths cannot be changed from
      NDMP to non-NDMP libraries.
  There are sufficient drives in the destination drive pool to accommodate all of the streams of the copy for which the data
  paths are changed. For example, a drive pool should contain at least three drives to accommodate a three-stream copy.
  It is possible to change the data paths for compatible libraries to stand-alone libraries and vice-versa, and between
  compatible stand-alone drives to stand-alone drives.
  When you change the data paths from one library to another, you must physically remove all of the media from the
  source library and insert them into the destination library. It is strongly recommended that you export such media from
  the source library and immediately import them in the target library. If you want to export all the media from the
  library, you can use the Mark Media Exported option from the library level. For more information, see Export Media.
  Once you change the data path and import the media in the target library, subsequent data protection operation, which
  uses these media, will mark active media as Appendable and use a new media.
  If you change the data path on media from a shared library, there is no need to export media from the source library, as
  the source library and the target library are the same.
  Media is marked as Appendable once it is migrated from any library to another library.
  Data paths for NAS attached libraries can only be added if the MediaAgent used in that data path also has the File
  System iDataAgent installed on that computer. This is applicable only for Windows MediaAgents.


Delete a Storage Policy Copy
  When a copy is deleted any data on that copy is permanently lost and hence becomes unavailable for data recovery
  operations.
  All corresponding media becomes available for reuse and moved to the corresponding scratch pool.
  A secondary copy cannot be deleted if there are any data aging, data recovery operations, or Auxiliary Copy jobs
  running. Check to see if there are any of these jobs running before attempting to delete the copy.
  A primary copy of a storage policy cannot be deleted. If you want to delete a primary copy then the entire storage policy
  must be deleted.
Disable a Job
   Marking a job disabled is an irreversible operation that cannot be undone.
   A job that has been disabled can still be restored/recovered.
   If a primary copy has a disabled job, and during a data recovery operation the software cannot find any data, data from
   the disabled job will be used.
   If you disable a backup of the last cycle that has occurred, this forces the next backup to be a full backup.
   For the Exchange Database and Image iDataAgents, if you disable an incremental or differential backup, all subsequent
   backups will be disabled up to the next full backup.
Hardware Compression
  You cannot enable hardware compression on a copy that uses an optical or magnetic library.
  NetApp attached drive pools: This procedure cannot change the hardware compression setting, which is determined by
  the access path selected when configuring the drive. By default hardware compression while be shown as enabled.
Inactive Storage Policy Copy
  If you mark a copy inactive, your primary copy data can still be pruned without being copied.
  Once a copy is marked as inactive, it cannot be used to transfer data to media.




                                                     Page 146 of 263
                                          Features - SharePoint Database iDataAgent




Prune a Job
  Once a job is pruned from a storage policy copy, it cannot be restored.
  If you prune a job of the last cycle that has occurred, this forces the next job to be a full data protection operation. This
  is done to ensure a consistent cycle that includes all available data.
Retain a Job
   From the CommCell Console, only full data protection operations can be manually retained. However, if necessary, the
   Command Line Interface jobretention qoperation can be used to manually retain any type of data protection operation
   for all agent types.
   Once a job has been manually retained, it will not be pruned during a data aging operation.
   If a job is manually retained, it will still be pruned as a result of the following operations:
       Deletion of a backup set or instance/partition.
       Deletion of a Storage Policy
       Deletion of a Storage Policy copy
       The Overwrite Media option is enabled on the library
       Deleting the contents of a media
Retention Rules
  Do not change the retention rule of a copy while a data protection, data recovery, or auxiliary copy operation is running.
  It is recommended that secondary copies have a retention period that is greater than or equal to that of the primary
  copy.
  If the copy is of a storage policy that is associated with an incremental storage policy, the retention period of the
  primary copy of the full storage policy copy must be greater than or equal to the retention period of the primary copy of
  the incremental storage policy.
  The Basic Retention Rule for a DataMigrator/Archiver storage policy can only be defined by time, not cycles.
  Basic Retention Rules must be defined before Extended Retention Rules can be selected.
  You can set either 0 days or 0 cycles as a basic retention rule on a primary copy, only when there is an active
  synchronous copy for the storage policy.
  Extended Retention Rules must be chosen in ascending order by the number of days and the selected rule type.
  All non-full backups after the Basic Retention Rules are met are pruned, regardless of any Extended Retention Rules set.
  If multiple backups reside in the same time period of the Extended Retention Rule, the retention order of priority is as
  follows:
      The backup is fully copied, but not disabled.
      The backup is fully copied, but disabled.
      The backup is marked as Partial or To be Copied (data for the job is available on the primary copy)
      The backup is marked as Partial (data from the job is pruned, disabled, or Partial on the primary copy)
Secondary Copy
  A synchronous copy must be active to be promoted to be a primary copy.
  Selective copies cannot be promoted to be a primary copy.
  It is recommended that the synchronous copy be synchronized with the primary copy before it is promoted. If it is not,
  you may suffer data loss if data from the primary copy has not yet been copied to the secondary copy before the
  secondary copy is promoted. Also, unsynchronized promotion causes the next backup to be a full backup.
  The retention period and all defined attributes for a copy is retained when promoted. Therefore, it is recommended that
  you change the retention period of the copy so that it has a greater than or equal to retention period of the primary
  copy. See Change the Retention Rules of a Storage Policy Copy for details on changing the retention period of a storage
  policy copy.
  It is recommended that you do not promote a copy to be the primary copy while a data protection, data recovery, or
  auxiliary copy operation is running.
  If your secondary copy that you want to promote uses the Combined to <n> Streams option, then that copy must
  have the same amount of drives available as the primary copy.
  Data Multiplexing is not supported when creating a secondary copy belonging to a Centera library.
Spool Copy
  Only a primary copy can be marked as a spool copy with a 0 days and 0 cycles basic retention rule.
  There must be an active synchronous copy.
  Once data is copied to a secondary copy, all data is pruned from a spool copy during a data aging operation.



                                                       Page 147 of 263
                                          Features - SharePoint Database iDataAgent




Alternate Data Paths (GridStor)

For a detailed description of the feature, see the following topics:
   Overview
   Configuring Alternate Data Paths for Primary Copies
      Adding Data Paths to Primary Copies
      Automatically Adding Data Paths for Existing Libraries
      Defining the Criteria for Using Alternate Data Paths
      Setting the Number of Streams for Alternate Data Paths
   Configuring Alternate Data Paths for Secondary Copies
   Configuring Alternate Data Paths for Subclients
   Data Protection Operations using Alternate Data Paths
      Media Usage
   Data Recovery Operations using Alternate Data Paths
      When Media is Available in a Library
      When Media is Exported
   Index Check Pointing
   Important Considerations
   Best Practices
   Support Information - Alternate Data Paths (GridStor)
   How To
   Troubleshoot Data Paths
   Troubleshoot Shared Indexes
See Also:
   Load Balancing Operations Between Libraries
   When WAN Links Cannot Support the Full Backup Data Transfer


Overview
Several data paths can be added to a storage policy copy, to ensure the success of data protection and other operations
conducted using the storage policy. A data path is the combination of MediaAgent, Library, Drive Pool and Scratch Pool used
by the storage policy copy to perform a data protection operation. Each storage policy copy has a default data path which
will be used to perform data protection operations. In addition, you can also define alternate data paths in each of the
storage policy copies.
Alternate data paths provide the following advantages:
   Alternate data paths provide the facility to automatically switch over to an alternate data path, when one of the
   components in the default data path is not available. In addition to ensuring the successful completion of data protection
   jobs, alternate data paths utilize available libraries and drives in the event of failure or non-availability of these
   resources.
   Alternate data paths can be used to minimize media utilization by routing data protection operations from several
   subclients to the same storage policy and hence the same media, instead of creating several storage policies which in
   turn utilizes a different media for each subclient.
   In addition the facility to load balance (spill and fill) between alternate data paths provides the mechanism to load
   balance or evenly distribute the data protection operations between available resources.
Alternate data paths are supported for both the primary and secondary copies associated with storage policies for all
libraries. (See Support Information - Alternate Data Paths (GridStor) for additional details.) Note, however, that there are
several differences between the operations performed using primary and secondary copies with alternate data paths. These
are explained in detail in the following sections. In addition, within the selected storage policy (and its data paths), the
facility to define a subset of the data paths at the subclient level is also provided.
License Requirement

This feature requires a Feature License to be available in the CommServe.



                                                       Page 148 of 263
                                          Features - SharePoint Database iDataAgent




Review general license requirements included in License Administration. Also, View All Licenses provides step-by-step
instructions on how to view the license information.




Configuring Alternate Data Paths for Primary Copies
The following options are provided while defining alternate data paths on primary copies:
   Facility to automatically configure the data paths for shared libraries.
   Facility to select the alternate data path based on any one of the following:
      Use alternate data path when resources are busy or offline,
      Load balance (Spill and fill) between available resources, or
      Use alternate data path to perform LAN-free data protection operations.
      Note that the client and the MediaAgent must be on the same computer, in order to perform a LAN-free operation.
The following sections describes each of these options in detail.
Adding Data Paths to Primary Storage Policy Copies
If a storage policy is created during the library configuration
process, a default data path is created for the primary copy
using the MediaAgent, Library, Drive Pool and default scratch
pool combination for drive pools configured within the library. If
you create a new storage policy, you must specify a Library,
MediaAgent, Drive and Scratch pool combination for the primary
copy.
Additional data paths for the primary copy can be defined from
the Data Paths tab of the Copy Properties dialog box. (See
Add a Data Path to a Storage Policy Copy for step-by-step
instructions.)
The data paths that are available to be added as alternate data
paths, depends on the option selected in this dialog box. In
addition some of the options may require the index cache to be
shared to be accessed as a data path. (These details are
explained in the subsequent sections of this document.)
After defining additional data paths, if necessary, you can set
any of the data paths as the default data path for the storage
policy copy. (See Set a Data Path as the Default Data Path for
step-by-step instructions.)




Automatically Adding Data Paths for Existing Libraries
When multiple MediaAgents share the same library (SAN DDS, or direct-attached shared library configurations) the system
can automatically add the alternate data paths for each of the storage policies, when this option is enabled. As each of
these data paths (MediaAgent, Library, Drive Pool and Scratch Pool) use the same resources, additional index cache
configuration is not required. In addition, the criteria for using the alternate data path (described in the following section)
must also be specified.
Defining the Criteria for Using Alternate Data Paths
A storage policy can be configured to use an alternate data path using the following criteria:
   When resources are busy or offline - use this option to configure your system to use an alternate data path when
   resources are busy or offline.




                                                       Page 149 of 263
                                          Features - SharePoint Database iDataAgent



   Load balance between the data paths - use this option to evenly distribute data protection operations amongst drive-
   pools, thereby not overloading a specific drive-pool.
   LAN preferred datapath - use this option to automatically perform LAN-free data protection operations.
When resources are busy or offline
When this option is selected the system automatically uses an
alternate data path when resources are busy or offline.
If the When resources are offline option is selected, the
storage policy will use an alternate data path when one of the
following resources is broken or not available and hence marked
as offline by the user or by the MediaAgent:

   MediaAgent
   Library
   Master Drive Pool
   Drive Pool
   All the drives in the Drive Pool
   No spare media in the scratch pool associated with the copy
If the When resources are busy option is selected, the storage
policy will use an alternate data path when all the drives in the
library are busy.
In both the above options, you can indicate whether an
alternate data path must be used immediately or after the
specified amount of time.




The list of data paths that will be available when this option is selected will include the following:
  Data paths associated with MediaAgents that share the library with the default data path. In this case, it is not necessary
  to share the index cache, but the number of alternate data paths will be limited to the list of MediaAgents that share a
  library.
  List of MediaAgents that share the index with the MediaAgent associated with the default data path. In this case, as
  other libraries can be included in the list of data paths several alternate data paths can be added. However, keep in mind
  that the index cache must be shared. (See Create Shared Index Cache for step-by-step instructions.)
Load balance (Spill and fill) between Data Paths

When this option is selected the system automatically performs load-balancing between the resources (drives in a library or
writers in a magnetic library) available in all the data paths. Keep in mind that the load-balancing is performed at the drive-
pool level as opposed to the MediaAgent level.
The following section illustrates the load balancing operation:
If you have defined 5 data paths with 15 resources, and have 25 data protection operations running concurrently at a given
time, load balancing would cause the following to occur:
   The first operation will be performed on the default data path.
   The second (and subsequent operations) will be performed in the next data paths, in the order in which it is added in the
   Data Paths tab of the Copy Properties dialog box.
   Once the first 5 operations reserves the resources, the sixth operation will be routed to another resource in the default
   data path, if one is available. The subsequent operations will be routed to the next data path in the order in which it is
   added in the Data Paths tab of the Copy Properties dialog box, until all the resources are occupied.
   Once all the resources are used, the system will constantly check for an available resource, and as soon as one is freed
   the next job in the queue will be automatically routed to use that resource.
All the MediaAgents that share the index cache with the MediaAgent in the default data path will be available as an
alternate data path when this option is selected.



                                                       Page 150 of 263
                                          Features - SharePoint Database iDataAgent




See Also:
   Load Balancing Using Spill and Fill
Use preferred datapath

When this option is selected the system automatically performs LAN free backups wherever possible. It is not necessary to
share the index cache for this operation and all available MediaAgents will be available as an alternate data path when this
option is selected.
Setting the Number of Streams for Alternate Data Paths
When you add or delete an alternate data path, you must reset the number of streams that are defined for the Storage
Policy.
The maximum number of streams for a storage policy, with a primary copy that has alternate data paths should be equal to
the sum of all unique drives associated with the drive pools and/or the sum of all writers in the mount paths associated with
magnetic libraries in all alternate data paths. Consider the following scenarios, when the maximum number of streams
defined is either too many or too little, when you have specified the criteria to immediately use alternate data paths when
resources are busy:
   If the maximum number of streams in a Storage Policy is less than the sum of drives/writers in mount paths associated
   with all the data paths in a primary copy, then all the resources (drives/writers in mount paths) available in the data
   paths will not be utilized. For example, if the sum of drives/writers in mount paths in all the data paths is 20, and you
   have specified 10 as the maximum number of streams, at any given time, only 10 jobs would succeed and the remaining
   jobs would go into the Waiting status with the Job Delay Reason stating that no resources are available for the job. In
   such a situation to fully utilize all the available resources, the maximum number of streams should be set to 20.
   If the maximum number of streams in a Storage Policy is more than the sum of drives/writers in mount paths associated
   with all the data paths in a primary copy, only as many jobs as the total number of available drives will succeed. For
   example, if the sum of drives/writers in mount paths in all the data paths is 20, and you have specified 30 as the
   maximum number of streams, at any given time, only 20 jobs would succeed and the remaining jobs would go into the
   Waiting status with the Job Delay Reason stating that no resources are available for the job.
Jobs with Multiple Streams

For multi-stream jobs, the failover will occur only when all the streams have the necessary resources. For example, if you
have a job with 5-streams, and if the necessary resources are not available in the default data path, the failover will occur
only when the alternate data path has all the necessary resources - MediaAgent, Library and drive pool with 5 drives. This
is the case, irrespective of the criteria (when resources are busy, Load balance (Spill and fill) between Data Paths) specified
to use alternate data paths.




Configuring Alternate Data Paths for Secondary Copies
Data paths can be added to secondary copies to enable LAN free Auxiliary Copy operations, so that network resources can
be freed wherever possible.
Adding Data Paths to Secondary Copies
When a secondary copy is created, you must select the default
data path for the copy by selecting the MediaAgent, Library,
Drive Pool and scratch pool combination. This data path will be
used to access the secondary copy when an Auxiliary Copy
operation is performed.
However you can add data paths for the secondary copy so that
any Auxiliary Copy operations can be performed using a LAN-
free data path.
As with the primary copy, additional data paths for the
secondary copy can also be defined from the Data Paths tab of
the Copy Properties dialog box. (See Add a Data Path to a
Storage Policy Copy for step-by-step instructions.)
Note that although the Use preferred datapaths option is
selected, the LAN free data Auxiliary Copy operations on the
copy is not performed until the alternate data paths are



                                                       Page 151 of 263
                                          Features - SharePoint Database iDataAgent




selected.
When you add the data path for the secondary copy it is
sufficient to add one path per MediaAgent-Library combination.
The system automatically uses an available data path to
perform LAN free Auxiliary Copy operations. Keep in mind, that
when you add data paths in the secondary copies, the system
automatically tries to perform a LAN-free read operation. (This
is opposed to the primary copies, where the system strives to
perform both the read and write operations when the LAN-free
option is selected.) See also: Although common data paths are
defined in primary and secondary copies, another data path is
being used for Auxiliary Copy operations.




Examples
Alternate data paths on Secondary Copies can be used to perform LAN free Auxiliary Copy operations as follows:
   Using magnetic as primary and a tape/optical library for secondary copies. See Example 1 for more information.
   Using a tape/optical library for both the primary and secondary copies. See Example 2 for more information.




Configuring Alternate Data Paths for Subclients
Each subclient can be configured with a subset of data paths from the data paths available in the storage policy associated
with the subclient . The following options are provided while defining the data paths for a subclient:
   Facility to select a subset of data paths from the list of available data paths.
   Facility to assign a priority for the selected data paths.
   If necessary, facility to override the default data path on storage policy and use the other data path from the subset of
   data paths available for the subclient.
Note that the data paths and the priority established at the subclient level takes precedence over the data paths defined at
the storage policy copy.

Adding Data Paths to Subclients
By default, the system uses the data path associations defined in the
primary copy of the storage policy to perform data protection
operations. (This is depicted in the sample image shown on the right.
Note that the Override Datapaths option is not selected and the
default data path is displayed using a bold font-face and a special
icon.) If necessary you can perform the following operations:
   Select a subset of the available data paths for the subclient.
   Set a priority for the selected data paths.




                                                       Page 152 of 263
                                          Features - SharePoint Database iDataAgent




Points to Remember
Consider the following for configuring data paths at the subclient level:
   Subclient data paths are supported by all agents that require an iDataAgent Backup Storage Policy.
   If necessary, different data path subsets can be selected for database agents that use different storage policies for data
   and log files. (See, Classification of Agents based on Index Usage for a definition of database agents.)
   When a secondary (storage policy) copy is promoted as the primary copy, the data paths defined in the secondary copy
   is automatically used by the subclient. If necessary, you must establish the default data path and set the priority once
   the secondary copy is promoted.
   When a data path is deleted from the storage policy (or a library is deconfigured) the data path is automatically removed
   from the subclient.
   Subclient data paths are not supported by Subclient Policies.
   Subclient data paths cannot be configured using the Command Line interface.
   In the case of Incremental Storage Policies, which uses two different storage policy copies for full and non-full backups,
   different data path subsets can be selected for full or non-full backups using the same or a different storage policy. In
   such a situation, make sure that the selected data paths share the index.
Examples
The subclient data paths can be used effectively in the following situations:
   To configure subclients to use certain data paths, and minimize media utilization. See Example 1 for information on how
   this works.
   Restrict some subclients to use only a subset of data paths and at the same time reduce media utilization. See Example
   2 for information on how this works.




Data Protection Operations using Alternate Data Paths
When a data protection operation is initiated, the storage policy copy attempts to write the data using the default data
path. If the default data path is not available, an alternate data path is automatically used to perform the data protection
operation. If more than one alternate data path is defined, the first data path listed in the Data Paths tab of the Copy
Properties dialog box is selected, followed by the second and so on until a data path is available.

Media Usage
If both the default and alternate data paths are configured to use the same library, as a result of a shared library
configuration (configured as a SAN DDS library or direct-attached shared library) the MediaAgent will automatically use the
appropriate Assigned media for the data protection operation.

If the default and alternate data paths are configured to use different libraries, the MediaAgent, marks the previously used
Assigned media as Appendable and uses a new media from the library associated with the alternate data path.

Such Appendable media can be re-used in the library by enabling the Use Appendable Media option in the Library
Properties dialog box associated with the library.




Data Recovery Operations using Alternate Data Paths
Data can be restored/recovered from any compatible library and drive type in the CommCell.
When Media is Available in a Library
When a Data Recovery operation is initiated, and if the media is not exported, the software attempts to restore/recover
data using the appropriate data path associated with that library, instead of the default data path in the following order:
   The first priority is provided to the path which results in LAN-free restore/recover to the client computer from which the
   restore/recover operation was initiated. LAN-free operation is possible only when the client initiating the restore/recover
   operation and MediaAgent are on the same computer.
   If the LAN-free operation is not possible, then the Data Recovery operation attempts to restore/recover data using the




                                                       Page 153 of 263
                                          Features - SharePoint Database iDataAgent



   default data path.
   If the appropriate media is not available in all these data paths, the software automatically identifies the media in which
   the data resides and performs the restore/recover operation from that library.
When Media is Exported
When a Data Recovery operation is initiated, and if the media is exported, the software will prompt you to import the media
in the appropriate MediaAgent computer. This is done as follows:
If a LAN-free restore/recover is possible, the restore/recover operation would prompt you to import the media in the
appropriate library from which the LAN-free restore is possible. (LAN-free restore will be possible only when the client
initiating the restore/recover operation and MediaAgent are on the same computer.)
If the LAN-free restore/recover operation is not possible, then the operation would prompt you to import the media in the
library which was last used to write to the media.
If the resources in that library are offline, the restore/recover operation would prompt you to import the media in the
library associated with the default data path.
If the resources associated with the default data path are offline, then the restore/recover operation identifies a library from
an alternate data path which are assigned in the data path list, to import the media.
See Also:
   Restore From Anywhere




Index Check Pointing
During data protection operations, the system will start the backup from the Pre-Scan phase if the job is terminated
abnormally in the backup phase during the following situations:

   Loss of connectivity to shared index cache folder
   Power or any sudden failure on a MediaAgent computer
   Cluster fail-over of a MediaAgent
You can use the index check pointing feature to continue the data protection operation from the point of failure. This can be
done by adding the sindexcheckpoint registry key in the Windows and Unix MediaAgents.

Index check pointing is not supported by the NAS NDMP iDataAgents, Active Directory iDataAgent and for Agents that
support Content Indexing if Content Indexing is installed. If this registry key is created, these agents will skip the registry
key for index check pointing and continue the data protection operation.




Important Considerations
Consider the following information when using alternate data paths:
General Considerations
   Jobs that are run using an alternate data path cannot preempt other jobs. Similarly such jobs can also be preempted by
   other jobs which does not use an alternate data path.
   Change Data Path (right-click option) should not be performed on Storage Policies with Storage Policy Copies that have
   Alternate Data Paths (GridStor). If Change Data Path is performed on such a setup, data recovery operations can be
   performed from the media. However, subsequent data protection operations will not re-use the migrated media.


Clustered Environment
   On clustered computers the system automatically performs LAN free operations for Agents installed on the virtual
   machines with the storage policy copy (attached to the subclient) pointing to the MediaAgent on the physical node.
   Consider the following example:
   A file system iDataAgent is installed on Virtual Machine (VM1) and can be controlled by Node 1 or Node 2 at any given
   time.



                                                       Page 154 of 263
                                          Features - SharePoint Database iDataAgent




   The subclient (subclient1) associated with this file system iDataAgent on VM1 points to a Storage Policy Copy (SP1)
   which in turn uses the following data paths:
      default data path using MediaAgent (Node1) and Library 1
      alternate data path MediaAgent (Node2) and Library 1
   When a backup is run on subclient1, the system automatically figures out the node controlling VM1 and will use the
   appropriate MediaAgent. For example if VM1 is controlled by Node 2 at the time of the backup, the system automatically
   uses the MediaAgent on Node2 to perform the LAN free backup.
This capability allows you to install the MediaAgent on the physical node of a cluster. (Instead of multiple instances if
installed on the virtual server.) However you will need GridStor to provide failover capabilities.


Considerations for NAS attached libraries
   Data paths for NAS attached libraries can only be added if the MediaAgent used in that data path also has the File
   System iDataAgent installed on that computer. This is applicable only for Windows MediaAgents.
   For NAS NDMP environments, refer to Storage Policy Considerations for additional information.


Considerations for backing up the Microsoft Virtual Server
  If the MediaAgent software is installed on the virtual server, configure a magnetic library to backup the data.
  If you wish to configure a tape/optical library, install the MediaAgent software on the physical computer.
   Add a data path which uses this MediaAgent, Library, Drive Pool and Scratch Pool combination to the Storage Policy used
   to backup the virtual server. (See Add a Data Path to a Storage Policy Copy for step-by-step instructions.)
   Assign this as a high priority data path in the subclient(s) used to backup the virtual server. (See Assign Priorities for
   Subclient Data Paths for step-by-step instructions.)




Best Practices
Consider the following information and recommendations, while creating and using alternate data paths:
   For LAN-free clients do not enable the When Resources are Busy option to choose an alternate data path. This will
   ensure LAN free operations, wherever possible.
   It is not necessary to share the index on MediaAgents with LAN-free data paths. However, even if one additional
   alternate data path on the LAN is added to the storage policy, you must share the index for all the MediaAgents in the
   data path list.
   Create and use less number of storage policies, as large number of storage policies will result in the fragmentation of
   data on media. Consider the following:
       Control data retention by creating copies within the Storage Policies. (As opposed to creating many storage policies
       with different retention periods.)
       Consolidate each Client's data by creating Subclient-Based Storage Policy Copies.
   Before you deconfigure a library, verify and ensure that none of the Storage Policy Copy's default data path points to the
   library. (See View the Storage Policies Accessing a Library for step-by-step instructions on how to view the storage
   policies associated with a library.) If necessary, set an alternate data path as the default data path before deconfiguring
   the library.
   If you have a storage policy copy with no default data path, use Change Data Path option to migrate the storage policy
   to point to another data path. See Change Data Path for more information.
   NAS NDMP and EMC Centera Load-Balancing - in addition to the resource load balancing that the Alternate Data Paths
   feature provides, NAS NDMP and EMC Centera can be configured to load balance the processing tasks associated with
   backup, restore, and auxiliary copy jobs, which normally run on a single client machine, to spread the processing among
   different MediaAgent. For more information, see Subclients - NAS NDMP - Load-Balancing Considerations or Subclients -
   EMC Centera - Load-Balancing Considerations.




                                                       Page 155 of 263
                                           Features - SharePoint Database iDataAgent




Jobs on a Storage Policy Copy

Choose from the following topics:
   Overview
   Manually Retain a Job on a Storage Policy Copy
   Manually Select a Job To be Copied to a Selective Copy
   Prune a Job From a Storage Policy Copy
   Disable a Job From a Storage Policy Copy
   How To
   Support Information - Storage Policy Copy
Related Topics:
   Storage Policy Copy Operations
   Storage Policy Copy Properties
   Effect of Disabled Jobs on Data Aging


Overview
The software provides information about the jobs that are written to each storage policy copy of an iDataAgent data
protection operation Storage Policy, DataMigrator/Archiver Storage Policy, or a Disaster Recovery Backup Storage Policy.
These jobs can be viewed from the View Jobs option on the copy. Once selected, jobs for these storage policies are
displayed in the Jobs for Copy window.




Disaster Recovery backups on a Disaster Recovery Backup storage policy can be viewed from the DR Backups for Copy
window.
Status Levels
From these windows, jobs are displayed with the following status levels:

Status                                     Copy Type                                   When Displayed
Available                                  Primary, Synchronous, and Selective         Data is available on the primary copy for
                                                                                       data recovery, data aging, and auxiliary
                                                                                       copy operations. On the secondary copy,
                                                                                       this status displays after data is copied
                                                                                       from the primary copy and is now
                                                                                       available for data recovery and data
                                                                                       aging operations.
Available and Retained                     Primary, Synchronous, and Selective         A job is retained and will not be pruned
                                                                                       when a data aging operation is run.
Not Selected                               Selective                                   A job has not been manually selected for
                                                                                       being copied to a selective copy.
Partial                                    Synchronous and Selective                   Data is partially copied from the primary




                                                        Page 156 of 263
                                          Features - SharePoint Database iDataAgent




                                                                                      copy.
Partial & Not Selected                    Selective                                   A job was partially copied and de-
                                                                                      selected for being copied to a selective
                                                                                      copy.
To Be Copied                              Selective and Synchronous                   Data is selected to be copied from the
                                                                                      primary copy to the secondary copy.
To Be Copied and Deferred                 Selective and Synchronous                   Data is selected to be copied from the
                                                                                      primary copy to the secondary copy, and
                                                                                      it has not met the specified number of
                                                                                      deferred days.

The View Jobs option is useful in various circumstances, including the following:

   To   know when a job was created in order to know when it will be eligible for pruning.
   To   check that all jobs from a primary copy were successfully copied to a secondary copy.
   To   see those jobs that were not yet copied to a secondary copy.
   To   pick a job to be verified during a Data Verification operation.
   To   manually retain a job from being pruned during a data aging operation.
   To   manually select a job on a copy to be copied to a selective copy
   To   prune a job from a storage policy copy.
   To   disable a job on a storage policy copy.
Back to Top



Manually Retain a Job on a Storage Policy Copy
The Retain Job feature allows you to prevent a job from being pruned during a data aging operation. A job can be
manually retained from the Jobs for Copy window.




Once a job is selected to be retained, the status for that job is displayed as Available and Retained in the Status
column. Once selected, this job will not be pruned when a data aging operation is run.




                                                       Page 157 of 263
                                          Features - SharePoint Database iDataAgent




Once retained, the job can be de-selected for manual retention using the Do Not Retain Job Option.




Back to Top



Manually Select a Job To be Copied to a Selective Copy
During an auxiliary copy operation, a full data protection operation is copied to the selective copy. The data protection
operation that is selected for the copy is automatically selected by the software according to rules specified in the Selective
Copy tab of the Storage Policy Properties dialog box when the copy was created. However, if a selective copy is created as
a Manually Select Full Backups Copy, a full data protection operation will be copied to it during an auxiliary copy
operation only if that job is manually selected to be copied. This can be especially useful for those needing to control the full
data protection operations that are copied to the selective copy.
For example, a selective copy, manual selection, was created as a Manually Select Full Backups copy. In order to
select a job to be copied to this selective copy, you must first view the list of jobs that are not selected to be copied.
To view the jobs on this selective copy, right-click the copy, and select View --> Jobs. This will initiate the Job Filter for
Storage Policy: <Storage Policy Name>, Copy: <Selective Copy Name> dialog box.




                                                       Page 158 of 263
                                        Features - SharePoint Database iDataAgent




Click the Advanced button to initiate the Backup History for Copy Advanced Options dialog box. From this dialog box,
check the Jobs that will not be copied option, and Click OK.




This will bring you back to the Job Filter for Storage Policy dialog box. Click OK to initiate the Job for Storage Policy
window, which will display the jobs that will not be copied to the selective copy.




                                                     Page 159 of 263
                                         Features - SharePoint Database iDataAgent




Right-click on the job that you want to copy to the selective copy, and select the Select for copy option.




When the job is selected, the job is displayed with a status of To be Copied. The job will now be copied to selective copy
during the next auxiliary copy operation.




                                                      Page 160 of 263
                                          Features - SharePoint Database iDataAgent




You can also deselect the job for the copy by simply right-clicking on the job and selecting the Deselect for Copy option.




           When a selective copy is not created with the Manually Select Full Backups Copy option, users can
           still manually select and deselect the full data protection operations that will be copied or not copied to the
           selective copy. This provides users with the flexibility to always control the jobs that are copied or not
           copied to the selective copy during an auxiliary copy.

Back to Top



Prune a Job From a Storage Policy Copy
The Job Based Pruning feature allows you manage the aging of your data and tape rotation manually. If your jobs are
unnecessarily taking up crucial media resources and you do not want to wait until these jobs meet the specified storage
policy copy retention criteria, these jobs can be individually pruned.
Pruning a job from a copy will remove the job permanently from the CommServe database before it has exceeded its
retention criteria. The software will also prune all jobs that are dependent on the job you have selected for pruning. For




                                                       Page 161 of 263
                                          Features - SharePoint Database iDataAgent




example, if you select a full data protection operation job to be pruned, all subsequent jobs will be pruned up to the next
full data protection operation. Also, if an incremental or differential data protection operation job is selected, only that job
will be pruned. In addition, if the most recent full data protection operation is pruned from the primary storage policy copy,
the next job of that subclient will be converted to a full data protection operation.
A job can be pruned from a copy from the Jobs for Copy window:




Once pruned from the copy, by default the job cannot be restored during a data recovery operation. However, if you have
the need to recover such data, see Accessing Aged Data.

                          For all agents, pruning a job may prevent you from recovering any data from the data protection
                          operation cycle.

                          In job-based pruning for the Image Level ProxyHost iDataAgent, if you prune an incremental or
                          differential backup, all subsequent backups will be pruned up to the next full backup.

                          In job-based pruning for the Microsoft Exchange Server Database iDataAgent, if you prune an
                          incremental or differential backup, all subsequent backups will be pruned up to the next full
                          backup.

                          Within the Microsoft Windows File System, in job-based pruning for the Windows XP (32 and 64
                          bit) iDataAgent and the Windows Server 2003 iDataAgent, ASR backups are treated
                          independently. If you prune an ASR Backup, it will only prune the data protection operations that
                          you select.

                          Job-based pruning for Oracle, Sybase, Informix, and DB2 is only supported for selective copies.

Back to Top



Disable a Job From a Storage Policy Copy

The Disable for Copy feature allows you to manage your Auxiliary Copy operations manually. This feature is useful if an
auxiliary copy operation fails to read data that is on a bad media and cannot continue. When media becomes bad or
inoperable, the Auxiliary Copy operation is not able to successfully copy data from that media to the destination media.
Marking a job disabled from a storage policy copy removes the capability of a job (that may be on bad media) to be copied
during an Auxiliary Copy operation. A disabled full data protection operation is also not counted as a valid cycle during a
data aging operation. The software will also disable those jobs that are dependent on the job you have selected to be
marked disabled. For example, if you select a full data protection operation to be disabled, all subsequent data protection
operations will be disabled up to the next full data protection operation. Also, if an incremental or differential data
protection operation job is selected, only that job will be disabled. In addition, if the most recent full data protection
operation is disabled from the primary storage policy copy, the next data protection operation of that subclient will be
converted to a full data protection operation.
Jobs can be disabled on a copy from the Jobs for Copy window.




                                                       Page 162 of 263
                                      Features - SharePoint Database iDataAgent




                       In marking a job disabled for the Image Level ProxyHost iDataAgent, if you disable an
                       incremental or differential backup, all subsequent backups will be disabled up to the next full
                       backup.

                       In marking a job disabled for the Microsoft Exchange Server Database iDataAgent, if you disable
                       an incremental or differential backup, all subsequent backups will be disabled up to the next full
                       backup.

                       Within the Microsoft Windows File System, in marking a job disabled for the Windows XP (32 and
                       64 bit) iDataAgent and the Windows Server 2003 iDataAgent, ASR backups are treated
                       independently. If you disable an ASR Backup, it will disable the data protection operations that
                       you select.

                       Marking a job disabled for Oracle, Sybase, Informix, and DB2 is only supported for selective
                       copies.

Back to Top



How To
  View the Jobs of a Storage Policy Copy
  Manually Retain a Job on a Storage Policy Copy
  Prune a Job From a Copy
  Prune a Disaster Recovery Backup From a Disaster Recovery Backup Storage Policy Copy
  Disable a Job From a Copy
  Disable a Disaster Recovery Backup From a Disaster Recovery Storage Policy Copy
  View the Job Details of a Job on a Copy
  View the Events of a Job on a Storage Policy Copy
  View the Items that Failed for a Job on a Storage Policy Copy
  View the Media of a Job on a Storage Policy Copy
Back to Top




                                                   Page 163 of 263
                                          Features - SharePoint Database iDataAgent




Data Multiplexing

Select the desired topic:
   Overview
   How Data Multiplexing Works
   Configure for Data Multiplexing
   Determining the Multiplexing Factor
   Perform a Multiplexed Data Protection Operation
   Using QiNetix Features With Data Multiplexing
   Best Practices
   License Requirement
   Support Information - Storage Policy Copy
   How To


Overview
In a typical storage policy configuration, many clients/subclients can point to the same storage policy. Each storage policy
copy has one or more streams related to the number of drives in a drive pool. On a particular stream, only one subclient
can perform a data protection operation at any one time. The limit for the number of data protection operations that can go
to any one stream is one. Therefore, only one data protection operation can be sent to a media/drive at any one time.
This limitation has its disadvantages. Backing up one client/subclient to a single piece of media does not fully utilize the
drive's throughput, as the backing up of client data can be much slower than actual speeds of the tape.
In a large enterprise with many clients, many data protection operations may need to be performed within a fixed backup
window. This may lead to high hardware requirement costs if the drive or media used for those data protection operations
is being under utilized.
To optimally use the high speed tape drives available today, data from several clients/subclients can be multiplexed and
written to media.
Chunk Size of Data That is Multiplexed
The chunk size of data that is multiplexed is determined by the types of data that is being multiplexed. These two types of
data are file system data and database data.
   If the first backup is a file system type backup, all other backups joining multiplexing will have a chunk size of 2 GB.
   If the first backup is a database type backup, all other backups joining multiplexing will have a chunk size of 8 GB.


How Data Multiplexing Works
During a data protection operation, agent data is transferred to media over a data pipeline. This data is transferred by data
movers that read agent data then write the data to the media.
During data multiplexing, many such data movers must read and write data to the same piece of media. To achieve this,
these data movers are comprised of two components, data receivers and data writers. During data multiplexing, one data
receiver per backup stream reads the data coming through the data pipeline. One data writer per media receives data from
multiple data receivers then writes data to the media.
In the sample image that follows, Subclient_A and Subclient_B are being backed up at the same time and their data is
being multiplexed. Multiple data receivers read the data and then one data writer writes the data to a single piece of media.




                                                       Page 164 of 263
                                         Features - SharePoint Database iDataAgent




Configuring for Data Multiplexing
To configure your subclients to use this feature, data multiplexing must be enabled from the Media tab of the Copy
Properties dialog box of the primary copy.

For example, if three subclients of this storage policy are to be backed up in a multiplexed manner, then the multiplexing
factor would be set to three.




                                                      Page 165 of 263
                                          Features - SharePoint Database iDataAgent




Determining the Multiplexing Factor
The multiplexing factor should be determined by analyzing your network configuration and by examining your needs for
maximizing disk throughput to decrease the total amount of time it takes to protect your data. The multiplexing factor is
determined by the following:
     Network card speed
     Network switch speed
     Drive speed
The following examples will help you determine the multiplexing factor. Keep in mind that these are only hypothetical
examples.

1.   Let's analyze a network configuration that involves three clients, without and with multiplexing.
2.   What happens when a fourth client is added to the example and the multiplexing factor is set to four.
3.   A fifth client is added, and the multiplexing factor is set to five, is this over-multiplexing?
4.   If you have over-multiplexed, either set the multiplexing factor lower and multiplex less clients, or add some gigabit
     Ethernet switches to your network.
5.   In another example, client disk speeds are fast and they become slower after multiplexing.

Note that the maximum multiplexing factor that can be set from the CommCell Console is 10 and the system displays a
warning message when the multiplexing factor is set to 5 or above.



Perform a Multiplexed Data Protection Operation
Once the multiplexing factor is set on the primary copy of the storage policy whose subclients are to be backed up, all data
protection operations of the storage policy can run at the same time, to the same piece of media.
In the sample image that follows, Job IDs 142, 140, and 141 are all backing up to Lib_Drive1.




Perform Data Multiplexing Using a Magnetic Library
Data Multiplexing can be performed on a magnetic library by setting the maximum number of streams on the magnetic
storage policy to a value equal to the number of data protection operations that are to be performed simultaneously. For
more information on setting the number of data streams, see Storage Policy Copy Properties.
De-Multiplexing Multiplexed Data
De-multiplexing segregates/de-multiplexes the data for selected clients/subclients from the larger list of clients. The
software does not require de-multiplexing; however, if you want to de-multiplex the data that you have multiplexed, you
can create a subclient-based storage policy copy for each subclient within the original storage policy copy, and then perform
an auxiliary copy operation on that copy.




                                                       Page 166 of 263
                                           Features - SharePoint Database iDataAgent




Be sure to adhere to Best Practices when using the data multiplexing feature.



Multiplexing and Data Streams
Data Multiplexing is performed differently based on whether or not you are performing multiple stream data protection
operations.
Data Multiplexing With Single Stream Data Protection Operations
In the following example, J1, J2, J3,and J4 have been run as single stream data protection operations. There are two
drives available, D1, and D2.

If there is no data multiplexing:
J1 will use D1, J2 will use D2. J3,and J4 will go into a waiting state until J1 and J2 have completed.

If data multiplexing was used with a multiplexing factor of two:
J1 and J2 will use D1. J3 and J4 will use D2.

Data Multiplexing With Multiple Stream Data Protection Operations
The following examples illustrate data multiplexing with data protection operations that use multiple streams.
Data Multiplexing with File System Multiple Stream Data Protection Operations

In the following example, there are two jobs, J1 and J2. Each job was run with three streams. There are two drives, D1 and
D 2.

If there is no data multiplexing:
J1 has three streams, and each stream uses D1, but they run one after another.

J2 also has three streams, and each stream uses D2, and they also run one after another.

If there is data multiplexing with a multiplexing factor of three:
The three streams of J1 can run concurrently to D1.

The three streams of J2 can run concurrently to D2.

Data Multiplexing with Database Multi Streaming

In the following example, a three stream database data protection operation is performed with a multiplexing factor of
three. J1, J2, and J3 are database data protection operations, and each used three streams. There are three drives
available, D1, D2, and D3.

If there is no data multiplexing:
D1 - J1

D2 - J1

D3 - J1

The second and third job (J2 and J3) must wait for the necessary resources.

If there is data multiplexing with a multiplexing factor of three.
The first job (J1) uses three drives, D1, D2, and D3:

D1 - J1

D2 - J1

D3 - J1




                                                        Page 167 of 263
                                         Features - SharePoint Database iDataAgent




The second and third job (J2 and J3) are multiplexed and use the same drives as J1:

D 1 - J 1, J 2, J 3

D 2 - J 1, J 2, J 3

D 3 - J 1, J 2, J 3

Therefore, J1, J2, and J3 use D1, D2, and D3 in parallel.



Using QiNetix Features With Data Multiplexing
The following features can be performed on multiplexed data without significant degradation of performance:
   A data recovery operation
   A data recovery operation using Media Explorer



Best Practices
It is recommended that you keep the following in mind when performing data multiplexing:
   Use different storage policies for file system and database type data before performing data multiplexing. Therefore,
   there will not be differences in the chunk sizes of the different types of data.
   If possible use the Restore by Jobs option to restore multiplexed data, especially when restoring large amount of data.
   This will provide the optimum performance during the restore operation as there are fewer tape rewinds to secure the
   data.
   It is recommended that you perform data multiplexing for jobs that have similar speeds (i.e. two database jobs), instead
   of mixing faster jobs (i.e. file systems) with slower jobs (i.e. databases). Mixing faster and slower jobs results in data
   stored on media that is not uniform.. Hence, data recovery operations of slower clients will have added performance
   penalty.
   Multiplexing is recommended if you are planning to recover:
       Individual items, files and folders.
       Entire computers or databases.
   It is not recommended under following conditions:
       If you are planning to recover scattered folders as multiplexing will further scatter the data. Also it adds to up to
       extra tape mounts and rewinding/forwarding on the media.
       Clients which undergo very frequent restore requests.
   The multiplexing factor is determined based on the ratio of how fast the tape drive is compared to the disk. For example,
   consider the following ratios:
       Tape write speed = 80 GB per hour
       Disk read speed (backup) = 25 GB per hour
       Tape read speed = 80 GB per hour
       Disk write speed (restore) = 60 GB per hour
   Tape write speed/disk read speed (backup) = 80/25 = 3.2 GB per hour
   Tape read speed/disk write speed (restore) = 80/60 = 1.33 GB per hour
   It is recommended that the lower of the two ratios as the multiplexing factor if you want no-penalty data recovery
   operations.



License Requirement
This feature requires a Feature License to be available in the CommServe.
Review general license requirements included in License Administration. Also, View All Licenses provides step-by-step
instructions on how to view the license information.




                                                      Page 168 of 263
                                            Features - SharePoint Database iDataAgent




Data Verification

Select the desired topic:
   Overview
   Configure a Copy for Data Verification
   Pick a Job for Data Verification
   Perform a Data Verification Operation
   Data Verification Considerations for NetApp NAS NDMP
   License Requirement
   Support Information - Storage Policy Copy
   How To


Overview
The software typically protects/archives data on various types of media. Once these operations have been performed, there
is no way to ascertain if the data is valid for recovering until a data recovery operation has been performed on that data.
During a data verification operation, data is checked to see that it is valid for recovering and for being successfully copied
during an auxiliary copy operation. You can verify data on all copies, or on a specific copy, and in parallel streams. A
specific data protection/archive operation can also be verified on a copy. It is also possible to define a time interval of how
long data verification for a data protection job is valid for a storage policy copy.
Back to Top



Configure a Copy for Data Verification
You can configure a copy for data verification so that all, full, or those
data protection/archive operations occurring on or after a certain date
will be verified during a data verification operation. A copy can be
configured for data verification from the Data Verification tab.

Note that data verification is not set by default for any agents.




Back to Top



Pick a Job for Data Verification
You can select individual jobs to be verified during a data verification operation from the Backups for Copy window. You
can verify jobs on all types of storage policies. The example below is for an iDataAgent Backup storage policy.
The following table describes the data verification status that is displayed in the Data Verification Status column:

Status                      When Displayed
Not Picked                  A job has not been picked for a data verification operation.
Picked for



                                                         Page 169 of 263
                                           Features - SharePoint Database iDataAgent




Verification             A   job has been picked for a data verification operation.
Successful               A   data verification operation ran and successfully verified this job.
Failed                   A   data verification operation ran and failed to verify this job.
Partial                  A   data verification operation ran and has not yet completed verifying this job.
Once a job is picked, its status in the Data Verification Status field is displayed as Picked for Verification.




Once a data verification operation is run, the date and time of the operation is displayed in the Date and Time of Last
Verification column.




If this copy has already been configured for data verification, then individual backups do not have to be selected on the
copy.
Back to Top



Perform a Data Verification Operation
The operation can then be performed from the Data Verification
dialog box.




                                                        Page 170 of 263
                                         Features - SharePoint Database iDataAgent




More advanced options are available from the Advanced Options for Data Verification dialog box, in which you can
identify the media in which data should be verified.
Back to Top



Data Verification Considerations for NetApp NAS NDMP
In addition to verifying data which has been backed up using NRS, data which has been backed up to a library attached to a
NetApp file server can also be verified to ensure that it is valid for recovery.
Two Windows registry keys are provided which allow you to redirect NetApp NAS NDMP data verification operations:
  sSaveFileHistory: If turned on, all files in the backup will be written to the AuxCopy log as they are encountered in the
  backup image.
  Note that this can be a large amount of data. If you choose to turn this option on, it is recommended you increase the
  log file size.
  sVerifyHeaderOnly: If turned on, the data verification operation will not run through the entire backup image. Instead,
  it will only process the dump header of the backup.
For more information on these registry keys, see Registry Keys and Parameters.
Back to Top



License Requirement
This feature requires a Feature License to be available in the CommServe.
Review general license requirements included in License Administration. Also, View All Licenses provides step-by-step
instructions on how to view the license information.
Back to Top



How To
  Configure a Storage Policy Copy for Data Verification
  Start a Data Verification Operation
  Schedule a Data Verification Operation
  Picking a Job on a Storage Policy Copy for Data Verification
Back to Top




                                                      Page 171 of 263
                                         Features - SharePoint Database iDataAgent




Scheduling

Select the desired topic:
   Overview
   How To Use Scheduling
   Set Holidays
   Disable or Enable all Scheduled Operations
   Support Information - Scheduling
   Special Considerations
   How To
Related Topics:
   Schedule Policy
   Custom Calendar


Overview
Scheduling jobs within the CommCell is more than a convenience; it helps to ensure that the jobs in the CommCell are
automatically performed on a regular basis without user intervention. Scheduling can be based on the standard Gregorian
Calendar or a Custom Calendar.
You can create schedules for the following jobs:
   Data Protection Operations, such as:
     Backup, Migration, Archive (see Schedule Backups/Migrations/Archives)
     Recover Point / Consistent Recovery Point creation (see Create a Recovery Point Using CDR)
     QR Volume Creation (see Schedule QR Volume Creation)
     Snapshot Management Workflow (see Create/Edit/Delete a Snapshot Management Workflow Schedule)
   Data Recovery Operations, such as:
     Restore (see Restore Backup Data)
     Recovery (see Recover Data - DataMigrator for Exchange, Recover Data - DataMigrator for File System and Network
     Storage, and Recover Migrated Data)
     Retrieve (see Retrieve Archived Data - DataArchiver for Exchange)
     QR Volume Recovery (See Recover QR Volumes)
   Administration jobs, such as:
     Auxiliary Copy (see Schedule an Auxiliary Copy)
     Data Aging (see Schedule Data Aging)
     Data Verification (see Schedule a Data Verification Operation)
     Disaster Recovery Backup (see Schedule a Disaster Recovery Backup)
     Download Updates (see Update)
     Drive Cleaning (Schedule Drive Cleaning)
     Erase Backup/Migrated Data
     Erase Media (see Erase Spare Media)
     Export Media (See Schedule an Export Media Operation)
     Install Automatic Updates (see Automatic Updates)
     FTP Download (see Automatic Updates)
     Quick Inventory and Full Inventory for blind libraries
     Reports (See Schedule a Report and Modify a Scheduled Report)
     Vault Tracker (See Schedule a Tracking Policy)
If you have a large number of clients/backup sets/subclients, or storage policies in your CommCell that require the same
schedule, then you may want to create a schedule policy for a data protection or auxiliary copy operation. See Schedule
Policy for more information.
Back to Top



                                                      Page 172 of 263
                                         Features - SharePoint Database iDataAgent




How to Use Scheduling
The task of scheduling jobs ensures that jobs can be run on a regular basis. To schedule a job, you need to define the
following entities:
   The job (i.e., data protection, data recovery, or administration operation) that you want to schedule.
   Whether the schedule is a one-time, daily, weekly, monthly or yearly schedule.
   The schedule pattern, depending on whether it is a one-time, daily, weekly, monthly or yearly schedule and the time
   zone in which it should be executed.
   The start and end time for the schedule.
If a job is not initiated at the scheduled time, the job will start if the problem causing the scheduling delay was rectified
within 120 minutes of the originally scheduled time. You can, however, use the schedWindow registry key to set a different
time window.
Jobs can either be scheduled from the dialog box of the specific operation, or you can schedule certain types of tasks from
the Scheduled Tasks dialog box, from the Scheduled Jobs window.

Scheduling a Job From the Dialog Box of the Operation
Jobs can be scheduled using the Schedule Details dialog box of the specific job.

Scheduling a Task from the Scheduled Jobs Window
In addition to scheduling tasks from the dialog box of the operation, you can also create task schedules from the Schedule
Tasks dialog box, which is available from the Add button of the Scheduled Jobs window.

The following is the types of tasks that can be scheduled from this dialog box:
   Administration operations such as:
     Disaster Recovery Backup
     Data Aging
     Auxiliary Copy
   Data Protection operations
   Data Recovery operations

The types of tasks that can be scheduled from this dialog box varies
according to the type of CommCell object that is selected. The
following is an example of the types of tasks that can be scheduled for
a selected backup set:




View a Job Schedule
Once a schedule is defined, it can be viewed from the Scheduled Jobs window from the following entities within the
CommCell:

Entity                   Type of Schedules That Can Be Viewed
CommServe                All data protection, data recovery, administration operations, and reports.
Client                   All data protection and data recovery operations related to the client.



                                                      Page 173 of 263
                                           Features - SharePoint Database iDataAgent




Agent                     All data protection and data recovery operations related to the agent.
Backup
                          All data protection and data recovery operations related to the backup set/instance/partition.
set/Instance/Partition
Subclient                 All data protection operations related to the subclient.
                          All data protection operations, Auxiliary Copy, Data Verification, Disaster Recovery backup jobs (if
Storage Policy
                          a Disaster Recovery Backup Storage Policy is selected) related to the Storage Policy.
                          All data protection operations, Auxiliary Copy, and Data Verification jobs related to the Storage
Storage Policy Copy
                          Policy Copy.
Vault Tracker Policies    All schedules related to all Vault Tracker policies within the CommCell.
Vault tracker Policy      All schedules related to the specific Vault Tracker policy.
View a Schedule List

Once schedules are created, they can be viewed from the Scheduled Jobs window.




Filter Operation

Filters are provided as an option in the Schedule Job window to allow users to customize their view of the displayed
scheduled jobs and/or schedule policies, e.g., a user can configure a filter to display only the data operations for a
particular client. Users can select from filters that are predefined or create filters from the Filter Definition dialog box. To
Create a Schedule Filter, users must access the Scheduled Job window from the CommServe level. When viewing the
Scheduled Jobs window from all other entities, users can select job-type filters for viewing purposes.

                           CommCell Administrators can utilize filters created by all users. All other users can only utilize
                           the filters that they create. If a user account is deleted, their filters will automatically be deleted
                           as well.

Monthly View of Job Schedules

A monthly view of these schedules can be created from the Schedule Filter Dialog. Once selected, the Schedule
Viewer dialog box displays the data protection operation schedules that were chosen from the Schedule Filter Dialog.
Keep in mind that you can only view the jobs that are scheduled to occur at some future time. It does not show the jobs
that have already occurred. If you want to view the history of these jobs, see Job History.
Also, more detailed information can be viewed from the Schedule Viewer window.

Back to Top


Disable or Enable a Schedule
Disabling a schedule allows you to stop jobs that are part of the schedule from running at their scheduled times. Once a
schedule is disabled, this job does not run until the schedule is enabled again. You enable or disable one or multiple
schedules simultaneously.
Back to Top




                                                        Page 174 of 263
                                          Features - SharePoint Database iDataAgent




Set Holidays
If you need to define days and times that you do not want your schedules to run, you can establish holidays for those
schedules. You can establish a holiday as an annual holiday or for a specific date within a given year. Annual holidays never
expire; other holidays are automatically deleted once the holiday occurs. Only one holiday per date is allowed. If you set a
holiday to occur on an annual basis, this supersedes any holiday that is already set for the same day.
Holiday scheduling applies to all the schedules within a CommServe and cannot be attached to any individual schedule. You
can, however, run immediate jobs on a holiday. Holidays are defined at the CommCell level and are evaluated in the
CommServe Time Zone. These holidays can be set from the Holidays dialog box.

An appropriate event message is generated for each of the scheduled jobs that are skipped on a holiday. In this example, if
the Job Skipped alert is configured on the CommServe, an alert will be generated on January 1 of each year indicating that
the scheduled jobs will be skipped. For more information on configuring alerts, see Alerts and Monitoring.
Back to Top



Disable or Enable all Scheduled Operations
All scheduled operations within the CommCell can be disabled or enabled. For more information, see Activity Control.
Back to Top



Special Considerations
Deconfiguration
During Deconfiguration of an Agent or Client:
All Data Protection/Data Recovery job schedules (independent and those part of a schedule policy), which are related to the
specific Agent or Client, will be disabled, and thus, not displayed in the CommCell Browser.
Agent Reinstallation
Reinstalling the Agent will enable any pre-existing schedules (independent and those part of a schedule policy) that were
previously disabled when the agent was uninstalled. However, if a change, such as an upgrade, is made to a deconfigured
agent prior to its reinstallation, the schedules related to that specific agent will no longer be recognized and will need to be
reconfigured manually after the reinstallation.


Back to Top



How To
   Create a Job Schedule
   View a Job Schedule
   View Job Schedules From the Schedule Viewer
   Modify a Job Schedule
   Disable a Schedule
   Delete a Job Schedule
   Run a Scheduled Job Immediately
   Schedule a Task from the Scheduled Jobs Window
   Exclude Schedules on Holidays
   Create a Schedule Filter
   View Details/Modify a Schedule Filter
   Delete a Schedule Filter
Alerts
   Configure Job-Based Alerts
   Modify Alerts




                                                       Page 175 of 263
                                        Features - SharePoint Database iDataAgent




Scheduled Data Protection Operations

Choose from the following topics:
  Overview
  When to Schedule Data Protection Operations
  Scheduling Data Protection for Subclients, Backup Sets or Instance Level
  How To


Overview
Scheduled data protection operations provide a convenient means of securing data without user intervention. You can
establish data protection schedules for each subclient using the CommCell Console, as described in the online help. When
scheduling data protection operations, you need to establish a schedule for each subclient.
For example, a backup schedule always contains a full backup and may contain one or more other backup operations. When
combined for a given subclient, these backups comprise a full backup cycle.
Types of Data Protection Operations that can be Scheduled
The following types of data protection operations can be scheduled:
  Backup, Migration, Archive (see Schedule Backups/Migrations/Archives)
  File Replication
  QR Volume Creation (see Schedule QR Volume Creation)
  Snapshot Management Workflow (see Create/Edit/Delete a Snapshot Management Workflow Schedule)
Back to Top



When to Schedule Data Protection Operations
Data protection operations, like other processes, consume system resources. The extent to which any given operation
affects other applications depends on several factors:
  Amount of data to be secured
  Processing power of the computer from which the data is being secured
  Number of other operations occurring on the computer from which the data is being secured
  Compression mode of the operation (if applicable)
We suggest that you schedule regularly occurring data protection operations for times of low system utilization. For
example, you may want to avoid backing up or migrating data during office hours. If the data must travel across a network
to reach the destination library, then scheduling operations during off-peak hours can be even more important since
launching many operations simultaneously could diminish network responsiveness. The extent, if any, to which network
responsiveness is degraded depends on a number of issues including the quantity of data being secured concurrently,
capacity of the network, network configuration, etc.
It is often prudent to distribute the scheduled operations in a CommCell over some period of time in order to avoid media
drive and media group contention. The length of time for any particular CommCell depends on the amount of data to be
secured and the specific configuration of the CommCell with respect to libraries and storage policies. If the number of
media drives is small compared with the number of subclients in the CommCell, drive contention can occur. If the number
of storage policies is small compared with the number of subclients or if a specific storage policy is the target of many
subclients, then media group contention can occur. (Media group contention is discussed in Media Contention within
Removable Media Groups.) If either condition occurs, data protection operations will queue until the needed resource
becomes available. Consequently, operations may extend beyond the operation window that was intended for the
CommCell.
Back to Top



Scheduling Data Protection at the Subclient, Backup Set or Instance Level
The QiNetix system allows you to schedule or initiate backups at the subclient, instance and/or backup set level depending



                                                     Page 176 of 263
                                          Features - SharePoint Database iDataAgent



upon the agent. Selecting the backup set or instance saves you from having to select the individual subclients. If you select
a backup set or instance, the QiNetix system applies the same schedule or data protection operation request to all
constituent subclients.
Remember that the rules for initiating operations of sibling subclients still apply. If the subclients within a backup set or
instance are mapped to the same storage policy and that storage policy is not configured for multiple data streams, then
the system queues the operations, performing one data protection operation at a time. This topic is discussed further in
Establishing Parallel Backups via Subclients.
Back to Top



How To
   Create a Job Schedule
   Schedule a Task from the Scheduled Jobs Window
Back to Top




                                                       Page 177 of 263
                                         Features - SharePoint Database iDataAgent




Scheduled Data Recovery Operations

Choose from the following topics:
   Overview
   How To


Overview
When you want to restore/recover data, you usually need it immediately. However, there may be times when you need the
data by some specific time, but not necessarily right away. For example, perhaps you need to restore/recover some data
that you do not intend to use until the following day.
Using the scheduling feature, you can schedule a data recovery operation. As with scheduled data protection operations, a
scheduled data recovery operation relieves you of having to manually initiate the operation. This feature can be particularly
useful if you want to restore/recover a large amount of data, but would prefer to do so at a time when either the client
computer is not in use or at a time when network utilization is low (assuming the data must travel a network).
Types of Data Recovery Operations
The following types of data recovery operations can be scheduled:
   Restore
   Recovery
   Retrieve
   QR Volume Recovery
   Replica Recovery
   Restore by Jobs


How To
   Create a Job Schedule
   View a Job Schedule
   Modify a Job Schedule




                                                      Page 178 of 263
                                           Features - SharePoint Database iDataAgent




Schedule Policy

Select the desired topic:
   Overview
   Auxiliary Copy Schedule Policy
     Auxiliary Copy Schedule Policy Operations
   Data Protection Schedule Policy
     Agent Specific Data Protection Schedule Policy
     All Agent Types Data Protection Schedule Policy
     Data Protection Schedule Policy Operations
   Support Information
Related Topics:
   Scheduling


Overview
If you have a large number of clients/backup sets/subclients or storage policies in your CommCell that require the same
backup or auxiliary copy schedule, then you can create a schedule policy. Schedule policies allow you to associate a
schedule or groups of schedules to any number of clients/backup sets/subclients or storage policies within your CommCell.
There are two types of schedule policies, Data Protection and Auxiliary Copy. A Data Protection schedule policy can be
agent specific, or applicable for all types of agents.
Back to Top



Auxiliary Copy Schedule Policy
An auxiliary copy schedule policy allows you to create an auxiliary copy schedule, and associate it with any number of
storage policies or storage policy copies. When the auxiliary copy schedule policy is run, an auxiliary copy operation will be
performed on all associated storage policies and/or copies.
If storage policy level association is given to an auxiliary copy schedule policy, then all future additions of copies for the
selected storage policies will be included in the auxiliary copy schedule.
Auxiliary Copy Schedule Policy Operations
The following operations can be performed on an auxiliary copy schedule policy:
Create an Auxiliary Copy Schedule Policy

You can create an Auxiliary Copy schedule policy from the New Schedule Policy dialog box.

For example, an auxiliary copy schedule of a weekly auxiliary copy
operation is created to copy data to the selective copies of all the
storage policies within a CommCell. In the sample image that follows,
a schedule policy named Selective_AuxCopy is created from the New
Schedule Policy dialog box.




                                                        Page 179 of 263
                                         Features - SharePoint Database iDataAgent




Add Schedules
In order to add schedules to this schedule policy:
   A schedule for a weekly auxiliary copy operation is created from the Schedule Details tab of the Schedule Details
   dialog box.
   Additional auxiliary copy options were selected from the AuxCopy Options tab.

Once the auxiliary copy schedule is defined, the schedule can be
viewed from the Aux Copy Schedule pane of the New Schedule
Policy dialog box.




Give the Schedule Policy CommCell Object Association
In order to include the storage policy copies in the auxiliary copy operation, you must give the schedule policy an
association for those storage policy copies.

In the sample image that follows, schedule policy Selective_AuxCopy
was given association to all the selective copies within the CommCell.




                                                      Page 180 of 263
                                          Features - SharePoint Database iDataAgent




Assign an Alert to the Schedule Policy
You can assign an alert to a schedule policy, so that the operations of the schedule policy will receive the configured alert.
View the Schedule Policy

Once schedule policy Selective_AuxCopy was created, it can be viewed from the right-hand side of the CommCell Browser
from the Schedule Policies node. If an alert is configured for the schedule policy, the alert name can be viewed in the Alert
column.
The schedules of this schedule policy can also be viewed from the Scheduled Jobs window. The jobs related to this
schedule policy are identified with a schedule policy icon.
Edit

Once created, the options of a Auxiliary Copy schedule policy can changed by using the Edit option that is available from
an existing schedule policy from the right pane of the CommCell Browser, under the Schedule Policies level.
Schedules policies cannot be edited by multiple users simultaneously. If a user attempts to edit a schedule policy that is
currently being viewed by another user, a warning message will display indicating that the property sheet is currently being
edited, and therefore, locked. Users will only have the option to view the schedule policy without making any changes to it.
Delete

A schedule policy can be deleted by using the Delete option available from the right pane of the CommCell Browser, under
the Schedule Policies level. Subsequently, all of the schedules of the schedule policy will be deleted and will not run at
their scheduled time.
Disable/Enable an Auxiliary Copy Schedule Policy

Disabling a schedule policy allows you to disable all the schedules that were part of the schedule policy. Once a schedule
policy is disabled, all jobs of all associated schedules will not run until the schedule policy is enabled again.
A schedule policy can be disabled or enabled by right-clicking an auxiliary copy schedule policy from the Schedule
Policies node of the CommCell Browser.
Clone a Schedule Policy

Cloning a schedule policy allows you to duplicate an existing schedule policy without the need for creating a new one. Once
a schedule policy is cloned, the new schedule policy will have the same associations and schedules as the original schedule
policy. However, the status of the newly created schedule policy will be disabled.
A schedule policy can be cloned from the Schedule Policies node of the CommCell Browser. See Cloning for an overview.
Run Schedules

The schedules of a schedule policy can be run immediately using the Run option available from the right pane of the
CommCell Browser, under the Schedule Policies level.
Decouple a Schedule From an Auxiliary Copy Schedule Policy



                                                       Page 181 of 263
                                           Features - SharePoint Database iDataAgent




Decoupling a schedule from an auxiliary copy schedule policy removes its association from the schedule policy. Once a
schedule policy is decoupled, it is treated as a normal schedule without any schedule policy association. A schedule can be
decoupled from a Schedule Policy from the Scheduled Jobs window.

Back to Top



Data Protection Schedule Policy
There are two types of data protection schedule policies, an agent specific data protection schedule policy and an all agent
types data protection schedule policy.
Agent Specific Data Protection Schedule Policy
A Data Protection schedule policy allows you to define a maximum of six schedules, and associate them to any number of
client computer groups, client computers, backup sets, and subclients for a particular agent. If the association is at the
client computer group, client, or backup set, all items under the one chosen will be associated with this schedule policy.
You can define which type of data protection operation you want to schedule, as well as configure Advanced Backup Options
and assign an object association in the CommCell Browser.
All Agent Types Data Protection Schedule Policy
An All Agent Types schedule policy is a generic data protection schedule policy that has most of the same options as a
regular Data Protection schedule policy, and is supported by all agents that support scheduling. Like a regular Data
Protection schedule policy, this schedule policy allows you to define a maximum of six schedules, and associate them to any
number of client computers, backup sets, and subclients for a particular agent. If the association is at the client computer
group, client or backup set, all items under the one chosen will be associated with this schedule policy.
You can define which type of data protection operation you want to schedule (either full or incremental), as well as
configure Advanced Backup/Migrate/Archive Options and assign object association in the CommCell Browser.
Upon installation of the software, the All Agent Types (System Created) schedule policy is automatically created. You
can also create additional All Agent Types schedule policies manually.
Due to the generic nature of schedule policies, certain agent-specific options such as the Oracle iDataAgent's Backup Level
and Cumulative options are unavailable in the Schedule Details (Job Options) dialog. As a result, when a data protection
schedule policy is used for Oracle iDataAgents, only a default incremental backup (Level 1, non-cumulative) can be
scheduled.
Data Protection Schedule Policy Operations
The following operations can be performed on a data protection schedule policy:
Create a Data Protection Schedule Policy

You can create a Data Protection schedule policy from the New Schedule Policy dialog box.

For example, three schedules of a daily incremental backup, weekly differential backup, and a monthly full backup are
needed to back up the subclients of multiple clients within a CommCell. In the sample image that follows, a schedule policy
named Tech_Subclients is created from the New Schedule Policy dialog box.

In the sample image that follows, schedule
policy Selective_AuxCopy was given
association to all the selective copies within
the CommCell.




                                                        Page 182 of 263
                                         Features - SharePoint Database iDataAgent




Add Schedules

In order to add schedules to this schedule policy:
   A schedule for a daily incremental backup is first added from the
   Schedule Details tab of the Schedule Details dialog box.
   The type of Data Protection operation is selected from the Backup
   Options tab.
   Once the schedules for a weekly incremental backup and a monthly
   full backup are defined, they can be viewed from the Data
   Protection Schedules pane of the New Schedule Policy dialog box.




Give the Schedule Policy CommCell Object Association
In order for Backup:Differential, Backup:Incremental, and Backup:Full to back up data for the necessary subclients,
you must give the schedule policy an association for those subclients.



                                                      Page 183 of 263
                                          Features - SharePoint Database iDataAgent




When selecting SQL Server 2000/2005 (File / File Group Subclients) as the iDataAgent on the General tab, user-defined
databases and user-defined subclients will appear in the Associations tab; system databases will not display. When
selecting the box at database level, only the user-defined subclients are backed up; default subclient are not backed up.
In the sample image that follows, schedule policy Tech_Subclients was given association at the Tech_Data, Tech_Support,
Tech_Integrate, and Tech_Test subclients from the Associations tab of the New Schedule Policy dialog box.

In the sample image that follows, schedule policy Selective_AuxCopy
was given association to all the selective copies within the CommCell.




Assign an Alert to the Schedule Policy
You can assign an alert to a schedule policy, so that the operations of the schedule policy will receive the configured alert.
View the Schedule Policy

Once schedule policy Tech_Subclients was created, it can be viewed from the right-hand side of the CommCell Browser
from the Schedule Policies level. If an alert is configured for the schedule policy, the alert name can be viewed in the
Alert column.

The schedules of this schedule policy can also be viewed from the Scheduled Jobs window. The jobs related to this
schedule policy are identified with a schedule policy icon.
Edit

Once created, the options of a schedule policy can changed by using the Edit option that is available from an existing
schedule policy from the right pane of the CommCell Browser, under the Schedule Policies level.
Schedules policies cannot be edited by multiple users simultaneously. If a user attempts to edit a schedule policy that is
currently being viewed by another user, a warning message will display indicating that the property sheet is currently being
edited, and therefore, locked. Users will only have the option to view the schedule policy without making any changes to it.
Delete

A schedule policy can be deleted by using the Delete option available from the right pane of the CommCell Browser, under
the Schedule Policies level. All of the schedules of the schedule policy will be deleted, and, subsequently, the scheduled
jobs will not run at their scheduled time.
Disable or Enable

Disabling a schedule policy allows you to disable, at one time, all the schedules that were part of a schedule policy. Once a
schedule policy is disabled, all jobs of all associated schedules will not run until the schedule policy is enabled again.
A schedule policy or multiple schedule polices can be disabled or enabled from the Enable or Disable option that is available
from the right pane under the Schedule Policies level of the CommCell Browser.
Clone

Cloning a schedule policy allows you to duplicate an existing schedule policy without the need for creating a new one. Once
a schedule policy is cloned, the new schedule policy will have the same associations and schedules as the original schedule
policy. However, the status of the newly created schedule policy will be disabled.




                                                       Page 184 of 263
                                         Features - SharePoint Database iDataAgent




A schedule policy can be cloned from the Schedule Policies level of the CommCell Browser. See Cloning for an overview.
Run Schedules

The schedules of a schedule policy can be run immediately using the Run option available from the right pane of the
CommCell Browser, under the Schedule Policies level.
Decouple a Schedule From a Data Protection Schedule Policy

Decoupling a scheduled job from a Data Protection schedule policy removes its association from the schedule policy. This
causes all associated schedules to be decoupled. Once a schedule is decoupled, all schedules associated with it are now
treated as normal schedules without any schedule policy association. Note that you cannot decouple a schedule from an All
Agent Types Data Protection schedule policy.
Back to Top



How To
All schedule policy types:
   Add Additional Schedules To a Schedule Policy
   Decouple a Scheduled Job From a Schedule Policy
   Delete a Schedule Policy
   Disable or Enable a Schedule Policy or Groups of Schedule Policies
   Clone a Schedule Policy
   Modify a Schedule of a Schedule Policy
   Modify the Name of a Schedule Policy
   View a Schedule Policy
   Run the Schedules of a Schedule Policy Immediately
   Delete a Schedule From a Schedule Policy
   Configure an Alert for a Schedule Policy
   Modify an Alert for a Schedule Policy
   Delete an Alert of a Schedule Policy
Auxiliary Copy Schedule Policy Only:
   Create an Auxiliary Copy Schedule Policy
   Change the Association of an Auxiliary Copy Schedule Policy
Agent Specific or All Agent Types Data Protection Schedule Policy Only:
   Create an Agent Specific or All Agent Types Schedule Policy
   Change the Association of an Agent Specific or All Agent Types Data Protection Schedule Policy
Back to Top




                                                      Page 185 of 263
                                          Features - SharePoint Database iDataAgent




Data Encryption

Choose from the following topics:
   Overview
   Auxiliary Copy Operations and Encryption
   Replication Encryption
   Disabling Encryption
   Important Considerations
   License Requirement
   Support Information - Data Encryption
   How To


Overview
QiNetix allows encrypting data both for transmission over non-secure networks and for storage on media. While data is
always encrypted in the same way (Blowfish algorithm with 128-bit keys), the flexibility of key management schemes
makes QiNetix encryption useful in a wide variety of configurations.
If you need only network security, the encryption keys are randomly chosen for every session. Data is encrypted on the
Client and is decrypted on the MediaAgent and the keys are discarded at the end. The entire process is completely
transparent. All you have to do is to enable encryption.
If you are concerned that media may be misplaced, QiNetix can encrypt data before writing it to the media and store the
keys in the CommServe database. In this way, recovery of the data without the CommServe is impossible - not even with
Media Explorer. This mode is also completely transparent. Once enabled, it will work requiring no additional activity on your
part.
Additionally, encryption keys can be protected with your own pass-phrase (RSA algorithm with 1024-bit keys) before being
stored in the database. If the database is accessed by unauthorized users, and the media is stolen, the data will still not be
recoverable without the pass-phrase. This highest level of security comes at the price of having to enter the pass-phrase for
every recovery operation and not being able to run synthetic full backups. But even this mode can further be customized to
fit specific needs:
   By exporting a file that contains the scrambled pass-phrase of the client computer to a dedicated directory on another
   computer, QiNetix can recover the client’s data to that (and only that) computer without prompting you for the pass-
   phrase.
   Explicitly enabling synthetic full backups in the GUI will create a copy of unlocked encryption keys in the database, which
   will be accessible only to synthetic full data protection operations. In this case the regular data recovery operations will
   still prompt you for a pass-phrase, but synthetic full data protection operations will not.




Auxiliary Copy Operations and Encryption
When you run an auxiliary copy operation, the copy assumes the settings of the primary copy. Therefore, if the primary
copy data is encrypted, then the auxiliary copy data will be encrypted; and if the primary copy data is not encrypted, then
the auxiliary copy data will not be encrypted.
Impacts on Media Explorer data recovery operations from auxiliary copies are explained in Change Encryption Settings.




Replication Encryption
Data being replicated can be encrypted between the source and destination computers.
When encryption is enabled, data is encrypted on the source computer, replicated across the network to the destination
computer, and decrypted on the destination computer. Encryption for replication is specified on the Replication Set level,
and applies to all of its Replication Pairs. For a given Replication Set, you can enable or disable encryption between the




                                                       Page 186 of 263
                                            Features - SharePoint Database iDataAgent



source and destination machines. See Configure the Replication Set for Data Encryption for step-by-step instructions.




Disabling Encryption
Once you have enabled encryption functionality, there are different approaches to backing out of the functionality. You need
to be aware of the behaviors that result from each approach. Refer to Change Encryption Settings.
If an exported pass-phrase was not synchronized with the last source client's pass-phrase at the time encryption was
disabled (setting change from With a Pass-Phrase directly to Disabled), subsequent recovery operations may present an
erroneous message "Invalid pass-phrase specified. Please check the spelling and try again". If the data you are recovering
was not encrypted, this message can be ignored as the recovery will run successfully. If the data was encrypted with pass-
phrase protection, you will need to provide the correct (last) source client's pass-phrase.
When you disable encryption after having exported a pass-phrase, the exported file is not deleted. To remove the file,
locate the <hostname>.pf file in the <software installation path>\PF folder that is named for the source client.


                             Do not delete the exported synched pass-phrase file when DataMigrator is present on the client
                             computer. If a migration was done using encryption and the key is deleted, stub recoveries will
                             not be possible. At that point, your remaining option would be to perform a browse/recovery and
                             provide the correct Decryption key.




Important Considerations
Keep the following in mind when encrypting data:
   Since encrypting data converts it into a random form (not easily understood), it becomes less compressible than non-
   encrypted data. It is therefore recommended that you do not enable hardware or software compression on encrypted
   data, as doing so may actually make the data grow in size.
   The backup throughput for encrypted data will be lower when compared to non-encrypted data. Enabling Client
   Compression may provide a higher throughput for encrypted data which is not already compressed.



License Requirement
This feature requires a Feature License to be available in the CommServe.
Review general license requirements included in License Administration. Also, View All Licenses provides step-by-step
instructions on how to view the license information.




How To
   Configure   the   Client for Data Encryption
   Configure   the   Subclient for Data Encryption
   Configure   the   Replication Set for Data Encryption
   Configure   the   Instance for Third-party Command Line Encrypted Operations

   Export an Encryption Pass-Phrase

   Recover Encrypted Data (Regular)
   Recover Encrypted Data (With a Pass-Phrase)

   Change Encryption Settings




                                                         Page 187 of 263
                                         Features - SharePoint Database iDataAgent




Data Compression

Choose the following topic:
   Overview
   Software Compression
      Client Compression
         Considerations for Quick Recovery Agent
      Replication Compression
      MediaAgent Compression
   Hardware Compression
      EMC Centera and NAS NDMP iDataAgents
   Replication Compression
   Important Considerations
      Auxiliary Copy and Data Compression
      NetWare File System iDataAgent
      Subclient Policies and Software Compression
   Support Information - Data Compression
   How To


Overview
Data compression options are provided for data secured by data protection operations. Compression reduces the quantity of
data sent to storage, often doubling the effective capacity of the media (depending on the nature of the data). If the data is
later restored/recovered, the system automatically decompresses the data and restores it to its original state.
The following data compression options are provided:
   Software compression which includes options to compress the data in the:
      Client
      MediaAgent
   Hardware compression for libraries with tape media at the individual data path
As compressed data often increases in size if it is again subjected to compression, the system only applies one type of
compression for a given data protection operation. You can redefine the compression type at any time without
compromising your ability to restore/recover data.
When hardware compression is available and applied, it has precedence over the other compression selections. If hardware
compression is enabled for a data path, then all data conducted through that data path is compressed using the hardware
compression. If hardware compression is disabled for a data path, then the data is handled in accordance with the software
compression selection of each subclient that backs up to the data path. Selections under each subclient include options for
Client compression, MediaAgent compression, or no compression.
Also keep in mind that hardware compression is not applicable for magnetic libraries and hence the software compression
selection for subclient is used for data paths associated with magnetic libraries.
Note that at any given time you can view the compression scheme used for protecting a subclient's data by viewing the
details of the data paths associated with the subclient. See View Data Paths Associated With a Subclient for step-by-step
instructions.



Software Compression

Client Compression
The client compression is specified on the subclient level for most
agents. (For database iDataAgents, this is specified on the instance
level.)




                                                       Page 188 of 263
                                         Features - SharePoint Database iDataAgent




Client compression is available for all storage media. This scheme
compresses the data on the client computer using the compression
software. The compressed data is then sent to the MediaAgent
which in turn directs it to a storage media. Client compression is
useful if the client and MediaAgent reside on separate computers
and therefore the client must send its data using a network. Client
compression reduces the network load since the data is
compressed before it leaves the client.
Note that client compression may not be suitable in all
circumstances. Using software to compress data can be processor
intensive. Consequently, you may not want to use client
compression for client systems with limited processing power. In
such cases, MediaAgent compression may be more efficient.
The diagram on the right illustrates Client Compression.

Considerations for Quick Recovery Agent

For the Quick Recovery Agent, when the client compression is enabled, objects are compressed on the source computer in
the beginning of the copy and uncompressed on the destination computer at the end of the copy.

Replication Compression
Data being replicated can be compressed between the source and
destination computers. When compression is enabled, data is
compressed on the source computer, replicated across the network
to the destination computer, and uncompressed on the destination
computer, thereby reducing the network load. Compression for
replication is specified on the Replication Set level, and applies to
all of its Replication Pairs. For a given Replication Set, you can
enable or disable client compression between the source and
destination machines. See Enable or Disable Software Compression
for a Replication Set for step-by-step configuration instructions.
The diagram on the right illustrates Replication Compression.

MediaAgent Compression
The MediaAgent compression is specified on the subclient level for
most agents. (For database iDataAgents like SAP, Oracle, etc. the
compression type is specified on the instance level). For a given
subclient or instance as appropriate, you can enable or disable
MediaAgent compression for all data paths which do not have
hardware compression enabled.
MediaAgent compression is available for all storage media. This
scheme compresses the data on the MediaAgent using compression
software in the MediaAgent. The compressed data is then sent
from the MediaAgent to the storage media. MediaAgent
compression can be useful if the MediaAgent software resides on a
computer that is more powerful than the client computer. Using
software to compress data can be processor intensive;
consequently, you may want to use MediaAgent compression for
client computers with limited processing power.
The diagram on the right illustrates MediaAgent Compression.

             Note that data compressed on the MediaAgent during data protection, is decompressed on the client
             computer during the data recovery.



See the following procedures for step-by-step instructions on enabling (or disabling) software compression:




                                                      Page 189 of 263
                                               Features - SharePoint Database iDataAgent



   Enable   or   Disable   Software   Compression   for a Subclient
   Enable   or   Disable   Software   Compression   and Network Bandwidth for a QR Subclient
   Enable   or   Disable   Software   Compression   for Command Line Backups (DB2, Informix, Oracle, Oracle RAC, SAP)
   Enable   or   Disable   Software   Compression   for Log Backups (DB2, Informix, Oracle, Oracle RAC, Sybase)
   Enable   or   Disable   Software   Compression   for a Replication Set


Hardware Compression
The hardware compression is established on the data path level. This
kind of compression is only available for data paths that direct data to
tape libraries. This compression scheme sends uncompressed data
from the client computer through the data path to the media. There
the tape drive hardware compresses the data before writing it to the
media.
Generally, hardware compression is faster than software compression
since it is performed by dedicated circuitry. This compression scheme
is particularly suited for direct-connect configurations where the
subclient and MediaAgent are hosted by the same physical computer.
In such configurations, there are no network bottlenecks that can
throttle the transfer of data to the media drives. Therefore, the drives
can compress the data as quickly as it is sent by the subclient. In such
configurations, hardware compression can not only boost the virtual
capacity of the tape but the performance of the data protection
operation as well, because the tape, operating at high speed, stores
more data per unit time than it would otherwise.
The diagram on the right illustrates Hardware Compression.
Note that hardware compression is only supported by tape libraries. Hardware compression is not applicable for magnetic
and optical libraries.
Hardware compression may be less useful when data secured by data protection operations must compete with other data
for network bandwidth. If the network becomes congested, the tape drives can become starved for data. In this condition,
the drives still compress the data, but because data is not supplied quickly enough, the drives must stop and restart the
media as more data becomes available. As a result, performance may suffer.
See Enable or Disable Hardware Compression for step-by-step instructions.
EMC Centera and NAS NDMP iDataAgents
Note the following for using Hardware Data Compression for data protection operations involving the NAS NDMP
iDataAgents:
   For a tape drive attached to a NetApp file server, hardware compression is always on.
   For a tape drive attached to a BlueArc or EMC Celerra file server, hardware compression can be configured in the normal
   manner.
   For EMC Centera, or any of the NAS NDMP iDataAgents, when using NDMP Remote Server (NRS) and a drive pool
   configured to a MediaAgent, hardware compression can be configured in the normal manner.


Important Considerations
   Auxiliary Copy and Data Compression
   Software compressed data is not uncompressed during an auxiliary copy operation.
   NetWare File System iDataAgent
   The NetWare File System data can be in compressed format on a volume that supports compression, and data in
   compressed format can only be restored to a volume that supports compression. The Decompress Data before
   Backup option allows you to select whether to decompress data that is in compressed format on the backup media and
   can be restored to either a compressed or uncompressed volume. By default, data is backed up in a compressed format
   if the data is on a volume that supports compression.
   See Enable Decompress Data before Backup for a NetWare File System Backup Set for step by step instructions.




                                                            Page 190 of 263
                                        Features - SharePoint Database iDataAgent




  Subclient Policies and Software Compression
  Software Compression is made part of subclient policy. Once the subclient policy is associated to a backup set, software
  compression can be overridden at subclient, if necessary.



How To
Software Compression
  Enable or Disable Software   Compression   for a Subclient
  Enable or Disable Software   Compression   and Network Bandwidth for a QR Subclient
  Enable or Disable Software   Compression   for Command Line Backups (DB2, Informix, Oracle, Oracle RAC, SAP)
  Enable or Disable Software   Compression   for Log Backups (DB2, Informix, Oracle, Oracle RAC, Sybase)
  Enable or Disable Software   Compression   for a Replication Set (ContinuousDataReplicator)
Hardware Compression
  Enable or Disable Hardware Compression
Replication Compression
  Configure Software Compression for a Replication Set
General
  Enable Decompress Data before Backup for a NetWare File System Backup Set
  View Data Paths Associated With a Subclient
Back to Top




                                                     Page 191 of 263
                                         Features - SharePoint Database iDataAgent




Auxiliary Copy

Choose the following topic:
   Overview
   Auxiliary Copy With Synchronous and Selective Copies
      Synchronous Copies
      Selective Copies
   Auxiliary Copy and Other Copy Features
      Auxiliary Copy with Deferred Copies
      Auxiliary Copy with Automatic Copies
      Auxiliary Copy and Spool Copies
      Auxiliary Copy and Inline Copies
      Auxiliary Copy With Combined Streams
   Auxiliary Copy With Multiple Stream Parallelism
      Allow Maximum Number of Streams Option
      Limit to Number of Streams Option
   Auxiliary Copy Operations
   Sequence in Which Data is Copied During an Auxiliary Copy Operation
   Recovering Data From Copies
      Browse/Restore/Recover from Copy Precedence
      Browsing From Copy Precedence Across Multiple Storage Policies
      Restoring Data from a Secondary Copy using a Third-Party Command Line
   Safeguarding Your Data Using Auxiliary Copy With Selective Copies
   Auxiliary Copy Considerations
   Customize Auxiliary Copy Operations through Registry Keys
   Support Information - Auxiliary Copy
   How To
Related Topics:
   Command Line Interface - qoperation auxcopy
   Media Refresh
   Save a Job as Script
   Storage Policy Copies
      Automatic Copy
      Deferred Copy
      Inline Copy
      Spool Copy
      Source Copy
      Subclient-Based Storage Policy Copy
   Storage Policy Copy Properties
   Streams


Overview
An auxiliary copy operation allows you to create secondary copies of data associated with data protection operations,
independent of the original copy. For a full understanding, you should have some basic knowledge of storage policy and
storage policy copy configurations. See Storage Policies and Storage Policy Copies for more information.
The auxiliary copy operation can be useful for creating additional standby copies of data. The primary and secondary copies
use different media and often use different libraries, depending on the configuration. Should the primary copy become
inoperative, perhaps due to a storage media failure, or a library or network malfunction, you can promote a synchronous
copy to become the primary copy. This allows you to continue operations as before and make repairs without interrupting
data protection and data recovery operations.



                                                      Page 192 of 263
                                          Features - SharePoint Database iDataAgent




                          If it is necessary to separately maintain the media for the copies in different libraries, you can
                          configure your auxiliary copy operation to prevent the primary and secondary copies' data paths
                          from using the same library. This can be achieved by utilizing the SetCopyFlags utility, which is
                          available in the Resource Pack CD-ROM.

When an auxiliary copy operation is started, all valid data from a source copy is copied to all or one active secondary copies
within the storage policy. A source copy can either be the primary copy (the default), or a secondary copy that has been
selected as the source copy. The following figure illustrates a primary copy as the source copy for the an auxiliary copy
operation:




                          If the tape for a requested browse/data recovery operation is outside of a library, you will be
                          prompted to manually input it into the library. The tape from a secondary copy will not be
                          automatically used even if the data exists on the tape for the secondary copy.
                          While performing Auxiliary Copy operations, priority is provided to perform the operation in a
                          LAN-free environment.

Auxiliary Copies vs. Standard Data Protection Operations
An auxiliary copy should not be confused with a standard data protection operation. The two operations are unrelated,
except, of course, that a data protection operation must precede an auxiliary copy. In all other ways the two operations are
distinct and must be initiated or scheduled individually. A data protection operation is specific to a particular subclient,
copying the subclient content from the client computer to the primary storage policy copy. An auxiliary copy, however, does
not involve clients; instead, it copies backed up data from a source copy to one or more secondary copies. If you want the
auxiliary copy operation to capture the data of only one subclient, then you must ensure that subclient has a dedicated
storage policy.
Auxiliary Copy and Hardware Compression
Auxiliary copy does not manipulate software compression on the data. The data is transferred as it is. The hardware
compression is transparent. Thus, hardware compressed data is uncompressed by the tape device on read, and
recompressed during tape write.



Auxiliary Copy With Synchronous and Selective Copies
An auxiliary copy operation copies valid data from a source copy of a specific storage policy to all or one active secondary
copy within a storage policy. Data from source copies are not copied over to inactive secondary copies.
These secondary copies can be either synchronous or selective copies. The following sections describe how data is copied
during an auxiliary copy job on both types of copies.
Synchronous Copies
The auxiliary copy operation will copy full, incremental, and differential backup data from a source copy to other
synchronous copies based on the All Backups option or the Backups On And After date you selected from the Copy
Policy tab of the Copy Properties dialog box.
All Backups Option

If you select the All Backups option when creating a synchronous copy, all data protection operations on a source copy will
be copied to the synchronous copy.
In the following example, if an auxiliary copy is run on 9/5, F1, I1, D1, and F2 will be copied to the synchronous copy. The
source copy for the operation is the primary copy.




                                                       Page 193 of 263
                                          Features - SharePoint Database iDataAgent




When the All Backups option is selected, all data protection operations on a source copy will be copied to the synchronous
copy starting from the first full backup. If there are non-full backup operations that do not have an associated full backup,
these operations will not be copied to the synchronous copy.
In the following example, if an auxiliary copy is run on 9/7, only F2, I2, D2, and F3 will be copied to the synchronous copy.
The source copy for the operation is the primary copy.




                          When auxiliary copies are run on DataMigrator storage policies, all migration operations will be
                          copied, regardless if a new Index exists.




                                                       Page 194 of 263
                                         Features - SharePoint Database iDataAgent



Backups On and After Option

If you select a date from the Backups On and After field, then all data protection operations starting from the first full
backup on or after the date you select (starting from 12:00 A.M.) will be copied to the synchronous copy. This option is
useful if you do not want all the data protection operations of a source copy to be copied to the synchronous copy.
In the following example, the Backups On and After date was set to 9/4. All data protection operations starting from 9/4
are copied to the synchronous copy when an auxiliary copy operation is run on 9/9. All data protection operations prior to
9/4 are not copied.




If the Backups On and After date is set to 9/5, all data protection operations starting from 9/7 will be copied when an
auxiliary copy is run on 9/9. Because the date of 9/7 contains the first full backup after the Backups On And After Date,
F3 will be the first job copied to the synchronous copy.




                                                      Page 195 of 263
                                          Features - SharePoint Database iDataAgent




Selective Copies
Selective copies only contain full backup data that has occurred on or after a specified date and are copied based on the
following selective criteria:
   Manually Select Full Backups
   Time
   Based on the Copy most recent full backups when Auxiliary copy starts option.
   Based on the Most recent full backup copied when an auxiliary copy starts option with Reset selected
   backups selected.

The selective copy type can be selected from the Selective Copy tab of the Copy Properties dialog box. Once you set
the Backups On and After date from the Copy Properties dialog box, then all data protection operations starting from
the first full backup on or after the date you select (starting from 12:00 A.M.) will be copied to the selective copy, based on
the selective copy type.
Manually Select Full Backups

If a selective copy is defined as a Manually Select Full Backups copy, a backup can only be copied to a selective copy
after that backup is manually selected to be copied from the Backups for Copy window of the selective copy. See Manually
Select a Backup To be Copied to a Selective Copy for more information.
All Fulls Backups

If a selective copy is defined as an All Fulls copy, all full backups associated with the storage policy are copied to the
selective copy.
Time Based Selective Copy

If a selective copy is defined as time based, an auxiliary copy operation copies the first or last full backup of a time period
based on the following parameters:
   Weekly - The first or last full backup of a specified starting day of a week will be copied from 12:00 A.M. of that day up
   to 11:59 P.M. on the last day of that week.



                                                       Page 196 of 263
                                          Features - SharePoint Database iDataAgent



   Monthly - The first or last full backup of a specified starting day of a month will be copied from 12:00 A.M. on that day
   up to 11:59 P.M. on the last day of that month.
   Quarterly - The first or last full backup of the first day of a quarter will be copied from 12:00 A.M. on that day up to
   11:59 P.M. on the last day of that quarter.
   Half Yearly - The first or last full backup of the first day of a half year will be copied from 12:00 A.M. on that day up to
   11:59 P.M. on the last day of that half year.
   Yearly - The first or last full backup of the first day of a year will be copied from 12:00 A.M. on that day up to 11:59
   P.M. on the last day of that year.
Selective storage policy copies associated with custom calendars will have data copied during auxiliary copy operations
either monthly, quarterly, half yearly, or yearly based on the days defined in the calendar. See Custom Calendar for more
information.
Example
The following example illustrates the time interval of a primary copy with four full backups that were run across two weekly
time intervals, T1 and T2, every one week starting on Monday.




In the following example, a time based selective copy was specified as every one week starting on Monday, with the First
full backup option enabled on the copy. Data protection operations were run across two time intervals on the primary
copy, T1 and T2.

If an auxiliary copy operation is run on Monday 9/16, the first full backup within each weekly time interval (F1 and F3) will
be copied to the selective copy.




                                                       Page 197 of 263
                                          Features - SharePoint Database iDataAgent




If the Last full backup option is enabled on the copy instead of the First full backup option, then F2 and F4 will be copied
instead of F1 and F3.

Most Recent Full Backup Selective Copy

If the Copy most recent full backups when Auxiliary Copy starts option is selected when a selective copy is
created, the most recent full backup of each subclient will be copied during an auxiliary copy job.
In the following example, three full backups are on the primary copy, F1, F2, and F3. When an Auxiliary Copy is run on 9/4,
F3 will be copied because it is the most recent full backup.




In another example, if the same auxiliary copy is run on 9/4, but is killed before it is completely copied, F3 will only be
partially copied.




                                                       Page 198 of 263
                                          Features - SharePoint Database iDataAgent




If another full backup (F4) occurs on 9/5, and an auxiliary copy is run on 9/6, the partially copied full backup F3 will again
be copied, and the full backup F4 will be skipped.




Most Recent Full Backup Selective Copy With Reset Selected Backups

For the most recent full backup selective copy type, you can select the Reset Selected Backups option. This option will
disable all partially copied data protection operations and those operations will not be copied the next time an auxiliary copy
is run.




                                                       Page 199 of 263
                                          Features - SharePoint Database iDataAgent




Changing the Selective Copy Type Or Selective Criteria

If you change the selective copy type or selective criteria and then run an auxiliary copy operation, the auxiliary copy
operation will copy the data based on the old and new selective copy criteria.
In the following example, on 6/9 the criteria of a selective copy was changed from being weekly based to monthly based.
If an auxiliary copy is run on 6/30, F1 and F2 are copied because they meet the weekly based criteria. F5 is also copied
because it meets the monthly based criteria.




                                                       Page 200 of 263
                                         Features - SharePoint Database iDataAgent




Auxiliary Copy and Other Copy Features
The section describes how auxiliary copy is performed on copies that have other copy features enabled, such as:
   Combined Streams
   Inline Copy
   Deferred Copy
   Automatic Copy
Auxiliary Copy with Deferred Copies
An auxiliary copy operation on a copy that has the Defer Auxiliary Copy for <n> day(s) option enabled, data will be
copied starting at 12:01 A.M. on the set number of days after valid data becomes available on the source copy. See
Deferred Copy for an overview.
Auxiliary Copy with Automatic Copies
An auxiliary copy operation on a copy with the Enable Automatic Copy (no schedule required) option enabled on the
copy, will be performed within 15 minutes of valid data becoming available on the source copy. See Automatic Copy for an
overview.
Auxiliary Copy and Spool Copies
An auxiliary copy must be performed on a primary copy that has the Spool Copy (no retention) option enabled, before data
on that copy can be pruned the next time data aging is run. It is recommended that regular or automatic auxiliary copy
operations are performed for storage policies using spool copies. See Spool Copy for an overview.
Auxiliary Copy and Inline Copies
If a data protection operation of a subclient whose storage policy has an inline copy enabled does not successfully create an
Inline Copy, that data will be copied to a secondary copy the next time an auxiliary copy is run. See Inline Copy for an
overview.




                                                      Page 201 of 263
                                         Features - SharePoint Database iDataAgent




Auxiliary Copy With Combined Streams
Auxiliary Copy normally copies data stream by stream, meaning, if there were four data streams on the primary copy, then
the auxiliary copy operation would use four data streams to copy on the secondary copy.
Alternatively, auxiliary copy can copy data from a primary copy that has multiple streams to a secondary copy that has less
than that number of streams, by using the Combined to <n> Streams option on the copy.

By combining the data streams to less media, this improves media usage as the media storage is optimized. Media
recycling is also more efficient, as data aging is more effective as secondary copies of the data reside on less media than
what was required for the original data protection operation. The following example illustrates an auxiliary copy operation
performed for a copy that combined data streams into one stream:




Multi-stream backups of the Microsoft SQL, DB2, and Sybase agents will not be copied during an auxiliary copy operation to
a copy that combines streams.
Advantages for Media Usage

The main advantage of using the Combined to <n> Streams option is that you can use less media when copying data to a
secondary copy.
See Combine the Data Streams of a Storage Policy Copy for step-by-step instructions.



Auxiliary Copy With Multiple Stream Parallelism
You can select the number of data streams to be copied at the same time during an auxiliary copy operation. This can be
achieved by using the maximum number of available streams or from a specified number of streams.



                                                      Page 202 of 263
                                          Features - SharePoint Database iDataAgent




Allow Maximum Number of Streams Option
If enough storage resources are available, you can use the Allow Maximum option so that all data streams are copied
concurrently during an auxiliary copy operation.
For example, if four streams were required for the auxiliary copy job, then all four streams will be copied in parallel.




                             If an auxiliary copy is configured to copy with parallel streams, and the associated storage
                             policy copy is configured with combined streams, the auxiliary copy operation will attempt to
                             use no more than the number of streams defined in the storage policy copy's Combined to n
                             streams field.
                             Auxiliary Copy operations only use streams to perform copies of jobs with a "to be copied"
                             status. Therefore, all available streams may not be used when performing an auxiliary copy.

Limit to Number of Streams Option
If not enough storage resources are available, or you do not want to use all available resources, you can select the number
of data streams that will be copied at the same time during an auxiliary copy operation.
For example, if four streams were required for the auxiliary copy job, and two streams are selected to copy in parallel, then
the auxiliary copy operation will copy two streams at a time. Note that Stream 3 will start when Stream 1 or Stream 2 is
completed. Hence, Stream 3 does not have to wait for both Stream 1 and Stream 2 to complete before starting.




                                                       Page 203 of 263
                                           Features - SharePoint Database iDataAgent




Auxiliary Copy Operations
You can start or schedule an auxiliary copy at either the CommServe or storage policy levels from the Auxiliary Copy
dialog box.
From this dialog box, you can:
   Select the storage policy from which this auxiliary copy operation will be performed. (CommServe level only.)
   Copy the data from the primary copy to all secondary copies.
   Copy the data from the primary copy to a specific secondary copy.
   Start new media for all secondary copies.
   Mark a media full after a successful operation.
   Select a number of streams to copy in parallel.
   Configure an alert for the operation.
   Select vault tracking, change priority, start suspended.
   Specify Job Running Time, and Job Restart interval options. (You can also specify the maximum number of allowed
   restart attempts and the interval between restart attempts for all auxiliary copy jobs. For procedures, see Specify Job
   Restartability for the CommCell.)


Sequence in Which Data is Copied During an Auxiliary Copy Operation
Data is copied to secondary copies during auxiliary copy operations according to the media of the original data protection
operations, per destination copy and data stream. Data is copied in the following sequence:
   Media
   For example, J1 and J2 used media M1. J3 and J4 used media M2. J1 and J2 are copied first, and then J3 and J4.
   Drive Pool
   For example, J1, J2, J3 and J4 used drive pool D1. J5, J6, J7 and J8 used drive pool D2. J1, J2, J3 and J4 are copied first
   grouped by media and volume. J5, J6, J7 and J8 are copied second.
   MediaAgent
   For example, J1, J2, J3, J4, J5, J6, J7, and J8 use MediaAgent MA1. J9, J10, J11, J12, J13, J14, J15, and J16 use MediaAgent




                                                        Page 204 of 263
                                         Features - SharePoint Database iDataAgent



   MA2. J1 through J8 are copied first grouped by media, volume, and drive pool. J9 through J16 are copied second.

See View the Media Not Copied for step-by-step instructions.



Recovering Data From Copies
By default, when a browse or data recovery operation is requested (without specifying copy precedence), the software
attempts to browse/restore/recover from the storage policy copy with the lowest copy precedence. If the media for the
copy with the lowest precedence is offsite, damaged, or if hardware resources are unavailable, then a specific storage policy
copy must be specified in the Copy Precedence tab of the Storage Policy Properties dialog box. For more information,
see Change the Copy Precedence.
If the data that you want to browse/restore/recover was already pruned from that copy, the software will search for the
requested data first from synchronous copies starting with the lowest copy precedence number to the synchronous copy
with the highest copy precedence number, and then from selective copies in the same order.
In the following example, a storage policy includes three copies, a primary copy (with copy precedence 1) and two
additional synchronous copies. If File B is unavailable from the primary copy, then, when performing a data recovery
operation, data will automatically be restored/ recovered from Synchronous1 that has a copy precedence of 2.




If, however, the copy precedence of the two synchronous copies was changed so that Synchronous1 has a copy precedence
of 3 and Synchronous2 has a copy precedence of 2, then the data recovery operation will be performed from data obtained
from Synchronous2 that has a copy precedence of 2.




                                                      Page 205 of 263
                                        Features - SharePoint Database iDataAgent




Browse/Restore/Recover from Copy Precedence
When a copy is configured, the system automatically assigns it a copy precedence number, which you can change at any
time.
If you specify a copy precedence number for a data recovery operation, the software searches only the storage policy copy
with that precedence number in each of the storage policies through which the data was secured. If data does not exist in
the specified copy, the data recovery operation fails even if the data exists in another copy of the same storage policy.
Copy precedence is useful if:
   The primary copy is no longer available for a data recovery operation due to a hardware failure.
   You know that the media containing the data from data protection operations for a particular copy have been removed
   from the storage library. In this case, you can choose to browse/restore/recover from a copy whose media are inside the
   library.
   You want to browse/restore/recover from a selective copy.
   You want to browse/restore/recover from a copy that accesses faster magnetic disk media rather than slower tape
   media.
   You know that the media drives used by a particular copy are busy with another operation and want to
   browse/restore/recover from a different copy to avoid resource conflicts.
In the following example, a storage policy has a primary copy, Primary1, two synchronous copies, Synchronous1, and
Synchronous2, and two selective copies, Selective1 and Selective2. If you choose to browse/restore/recover your data
from Selective2, you must specify that you want to browse/restore/recover from selective copy precedence 2 so that data
will be restored/recovered from that copy.




                                                     Page 206 of 263
                                         Features - SharePoint Database iDataAgent




Browsing From Copy Precedence Across Multiple Storage Policies
When you browse at either the client, agent, or backup set levels, keep in mind that the data for the subclients included in
these levels may have been secured through more than one storage policy. If you specify a copy precedence for a data
recovery operation, the data is restored/recovered from the storage policy that has data on the specified copy.
For example, data is restored/recovered at the backup set level that has two subclients, SubA and SubB. SubA uses storage
policy SP1 and SubB uses storage policy SP2. SP1 has two copies, Copy1 and Copy2. Copy1 has a copy precedence of 1, and
Copy2 has a copy precedence of 2. If copy precedence 2 was selected for the data recovery operation, only data from SubA
will restored/recovered.




Restoring Data from a Secondary Copy using a Third-Party Command Line
The Oracle and SAP for Oracle/MAXDB iDataAgents provide the capability of restoring data from secondary copies using a
third-party command line, such as RMAN and the SAP command line. Using a third-party command line for this operation
provides an alternative to the CommCell Console, and is useful for restoring data when the primary copy is unavailable. To
utilize this feature, some minor setup configuration is required depending on the agent, as described briefly below:
   For Oracle, the setup involves adding the PARMS="ENV=(CV_restCopyPrec=2)" parameter statement into the RMAN
   restore script.
   For SAP, the setup consists of adding the CV_restCopyPrec parameter followed by the copy precedence number 2 into
   the parameter file prior to running the restore.
See Restore Data from a Secondary Copy using a Third-Party Command Line for step-by-step instructions.




                                                      Page 207 of 263
                                        Features - SharePoint Database iDataAgent




Safeguarding Your Data Using Auxiliary Copy With Selective Copies
You can use the Auxiliary Copy feature to copy your data to selective copies, and then use the Export Media option to keep
these copies of your data in a safe offsite location. By defining your selective copy to contain one full backup in a three
month period, and by using the Export Media option, you can guarantee that you can keep a secondary copy of full backup
data every three months in a safe location. This provides for extra protection in the event of data loss.
To accomplish this:
  Create a Selective Copy
  Schedule an Auxiliary Copy
  To guarantee that data will copied to the selective copy you have defined, create a Auxiliary Copy schedule. This
  schedule will dictate when the full backup(s) will be copied from the primary copy to the selective copy media. If
  resources permit, create a schedule to run every day, for example, every day at 9:00 A.M. New, eligible full backups will
  be copied when the Auxiliary Copy operation is run. No drive resources will be used if no eligible full backups have
  occurred since the last Auxiliary Copy was performed for the Storage Policy.
  Perform an Export Media Operation Using Vault Tracker
  Once you have copied your data to selective copies using auxiliary copy operations, you can now export your media to
  keep it in a safe offsite location.
  Perform a Manual Export Based on a Report
  If you want to take the tapes out manually (either by opening the library door or through selecting a list), run the Media
  Information report. Based on the report, (which you can to run at the same frequency and options as above) the tapes
  may be manually exported.


Auxiliary Copy Considerations
  Software compressed data is not uncompressed during an auxiliary copy operation.
  In a NAS NDMP environment, you must be aware of these considerations in creating copies for the Storage Policy.
      Libraries attached to a MediaAgent - copies for the Storage Policy must be pointed to drive pools connected to a
      MediaAgent.
      Libraries attached to a NAS NDMP file server - copies for the Storage Policy must be pointed to drive pools connected
      to a NAS-attached device.
      If NAS data is backed up using a Storage Policy pointing to a MediaAgent with NDMP Remote Server (NRS) installed,
      then any copies of that Storage Policy must also point to a MediaAgent with NRS installed.
  If you disable a data protection operation associated with a primary copy, that backup will not be copied during an
  Auxiliary Copy operation.
  Once started, if an auxiliary copy job cannot be completed, the Job Manager will retry the job up to a total of two days at
  20-minute intervals for a maximum of 144 times.
  The Skip job on read errors during Auxiliary copy option specifies whether the Auxiliary Copy job will skip data
  protection jobs that encounter read errors during auxiliary copy operations. If you have many data protection jobs that
  need to be copied, enabling this option will allow the auxiliary copy job to continue copying other data protection
  operations while skipping over those jobs with read errors. Upon completion of all jobs that need to be copied, the
  skipped jobs will be attempted to be copied again. This option is enabled by default. However, if you disable the option
  and the Auxiliary Copy job encounters a read error, it will return with a Pending status and not continue. The Skip job
  on read errors during Auxiliary copy option can be modified from the Media Management Configuration (Service
  Configuration) dialog box available in the Control Panel. If you continue to encounter read errors, contact your software
  provider.
  NOTE: Pending reasons are displayed in the Reason for Job delay field, which is located in the Administration Job
  Details (General) tab.
  Multiple streams will not be copied in parallel to a copy that combines streams.
  To avoid possible media contention, which can affect performance, it is recommended that you do not start an auxiliary
  copy operation if the selected storage policy is already being used by a data protection or data recovery operation. To
  determine the jobs scheduled for a storage policy:
      Identify the storage policy associated with a job, see the Storage Policy column of the Job Controller.
      To view a list of jobs scheduled in the CommServe, click the CommServe icon or a specific storage policy, then select
      View Schedules.
  Auxiliary copy operations will maintain the data format of multiplexed data. For more information, see De-Multiplexing
  Multiplexed Data.
  When you run an auxiliary copy operation, the copy assumes the settings of the primary copy. Therefore, if the primary




                                                     Page 208 of 263
                                        Features - SharePoint Database iDataAgent



  copy data is encrypted, then the auxiliary copy data will be encrypted; and if the primary copy data is not encrypted,
  then the auxiliary copy data will not be encrypted.
  When defining the rules for a selective copy in the Copy Properties (Selective Copy) dialog box, it is recommended to
  select the Select Full Backups at frequency option. This option enables you to select and customize the interval at
  which full backups will be written to the Storage Policy Copy. Selecting the Copy Most Recent Full Backup when
  Auxiliary Copy Starts option will write a full backup to the copy only when you run the auxiliary copy operation.
  The amount of data transferred for an Auxiliary Copy job is updated every 512 MB or when a data chunk is closed. The
  progress for the entire job is displayed on the Administration Job Details (General) tab, and the per copy progress is
  displayed on the Auxiliary Copy Job Details (Streams) tab.
  NOTE: The data chunk size can be changed from the Media Management Configuration (Chunk Size) or storage policy
  copy's Data Path Properties dialog boxes.


Customize Auxiliary Copy Operations through Registry Keys
  Periodically, Auxiliary Copy operations update the status of the jobs that are currently being copied. By default the
  status will be updated every 60 minutes, as well as at the end of the Auxiliary Copy operation. The time interval for the
  status updates can be modified using the AUXCOPY_MARKCOPIED_MINUTES registry key.
  Auxiliary Copy operations copy all eligible data, even from data protection operations that have finished after the
  auxiliary copy operation has started. Auxiliary Copy operations can be prevented from copying this new data by
  configuring the AUXCOPY_NOT_PICK_NEW_BACKUPS registry key.


How To
  Start an Auxiliary Copy
  Schedule an Auxiliary Copy
  Recover Your Data From Copies
  Restore Data from a Secondary Copy using a Third-Party Command Line (Oracle and SAP iDataAgents)
Back To Top




                                                     Page 209 of 263
                                          Features - SharePoint Database iDataAgent




Data Aging

For a detailed description of the feature, see the following topics:
   Overview
   Basic Retention Rules
      Retention Time
      Retention Cycles
      Data Aging Based on Basic Retention Rules
   Extended Retention Rules
      Extended Retention Rules for Standard and Custom Calendars
      Data Aging Based on Extended Retention Rules
   Agents with Unique Rules
      DB2, Informix, and Sybase
      Lotus Notes Database
      Oracle and Oracle RAC
      Microsoft SQL Server
      DataMigrator and DataArchiver
      Quick Recovery Agent and Recovery Director
      ASR Backups
      Transaction/Archive/Logical Log Backups
   Data Aging and Other Scenarios
      Data Aging of a Secondary Copy
      Data Aging of a Primary Copy with Synchronous and Selective Copies
      Data Aging of a Primary Copy without Secondary Copies
      Data Aging of an Incremental Storage Policy
      Effect of Disabled Jobs on Data Aging
      Data Aging from a Source Copy that is not a Primary Copy
      Data Aging and Thresholds for Managed Disk Space on Magnetic Libraries
   Data Aging of Other Types of Data
      Job History Data
      Audit Trail Data
      Erase Backup/Migrated Data
   Data Aging Operations
      Start or Schedule Data Aging
      Enable or Disable Data Aging
      Alert Configuration
      Job Restarts and Job Running Time
   Data Aging and Media Recycling
   Special Considerations
   Related Reports
   Support Information
   How To
Related Topics:
   Command Line Interface - qoperation agedata
   Save a Job as Script
   Storage Policy Copies
   Auxiliary Copy
   Accessing Aged Data




                                                       Page 210 of 263
                                          Features - SharePoint Database iDataAgent




Overview
Data Aging is the removal of data from media that has aged. Data Aging first ages data based on the retention rules of a
storage policy copy, and then removes that data based on the media recycling rules of the associated media. This will
ensure data recoverability going back n number of days, which is specified in the user-defined retention rules. Only data
that is classified as aged is removed when data aging operations are run.
Storage policy copies have configurable parameters called retention rules. These rules are called Basic Retention Rules and
Extended Retention Rules (These are explained in the following sections.) In order to age data, both the Basic Retention
Rules of retention days and retention cycles, and the Extended Retention Rules based on extended intervals of days must
be exceeded for all jobs in a cycle. Most agents follow these rules. However, some have their own unique rules, as
described in Agents With Unique Rules.
Data must be copied from a source copy (either the primary or a specified source copy) to a secondary copy during an
auxiliary copy operation before data can be aged from that source copy.
After data is qualified to be aged, this data will be removed from media based on the following rules:
   Data on tape media will be removed when media needs to be reused by data protection operations.
   Data on magnetic media will be removed based on the Managed Disk Space rule. All magnetic data on a copy will be
   removed by data aging if this option is disabled. Data will be removed based on the threshold for Managed Disk Space
   defined in the library properties.
When all the data on a specific media is aged, that media is automatically recycled back to a scratch pool to be used again
for future data protection operations. If you wish to perform data recovery operations from data that has been aged or save
the media containing the data for future use, see Accessing Aged Data.




Basic Retention Rules
Most agents follow basic retention rules. See Support Information - Data Aging for more information.
Basic retention rules are defined by retention time and retention cycles. These parameters determine how much data is
retained and for how long. For a cycle to be eligible for data aging, both of its retention time and retention cycles must be
exceeded.
Retention Time
Retention time is defined as the amount of time that a cycle needs to be available for a single subclient. Retention time is
calculated in terms of 24 hour days from the completion time of the last data protection job in a cycle until the start time of
the data aging job.
Retention Cycles
Retention cycles are defined as the number of cycles that needs to be available for a single subclient. A full cycle begins
with a full backup and includes all other non-full backups up to, but not including, the next full backup. When a full backup
job runs, it is counted as a cycle.
The default settings for time and cycle parameters are set to infinite, but can be changed to better suit the retention
requirement of the data being secured.
A cycle is defined as a group of data protection operations, starting with a full backup, including all subsequent data
protection operations up to, but not including, the next full backup. A full backup without any subsequent incremental or
differential backups is still counted as a cycle.
Each subclient has its own cycles. Each subclient is associated with a storage policy, and multiple subclients can share the
same storage policy. Therefore, a single storage policy can contain cycles for multiple subclients.
This example illustrates a storage policy that
contains four full backup cycles of Subclient A and
two full backup cycles of Subclient B.




                                                       Page 211 of 263
                                           Features - SharePoint Database iDataAgent




The backup data is retained in terms of the following:
   Retention Time
   Retention Cycles
The retention period implicitly suggests a relation between the amount of time that you want the data to remain restorable
and the number of full backup cycles that you expect to complete during that time period. For example, a retention period
of 14 days, 2 full cycles suggests that 2 full backup cycles are completed in a time period of 14 days, one full cycle a week.
Consequently, for this subclient, it would be appropriate to schedule full backups on a weekly basis.
As another example, a retention period of 28 days, 2 full cycles suggests that 2 full backup cycles are completed in a time
period of 28 days, one full cycle every two weeks. It would therefore be appropriate to schedule full backups on a biweekly
basis.
As demonstrated by these examples, the full backup cycle (i.e., the length of time between full backups) for subclients,
should be established by the ratio of the retention period parameters, specifically:
Length of time/Number of full cycles = Full backup cycle
Data Aging Based on Basic Retention Rules
For data to be aged from a primary or source copy, the following basic retention rules apply:
   Retention days must be exceeded for all jobs in a cycle.
   Retention cycles must be exceeded.
   Data protection operations that are candidates to be copied to secondary copies must be copied before they are eligible
   for data aging.
For data to be aged from a secondary copy (that is not a source copy), the following basic retention rules apply:
   Retention days must be exceeded for all jobs in a cycle.
   Retention cycles must be exceeded.
The following Storage Policy Copy Properties settings would result in data not being retained in the storage policy copy.
Thus, these settings are only permitted for primary copies configured with at least one synchronous copy association:
   Spool Copy (No retention)
   Retain for n number of cycles - set to zero (0).
Basic retention rules can be established from the Basic Retention Rules pane of the Retention tab of the Copy
Properties dialog box, then the aging of data on that copy is based on time and cycles.
This example illustrates four cycles, Cycle1, Cycle2,
Cycle3, and Cycle4. Cycle1 includes F1, I1 and I1.
Cycle2 includes F2, I2, and I2, Cycle3 includes
F3, and I3, and Cycle4 includes F4 (F4 is still
considered a cycle). When F2 completes on 9/8, the
cycle count is two, when F3 completes on 9/15, the
cycle count is three, and when F4 completes on
9/22, the cycle count is four.




The way data is aged during data aging is dependent
on the data retention rule that is defined for storage
policy copies. If the retention rule is set for every 14
days and three cycles, data from the first full backup
cycle will become eligible for data aging only when
F4 completes on 9/22. Once F4 completes on 9/22,
three cycles are available to meet the cycle retention
criteria. Because Cycle1 completes on 9/6, it also



                                                           Page 212 of 263
                                          Features - SharePoint Database iDataAgent




meets the retention criteria you set of 14 days. Since
the data from Cycle1 exceeds both retention criteria,
it is eligible for data aging on 9/23.




            When a user changes the storage policy association of a subclient, a subclient is deleted, or an agent or
            client is deconfigured, only the retention days must be exceeded for data to be aged. In these cases,
            retention cycles are set to zero (0).
            With this, users can temporarily suspend the activity of a client to age data without uninstalling the client
            software and without meeting the cycle retention requirement, thereby freeing up media faster for new
            data. For more information, see Suspend Use of a Client Computer Temporarily.



Extended Retention Rules
Extended retention rules are defined in terms of weeks, months, and years. This enables you to retain data for a much
longer period of time than with Basic Retention Rules. Each rule starts on the current day and counts backward in time
according to each rule. Hence, if there are multiple extended retention rules, then there may be multiple reasons why a full
backup is retained and not aged.
For each extended retention rule you must specify whether the first or last full backup within the time period of the rule is
to be kept. If you select last full backup, only the last full backup within the time period will be picked if there are no
remaining full data protection schedules for that subclient during that period.
See Also:
   Grandfather-Father-Son (GFS) Tape Rotation
You can use a standard calendar or define and use a specific custom calendar. See Custom Calendar for more information.

                          Only full backup jobs can be retained by extended retention rules. Note that full backup jobs for
                          some agents are not self contained because they require data from subsequent jobs in order to
                          be successfully restored, and therefore, cannot be retained by extended retention rules. Since
                          Oracle online and SQL File/File Group (FFG) full backup jobs are dependant upon corresponding
                          transaction logs for restorability, their data will not be retained by extended retention rules.


Extended Retention Rules for Standard and Custom Calendars
Extended Retention Rules are based on the following:

All Fulls        All Full Backups
Weekly Full      The first or last full backup of every calendar week.
Monthly Full     The first or last full backup of every standard or custom calendar month.
Quarterly Full   The first or last full backup of every standard or custom calendar quarter year.
Half-yearly
                 The first or last full backup of each standard or custom calendar half year.
Full




                                                         Page 213 of 263
                                          Features - SharePoint Database iDataAgent




Yearly Full     The first or last full backup of each standard or custom calendar year.

Note: If there are no custom months defined for the current time period, data aging operations will use the default standard
calendar month definitions.
Data Aging Based on Extended Retention Rules
Extended retention rules allow you to retain data for extended periods of time. These rules can be defined from the
Retention tab of the Copy Properties dialog box.
The following example illustrates if extended retention rules are selected for this copy and a data aging operation is run,
based on the dates of a standard calendar. Note that these dates may differ if you are using a custom calendar.

The retention rules of this example are as follows:
   Basic Retention Rules = 15 Days, 2 Cycles.
   First Extended Rule = For 90 days, keep last
   weekly full.
   Second Extended Rule = For 365 days, keep last
   monthly full.
   Third Extended Rule = For 1825 days, keep last
   yearly full.
If data aging is run on 1/23/03, then:
   All backups run between 1/22/03 and 1/7/03 are
   retained.
   From 10/24/02 to 1/07/03, the last full backup of
   every week is retained.
   From 1/22/02 to 10/24/02, the last full backup of
   every month is retained.
   From 1/23/98 to 1/22/02, the last full backup of
   every year is retained.

              Data backed up through file/file group subclients for the Microsoft SQL Server iDataAgents cannot be
              pruned through extended retention rules.
              Extended retention rules are not supported for ASR Backups of the Windows XP and Windows Server
              2003 (32 and 64-bit) agents.
              For the Oracle iDataAgent, extended retention rules are supported for offline and selective online full
              backups only.



Agents with Unique Rules
Not all agents support standard data aging rules. The following agents have unique data aging rules due to the nature of
their data and operations. See Support Information - Data Aging for more information.
DB2, Informix, and Sybase
Data Aging rules are unique for the DB2, Informix, and Sybase iDataAgents. This section describes these rules.
DB2 log files are aged at the backup set level; Informix and Sybase log files are aged at the instance level. For these
iDataAgents, archive and logical log backups are not considered part of the backup cycle. Therefore storage policy cycle
retention parameters do not apply to them and have their own set of data aging rules, as described in
Transaction/Archive/Logical Log Backups.

           For the DB2 and Informix, retention cycles are not used for copies involved in operations from the
           command line. For such operations, data is aged according to the associated retention time.


Lotus Notes Database
Data Aging rules are unique for the Lotus Notes Database iDataAgents. This section describes these rules.
The Lotus Notes Database iDataAgent transaction log backups are not considered part of the backup cycle. Therefore,



                                                       Page 214 of 263
                                           Features - SharePoint Database iDataAgent



storage policy cycle retention parameters do not apply to them and have their own set of data aging rules, as described in
Transaction/Archive/Logical Log Backups.
Oracle and Oracle RAC
Data Aging rules are unique for the Oracle and Oracle RAC iDataAgents. The following sections describe these rules.
Data Aging of the Oracle Recovery Catalog Database

When a Data Aging job is run, the BackupPieceName UNAVAILABLE command is automatically issued to RMAN to disable
specific backup pieces in the Oracle Recovery Catalog database that were pruned from the Media Manager CommServe
tables. Any backup pieces that were aged from the system's database that have exceeded their retention criteria will be
marked as unavailable in the Oracle Recovery Catalog database through this methodology. You can delete these specific
backup pieces by creating and enabling the OracleDeleteAgedBackupPiece registry key. For mixed mode environments,
where the CommServe is at the current release and the Oracle client is at a prior release, the synchronization is achieved
through different means (e.g., CROSSCHECK) and you should consult the documentation for that prior release for more
information.
Timeout for Oracle Crosscheck Per Instance During Data Aging

By default the timeout for Oracle CROSSCHECK per instance is 600 seconds during data aging in mixed mode
environments. You can modify this value (or disable the option) by using the OraCrossCheckTimeOut registry key.

            For the Oracle and Oracle RAC iDataAgents, keep in mind that after uninstalling the iDataAgent software,
            CROSSCHECK will no longer be performed by the system to synchronize entries in the CommServe
            Database with the RMAN catalog. If either of these iDataAgents is later re-installed, then the next data
            aging job will synchronize the RMAN catalog with the CommServe Database unless the data on tape has
            been deleted (such as the case where the tape/volume was used for other backups and has been pruned).

Data Aging Rules for Archive Log files

Data Aging for Archive Log files needs to be understood to prevent unintended loss of data. The storage policy for the
archive log files is defined at the Instance (or database) level. This means that all the subclients defined for the instance
use the same storage policy for backing up the archive log files and have their own set of data aging rules, as described in
Transaction\Archive\Logical Log Backups.
Data Aging Rules for Selective Online Full Backups

A selective online full operation that consists of archive logs and oracle data can also be linked to the logs of a separate job,
which was initiated within the time frame of the selective online full operation. These logs and the selective online full are
then considered as one entity within the software, regardless of whether or not separate jobs have the same job ID.
Therefore, they are copied to synchronous and selective copies together during auxiliary copy operations and are aged
together. If any part of the selective online full is missing from a copy, the full will not be considered as a valid full and will
not be counted as a cycle during data aging. Consider the following:
   Data from selective online full backups are considered the same as regular full and offline full backups for each Oracle
   subclient in terms of basic retention rules of cycles and days. However, if any logs on a primary copy have not been fully
   copied to a secondary copy, the selective online full cannot be aged.
   Data from selective online full backups are considered the same as offline full backups for each Oracle subclient in terms
   of extended retention rules of days. Selective online full backups and all logs linked with it must be retained together on
   the same storage policy copy.
   Those Logs that are linked with a selective online full (and the logs of the selective online full) can be aged only if they
   are older than the oldest data that can be aged and the corresponding data of the selective online full that can be or
   have been aged.
Data Aging Rules for Oracle On Demand Backups

Data Aging for Oracle On Demand backup jobs uses days/time, and ignores cycles, as the determining factor for pruning
the data. Therefore, once the retention time criteria has been met, all data (for both data and logs) is pruned that was
backed up using the storage policy specified in the RMAN script that was run through the Command Line Interface.
Consider the following when developing a storage policy strategy for Oracle On Demand backups:
   The same storage policy should not be used for regular Oracle backups and Oracle On Demand backups.
   The storage policy copy containing logs of Oracle On Demand backups should have a much longer retention time than
   other storage policies used by regular Oracle backups for the same instance. This is to prevent the logs of Oracle On
   Demand backups from being pruned before the data of regular Oracle backups, and allow the database to be fully
   restored and recovered using the data of old regular Oracle backups and logs afterwards.




                                                        Page 215 of 263
                                         Features - SharePoint Database iDataAgent




See Running RMAN Scripts using the Command Line Interface for more information.
Microsoft SQL Server
Data Aging rules are unique for the Microsoft SQL Server, Microsoft SQL Server 2000, and the Microsoft SQL Server 2005
iDataAgents. The following sections describe these rules.
Data Aging for the SQL Server iDataAgents performs the following stored procedures that you may have been manually
running on Enterprise Manager. When Data Aging is run, the system ages these histories from the CommServe database
and the SQL Server.
   sp_delete_backuphistory
   sp_delete_database_backuphistory
   sp_delete_backup_and_restore_history
SQL Back in Time Restores and Data Aging Rules

When you perform a back in time restore (i.e., restoring to a backup cycle earlier than the current backup cycle), all
differential and transaction log backups which were run after the full backup from which the restored data was obtained will
not be able to be aged until a new full backup is run. Running a full backup after performing a back in time restore releases
the older backups and subsequent log backups for data aging.
Data Aging Rules for Transaction Log Backups

SQL Server iDataAgent's transaction log backups are not considered part of the backup cycle. Therefore, storage policy
retention cycle parameters do not apply to them and have their own set of data aging rules, as described in
Transaction/Archive/Logical Log Backups.
DataMigrator and DataArchiver
Data Aging rules are unique for the DataMigrator and DataArchiver agents. All data and jobs in a migration or archive
operation, must meet the basic or extended retention rules in days (as specified in a copy of a DataMigrator/Archiver
storage policy) in order to be aged. The DataMigrator and DataArchiver agents do not support retention cycles.

           To promote consistent data availability of all of the data migrated or archived by the DataMigrator and
           DataArchiver agents, it is recommended that all subclients be associated with storage policies whose
           primary copies have the same retention period.

Quick Recovery
The Quick Recovery Agent can age, or delete, QR Volumes that are older than the retention period specified in the
associated QR Policy. The Quick Recovery Agent will also attempt to free any destination volumes that were marked as
allocated, or locked, but are not currently referenced by any QR Volume. For example, this condition may arise when a
Create/Update QR Volume job is terminated unexpectedly. QR Volume data aging takes place in the context of the data
aging job that runs periodically on the CommServe.
The following constraints apply to QR Volume data aging:
   QR Volumes with the In Use flag set will not be deleted. You can set this flag from the QR Volume Details screen if you
   wish to exclude the QR Volume from being aged.
   QR Volumes with incremental updates scheduled will never be aged. They must be deleted manually in order to age the
   volume(s).
   QR Volumes are not aged if any QR Agent data protection or recovery operations are running.
Recovery Director
The QR Agent creates snapshots or QR Volumes for use with Recovery Director. These snapshots and QR Volumes are aged
as described in the Quick Recovery section above. However, Recovery Director has the following exception: When a
snapshot or QR Volume is part of a work flow, and jobs from that workflow are running, the snapshot or QR Volume cannot
be aged until all of the jobs in the work flow are complete.
This is true even in cases where the snapshot or QR Volume has passed its retention period, and Data Aging is run on the
CommServe.
ASR Backups
ASR backups are only performed in full, there are no incremental or differential ASR backups. Therefore, ASR backups are
aged in full, and independently of other backup operations. For example, an ASR backup will be preserved past its retention
period until another ASR backup is run, even if full backups have been run on the client.




                                                      Page 216 of 263
                                          Features - SharePoint Database iDataAgent




Transaction/Archive/Logical Log Backups
Transaction/archive/logical log (log) backups are not considered part of the backup cycle. Therefore, storage policy cycle
retention parameters do not apply to them. The following are the unique data aging rules for log backups:
   To-be-copied logs will not be aged (on primary or non-primary source copy).
   Logs that exist on only one copy will be aged when they are older than the oldest non-removable data.
   Logs that exist on multiple copies (except those logs with the longest retention days) will be aged according to copy
   retention days. Logs that exist on multiple copies with the longest retention days will be aged when they are older than
   the oldest non-removable data.
   Oracle online full logs will not be aged if their data on the same copy is non-removable data.
   Partial, disabled logs will be aged when they are older than the oldest non-removable data.


Data Aging of Other Scenarios
The sections that follow include the data aging of other scenarios.
Data Aging of a Primary Copy Without Secondary Copies
If data aging is performed on a primary copy and there are no secondary copies defined, the data on the primary copy can
be aged provided the data has exceeded its specified retention criteria.
In this example, the retention criteria of the primary
copy is set to five days and one cycle. If data aging
is run on 9/13, no data is eligible for data aging.
Although the retention criteria of one cycle is met,
data has not been retained for a 5 day period.




However, if data aging is run again on 9/17, F1 and
I1 can be aged, as the retention rule of one cycle
and five days is met.




Data Aging of a Secondary Copy
The data aging of a secondary copy is dependent on the selected retention criteria set for that copy. In the following
example, the primary copy has a retention rule of four days and two cycles, and the synchronous copy has a retention rule
of two days and two cycles.
In this example, if data aging is run on 9/18, no data
can be aged from the primary copy because the
retention rule of the primary copy was not met.
However, F1 and I1 from the synchronous copy can
be aged because the retention rule of the
synchronous copy of two days and two cycles was
met.




                                                         Page 217 of 263
                                          Features - SharePoint Database iDataAgent




Data Aging of a Primary Copy with Synchronous and Selective Copies
Data aging can be performed on a storage policy that has synchronous and/or selective copies defined. Data can be aged
according to its retention rule) from the primary copy only when all data that is eligible to be aged has been copied to all
active copies during an auxiliary copy operation.
In this example, the data retention rule for the
primary copy is set to one day and one cycle. (The
retention rule of the secondary copy is greater than
the retention rule of the primary copy. Data from
Tape 1, Tape 2, and Tape 3 was copied to the
secondary copy when an auxiliary copy job is run on
9/13. If data aging is run on 9/15, data from Tape
1, Tape 2, and Tape 3 can be aged.




In this example, the data retention rule for the
primary copy is also set to one day and one cycle.
(The retention rule of the secondary copy is greater
than the retention rule of the primary copy.) Data
from Tape 1, Tape 2 and Tape 3 has not yet been
copied to the secondary copy. If data aging is run on
9/14, no data can be aged from the primary copy.
Data that has exceeded its retention criteria cannot
be aged until that data is copied to the secondary
copy.




                                                        Page 218 of 263
                                          Features - SharePoint Database iDataAgent




Data Aging from a Source Copy that is not a Primary Copy
The source copy feature allows you to select a copy other than the primary copy to be used as the source copy from which
data is copied during an auxiliary copy operation. A source copy can be selected from the Copy Policy tab of the Copy
Properties dialog box.
The rules for data aging on source copies are as follows:
   Data can be aged from the primary copy when there are To Be Copied jobs and the primary copy is not the source
   copy. See Jobs on a Storage Policy Copy for an overview of To Be Copied jobs.
   Data can be aged from a secondary copy that is a source copy once all of its data is copied to the secondary copy.
The following examples illustrate how data is aged from a storage policy that has three copies; primary copy Primary_01,
Secondary_01, and Secondary_02. Secondary_02 uses Secondary_01 as a source copy. The retention rules for each copy
are as follows:
   Primary_01 = 15 days and 2 cycles
   Secondary_01 = 1 day and 1 cycle
   Secondary_02 = 30 days and 2 cycles
In the first example, if a data aging operation is run
on 10/2, F1 can be aged from the primary copy as
the retention rule for F1 was met. Data from
Secondary_01 cannot be aged as it is the source
copy for Secondary_02, and data has not yet been
copied to Secondary_02.




                                                         Page 219 of 263
                                          Features - SharePoint Database iDataAgent




In the next example, F1 can be aged from
Secondary_01 after an auxiliary copy operation
copies that data onto Secondary_02 on 9/2, and a
data aging operation is run on 10/2.




Data Aging of an Incremental Storage Policy
If data aging is performed on a storage policy that has an incremental storage policy enabled, the data aging operation
counts backup cycles across both full and incremental storage policies. Data on a full storage policy is aged based on the
retention of the full storage policy, and data on the incremental policy is aged based on the retention rules of the
incremental policy.
If the incremental storage policy is also being used as a regular storage policy (and has full backups), the full backups will
be also aged according to any basic and extended retention rules that are set.
It is recommended that the retention rule for the full storage policy is greater than the incremental storage policy. Data on
incremental policy will be aged earlier if it has shorter retention than the full storage policy. If the incremental storage



                                                       Page 220 of 263
                                          Features - SharePoint Database iDataAgent



policy has longer retention than a full storage policy, this may result in dangling incremental jobs.

In this example, a full storage policy has a basic
retention rule of 7 days and 3 cycles. An incremental
storage policy has a basic retention rule of 3 days
and 2 cycles. If data aging is run on 10/2, F1 and F2
can be aged from the full storage policy. I1, D1, I2,
D2, I3, and D3 can be aged from the incremental
storage policy.




For more information on incremental storage policies, see Incremental Storage Policy.
Effect of Disabled Jobs on Data Aging
If data aging is performed on a storage policy copy that has disabled jobs, these jobs are aged differently. If the disabled
job is a full backup job, the entire cycle is marked as disabled. In this case, data aging does not count the disabled full
backup as a valid cycle. If the disabled job is an incremental or differential backup and the full backup job is not disabled,
the cycle is counted as a valid cycle. For more information on disabled jobs, see Storage Policy Copy Operations.

In this example, the data retention rule for the
primary copy is set to one day and four cycles. Prior
to the job being disabled, there are five cycles (the
most recent being F5 and the oldest being F1). In
this case, the oldest cycle (F1, I1) could be aged
because they satisfy the retention criteria of both
days and cycles.
When the second cycle becomes disabled, there are
only four valid cycles. The oldest cycle will count F1,
I1, F2, and I2 as one cycle, and F3, F4, F5 will
be counted as independent cycles. If a data aging
operation is run on 9/14, and F2 and I2 are marked
as disabled, none of the cycles on this storage policy
can be aged.


Data Aging and Thresholds for Managed Disk Space on Magnetic Libraries
Magnetic disks offer the best choice of media type for fast data protection and recovery operations. Magnetic disks are
often used to spool the data before it is archived to tape for long term and/or offsite storage. Using Managed Disk Space
data on magnetic disks can be retained as long as possible without running out of disk capacity and affecting future data
protection operations. Managed Disk Space provides a way to prune data according to disk capacity in addition to the
existing retention criteria which is usually defined by number of days as well as full cycles of data. This adds another layer
of retention parameter in addition to specified days/cycles.
Two disk capacity thresholds for managed disk space can be defined. They are:
   A threshold (in percentage) for starting the data aging operation (upper limit)
   A threshold (in percentage) for stopping the data aging (lower limit)




                                                          Page 221 of 263
                                         Features - SharePoint Database iDataAgent




When disk capacity reaches a high threshold, e.g., 85%, older data automatically qualify for removal. They are removed
from the disk if they meet their retention criteria and have been copied to appropriate secondary copies. The aging process
automatically stops when the disk capacity reaches a low threshold, e.g., 70%.
Once the managed disk is set up the process runs automatically without user intervention to manage disk capacity. Data
protection operations are retained on disk longer than usual providing the benefits of magnetic storage without having to
spend manual efforts to manage the disk capacity.
The Enable Managed Disk Space for magnetic data option is available in the Retention tab of the Copy Properties
dialog box. (By default, this option is disabled in all copies.)
The pre-defined thresholds for disk capacity for a magnetic library can be defined in the Mount Paths tab of the Library
Properties (associated with a magnetic library) dialog box.
Data Aging determines the pruning based on jobs - jobs with older data (based on the creation time of its first archive file)
is pruned first. Once the Data Aging operation determines the jobs to be pruned, data will be deleted based on the
established threshold. The frequency for checking the disk space and deleting data is determined by the frequency
established in the Interval (Minutes) between magnetic space updates option established in the Service
Configuration tab of the Media Management Configuration dialog box in the Control Panel.
See Enable Managed Disk Space for Magnetic Data for step-by-step instructions.




Data Aging of Other Types of Data
The following other types of data can be removed during a data aging operation:
Job History Data
The following table describes when job history is removed based on the type of job history and the status of the job:

Job Type                                   Job Status                           When it is Aged
                                                                                With its associated data, which is aged based
Data Protection Job History/Disaster       Successful                           on the associated storage policy copy's
Recovery Backup Job History                                                     defined retention rules.
                                           Failed/Killed                        90 Days
Data Recovery Job History (including
                                           Any                                  90 Days
CDR Recovery operations)
Administration Job History                 Any                                  90 Days

           Job history is also removed during the deletion of a storage policy or storage policy copy.




The number of days job history will be retained before it is aged can be changed from the default of 90 days using the
jobHistoryLifeSpan registry key.

If you want to change the number of days that DataArchiver recovery job history is kept, you can use the
archiverRestoreHistoryLifeSpan registry key.

Audit Trail Data
When the specified retention rule of Audit Trail data is exceeded, Audit Trail operations are aged when a data aging
operation is run. For more information, see Audit Trail.
Erase Backup/Migrated Data
Data from Erase Backup/Migrate operations can be aged as follows:
   If the Erase Backup/Migrate operation is older than all the backup data of the agent that the Erase backup/migrate job
   was run for.
   If there are already three Erase Backup/Migrate operations that were run for the agent.




                                                        Page 222 of 263
                                          Features - SharePoint Database iDataAgent




Data Aging Operations
Start or Schedule
Data aging operations can either be run on demand or started immediately. By default, data aging is run every day at
12:00 p.m.
The Data Aging Options dialog box allows you to either start a data aging operation immediately from the Run
Immediately option, or schedule it using the Schedule option. You can also run a data aging operation from the
Command Line Interface. For more information, see Command Line Interface.
Enable or Disable
Data Aging can be enabled or disabled on a storage policy copy. Disabling Data Aging prevents a Data Aging operation from
aging data from a copy when the operation is run, even after the retention rules for that copy have been met.
Alert Configuration
Alerts allow you to send notifications related to data aging events. Alerts can be configured during the initiation of a data
aging operation. For a detailed explanation of Alerts, see Alerts and Monitoring.
Job Restarts and Job Running Time
Click the Job Retry tab in the Data Aging Options dialog box to access the Job Running Time options, when you either
Start Data Aging or Schedule Data Aging.
You can also specify the maximum number of allowed restart attempts and the interval between restart attempts for all
data aging jobs. For procedures, see Specify Job Restartability for the CommCell.
For more information on these subjects, see Job Restart and Job Running Time.




Data Aging and Media Recycling
If data stored on tape media exceeds its retention rule and the data aging operation is run, the data is logically deleted. If
all of the data on a media is aged, the media is recycled; that is, it is returned to the scratch pool that is currently
associated with the storage policy copy that writes to the media. If you wish to save the data on the media for future
recovery, see Accessing Aged Data. Once the tape media is re-used, the data that was originally written to it cannot be
restored.
Media that has an active status is not recycled back to the scratch pool until the media has a non-active status.
For more information, see Media Recycling.



Special Considerations
Keep in mind these considerations when performing a data aging operation:
   For data to be aged from a spool copy, the following rule must apply:
      The data must be copied to an active synchronous copy. For more information about spool copies, see Spool Copy
   The way in which data is aged is also contingent upon the following:
      Whether the data protection operation was successful, killed, or failed.
      Whether the storage policy has secondary copies and whether they are active.
      Whether jobs are disabled on the storage policy copy.
      Whether jobs still need to be copied to secondary copies.
   System may override the user defined retention rules under the following cases:
      Data to-be-copied jobs on source copy won't be pruned regardless of its retention defined by user.
      When a user changes the storage policy association of a subclient, a subclient is deleted, or an agent or client is
      deconfigured, only the retention days must be exceeded for data to be aged. In these cases, retention cycles are set
      to zero (0).
      If a user changes the storage policy association of a subclient:




                                                       Page 223 of 263
                                         Features - SharePoint Database iDataAgent




     All subclient data that was backed up through the previous storage policy will be aged based on its storage policy
     copy retention time (days) rule only. If you select to run a full backup after changing the storage policy, all subclient
     data on the new storage policy will be aged according to its retention time and cycle rules. If you select to run a non-
     full backup as the next backup operation, it is recommended that you run a full backup as soon as possible. All non-
     full backups run before a full backup will be retained as a partial cycle according to the new storage policy copy's
     retention cycle rule (even though not a full cycle). The non-full backups (partial cycle) will be aged when the new
     storage policy copy's retention time and cycle rules are met.
     For more information, see Associate a Subclient to a Storage Policy.
  Invalid archive files of valid jobs will not be aged for 90 days; however, users can change this by creating the
  DaysKeepInvalidArchivesForValidJob registry key. It may be necessary to decrease the number of days to keep the
  invalid archive files to increase the amount of free space sooner, especially if there are many invalid archive files. On
  the contrary, it may be necessary to increase the number of days to keep the invalid archive files for troubleshooting
  issues related to the archive files.
  Data Aging on Centera Clusters Consider the retention rules established in the Centera Cluster, when you set the
  rules for data aging on Storage Policy Copies that write data to a Centera Cluster. Retention rules should be configured
  based on the retention rules established in Centera as follows:
     Standard read-write access on Centera - retention rules for data aging on Storage Policy Copies will take effect.
     Worm or Compliance mode - recommended that the retention ruled for data aging on Storage policies be set to prune
     later than the pruning on Centera.
     Compliance Plus mode on Centera - recommended that the retention rules for data aging on Storage Policy Copies be
     set to "infinite".
  This software supports all these modes by setting the retention property on each data object when submitting a Cclip
  which is used to register the data with Centera for storage. The Cclip reflects a retention date that is based on the
  number of retention days on the Storage Policy Copy Properties. Once the date is set on the Cclip, changing the Storage
  Policy Copy to a shorter retention will not change the existing Cclips and attempts to prune the data on the device will be
  denied until it is aged based on the Cclip value. (For information on Cclips and setting-up the data retention on Centera
  Clusters, refer to the Centera documentation.)



Related Reports
Data on Media and Aging and Report
The Data on Media and Aging Forecast Report provides real-time and historical information from of data protection jobs,
associated media, data aging forecast and recyclable media.
Data Aging Job Summary Report
The Data Aging Job Summary Report provides real-time and historical information about data aging operations.




How To
  Start Data Aging
  Schedule Data Aging
  Enable or Disable Data Aging
  Enable Managed Disk Space for Magnetic Data
  Suspend Use of a Client Computer Temporarily


Back to Top




                                                      Page 224 of 263
                                         Features - SharePoint Database iDataAgent




User Accounts and Passwords

Choose from the following topics:
   Overview
   Change CommServe Accounts
      Changing the CommCell Network Password
      Changing the Account for Accessing Automatic Updates
   Change MediaAgent Accounts
      Changing the Media Password
      Changing the Index Cache Account
   Change Accounts for Accessing the Maintenance Advantage Page
   Change Agent Accounts
      Changing Accounts for Accessing Application Servers/Filers
      Changing Accounts for Accessing Databases
      Changing Accounts for Accessing Instances
      Using/Changing Accounts for Data Classification
      Changing Accounts for Auto-Discovery of Mailboxes by Active Directory User Group
      Configuring Single Sign-On (SSO) for DataMigrator Outlook Add-In
      Changing Accounts for Configuring UNC Paths
      Changing Accounts for Discovering Network Drives
      Changing Accounts for Executing Pre/Post Commands (Data Protection)
      Changing Accounts for Executing Pre/Post Commands (Data Recovery)
      Changing Accounts for Executing Pre/Post Commands (Recovery Points)
      Changing Accounts for Restoring to Mapped/Share Network Drives and Restricted Directories
   Other Considerations
      Windows File System
   How To
Related Topics:
   Command Line Interface - Qmodify password


Overview
User accounts and passwords can be administered for various components of the product, including the CommServe,
MediaAgents, and agents. These accounts and passwords allow you to perform various operations per the affected
component. In many cases, user accounts and passwords are established during the install of the specific component and
can be changed after the install via various operations from the CommCell Browser. To this latter end, you can use the
CommCell Browser to populate either account-like dialog boxes or spaces within other types of dialog boxes with this
information.
For some account-related operations, you may have the option or be required to administer a User Impersonation account.
This account is intended to be used in place of the Local System Account because it provides additional security. However,
note that "impersonated users" must have write permissions to the product installation folders; otherwise, the User
Impersonation account may not take effect. This is especially true if the associated computer is not part of a domain and if
the user is not a domain user.
Back To Top



Change CommServe Accounts
Select the desired topic.
   Changing the CommCell Network Password
   Changing the Account for Accessing Updates




                                                      Page 225 of 263
                                           Features - SharePoint Database iDataAgent




Back To Top


Changing the CommCell Network Password
The CommCell network password is an internal security measure used to ensure that communications occur only between
CommCell computers. By default, the software assigns each computer in the CommCell a different password. You can, at
any time, define a new CommCell network password for any computer in the CommCell. Although you do not need to know
the existing password to define a new one, you do need to have administrative privileges.
The CommCell network password can be changed from the Change System Password dialog box.

The CommCell network password can be changed from the Change Network Password dialog box. See Change the
CommCell Network Password for step-by-step instructions.
Top of Section


Changing the Account for Accessing Automatic Updates
The CommServe uses an account to access updates. This account can be changed from the User name and password
dialog box. See Change Account for Accessing Updates for step-by-step instructions.
Top of Section



Change MediaAgent Accounts
Select the desired topic.
   Changing the Media Password
   Changing the Index Cache Account
Back To Top


Changing the Media Password
The Media Password is used to enforce credentials while using the Media Explorer (DR Tool) to restore data from a media.
This password prevents the unauthorized access of data from media. The password is assigned during the installation of the
CommServe and can be changed, and it becomes necessary in the case of a disaster. The password is stored as an
encrypted string on the On Media Label (OML) of the tape, and the information for the password (including the SQL
metadata, etc.) is stored encrypted in the OML of the tape and in the SQL database.
Only one media password is allowed per media. If you changed the media password, it will be effective for the next media.
Keep in mind that the existing media can be accessed using only the old media password.
The Media Password can by changed from the Change System Password dialog box. See Change the Media Password for
step-by-step instructions.

                            If you want to provide more security by not allowing anyone else to read and decipher data on
                            the media, you may want to enable Data Encryption.

Top of Section


Changing the Index Cache Account
The index cache password is not assigned during the MediaAgent install.
Top of Section



Change Accounts for Accessing the Maintenance Advantage Page
You can change the account for accessing the Maintenance Advantage web site. See Change Account for Accessing the



                                                        Page 226 of 263
                                            Features - SharePoint Database iDataAgent



Maintenance Advantage Page for step-by-step instructions.



Change Agent Accounts
The account information that is established during an agent install can be used by the agent to perform various tasks (e.g.,
to access the associated application server, execute specific commands, etc.). In some cases, if you have the necessary
permissions, you can replace (change) this information with information from other accounts that already exist. To this end,
you can use the CommCell Browser to populate either account-like dialog boxes or spaces within other types of dialog
boxes with this information.
Depending upon the agent and the associated task(s), account information for agents can be administered from the agent,
instance/partition, backup set, and/or subclient levels in the agent tree. However, if you create a new level item (i.e.,
instance, backup set, or subclient) from one level as part of the procedure for ultimately changing a user account at the
same or (especially) another level, this path can be taken only once for the specific items involved per the installation or
creation of these items within the specified path. For example, some agents allow you to create a new subclient from the
agent level and, if you elect to do this, the Subclient Properties dialog box displays. This dialog box may have a Pre/Post
tab that allows you to set up Pre/Post commands to be executed before data protection operations. Moreover, you may
have the option of changing the user account for executing these commands. If you do all of this for the subclient that you
create, any subsequent changes to this user account that you want to make for this subclient cannot be made from the
agent level because you have already created this subclient within the specified path, and there is no other option to access
this subclient from the agent level. Instead, you must change the user account from the subclient level in this case.
Note: For information on using Data Encryption pass-phrases, see Data Encryption.
Select the desired topic:
   Changing   Accounts    for Accessing Application Servers/Filers
   Changing   Accounts    for Accessing Databases
   Changing   Accounts    for Accessing Instances
   Changing   Accounts    to Authenticate Against the Active Directory Domain Controller
   Changing   Accounts    for Configuring UNC Paths
   Changing   Accounts    for Discovering Network Drives
   Changing   Accounts    for Executing Pre/Post Commands (Data Protection)
   Changing   Accounts    for Executing Pre/Post Commands (Data Recovery)
   Changing   Accounts    for Executing Pre/Post Commands (Recovery Points)
   Changing   Accounts    for Restoring to Mapped/Share Network Drives and Restricted Directories
Back To Top


Changing Accounts for Accessing Application Servers/Filers
During backup/migrate, browse, and restore/recovery jobs for some agents, the system logs on to the related server or
filer and accesses the data using the specified user account. You can change these accounts for all such agents from the
agent level. For each group of agents discussed in the following sections, see Change Account for Accessing Application
Servers/Filers for step-by-step instructions.
Also, see these sections as appropriate for agent considerations:
   Considerations   for   Active Directory, DataMigrator for NetWare, Novell GroupWise, NetWare, NDS
   Considerations   for   NAS NDMP File Servers
   Considerations   for   Exchange-based Agents
   Considerations   for   Quick Recovery Agent
   Considerations   for   SharePoint
Top of Section


Considerations for Active Directory, DataMigrator for NetWare, Novell GroupWise, NetWare, and NDS

If a user account does not have sufficient privileges, the job for the associated iDataAgent may fail either in whole or part.
To ensure that all data or attributes can be backed up and restored, we recommend the following:
   For the Active Directory iDataAgent, to back up Active Directory objects in a single domain, the name user account



                                                         Page 227 of 263
                                          Features - SharePoint Database iDataAgent



   should be a member of a Domain Administrator group (e.g., CN=administrator, CN=users, DC=company, DC=com).
   For the File System iDataAgent and the DataMigrator for NetWare Agent, the named user account should have
   supervisor privileges for the NetWare server.
   For the NDS iDataAgent, the named user account should have supervisor privileges for the root of the NDS tree.
When specifying another Active Directory or NetWare account, you must specify an account that already exists. If the
desired account does not exist, you must create it using an Active Directory or NetWare administration tool (e.g.,
ConsoleOne).
Using NTLM Bind with Active Directory

When installing the agent, you now have the option of using NT LAN Manager Bind for authentication.
If you selected this option during installation because you chose to use an Active Directory account residing in an
organizational unit other than the Users organizational unit, you must reconfigure the account using the Agent's Properties
menu option in the CommCell Console. When reconfiguring the account, you must use the LDAP path of the desired account
as the user name (e.g., CN=administrator, CN=users, DC=company, DC=com). The account must be a member of the
Domain Administrator group or have Read, Change, and Create Child Objects permissions for the Active Directory domain.
If you did not select this option during installation, to use NT LAN Manager Bind encrypted authentication with the Active
Directory iDataAgent you must edit the Agent properties.
See Change Account for Accessing Application Servers/Filers for step-by-step instructions.
Top of Section


Considerations for NAS NDMP File Servers

For the BlueArc and Hitachi NAS NDMP file servers, the User Account is user-defined.
For EMC Celerra NAS NDMP file servers, the User Account must be ndmp. If you are not using Celerra OS 4.0 (e.g., you're
using Celerra OS 2.2.49), you must contact EMC Celerra Technical Support to change your NDMP password on the data
mover.
For NetApp NAS NDMP file servers, the User Account must be root.

Top of Section


Considerations for Exchange-based Agents

This section pertains to the following agents: Exchange 5.5 Database, Exchange Mailbox, Exchange Public Folder,
DataArchiver for Exchange, DataMigrator for Exchange Mailbox, and DataMigrator for Exchange Public Folder.
These agents allow you to change the Exchange Administrator Account and/or the Exchange Site Service Account for the
site in which the associated Exchange Server resides.
Top of Section


Considerations for Quick Recovery Agent

For Quick Recovery Agent with Exchange, you can select the Exchange application and change the user account; also, if you
are including another Exchange Server, you can change the Exchange Server Name.
When configuring the Exchange server(s) for the Quick Recovery Agent on a cluster, be sure to enter the Exchange server
name into the Exchange Server Name field of the Change User Account dialog box. If you do not enter the server
name, the agent may not be able to detect the Exchange Server.
It is also recommended to enter the Exchange Server name here if you are having difficulty detecting the Exchange server.
If you have tried entering the Exchange Server into the Change User Account dialog box and still cannot detect the server,
you can manually add the server to the registry using the sExchangeServerName Registry Key.
Similarly, for Quick Recovery Agent with SQL Server, you can select the SQL Server application and change the user
account. If the Quick Recovery Agent has difficulty detecting the SQL server on a cluster, you can manually add the server
to the registry using the SClusteredSQLServerName Registry Key.
Top of Section




                                                       Page 228 of 263
                                        Features - SharePoint Database iDataAgent




Considerations for SharePoint

For the SharePoint 2003 iDataAgents, consider the following:
For both the SharePoint Server 2003 Document and SharePoint Server 2003 Database iDataAgents, run Base Services on
the Client using an account that meets the following criteria:
  member of the local Administrator Group
  member of the SharePoint Portal Administrator Group
  System Administrator role on the SQL Server Instance
Refer to the article, Galaxy Service Account User Information for Windows 2003 and Window Server 2003 clients available
from the Maintenance Advantage web site.
For SharePoint 2003 Database, you can change:
  the SSO Administrator Account for the service on the associated SharePoint Portal server; and
  the Administrative Group Account
For both SharePoint 2001 and 2003 Document, you can change the SharePoint Document iDataAgent Account.
Top of Section


Changing Accounts for Accessing Databases
For DB2 on Windows and DB2 on Unix, you can change the account for accessing DB2 databases. Similarly, for Oracle on
Windows, Oracle on Unix, SAP for Oracle on Unix, SAP for Oracle on Windows, and QR Agent with Oracle, you can change
the account for accessing Oracle databases. QR Agent allows you to do this when you are adding or editing an Oracle
instance. For Informix, you can change the account for accessing Informix databases.
  From the agent level, you can do this for all the supported Unix and Windows agents.
  From the instance level, you can do this for all the supported Unix and Windows agents except QR Agent.
  From the backup set level, you can do this for DB2 on Windows.
For DB2, the DB2 user account that you use must already exist. If you choose to use an account that does not exist, you
must create the account. To this end, you must have either administration privileges or SYSADM, SYSCTRL, and SYSMAINT
user group privileges to change the user account.
For Oracle, SAP for Oracle, and QR Agent with Oracle, ensure that you have administrator privileges for the Oracle database
you want to back up and restore. Also, ensure that the User Account and password are already set up in the client machine.
See Change Account for Accessing Databases for step-by-step instructions.
Top of Section


Changing Accounts for Accessing Instances
For SQL Server and SQL Server 2005, you can change from the instance/partition level the Windows account that the
system uses to access the SQL Server instance. Similarly, for Sybase, you can change the account for accessing the Sybase
instance. See Change Account for Accessing Instances for step-by-step instructions.
Top of Section


Using/Changing Accounts for Data Classification
For DataMigrator for Windows with Data Classification, you can specify a user account to authenticate against the Active
Directory domain users whose files you want to migrate. See Use Users and User Groups - Data Classification Enabler for an
overview.
Top of Section


Changing Accounts for Auto-Discovery of Mailboxes by Active Directory User Group
For the DataMigrator for Exchange 2000/2003 Mailbox Agent and the Exchange 2000/2003 Mailbox iDataAgent, you can



                                                     Page 229 of 263
                                         Features - SharePoint Database iDataAgent



specify a user account to authenticate against the Active Directory domain user groups whose mailboxes you want to
configure for Auto-Discovery operations. See Discovering and Assigning New Mailboxes for an overview.
Top of Section


Configuring Single Sign-On (SSO) for DataMigrator Outlook Add-In
CommCell authentication is required for end-users to perform advanced message recovery operations such as find
recoveries and browse recoveries from Outlook using the DataMigrator Outlook Add-In. The single sign-on (SSO) feature
allows Exchange administrators to establish a CommCell User Group for Outlook Add-In end-users to perform these
functions using their existing Windows user accounts and passwords residing in the Active Directory domain. Through the
use of the ExportADUsers tool on the Resource Pack, Exchange administrators can export an Active Directory User Group
to a file which can then be imported to the CommServe database and added to a new or existing CommCell User Group that
has the permissions to perform advanced message recovery operations.
Once Single Sign-On has been configured, then Outlook users may perform find and browse recoveries of migrated
messages without the need to enter CommCell authentication credentials. When users select the Outlook Add-In option to
Find and Recover Messages, their Windows user accounts are automatically granted rights to access the CommServe to
perform this function as part of a CommCell User Group. See the Readme_exportADusers.txt file located in the
ExportADUsers directory on the Resource Pack for step-by-step instructions on using this tool to set up Single Sign-On for
your site.
Top of Section


Changing Accounts for Configuring UNC Paths
For Windows File System, you can define from the subclient level an account to be used when configuring a UNC Path as
part of the subclient's content.
The user name and password that you use must have sufficient rights to access the share to which the UNC Path is
pointing. Also, the user name and password must have the right to log on to the client machine that is running the backup,
as well as rights to the logs on that computer.
To perform a backup or a restore operation using a UNC Path as either the content of the subclient or the destination for a
restore, we recommend using an account that has administrative privileges. Also, the User Account that is used must be an
account that already exists. If you choose to use an account that does not exist, it must be created.
Once you establish the account, you can modify the account. See Change Account for Configuring UNC Paths for step-by-
step instructions.
Top of Section


Changing Accounts for Discovering Network Drives
For Quick Recovery Agent with NAS, you can change from the agent level the account for discovering network drives. This
account has permissions on both the NAS data server and the Quick Recovery Agent machine. In effect, this account also
has permissions on the CIFS shares that are backed up.
Since the NAS data server and the Quick Recovery Agent machine can never be the same machine, the account is a
network (and not a local) account. Therefore, since the account has permissions on both machines, the machines must
either be in the same domain or have an appropriate trust set up.
See Change Account for Discovering Network Drives for step-by-step instructions.
Top of Section


Changing Accounts for Executing Pre/Post Commands (Data Protection)
For several agents, you can define a user or an account with permissions to execute Pre/Post commands for migration,
backup, and QR volume creation jobs. To this purpose, you must designate either the Local System Account or another
account. Once this account is established, you can modify the account.
  From the agent level, you can do this for DataArchiver for Exchange, DataMigrator for Network Storage, DataMigrator for
  Windows, Image Level on Windows, Image Level ProxyHost, ProxyHost, and Serverless Data Manager.




                                                      Page 230 of 263
                                         Features - SharePoint Database iDataAgent



   From the instance/partition level, you can do this for Lotus Notes Database.
   From the backup set level (when creating a new subclient), you can do this for DB2 on Windows, Exchange Mailbox,
   Exchange Public Folder, Exchange Web Folder, DataMigrator for Exchange Mailbox, DataMigrator for Exchange Public
   Folder, Lotus Notes Document, NAS, and Windows File System.
   From the subclient level, you can do this for Active Directory, DataArchiver for Exchange, DataMigrator for Exchange
   Mailbox, DataMigrator for Exchange Public Folder, DataMigrator for Network Storage, DataMigrator for Windows, DB2 on
   Windows, Exchange, Image Level on Windows, Lotus Notes, NAS, Oracle on Windows, ProxyHost, Image Level
   ProxyHost, Quick Recovery, Serverless Data Manager, SharePoint, SQL Server, and Windows File System.
See Change Account for Executing Pre/Post Command (Data Protection) for step-by-step instructions.
Top of Section


Changing Accounts for Executing Pre/Post Commands (Data Recovery)
For several agents, you can define a user or an account with permissions to execute Pre/Post commands for data recovery
jobs. To this purpose, you must designate either the Local System Account or another account. Once this account is
established, you can modify the account.
   From the agent level, you can do this for Exchange Database, Oracle on Windows, SQL Server, SQL Server 2005 and
   Windows File System.
   From the instance/partition level, you can do this for Oracle on Windows, SQL Server and SQL Server 2005.
   From the backup set level, you can do this for Exchange Database and Windows File System.
See Change Account for Executing Pre/Post Commands (Data Recovery) for step-by-step instructions.

                         Note that this option is not available for SQL Server 2005 if you click Files/File Groups in the
                         Browse Options dialog box on the way to the Restore dialog box.

Top of Section


Changing Accounts for Executing Pre/Post Commands (Recovery Points)
For ContinuousDataReplicator, you can define a user or an account with permissions to execute Pre/Post commands when
creating Recovery Points. To this purpose, you must designate either the Local System Account or another account. Once
this account is established, you can modify the account.
See Change Account for Executing Pre/Post Commands (Recovery Points) for step-by-step instructions.
Top of Section


Changing Accounts for Restoring to Mapped/Share Network Drives and Restricted Directories
For several agents, you can define a user or an account with permissions to restore data to either mapped/shared network
drives or directories to which you have no write privileges.
   From the agent level, you can do this for DataMigrator for NetWare, DataMigrator for Network Storage, DataMigrator for
   Windows, Image Level on Windows, Lotus Notes Database, Lotus Notes Document, NetWare File System, NDS, Image
   Level ProxyHost, ProxyHost, and Windows File System.
   From the instance/partition level, you can do this for Lotus Notes Database and Lotus Notes Document.
   From the backup set level, you can do this for Lotus Notes Document, NetWare File System, NDS, and Windows File
   System.
See Change Account for Restoring to Mapped/Share Network Drives and Restricted Directories for step-by-step instructions.
Top of Section



Other Considerations
There are scenarios where it is possible, preferable, or even mandatory to use other accounts to perform specific tasks.
These scenarios are discussed per the affected agents in this section.



                                                      Page 231 of 263
                                          Features - SharePoint Database iDataAgent




Windows File System
Create a User to Run QiNetix Services and Operations

This section discusses how to create a user (and not the local system account) to run QiNetix services and operations. By
default, QiNetix services run as a "local system account." The created user will run QiNetix services to back up and restore
files and folders regardless of ownership, permissions, encryption, or auditing settings.
"Backup Operator", "Administrator", and "Local Administrator", as discussed in this section, are built-in groups. These
groups have the necessary permissions and user rights defined. Only a member of the Administrator group can assign users
as Backup Operators.
NOTES
     You may be required to edit the QiNetix registry. However, before you do this, back it up and ensure that you
     understand how to restore it if a problem occurs. For information about how to do this, refer to Windows Help.
     Creating and using a user to run QiNetix services and operations currently works only for the Windows File System
     iDataAgents.
To create a user that can run QiNetix services/operations:

1.   Create a Windows user.
2.   On the user's Members tab, add the group Backup Operators.
3.   Verify the user (or the Backup Operators group) has the following local security settings:
        log on as a service
        back up files and folders
        restore files and folders

Following are account setup considerations as they relate to functions:
     QiNetix Services Considerations
     Backup Considerations
     QiNetix Registry and Directory Considerations
     Restore Considerations
     Considerations When Using a Domain Controller
     Set Up or Modify User Permissions and Rights
     Related Issues
QiNetix Services Considerations
If the user right to the QiNetix "Logon as a Service" is not set, it will be granted when the "log on as" user is assigned to
the QiNetix services in the services. This right needs to be granted only once. See Set Up or Modify User Permissions and
Rights for more information.
Backup Considerations
Generally, to run backups, the user must be either an administrator or a member of the Backup Operators group. Each such
member acquires backup rights. Backup operators (or QiNetix Service Users) are designed to have full control to the
QiNetix registry and the install folder.
To back up the System State data, the QiNetix service user must be either an administrator or a backup operator. Also,
system state backups require backup operator group permissions on the HKLM\SYSTEM\SETUP key to enable system-
protected file backups.

                           The 1-Touch component of system state backups will fail whenever you run QiNetix services as a
                           backup operator. As a workaround, either skip backing up 1-Touch information during system
                           state backups by using the SKIP_1TOUCH_BACKUP registry key, or run the backups using the
                           local system account.

An administrator or a backup operator in a local group can back up any file and folder on the local computer to which the
local group applies. An administrator or backup operator on a domain controller can back up any file and folder on any
computer in the domain or any computer in a domain where a two-way trust relationship exists.
To back up files if you are not an administrator or a backup operator, you must be the owner of the files and folders you
want to back up or have one or more of the following permissions for the files and folders you want to back up: Read, Read
and execute, Modify, or Full Control. See Set Up or Modify User Permissions and Rights for more information.



                                                       Page 232 of 263
                                          Features - SharePoint Database iDataAgent




QiNetix Registry and Directory Considerations
You must enable backup operator access to the QiNetix registry and directory. See Set Up or Modify User Permissions and
Rights for more information.
Restore Considerations
Generally, only restore rights are required to restore files. For a Windows 2000 Server, these rights are inherited by backup
operators. For a Windows 2003 Server, you must add backup operators to the 'Restore Files and Folder' Local Security
Policy.
To restore System State data, one of the following must be true: the QiNetix service user is a local administrator, or QiNetix
Services will be run as a local system. See Set Up or Modify User Permissions and Rights for more information.
Considerations When Using a Domain Controller
To add a user to the Backup Operators Group on a domain controller, use Active Directory users and computers.
Also, on a domain controller, you may need to modify the Domain Controller Security Policy since a domain controller
overrides the Local Security Policy. In addition, when you set the policy for DC security policy, this policy adds itself to the
local policy as an "Effective Policy Setting". This means the domain controller is using a policy that has overwritten the Local
Policy Setting.
Set Up or Modify User Permissions and Rights
You can do the following to administer user permissions and rights:
     View or modify user rights assignments on a Workgroup or Member Server
     View or modify user rights assignments on a domain controller
     Set up user permissions and rights on a Windows Workgroup or Member Server
     Set up rights on a Windows 2000 domain controller
     Set up QiNetix registry permissions on Windows 2000
     Set up folder permissions
To view or modify user rights assignments on a Workgroup or Member Server

1.   Click Start > Settings > Control Panel > Administrative Tools.
2.   From Administrative Tools, select the local security policy and add the QiNetix Service user to all the required rights
     (logon as service, backup, restore).

To view or modify user rights assignments on a domain controller

1.   Click Start > Settings > Control Panel > Administrative Tools.
2.   From Administrative Tools\Domain Controller Security Policy, expand the tree to Security Settings, Local
     Policies, and User Rights Assignment. Add the user to all the required rights (logon as service, backup, restore).

To set up user permissions and rights on a Windows Workgroup or Member Server

1.   Click Start > Settings > Control Panel > Administrative Tools.
2.   From Administrative Tools, double-click Computer Management.
3.   Create or prepare to manage a Windows user who will run the QiNetix services.
4.   Open Computer Management by expanding Local User and Groups and then Users. Double-click or create the User
     who will be running the QiNetix services.
5.   Right-click the User (if new), click Properties, and click Member of. Then add the 'Backup Operators' group to the
     User.
6.   Change the QiNetix services account to the User and re-start the services.
7.   Log off and log in as the Administrator for the policies to take effect. Sometimes you may have to restart the computer
     to this end.

To set up rights on a Windows 2000 domain controller

1.   Create or prepare to manage a Windows user who will run the QiNetix services.
2.   From Administrative Tools\Active Directory Users and Computers\Users, double-click or create the user who will
     be running the QiNetix services.
3.   Right-click the User (if new), click Properties, and click Member of. Then add the 'Backup Operators' group to the




                                                       Page 233 of 263
                                           Features - SharePoint Database iDataAgent



     User.
4.   Change the QiNetix services account to the user and re-start the services.
5.   Log off and log in as the Administrator for the policies to take effect. Sometimes you may have to restart the computer
     to this end.

To set up QiNetix registry permissions on Windows 2000

1.   Run regedt32.
2.   Highlight "CommVault Systems" under HKEY_LOCAL_MACHINE/SOFTWARE and click security - permissions from the tool
     bar.
3.   Add Backup Operators (or the QiNetix service user) with full control to the QiNetix registry key.

To set up folder permissions

1.   As appropriate, provide the QiNetix service user with full control to the installation directory or confirm that such control
     is in place. The default location is C:\Program Files\CommVault Systems.
2.   Right-click, select Properties, and then select the Security tab and Add Backup Operators (or the QiNetix service
     user) with full control rights.

Related Issues
Please note the following issues.
     Share permissions will not be backed up or restored

     You must use the local system account for this purpose.

     Full iDataAgent restore requires a local system account

     To perform a full iDataAgent restore, see Disaster Recovery. A full iDataAgent restore should be performed using only
     the local system account because this account has full rights to restore the data on the computer. If a full iDataAgent
     restore runs as a User, registry and other data will not be restored properly because the User does not have the required
     rights to write to the system. Even when performing a system state restore or a system database restore, the services
     must be running as a local system account; otherwise, the restore will fail. Also, following a full iDataAgent restore, the
     user no longer has rights to the QiNetix registry key. Therefore, you may have to give the User permissions to this key
     in the registry following a full iDataAgent restore.

     QSnap requires a local system account

     QSnap fails to snap volumes when QiNetix services are running as a backup operator account. Therefore, use a local
     system account to this purpose.

     OST file comes back corrupted for Microsoft Outlook

     When you open Outlook, a pop-up message indicating that you have an older OST file is displayed. You can delete this
     file and create a new one if necessary.

     You may not be able to set the archive attribute

     During a file system backup, there maybe a message logged in clbackup.log indicating you are unable to turn off the
     archive attribute after the backup. This happens because the user does not have the rights to write to this folder (e.g.,
     C:\Program Files\).

     Ignore a specific message when running backups

     During file system backups and restores, the following message is displayed:

         W2K File System: Failed to enumerate share information [[997 Access is denied.]

     Note that backups will be successful even if this message is displayed.




                                                        Page 234 of 263
                                                Features - SharePoint Database iDataAgent




Job Controller

The Job Controller allows you to manage and monitor the following types of jobs:
   Data protection operations
   Data recovery operations
   Administration operations
You can view detailed information about these jobs as well as job events and the media used for each job. Information
about a job is continually updated and available in the Job Controller window. When a job is finished, the job stays in the
Job Controller for five minutes. Once a job is finished, more information about that job is obtainable using the Job History.
Select one of the following topics:
   The Job Controller Window
   Job Controller Actions
   View the Information of a Job
   Job Filters
   What Happens When There are no Resources for a Job
   What Happens When a Job is Preempted
   How To
Related Topics:
   Command    Line   Interface   -   qlist job
   Command    Line   Interface   -   qlist jobsummary
   Command    Line   Interface   -   qoperation jobcontrol
   Command    Line   Interface   -   qoperation jobretention


The Job Controller Window
The Job Controller window displays all the current jobs in the CommCell. A status bar at the bottom of the job controller
shows the total amount of jobs; the amount of jobs that are running, pending, waiting, queued and suspended; and the
high and low watermarks. The watermarks indicate the minimum and maximum number of streams that the Job Manager
can use simultaneously.
See Control the Number of Simultaneously Running Streams for more information.

Pause    and Play      buttons allow you to control how the Job Controller displays real time information from active jobs.
The Pause button stops the Job Controller from displaying real time information collected from jobs. The play button allows
the Job Controller to display real time job updates.
By default, the Job Controller displays the following information:

                           A unique number allocated by the Job Manager that identifies the data protection, data recovery,
Job ID
                           or administration operation.
Operation                  The type of data protection, data recovery, or administration operation being conducted.
Client Computer            For data protection operations, the client computer to which the backup set and subclient belong.
                           For data recovery operations, the computer from which the data originated.
Agent Type                 The agent that is performing the operation. (e.g., Windows 2000 File System).
Subclient                  The subclient that is being included in the operation.
Backup Type                The type of data protection operation that is being conducted.
Storage Policy             The storage policy to which the operation is being directed.
MediaAgent                 The MediaAgent to which the operation is being directed.
Status                     The status of the operation. For job status descriptions, see Job Status Levels
Progress                   A status bar indicating its progress. The progress bar is not visible for certain operations (e.g.,
                           data aging) or for the initial phases of some data protection operations.
Errors                     Displays any errors that have occurred during the operation, such as a hardware problem or the
                           job has run outside of an operation window.




                                                               Page 235 of 263
                                          Features - SharePoint Database iDataAgent




You can also display the following columns:

Backup Set                The backup set to which the subclient belongs.
Instance                  The instance to which the subclient belongs.
Phase                     The current phase of the operation. The number of phases varies depending on the operation.
User Name                 The name of the user who initiated the operation.
Priority                  The priority that is assigned to the operation. (For more information, see Job Priorities and Priority
                          Precedence).
Start                     The date and time on the CommServe when the operation started.
Elapsed                   The duration of time consumed by the operation.
Libraries                 The libraries that is being used by the operation.
Drives/Mount Paths        The drives/mount paths that are being used by the operation.
Last Update Time          The last time the Job Manager received job updates for the operation.
Transferred               The amount of data that has been transferred for the operation at the present time.
Estimated                 The time that the system estimates for this job to be completed.
Completion Time
Delay Reason         The description of the reason why the operation may be pending, waiting, or failing.
Alert                The name of the job-based alert, if configured for the job.
Job Initiation       The origin of the operation: the CommCell Console (Interactive), a schedule (Scheduled), or a
                     third party interface (Third Party).
Maximum Number of The maximum number of readers that can be used for the operation.
Readers
Number of Readers in The number of readers currently in use for the operation.
Use
Restart Interval     The amount of time the Job Manager will wait before restarting a job that has gone into a pending
                     state. This is set in the Job Management (Job Restarts) tab.
Max Restarts         The maximum number of times the job will be restarted after a phase of the job has failed. This is
                     set in the Job Management (Job Restarts) tab.

To see all the columns in the Job Controller window, use the scroll bar at the bottom of the window.

                          Pop-up messages for reporting job completion can be enabled or disabled using the F12 key.



Back to Top



Job Controller Actions
You can suspend, resume, or kill jobs individually or on a group-selection basis.
Types of Job Controller Actions
You can perform the following actions on jobs:

                          Temporarily stops a job. A suspended job is not terminated; it can be restarted at a later time.
Suspend
                          Only preemptible jobs can be suspended.
Resume                    Resumes a job and returns the status to Waiting, Pending, Queued, or Running depending on the
                          availability of resources or the state of the operation windows and activity control settings.
Kill                      Terminates a job.
Actions You Can Perform on Jobs Based on Their Job Status
The status of a job and the preemptibility of the phase of the job in the Job Controller determines the actions (either Kill,
Suspend, or Resume) that you can perform. Based on the original status and the action performed, a new job status will
result.

Original Status                            Actions Available                          New Status
Running                                    Suspend                                    Suspended




                                                       Page 236 of 263
                                          Features - SharePoint Database iDataAgent




                                          Kill                                        Killed
Waiting                                   Suspend                                     Suspended
                                          Kill                                        Killed
Interrupt Pending                         N/A                                         N/A
Pending                                   Suspend                                     Suspended
                                          Resume                                      Returns to original state, resources and
                                                                                      other conditions permitting
                                          Kill                                        Killed
Suspend Pending                           N/A                                         N/A
Queued                                    Suspend                                     Suspended
                                          Resume (scheduled jobs only)                Changes into a state of an active job,
                                                                                      resources and other conditions
                                                                                      permitting
                                          Kill                                        Killed
Suspended                                 Resume                                         Returns to original state, resources
                                                                                         and other conditions permitting
                                                                                         Changes into a state of an active job,
                                                                                         resources and other conditions
                                                                                         permitting
                                          Kill                                        Killed
Kill Pending                              N/A                                         N/A
Dangling Cleanup                          N/A                                         N/A
Control Individual Jobs
You can select a job in the Job Controller and perform a control action on that job individually.
Multi-Job Control
If you have many jobs in the Job Controller that you need to perform an action on, and you do not want to select each one
individually, you can control groups of these jobs from the Multi-Job Control dialog box.




From this dialog box, you can perform actions on:
   All jobs in the Job Controller.
   All selected jobs in the Job Controller providing you have the correct security associations at the proper level for each job




                                                       Page 237 of 263
                                          Features - SharePoint Database iDataAgent



   selected.
   All data protection operations/data recovery operations running for a particular client or client/agent.
   All data protection operations running for a particular MediaAgent.
Back to Top



View the Information of a Job
You can view the following information about a job in the Job Controller:
   Job Status
   Job Details
   Media
   Events
   Log Files
Job Status Levels
A job in the Job Controller window may have one of the following status levels:

Completed                The job has completed.
Completed With One       The job has completed with errors.
or More Errors
                         The following administration conditions will result in the Completed With One or More Errors
                         status level.
                            Disaster Recovery Backup
                               During the operation, Phase 1 failed and Phase 2 completed, or Phase 1 completed and
                               Phase 2 failed.
                            Data Aging
                               During the operation, one or more components failed, e.g., subclients failed to be aged or
                               job history failed to be removed.


                         The following iDataAgent-specific conditions will result in the Completed With One or More
                         Errors status level.
                            Microsoft Windows File System
                               During a system state backup operation, one or more non-critical components failed to
                               backup.
                               During a file system restore operation, one or more files failed to restore or were locked.
                               During a system state restore operation, one or more non-critical components failed to
                               restore.
                            Microsoft Exchange Server
                               During a backup operation of a storage group assigned to a subclient, one or more
                               databases failed to backup.
                               During a restore operation, one or more databases failed to restore.
                            Data Archiver for Exchange
                               During a restore operation, one or more files failed to restore.
                            DataMigrator for Exchange, Data Migrator for Exchange Mailbox, Data Migrator for
                            Exchange Public Folder and Data Migrator for Exchange Web Folder
                               During a restore operation, one or more files failed to restore.
                            Informix
                               During a backup operation, one or more files failed to backup.
                            Oracle, Oracle RAC
                               During a backup operation, one or more files failed to backup.
                            SAP
                               During a backup operation, one or more files failed to backup.
                            SharePoint



                                                       Page 238 of 263
                                         Features - SharePoint Database iDataAgent




                               During a backup operation, one or more elements in the subclient content failed to backup.
                               During a restore operation, one or more elements in the subclient content failed to restore.
                            Sybase
                               During a backup operation, one or more files failed to backup.
                            UNIX File System
                               During a backup operation, one or more files failed to backup.
Dangling Cleanup         A job phase has been terminated by the job manager, and the job manager is waiting for the
                         completion of associated processes before killing the job phase.
Failed                   The job has failed due to errors or the job has been terminated by the job manager.
Interrupt Pending        The job manager is waiting for the completion of associated processes before interrupting the job
                         due to resource contention with jobs that have a higher priority, etc.
Kill Pending             The job has been terminated by the user using the Kill option, and the job manager is waiting for
                         the completion of associated processes before killing the job.
Killed                   The job is terminated by the user using the Kill option or by the Job Manager.*
Pending                  The Job Manager has suspended the job due to phase failure and will restart it without user
                         intervention.
Queued                      The job conflicted with other currently running jobs (such as multiple data protection
                            operations for the same subclient), and the Queue jobs if other conflicting jobs are active
                            option was enabled from the General tab of the Job Management dialog box. The Job Manager
                            will automatically resume the job only if the condition that caused the job to queue has
                            cleared.
                            The activity control for the job type is disabled, and the Queue jobs if activity is disabled
                            option was enabled from the General tab of the Job Management dialog box. The Job Manager
                            will automatically resume the job only if the condition that caused the job to queue has
                            cleared.
                            The Queue Scheduled Jobs option was enabled from the General tab of the Job Management
                            dialog box. Scheduled Jobs can be resumed manually using the Resume option or resumed
                            automatically by disabling the Queue Scheduled Jobs option.
                            The job started within the operation window's start and end time.
                            The running job conflicted with the operation window and the Allow running jobs to
                            complete pass the operation window option was not enabled from the General tab of the
                            Job Management dialog box. (This is only applicable for jobs that can be restarted. See Job
                            Restart for more information.)
Running                  The job is active and has access to the resources it needs.
Running (Cannot be       During a running operation, the Job Alive Check failed. See Job Alive Check Interval for more
verified)                information.
Suspend Pending          A job is suspended by a user using the Suspend option, and the Job Manager is waiting for the
                         completion of associated processes before stopping the job.
Suspended                   A running, waiting or pending job has been manually stopped by a user using the Suspend
                            option. The job will not complete until it is restarted using the Resume option.
                            A job has been started in a suspended state using the Start Suspended or Startup in
                            Suspended State options available from the dialog box of the job that was initiated.
System Kill Pending      The job has been terminated by the Job Manager*, and the Job Manager is waiting for the
                         completion of associated processes before killing the job.
Waiting                  The job is active, waiting for resources (e.g., media or drive) to become available or for internal
                         processes to start.

*The Job Manager will terminate a job when:
  The number of job retries has exceeded the value set in the Job Retry dialog box.
  The total running time has exceeded the amount of time set in the Job Retry dialog box.
  Conflicting jobs overlap, i.e., a new backup job is initiated for the same subclient as a job that is currently running.
  Note that the Job Manager will only terminate a conflicting job if the new backup job encompasses the earlier job and if
  the earlier started job has yet to transfer any data to media. If these conditions exist, then the earlier job will be killed
  by the system and replaced by the newer job. To be more encompassing indicates that a FULL backup can kill jobs such
  as incrementals, differentials and other fulls; however, incrementals will not be able to kill fulls. If the current job has




                                                      Page 239 of 263
                                           Features - SharePoint Database iDataAgent



   already started transferring data, then the normal queue rules for the new job will apply.
   This feature must be enabled on the CommServe with the JMKillPreviousBackupJobForSameSubclient registry key.

   The free space is less than 25MB in the CommServe installation directory.
Job Details
You may want to view the details about a data protection, data recovery, or administration operation from the Job
Controller window. To view details about a particular job, right click the job in the Job Controller window and select Detail.
General Tab

The General tab of a Job Details dialog box provides general information about the selected job, such as the subclient,
storage policy, etc.
Progress Tab

The Progress tab of a Job Details dialog box of the selected job provides more specific statistical information about the
selected job’s current phase.
Streams tab

The Streams tab of a Job Details dialog box of the selected job provides data transferred by stream on the MediaAgent the
job is using.
Attempts Tab

The Attempts tab of a Job Details dialog box includes information on each attempt of each phase of the selected job, such
as the status of each phase of the job. Each phase has a corresponding client log that can aid in troubleshooting data
protection problems.
Media/Mount Paths
You can view the media/mount paths associated with a job from the Media Used by Job ID dialog box. For more
information about media, see Media Operations.
Job Events
You can view the events of a job from the All Found Events window. For more information about events, see the Event
Viewer.
Log Files
You can view the log files of an active job in the Job Controller. For more information about viewing log files, see Log Files.
Back to Top



Job Filters
You can filter the jobs that are displayed in the Job Controller by creating a job filter from the Filter Definition dialog box.
You can filter by Data Protection, Data Recovery, and Administration operations. The filter can also be based on an active
job for a particular CommCell entity.

                          CommCell Administrators can utilize filters created by all users. All other users can only utilize
                          the filters that they create. If a user account is deleted, their filters will automatically be deleted
                          as well.

Back to Top



How To
   Change Table Views
   Open a Console Window
   Control Jobs Through Job Queuing
   Kill a Job
   Resume a Job
   Suspend a Job




                                                        Page 240 of 263
                                          Features - SharePoint Database iDataAgent




Operation Window

Choose the following topic:
   Overview
   Define an Operation Rule
   How To


Overview
By default, all operations can run for 24 hours. To prevent certain operations from running during certain time periods of
the day, you can define operation rules so that these operations are disabled during those times. The main purpose of this
feature is to help you prevent an unexpected, time consuming operation from disrupting normal operations.
Operation rules are defined at both the CommServe and agent levels. Operation rules established at the CommServe level
apply across the entire CommServe. Operation rules established at the agent level apply only to the specified agent. When
an operation rule is defined at both the CommServe and agent levels, the job will run outside of the total time frame of
both levels.
Note that at the agent level:
   Not all operations are available to be assigned an operation rule, such as administration and synthetic full operations.
   You can also elect to ignore the operation rules set at the CommServe level from the Operation Window dialog box.
   The client time zone is displayed in the Operation Rules Details dialog box.

Jobs that are started at any time within the operation rules will go to the queue (and not pending) state. Once the window
of time of the operation rule has passed, these queued jobs will resume automatically and complete successfully. However,
jobs that are started before an operation rule can run to completion if the Allow running jobs to complete past the
operation window option is enabled from the General tab of the Job Management dialog box.

Jobs that are not interruptible (such as certain database jobs) will not be terminated if they fall within the time an operation
rule is defined. See Job Preemption Control for the types of jobs that are not interruptible.



Define an Operation Rule
You can add an operation rule from the Operation Window.




For example, data protection operations are not to be run between the
working hours of 8:00 a.m. to 6:00 p.m., Monday through Friday.
From the Operation Rule Details dialog box, select the details of the
operation rule.
NOTES
The client time zone is displayed in the dialog box when adding,
viewing or modifying an Operation Window from the Agent level only.




                                                       Page 241 of 263
                                        Features - SharePoint Database iDataAgent




Select the times for the operation rule from the Time Intervals dialog
box.




Once selected, the time is displayed in the Do no run intervals pane
of the Operation Rule Details dialog box. The operation rule is then
displayed in the Operation Window.




Back to Top




                                                     Page 242 of 263
                                           Features - SharePoint Database iDataAgent




Activity Control

Choose the following topic:
   Overview
   Support Information
   How To


Overview
The Activity Control feature allows you to enable or disable operations at the following levels in the CommCell hierarchy:

                          Allows you to enable/disable all activity, data protection, data recovery, auxiliary copy, data aging
CommCell
                          operations, or scheduled operations for all client computers within the CommCell.
Client Computer           Allows you to enable/disable all data protection and/or data recovery operations on all client
Groups                    computers that are members of a client computer group.
                          Allows you to enable/disable all data protection and/or data recovery operations on a specific
Client
                          client computer.
                          Allows you to enable/disable the data protection and/or data recovery operations of a specific
Agent
                          agent on a specific client computer.
Subclient                 Allows you to enable/disable the data protection of a specific subclient.

When disabling operations, the CommCell level has the highest precedence while a subclient has the lowest precedence. For
example, if you disable data protection operations at the CommCell level, then all data protection operations throughout the
CommCell are disabled regardless of the corresponding settings of the individual client computer groups, client computers,
agents, and subclients. If, however, a data protection operation is enabled at the CommCell level, you can still disable data
protection operations at the client computer groups, client computer, agent, subclient levels. By default, all operations are
enabled at all levels of the CommCell hierarchy.
Enhanced icons for CommServe and Clients in the CommCell Browser. The icons are based on the Activity Control options
established for Data Protection and/or Data Recovery in that entity. See CommCell Browser Icons and Activity Control for
more information.
CommCell Browser Icons and Activity Control
The icons associated with the CommServe and Clients in the CommCell Browser includes information on the activity control
options for data protection and/or data recovery operation in that specific entity, i.e., CommServe or the Client. The icons
have two arrows - out-bound to denote activity control for data protection operations and in-coming to denote activity
control for data recovery operations. These arrows appear in green when the activity control is enabled, and in red when
disabled.
The following table illustrates the appearance of these icons:

         A Client with both Data Protection and Data Recovery jobs enabled.

         A Client with only Data Protection jobs disabled.

         A Client with only Data Recovery jobs disabled.

         A Client with both Data Protection and Data Recovery jobs disabled.

Note that the icons do not reflect the status of other activity control options that may be available in these entities.




How To
   Enable or Disable Operations
   Queue Jobs if Activity Control is Disabled




                                                        Page 243 of 263
                                          Features - SharePoint Database iDataAgent




Job Preemption Control

Select one of the following topics:
   Overview
   Types of Preemptible and Non-preemptible Jobs
   Job Preemption Control for the CommCell
   Preemptibility of Job Types
   Support Information - Job Management
   How To


Overview
Jobs or operations fall into two main categories: preemptible and non-preemptible. A preemptible phase of a job is one that
can be interrupted by the Job Manager or suspended by the user, and then restarted, without having to start the phase
over again from the beginning. Preemption is defined by the Job Manager at each phase of a job. A File System backup
phase is one example of a preemptible phase; the Job Manager can interrupt this phase when resource contention occurs
with a higher priority job. You can also suspend this phase in progress and resume it later.
A non-preemptible phase is one that cannot be interrupted by the Job Manager or suspended by the user. It can only run to
completion, be killed by administrative action, or be failed by the system. For example, the data recovery operations of
database agents are non-preemptible.
Both preemptible and non-preemptible jobs can also be defined in terms of their restartability; preemptible jobs are always
restartable. In addition, even jobs that are not preemptible might fail to start and be in a "waiting" state; these are
restartable as well. For more specific information on this topic, see Job Restart.
Back to Top



Types of Preemptible and Non-preemptible Jobs
The following table lists the types of preemptible and non-preemptible jobs:

                                          Non-preemptible and Non-
Preemptible and Restartable                                                           Non-preemptible but Restartable
                                          Restartable
   Data protection operations for most       Data recovery operations for               Data protection operations for
   non-database agents (except as            database-like agents.                      database agents (except as noted in
   noted in Support Information - Job        Media export, erase media, and             Support Information - Job
   Management).                              inventory jobs.                            Management).
   DataArchiver archive jobs during the      SAN volume data protection jobs            DataArchiver archive jobs during the
   Archive Index and Archive Content         (non-preemptible in its scan phase).       Archive and Create Content Indexing
   Index phases of the job.                                                             phases of the job.
   Data recovery operations for most                                                    The system state phase of Windows
   File System-like (indexing-based)                                                    File System data protection
   agents during the restore phase.                                                     operations.
   Most administration jobs including
   Install Automatic Updates and
   Download Automatic Updates.
   Jobs that are run using an alternate
   data path cannot preempt other jobs.
   Similarly such jobs can also be
   preempted by other jobs which does
   not use an alternate data path.

Back to Top



Job Preemption Control for the CommCell



                                                       Page 244 of 263
                                         Features - SharePoint Database iDataAgent




You can specify that certain operations will preempt other operations
based on their job priority, in cases where multiple jobs are competing
for media and drives.
If a running job is preemptible, the Job Manager can interrupt the
running job and allocate the resources to a higher-priority job. (The
interrupted job enters a waiting state and resumes when the resources it
needs becomes available.)
You can:
   Allow restores and browse backup data index restores to preempt
   other jobs of lower priority such as backups, synthetic fulls, and
   auxiliary copy operations.
   Allow backups (including Disaster Recovery backups) to preempt other
   backups of lower priority.
   Allow backups (including Disaster Recovery backups) to preempt
   auxiliary copy jobs of lower priority.
See Set Job Preemption Control for the CommCell.




Back to Top



Preemptibility of Job Types
You can specify which of the following types of jobs are preemptible:
  Data Protection and Data Recovery operations of indexing-based file
  system-like agents. (For a complete list of the agents that support/do
  not support preemption, see Support Information - Job Management.)
  Disaster Recovery backup
  Auxiliary Copy
To configure preemptibility in the CommCell for specific job types, see
Specify Preemptibility of Job Types.




                                                      Page 245 of 263
                                      Features - SharePoint Database iDataAgent




Back to Top



How To
  Set Job Preemption Control for the CommCell
  Specify Preemptibility of Job Types
Back to Top




                                                   Page 246 of 263
                                           Features - SharePoint Database iDataAgent




Job Priorities and Priority Precedence

Select one of the following topics:
   Overview
   Job Priority Numbers
   Set Job Priorities
   View Job Priorities
   Change Job Priorities
   Supported Agents
   How To
Related Topics:
   Command Line Interface


Overview
Job priorities determine which competing jobs can access limited resources (such as drives and media). When a job is
started, the Job Manager assigns the job a priority number. The lower the job priority number, the higher the priority. The
job with the highest priority gets the resources first. The priority of a job is based on the:
   Job Priority Number of the operation type, client computer performing the operation, and the type of agent from which
   the operation originated
   Priority Precedence (the weighing of the priority of the client computer relative to the priority of the agent).
When several jobs have the same priority, resources are allocated on a first-come, first-served basis. When a job is
completed, then the Job Manager automatically assigns the newly freed resources to the next job.
When several jobs have a different priority, if job preemption is allowed, the Job Manager will interrupt the running job and
then allocate the resources to the higher priority job.
The priorities of jobs (except data aging and export media) can also be changed when they are being initiated, scheduled,
or are active.
Back to Top



Job Priority Numbers
Job Priorities are based on a 3-digit integer. The first digit always represents operation priority. If client precedence is
chosen, the second digit represents client priority and the third digit represents agent priority. If agent precedence is
chosen, the second digit represents agent priority and the third digit represents client priority.
Operation Priority
Operation priorities (the first digit of the job priorities number) are assigned by the Job Manager and can be changed using
the Change Job Priority feature. The operation priority assignments are:

Operation                                                        Assigned Priority
Data Recovery Operations                                         0
Data Protection Operations                                       1
Client Computer Priority
The Job Manager automatically assigns a priority of 6 to all client computers. Therefore, the Job Manager evaluates all
client computers as having the same priority. Based on this assumption, you may want to change the client priority.
For example, you may want operations originating from a specific file server to take precedence over operations originating
from a user’s computer. To do this, you would assign a file server a higher priority than other computers within the
CommCell.
Agent Priority
The Job Manager automatically assigns a priority of 6 to all agents. Therefore, the Job Manager evaluates all agents as



                                                        Page 247 of 263
                                           Features - SharePoint Database iDataAgent



having the same priority. Based on this assumption, you may want to change the priority of agents.
For example, you may want Exchange database operations to take precedence over Windows 2000 File System operations.
To do this, you would assign the Exchange Database iDataAgent a higher priority than the Windows 2000 File System
iDataAgent.
To illustrate this point further, suppose you have chosen client precedence and you run a data protection operation with a
client that has a priority of two and an agent with a priority of three, then the 3-digit integer will look like this:




Job Priority Exceptions
The following table lists jobs that do not follow the job priorities rules listed above. The priorities of all administration jobs,
with the exception of auxiliary copy jobs, cannot be changed. However, the priority of an auxiliary copy job can be changed
using the Change Job Priority feature.

Operation                                                        Assigned Priority
Data Aging, Export Media                                         Not applicable
Inventory, Download Updates, Install Updates                     0
Disaster Recovery Backups                                        66
Data Verification                                                366
Erase Media                                                      466
Auxiliary Copy                                                   266

NOTES
   A search inventory job is of highest priority.
Back to Top



Set Job Priorities
You can set the job priority of an agent or client.
Set the Priority of an Agent
From the CommCell level, the priority of an agent can be set from the Priority column of the Job Priorities pane of the
General tab of the Job Management dialog box.
Set the Job Priority of a Client
From the Client level, the priority of a client can be changed from the Job Priority (0-9)field of the Job Configuration
tab of the Client Computer Properties dialog box.
Back to Top



View Job Priorities
You can view the priority of a specific job by viewing the details of the job, or you can view the priority of all jobs in the job
controller. By default, the Job Controller window does not display the priorities of running jobs. To view the job priorities for
all jobs, you need to enable the Priority column.
Back to Top



Change Job Priorities
The Change Job Priority feature allows you to change the priority of a job or groups of jobs. Using this feature may be



                                                        Page 248 of 263
                                           Features - SharePoint Database iDataAgent



necessary if you need to change the priority of jobs based on how you want the Job Manager to allocate necessary
resources. Note that if you change the priority of job because you want that job to interrupt another job for resources, that
job must be able to preempt the other job.
This feature allows you to:
   Change the job priority for an active job (in any state) from the Job Controller
   To specify a priority to be used at the time of submitting a job (scheduled or immediate).
   To specify a priority for multiple jobs at once by selecting multiple jobs in the Job Controller.
Types of Jobs That Can Be Changed
You can change the job priority number of the following jobs:
   Auxiliary Copy
   Data Protection Operations
   Data Recovery Operations
Change Job Priorities From the Job Controller
The priority of a job or groups of jobs can be changed from the
Change Job Priority dialog box.




Change the Priority of an Individual Job Being Submitted
The priority of a job that is being submitted immediately or is being
scheduled can be changed from the Change Priority dialog box.




Back to Top



Priority Precedence
By default, client computers (the second digit in the job priority number) have higher precedence over agents (the third
digit in the priority number). You can reverse the order of the digits so that the agent has priority over the client computer.
You can change the priority precedence of a client or agent from the General of the Job Management dialog box.



                                                        Page 249 of 263
                                         Features - SharePoint Database iDataAgent




Back to Top



Supported Agents
All agents support the Change Priority feature except the following agent:
   Quick Recovery Agent
Back to Top



How To
   Set the Priority Precedence of a Client or Agent
   Set the Priority of an Agent
   Set the Priority of a Client
   View the Priorities of All Jobs
   View the Priority of a Specific Job
   Change the Job Priority of an Active Job
   Change the Job Priority of Groups of Active Jobs
   Change the Job Priority of an Immediate Job
   Change the Job Priority of a Scheduled Job
Back to Top




                                                      Page 250 of 263
                                        Features - SharePoint Database iDataAgent




Job Alive Check Interval

Overview
The Job Alive Check Interval option within the General tab of the Job
Management dialog box allows you to specify the time interval by which
the Job Manager will check active jobs to determine if they are still
running.




How To
  Set the Job Alive Check Interval




                                                     Page 251 of 263
                                          Features - SharePoint Database iDataAgent




Job Running Time

Select from the following topics:
   Overview
   How to Configure Job Running Time
Related Topics
   Job Management


Overview
At the time of job initiation, you can determine the total amount of time a job can run before it is killed by the Job Manager.
The configurable parameters for Job running time allow you to control the following:
   Total Running Time - the maximum elapsed time, in hours and minutes, from the time that the job is created. When
   the specified maximum elapsed time is reached, as long as the job is in the "running" state, it will continue; if its state is
   not in the "running" state when the specified time is reached, Job Manager will kill the job.
   Example: Total Running Time for a job is specified as 1 hour.
      If the job is still running at the 1 hour point, it will continue to run.
      If the job is still running at the 1 hour point, but 30 minutes later you suspend the job, Job Manager will kill the job.
      If the job begins running, and 15 minutes later is suspended and left in that state, 45 minutes later (when the
      specified Total Running Time of 1 hour has elapsed) Job Manager will kill the job.
      If the job is started in the suspended state and left in that state, 1 hour later (when the specified Total Running Time
      of 1 hour has elapsed) Job Manager will kill the job.
   Kill running jobs when total running time expires - option to kill the job if the specified Total Running Time has
   elapsed, even if its state is "running". This option is available only if you have specified a Total Running Time.


How to Configure Job Running Time
You can configure the Total Running Time and whether to Kill running jobs when total running time expires in the
Job Retry tab of the job initiation dialog box for the following types of jobs:
   For an Auxiliary Copy job, see Start an Auxiliary Copy or Schedule an Auxiliary Copy. In the Auxiliary Copy dialog, click
   Advanced, then select the Job Retry tab.
   For a Data Aging job, see Start Data Aging or Schedule Data Aging. In the Data Aging dialog, select the Job Retry tab.
   For a Data Protection operation, in the Archive Options, Backup Options, or Migrate Options dialog, click Advanced,
   then select the Job Retry tab. Refer to information specific to your Agent, beginning with the Archive, Backup Data, or
   Migration Operations page.
   For a Data Recovery Operation, in the Restore Options or Recover Options dialog, click Advanced, then select the Job
   Retry tab. Refer to information specific to your Agent, beginning with the Retrieve Archived Data, Restore Backup Data,
   or Recover Migrated Data page.
   For a Disaster Recovery Backup operation, see Starting a Disaster Recovery Backup or Scheduling a Disaster Recovery
   Backup. In the Disaster Recovery Backup Options dialog, select the Job Retry tab.
   For an Erase Data job for DataMigrator for Exchange Mailbox, see Erase Migrated Data. In the Erase All Migrated Data
   dialog, select the Job Retry tab.


How To
   Set the Total Running Time for a Job
Back to Top




                                                       Page 252 of 263
                                          Features - SharePoint Database iDataAgent




Job Restart

Select from the following topics:
   Overview
   How to Configure Job Restarts
   QR Volume Creation Restartability
   How To
Related Topics
   Job Management
   Agent Types Included in the Job Type List for Restartability
   Support Information - Job Management


Overview
A restartable job is one that can be restarted, either by a user, or automatically by the Job Manager. Both preemptible and
non-preemptible jobs can be restartable; preemptible jobs are always restartable after they are suspended; jobs that are
not preemptible might fail to start and be in a "waiting" state and can be restartable as well. Additional insight about jobs
that fail to start can be gained from reviewing What Happens When There are no Resources for a Job.
Job Restartability can be configured in the Job Management Control Panel; restartability can be turned on or off, the
maximum number of restart attempts can be specified, and the time interval between each restart attempt can be
configured. These settings are for the entire CommCell, so that all jobs in the CommCell of a selected type will behave
according to the Job Restart settings you have specified.
The following types of operations can be restarted, if so configured:
   Auxiliary Copy
   Data Aging
   Data Protection operations of indexing-based, file system-like agents, and certain database-like agents**
   Data Recovery operations of indexing-based, file system-like agents**
   Disaster Recovery backup
   Erase Data (for DataMigrator for Exchange Mailbox only, a job-based setting is available)
**For a complete list of the agents that are included in the data protection and data recovery categories Job Type list in the
Job Management Control Panel, see Agent Types Included in the Job Type List for Restartability.
For a specific job, you can override one of these settings, the maximum number of restart attempts, by specifying the
Number of Retries in the Job Retry tab of the job initiation dialog box for that particular job. See How to Configure Job
Restarts for more specific direction on this.
In all cases, whether the Max Restarts setting is used in the Job Management Control Panel, or the Number of Retries
setting in the Job Retry tab, once the maximum number of retries has been reached, if the job has still not restarted
successfully, the Job Manager will kill the job.
NOTES
   The job-based setting will have no affect unless restartability has been turned on in the Job Management Control Panel.
   You can not configure the interval between restart attempts for an individual job, only the number of attempted restarts.
   Data Aging restartability can only be set in the Job Management Control Panel; you cannot set it in the Job Retry tab of
   the job initiation dialog box for that particular job.
   The restartability of Unix raw partition backup jobs either manually or by the system is not supported. Therefore, you
   should run such jobs under high priority.
   Data Protection Jobs that enter a Running (Cannot be verified) job state during a temporary network or CommServe
   service outage will not be restarted. These jobs do not enter a pending state; they will continue, without interruption,
   when the network or CommServe services become available. For more information, see Fault Tolerance.
   Restarting an Oracle On Demand backup job for multiple instances will cause the instance, whose backup was
   interrupted, to be backed up again from the beginning. Because of this restart behavior, if the archive files for that
   instance were successfully backed up before the restart, they will be backed up again after the restart. As a result, Job




                                                       Page 253 of 263
                                           Features - SharePoint Database iDataAgent



     Manager may count the data size of archive files twice for the instance that the Oracle On Demand backup job was
     restarted from. Therefore, the size of data reported as backed up for this job (in the Job Details and Backup Job History)
     will reflect the duplicate size of the archive files that were backed up twice for that instance.



How to Configure Job Restarts
1.   Using the Job Management control panel, Job Restarts are configured for the entire CommCell. For each job, Specify Job
     Restartability for the CommCell.
2.   For Agents that support the capability, to override the CommCell's Max Restart setting for a particular job, you can
     specify the Number of Retries in the Job Retry tab of the job configuration dialog box for the following types of jobs:
        For an Auxiliary Copy job, see Start an Auxiliary Copy or Schedule an Auxiliary Copy. In the Auxiliary Copy dialog,
        click Advanced, then select the Job Retry tab and specify Number of Retries.
        For a Data Protection operation, in the Archive Options, Backup Options, or Migrate Options dialog, click Advanced,
        then select the Job Retry tab and specify Number of Retries. Refer to information specific to your Agent,
        beginning with the Archive, Backup Data, or Migration Operations page.
        For a Data Recovery Operation, in the Restore Options or Recover Options dialog, click Advanced, then select the
        Job Retry tab and specify Number of Retries. Refer to information specific to your Agent, beginning with the
        Retrieve Archived Data, Restore Backup Data, or Recover Migrated Data page.
        For a Disaster Recovery Backup operation, see Starting a Disaster Recovery Backup or Scheduling a Disaster
        Recovery Backup. In the Disaster Recovery Backup Options dialog, select the Job Retry tab and specify Number of
        Retries.
        For an Erase Data job for DataMigrator for Exchange Mailbox, see Erase Migrated Data. In the Erase All Migrated
        Data dialog, select the Job Retry tab and specify Number of Retries.




QR Volume Creation Restartability
QR Volume Creation restartability is only supported on Windows platforms. See Create a QR Volume for more information.
Single Volume Subclient
     The Quick Recovery Agent maintains a restart string during the Volume Creation (copying) phase of full and incremental
     copy jobs to keep track of the progress made on each volume being copied. This restart string is updated on the
     CommServe database every time 1 GB of data is copied per volume. If a job is resumed from a suspended or pending
     state, this restart string will be retrieved and used to identify the location in the volume from where to resume the
     copying. For example, a job was suspended with 2.8 GB of the data copied for a particular volume; since the restart
     string on the volume was last updated when 2 GB completed copying, the job resumed from that point.
Multi-Volume Subclient
     In the QR Volume Creation phase, volumes are copied sequentially (i.e., not in parallel). This affects job restartability
     behavior for a multi-volume subclient. When a QR Volume Creation job is interrupted (suspended or pending), some of
     the volumes in the subclient may be completely copied while others may not be copied yet at all. If the job is restarted
     (either manually or automatically), the behavior toward each volume in the subclient will depend on the condition of the
     volume at the time of job interruption. Refer to the following table for the expected behavior (for each volume) when
     resuming an interrupted QR Volume Creation job for a multi-volume subclient.
     Volume Condition at the Time of Job
                                                 Behavior when Job Restarts
     Interruption
                                                 The Quick Recovery Agent copies any changes to the volume that occurred
                                                 after the starting point of the original job up to the time of the restart.
                                                 For example: A job was initiated at 2:00 P.M. At 2:30 P.M., you suspended
     volume was successfully copied              the job. This job was suspended in the QR Volume Creation (copying) phase,
                                                 after the volume was successfully copied. At 3:00 P.M. you restarted the
                                                 job. Upon the resume, the Quick Recovery Agent copied the changes made
                                                 to the volume from 2:00 to 3:00 P.M.



     volume was partially copied                 The Quick Recovery Agent runs the full or incremental copy, and then copies
                                                 any changes to the volume that occurred after the starting point of the
                                                 original job up to the time of the restart.



                                                        Page 254 of 263
                                    Features - SharePoint Database iDataAgent




                                          For example: A job was initiated at 2:00 P.M. At 2:30 P.M., you suspended
                                          the job. This job was suspended in the QR Volume Creation (copying) phase,
                                          during the copying of the volume. At 3:00 P.M. you restarted the job. Upon
                                          the resume, the Quick Recovery Agent ran the initial copy job and then
                                          copied the changes made to the volume from 2:00 to 3:00 P.M.

                                          If it’s a full copy, the Quick Recovery Agent runs a normal full copy.
                                          For example: A job was initiated at 2:00 P.M. At 2:02 P.M., you suspended
                                          the job. This job was suspended in the QR Volume Creation (copying) phase,
                                          before it copied any parts of the volume. At 3:00 P.M. you restarted the job.
                                          Upon the resume, the Quick Recovery Agent ran a full copy job, copying all
                                          the data in the volume up to 3:00 P.M.
                                          If it’s an incremental copy, the Quick Recovery Agent copies any changes
volume was not yet copied                 that the original incremental would have copied as well any changes to the
                                          volume that occurred after the starting point of the original incremental copy
                                          job up to the time of the restart.
                                          For example: A job was initiated at 2:00 P.M. At 2:02 P.M., you suspended
                                          the job. This job was suspended in the QR Volume Creation (copying) phase,
                                          before it copied any parts of the volume. At 3:00 P.M. you restarted the job.
                                          Upon the resume, the Quick Recovery Agent copied the data that the original
                                          incremental copy would have copied, as well as the changes made to the
                                          volume from 2:00 to 3:00 P.M.



How To
  Specify Job Restartability for the CommCell
Back to Top




                                                 Page 255 of 263
                                         Features - SharePoint Database iDataAgent




Job Management - Data Protection Operations

Choose the following topic:
   When a Non-Full Backup is Automatically Converted to a Full Backup


When a Non-Full Backup is Automatically Converted to a Full Backup
Under the following conditions, a non-full backup is automatically converted to a full backup:
   If it is the first backup of the subclient.
   If you re-associate a subclient to another storage policy.
   If you promote a secondary storage policy copy that is not synchronized with a primary copy (for all the subclients of a
   storage policy).
   If a backup job within the most recent backup cycle is pruned or disabled from a primary copy.
   If a new content path is added to the subclient.
Some agents have additional scenarios in which a non-full backup is also automatically converted to a full backup:
   Exchange Database iDataAgents
     If an Exchange Database has been restored
     If an Exchange Database has been auto-discovered
     If the Pre-Selected backup type has been changed
   Image Level and Image Level ProxyHost iDataAgents
     After a failover occurs in a clustered environment, without having CXBF bitmap persistence enabled. For more
     information, see Configure Persistence.
     After an in-place Volume Level restore
   Oracle iDataAgent
     If an incremental backup is selected for an Oracle subclient that includes Archive Logs and/or control files only
   SQL Server iDataAgent
     See Default Subclient Backup Conversion Rules and File/File Group Subclient Backup Conversion Rules for complete
     listings.
   NetWare File System iDataAgent
     The first NetWare File System backup run after having selected the backup set option Decompress Data Before
     Backup is converted to a full backup for all subclients that belong to that backup set.
Back to Top




                                                      Page 256 of 263
                                          Features - SharePoint Database iDataAgent




Job Management - Data Recovery Operations

Hardware Considerations
When a hardware failure occurs during a restore, the restore job will go into a device wait state indefinitely and will need to
be killed.




                                                       Page 257 of 263
                                        Features - SharePoint Database iDataAgent




Backup Job History

Choose from the following topics:
  Overview
  Items That Failed
  Content Indexing Failures
  Files That Were Backed Up
  Supported Features
  How To


Overview
You can view the backup and restore history of iDataAgents, BackupSets/Instances, and subclients.
The Backup Job History Filter dialog box allows you view detailed, historical information about backup jobs. Once you
have chosen your filter options, they are displayed in the Backup Job History window. The Backup Job History window
displays the following information:

Job ID                   The unique number allocated by the Job Manager that identifies the operation.
Status                   Displays the Job Status of a particular operation.
iDataAgent               The agent that performed the operation.
                         The instance/partition in the client computer that represents the database that was included in
Instance/Partition
                         this operation.
Backup Set               The backup set that was protected/recovered during the operation.
                         The subclient that was protected during the operation. Note that a deleted subclient will have a
Subclient                Unix time stamp appended to its name in cases where another subclient is currently using the
                         same name as the deleted subclient.
Storage Policy           The storage policy to which the data protection operation was directed.
Backup Type              The type of backup that was conducted: Differential, Full, Incremental or Synthetic.
Failed Folders           The number of folders that were not included in the operation.
Failed Files             The number of files that were not included in the operation.
Start Time               The date and time on the CommServe when the operation started.
End Time                 The date and time on the CommServe when the operation was completed.
                         The amount of data that was transferred to the media.

Size on Media            NOTE: When viewing the jobs from the client level, the amount displayed is an uncompressed size
                         and includes valid and invalid attempts of the backup jobs, and thus, may be larger than the size
                         displayed when viewing the jobs from any other level.
                         The name of the user who initiated the operation. For DataMigrator stub recoveries from Outlook,
User Name
                         the Exchange Mailbox Alias Name will be displayed.
Content Indexed          Displays whether contenting indexing was used (Yes or No) for the operation.

From this window, you can right-click a backup job to:
  View/change the fields that are displayed in the Backup Job History window
  Browse the data backed up by the backup set or instance from the Backup Job History window. This is provided as
  right-click option for each job. (This menu option, when selected, initiates the Browse Options dialog box preset with
  the values needed to browse the data.)
  View items that failed during the backup job
  View details of the backup job
  View files that were not indexed during a backup job that performed content indexing
  View associated media
  View events of the backup job
  View a list of items that were backed up
  View the log files of the backup job.




                                                     Page 258 of 263
                                          Features - SharePoint Database iDataAgent




Back to Top



Items That Failed
The items that failed for a data protection operation include individual files that may fail the job even though a particular
job completed successfully. You can determine the degree of success for these jobs using this window.
Filters can be used in conjunction with the "Items That Failed" list on the data protection Job History Report to eliminate
backup or migration failures by excluding items which consistently fail that are not integral to the operation of the system
or applications. Some items fail because they are locked by the operating system or application and cannot be opened at
the time of the data protection operation. This often occurs with certain system-related files and database application files.
Also, keep in mind that you will need to run a full backup after adding failed files to the filter in order to remove them.
NOTES
   A listing of files and folders that failed is not available for the Quick Recovery Agent, nor the Image Level, Image Level
   ProxyHost, and Serverless Data Manager iDataAgents. These agents do not perform a file level backup/copy.
   Certain application related files can never be backed up by the File System iDataAgent due to the nature of the data. For
   example, Microsoft SQL Server database files cannot be backed up by the File System iDataAgent. In this and other
   similar circumstances, consider entering files such as these as exclusions in the corresponding subclient filter.
Back to Top



Content Indexing Failures
Content Indexing failures allows you to look at the files that could not be indexed during a migration/archive job that
performed Content Indexing. Content Indexing looks at each file (of the supported data types) and indexes its contents
allowing advanced searches of backed up/archived/migrated data.
Files that were not indexed, (perhaps because the file’s content could not be read) are added to the Content Indexing
Failures list, and are viewable from the View Content Indexing Failures option in a Job History window.
Back to Top



Items That Were Backed Up
The View backup file list option allows you to view a list of the files that were backed up during a backup job, along with
the data sizes of each backed up file. The View backed up messages option allows you to view a list of messages that
were backed up by using, along with the alias name, display name, email address, sender name, and recipient of each
message.
From these windows you can conduct searches based on a particular string, allowing to find particular files quickly and
easily.
NOTES
   It is not recommended that this option is used to view a very large list of items that were backed up (such as lists that
   total over 100,000 items). It is suggested that the Browse option is used to find a list of backed up items in such cases.
Back to Top



Supported Features
   The NAS NDMP iDataAgents do not support the ability to view items that failed.
   The Image Level and Image Level ProxyHost iDataAgents do not support the ability to Browse the data of a selected
   backup job in Backup Job History.
Back to Top



How To



                                                       Page 259 of 263
                                          Features - SharePoint Database iDataAgent




Restore Job History

Chose the following topic:
   Overview
   Items That Restored
   Supported Features
   How To


Overview
The Restore History Filter dialog box allows you to view detailed, historical information about restore jobs.
Once you have chosen your filter options, they are displayed in the Restore Job History window. From this window you
can right-click a restore job to:
   View Restore Items; items in the job that were Successful, Failed, Skipped or All. These items, if any, will be listed
   in the Restored Files window.
   View Job Details of the restore job. The job details will be listed in the Job Details window.
   View Events of the restore job. The job events will be listed in the All Found Events window.
   View Log files of the restore job. The job log files will be listed in the Log File window.
Back to Top



Items That Restored
When viewing files that restored in the Restored Files window, each of the files is listed with the restore status level
appended at the end of the file path. The possible status levels are: RESTORED, FAILED and OLDER.

Successfully restored files will be listed with RESTORED appended to the file path. If files are not restored/recovered due to
errors, the file paths will be appended with FAILED. Under some circumstances, the system may not restore/recover certain
files because they are older versions of the same files already present in the files system; these files are appended with the
word OLDER.

Back to Top



Supported Features
The NAS NDMP iDataAgents do not support the ability to view failed/successful item lists.
Back to Top



How To
   View   Admin Job History
   View   Job History Details
   View   the Restored Items of a Job History
   View   the Events of a Job History
   View   the Media of a Job History
   View   the Log Files of a Job History
Back to Top




                                                       Page 260 of 263
                                          Features - SharePoint Database iDataAgent




Hardware-Specific Resource Issues

Choose from the following topics:
   Overview
   Removable Media Libraries
   Magnetic Disk Libraries


Overview
Storage policy copies and streams associate logical data entities with physical media. In order to configure storage policies
and copies for maximum efficiency, you must understand how QiNetix uses your storage media and the hardware-specific
limitations that apply to each media type. For example, you can run multiple operations simultaneously if they are directed
to magnetic disk media. However, this may cause resource contention if the jobs are directed to a tape library, since a
given tape is only available for one operation at a time. The sections that follow describe issues relating to each media type.
Back to Top



Removable Media Libraries
As removable media (tape cartridges and optical disks) can only be accessed by one tape drive (and consequently one
operation) at a time, you must plan carefully to avoid resource contention. The sections that follow explain how contention
can arise.
Removable Media Groups
A media group is simply one or more related media to which data is written during a data protection operation. There is a
one-to-one correspondence between media groups and data streams. Each time a given stream is in use, it transfers data
to or from the same media group. Consequently, the data stored by a media group tends to be from the same subclient(s).
Within a media group, only one media receives the data secured by the data protection operation. This media is called the
active media. Once the active media reaches capacity, either through one large backup/migration or a series of smaller
ones, the MediaAgent gets a new media from a scratch pool, designating it as the active media. While the original active
media still contains valid data, it is no longer used for data protection operations; however, it will be used if the data it
contains is needed for a data recovery operation. Over time, additional media are cycled through the active state and the
media group grows. The size (i.e., the number of tapes) of the media group depends on the retention period of the copy
through which the data was backed up/migrated and the quantity of data backed up/migrated to the media during the
retention period.




Backup/Migration Series within Removable Media Groups




                                                       Page 261 of 263
                                         Features - SharePoint Database iDataAgent




A removable media group can contain the data of more than one subclient. The data mix, if any, depends on whether the
backups/migrations of other subclients are mapped to the same storage policy. Since a storage policy has a primary copy,
all data sent to that storage policy is ultimately written to the same set of streams; therefore the same media group(s).
When the data secured by two or more subclients are mapped to the same storage policy, the destination media group(s)
become a composite of different backup/migration series; one backup/migration series per subclient.
Take a simple example in which the data protection operations of two File System clients, coral and onyx, are associated
with the same storage policy, A, which is associated with a tape library. Assume no subclients are declared; hence, each
client computer comprises only the default subclient. When data protection operations of these subclients are initiated, the
data is written to the same media group in the form of archive files as shown in the following figure. Each data protection
operation produces one archive file. Although the data resides on the same media, the data retains the identity of their
origins thus preserving their integrity for data restoration/recovery.




In the previous example, the media group comprised two backup/migration series. If additional subclients were associated
with the same storage policy, even subclients from different Agents, then the media group would contain one more
backup/migration series for each additional subclient.
When you associate subclients with storage policies (and consequently copies), it is important to realize that only one
subclient can access a given media at a time. Unless, during a data protection operation, data multiplexing is performed,
and these operations of different subclients can run in parallel.
However, regardless of data multiplexing, data recovery operations that need access to multiple backup/migration series on
media cannot run simultaneously. In the example above, a restore/recovery of Backup 1 to coral cannot run at the same
time as a restore/recovery of Backup 2 to onyx.

Media Contention within Removable Media Groups
When you direct the data protection operations from different subclients to the same storage policy, you increase the
likelihood of resource contention for those storage policy copies that are associated with removable media libraries. A media
group can support one operation at a time. As a result, data protection or data recovery operations that access the same
storage policy at the same time may actually be performed serially. This is particularly true if the corresponding storage
policy is configured to provide only one data stream. Removable media contention tends to lessen as the number of
configured streams increases. Even so, since a given backup/migration can use any stream, it is possible that the data for
different clients could, over time, be written to the same stream, therefore the same tapes. Consequently, removable media
contention can arise when backing up/migrating or restoring/recovering data to different clients that share the same
storage policy.
Remember, the system does not require you to consolidate the data of different subclients or client computers within the
same storage policy. To avoid the effects of media contention, you may want to create additional storage policies.
Scratch Pools
A scratch pool is a repository of new and pruned media. Each storage policy copy that is associated with a media library is
also associated with a scratch pool. Removable media cycle through the scratch pool.
When a media group exhausts the capacity of the active media, it marks the media as full and appropriates another from
the scratch pool. Over time and in accordance with the associated retention period, data is pruned by the pruning utility.
Once all of the data on a given media has been pruned, the system recycles the tape by reassigning it from the media
group back to the scratch pool where it can be reused. Of course, if the associated retention period is unlimited, the data



                                                      Page 262 of 263
                                          Features - SharePoint Database iDataAgent



never expires; consequently, the media never recycles and the size of the media group continues to grow with each data
protection operation.
Drive Pools and Resource Contention
A drive pool is a group of drives within a single tape library that are controlled by a specific MediaAgent. Each storage policy
copy that is associated with a tape library is also associated with a drive pool.
To get the most out of your tape libraries, you can allocate the arm changer and drives within a library to different
MediaAgents within the CommCell. The system creates a drive pool for all of the drives within a given library that are
controlled by a specific MediaAgent.
If you divide control of a library’s drives among multiple MediaAgents, you must take the following into account to avoid
resource contention:
   When a library’s resources are divided among MediaAgents, jobs running via a particular MediaAgent can only use drives
   that are attached to that MediaAgent. This means that fewer drives are available and resource contention is more likely
   than if the library were not shared.
   When you configure storage policies, the number of drives in the smallest drive pool associated with any copy of the
   storage policy determines the maximum number of streams that can be created simultaneously by any copy of the
   storage policy.
Back to Top



Magnetic Disk Libraries
                          For NAS NDMP iDataAgents magnetic disk libraries can be used only with NDMP
                          Remote Server policies.



Theoretically, there is no limit to the number of streams that can access a magnetic disk simultaneously (though if too
many simultaneous operations are attempted performance suffers).
Consequently, resource contention is not an issue for a storage policy if all of the storage policy’s copies are associated with
magnetic disk libraries. Still, all copies of a storage policy must have the same number of streams. If one copy of a storage
policy is associated with a magnetic disk library while another copy is associated with a media type that places physical
limitations on the number of streams supported (e.g., tape), the copy directed to magnetic disk is subject to those
limitations as well.
For example, assume that we have both a tape library and a magnetic disk library attached to a MediaAgent. We want to
use tape media for long-term archive storage while using magnetic disk media for day-to-day operations. Within the tape
library we plan to use one drive pool, which contains five media drives. For the reasons explained in Allocating Data
Streams, when we create the storage policy that accesses the drive pool, we must set the maximum number of streams for
all copies of the storage policy to five. If we try to run a five-stream database backup and a single stream file system
restore/recovery from the magnetic disk library simultaneously, resource contention will occur. Although a magnetic disk
can easily support many more than five streams, the physical limitation of the tape hardware imposes a logical limitation on
the magnetic disk hardware.
Back to Top




                                                       Page 263 of 263

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:77
posted:11/8/2011
language:English
pages:268