Docstoc

COTS Assessment Process _CAP_

Document Sample
COTS Assessment Process _CAP_ Powered By Docstoc
					COTS Assessment Process                                  Version 6.20


           COTS Assessment Process (CAP)
        Pasadena High School Computer Network Study



                               Team #3


        Ajithkumar Kattil            (Project Manager)
        Chris Yuan                   (Tester & Test Reviewer)
        Kunal Kadakia                (Business Process Analyst)
        Ashwin Kusabhadran           (Test Designer)
        Andrew Ha                    (Tester & Prototype)
        Devesh Thanvi                (Requirements Analyst)
        Vincent Chu                  (IV&V)
        Winston Kwong                (IV&V)
        Mrs. Jeanine Foote           (Client)
        Erin Shaw                    (Sponsor & Researcher)
        Pasadena High School         (Customer)




CAP_LCA_F05_T03_V06.20           I                          12/04/05
COTS Assessment Process                                                                   Version 6.20



Version History
Date       Author        Version   Changes made                             Rationale
09/28/05   Andrew Ha     1.0        Initial Draft Release                   Initial draft release sections 1.1, 1.2,
                                                                              1.3, 3, 4.1.2
09/29/05   Chris Yuan    1.1        Added Section 2.1, 4.1.1, 4.1.3         Added section 2.1, 4.1.1, 4.1.3
09/30/05   Chris Yuan    1.2        Added Table of Contents                 Added table of contents
09/30/05   Chris Yuan    1.3        Added Table of Tables                   Added table of tables
10/07/05   Chris Yuan,   2.0        Added section 2.1.1, revised section    Revise to incorporate the comments
           Andrew Ha                 1, 3 and section 4.1.2                   in QR report
10/09/05   Andrew Ha     2.1        Added Section 2.2, 2.3, 5               Added Section 2.2, 2.3, 5
10/10/05   Chris Yuan    2.2        Added Section 4.2, 4.3, revised         Added Section 4.2, 4.3. Revised
                                     section 1.2, revised Table of            section 1.2 to reflect the reference
                                     Contents, Table of Tables, added         used in section 4.2 and 4.3. Revised
                                     Table of Figures                         TOC to reflect the change.
10/10/05   Chris Yuan    2.3        Revised Section 2.1.                    Convert all importance into weights
                                                                              to reflect the evaluation criteria
                                                                              importance.
10/11/05   Chris Yuan    2.4        Revised Section 1.2, 4.2                Revised section 4.2 and updated the
                                                                              reference in section 1.2 accordingly
10/11/05   Andrew Ha     2.5        Revised Section 4.1.2                   Changed description for 1 of the
                                                                              software
10/16/05   Chris Yuan    2.6        Revised Section 2.1.2, 2.2.1            Revised evaluation criteria and
                                                                              weight in 2.1.2 and corrected dates
                                                                              in sections 2.2.1
10/17/05   Chris Yuan    2.7        Revised Section 4.2.2                   Revised section 4.2.2 to reflect the
                                                                              server qualification
10/18/05   Andrew Ha     3.0        Revised 2.1 and add approach            Revise all sections to incorporate
                                     adopted                                  the comments in ARB review
                                    Update Milestones & Schedule            Revised 2.1 and add approach
                                     Section 2.2.1                            adopted
                                    Added Project Timeline to 2.2.1         Update Milestones & Schedule
                                    Update Individual Roles in Section 3     Section 2.2.1
                                    Update Figure 3                         Added Project Timeline to 2.2.1
                                    Update 4.1.2 to incorporate             Update Individual Roles in Section 3
                                     suggested server and client             Update Figure 3
                                    Added 2.3.2                             Update 4.1.2 to incorporate
                                    Added 2.2.2                              suggested server and client
                                                                             Added 2.3.2
                                                                             Added 2.2.2
10/21/05   Chris Yuan    3.1        Added section 5.2                       Added section 5.2
                                    Revised TOC, TOT and TOF                Revised TOC, TOT and TOF




CAP_LCA_F05_T03_V06.20                              II                                          12/04/05
COTS Assessment Process                                                                  Version 6.20

Date       Author           Version   Changes made                          Rationale
10/24/05   Chris Yuan       3.2        Revised section 2.1.2, 2.1.3         Revised section 2.1.2 based on
                                       Revised section 4.2.1.2               ARB presentation
                                       Revised section 4.3                  Revised section 4.2.1.2 based on
                                                                              the ARB comments
                                                                             Revised section 4.3 based on
                                                                              current risks
10/25/05   Chris Yuan       3.3        Revised section 1.2                  Revised section 1.2 based on
                                       Revised section 2.1.2                 IV&Ver’s comments on LCO Draft
                                       Revised section 2.1.3                Revised section 2.1.2 based on
                                                                              IV&Ver’s comments on LCO Draft
                                       Revised section 4.2.2.1
                                                                             Revised section 2.1.3 based on
                                                                              IV&Ver’s comments on LCO Draft
                                                                             Revised section 4.2.2.1 based on
                                                                              IV&Ver’s comments on LCO Draft
11/10/05   Chris Yuan       3.4        Revised section 2.1.2                Revised section 2.1.2 based on
                                       Revised section 2.1.3                 IV&Ver’s comments on LCO
                                                                              Package
                                       Revised section 4.2.2.1
                                                                             Revised section 2.1.3 based on
                                       Revised section 4.2.2.2               IV&Ver’s comments on LCO
                                                                              Package
                                                                             Revised section 4.2.2.1 based on
                                                                              IV&Ver’s comments on LCO
                                                                              Package
                                                                             Revised section 4.2.2.2 based on
                                                                              IV&Ver’s comments on LCO
                                                                              Package
11/20/05   Andrew Ha,       4.0        Revised section 2.1.2                Revised section 2.1.2 for LCA Draft
           Kunal Kadakia,
           Chris Yuan                  Revised section 2.1.3                Revised section 2.1.3 for LCA Draft
                                       Added section 2.1.2.1 and 2.1.2.2    Added section 2.1.2.1 and 2.1.2.2
                                       Added section 2.1.3.1 and 2.1.3.2     for LCA Draft
                                       Revised section 2.2.2                Added section 2.1.3.1 and 2.1.3.2
                                                                              for LCA Draft
                                       Revised section 2.3.2
                                                                             Revised section 2.2.2 figure 3 for
                                       Revised section 4                     current schedule
                                                                             Revised 2.3.2 deliverable versions
                                                                             Revised section 4 to reflect the
                                                                              current risks
11/21/05   Chris Yuan       4.1        Added Glossary section               Added Glossary section
                                       Revised section 2.1.2.2              Revised section 2.1.2.2
                                       Revised section 2.1.3.1              Revised section 2.1.3.1
11/21/05   Andrew Ha        4.2        Revised section 2.2.2                Revised section 2.2.2 and 5.1 to
                                       Revised section 5.1                   incorporate LCO Package feedback




CAP_LCA_F05_T03_V06.20                               III                                       12/04/05
COTS Assessment Process                                                     Version 6.20

Date       Author       Version   Changes made                Rationale
11/21/05   Chris Yuan   5.0        Revised section 1.2        Revised all sections to incorporate
                                   Revised section 2.1.2       the internal QR comments for LCA
                                                                Draft
                                   Revised section 2.2.2
                                   Revised section 4.1.2
                                   Revised section 4.2.1.1
11/27/05   Chris Yuan   5.1        Revised section 2.1.2.1    Revised section 2.1.2.1 for AT-S06
                                   Revised section 2.1.2.2     and AT-07
                                   Revised section 2.1.3.1    Revised section 2.1.2.2 for AT-C01
                                   Revised section 2.1.3.2    Revised section 2.1.3.1 for AT-S06
                                                                and AT-07
                                                               Revised section 2.1.3.2 for AT-C01
11/27/05   Andrew Ha    5.2        Revised section 2.1        Revised overall strategy diagram in
                                   Revised section 4.1.3       section 2.1
                                                               Added COTS assessment process
                                                                diagram in section 4.1.3
11/27/05   Chris Yuan   5.3        Revised section 1.2        Added one more reference to
                                                                section 1.2
12/4/05    Andrew Ha    6.0        Revised 2.1.3.1            Revised 2.1.3.1 table formatting
                                   Revised 4.1.2              Revised 4.1.2 table to fix formatting
                                   Revised 4.3 Table 40       Revised 4.3 Table 40
                                                               Incorporate IV&V LCA Draft
                                                                comments LCA Package
12/4/05    Chris Yuan   6.1        Revised section 2.1.2.1    Revised all sections to incorporate
                                   Revised section 2.3.2       LCA ARB comments for LCA
                                                                Package
12/4/05    Chris Yuan   6.2        Revised section 2.1.2      Revised all sections to incorporate
                                   Revised section 2.1.3       IV&V LCA Draft comments LCA
                                                                Package
                                   Revised Glossary




CAP_LCA_F05_T03_V06.20                           IV                              12/04/05
COTS Assessment Process                                                                                                                Version 6.20

Table of Contents
VERSION HISTORY ....................................................................................................... II

TABLE OF CONTENTS.................................................................................................. V

TABLE OF TABLES ..................................................................................................... VII

TABLE OF FIGURES .................................................................................................... IX

1.     Introduction ..................................................................................................................................................... 10

     1.1      Purpose Scope and Assumptions ............................................................................................................. 10

     1.2      References ............................................................................................................................................... 11

     1.3      Change Summary..................................................................................................................................... 13

2.     Milestones and Products .................................................................................................................................. 16

     2.1      Overall Strategy ....................................................................................................................................... 16

             2.1.1             Selection and Assessment Iterations ...................................................................................... 17
             2.1.2             Attributes for Evaluation ....................................................................................................... 18
             2.1.3             Assessment Activity .............................................................................................................. 23

     2.2      Milestones and Schedules ........................................................................................................................ 29

             2.2.1             Inception Phase ...................................................................................................................... 29
             2.2.2             Elaboration Phase .................................................................................................................. 32

     2.3      Deliverables ............................................................................................................................................. 39

             2.3.1             Inception Phase ...................................................................................................................... 39
             2.3.2             Elaboration Phase .................................................................................................................. 40

3.     Responsibilities ................................................................................................................................................ 41

4.     Approach ......................................................................................................................................................... 44

     4.1      Assessment Framework ........................................................................................................................... 44

             4.1.1             Instruments ............................................................................................................................ 44
             4.1.2             Facilities ................................................................................................................................ 45
             4.1.3             COTS Assessment ................................................................................................................. 48

     4.2      Complementary Activity.......................................................................................................................... 49


CAP_LCA_F05_T03_V06.20                                                         V                                                               12/04/05
COTS Assessment Process                                                                                                                 Version 6.20
              4.2.1              Market Trend Analysis .......................................................................................................... 49
              4.2.2              Product Line Analysis ........................................................................................................... 52

     4.3       Risk Management .................................................................................................................................... 56

5.      Resources ......................................................................................................................................................... 60

     5.1       Work Breakdown ..................................................................................................................................... 60

     5.2       Effort Estimation ..................................................................................................................................... 62

GLOSSARY .................................................................................................................. 71




CAP_LCA_F05_T03_V06.20                                                          VI                                                              12/04/05
COTS Assessment Process                                                                                                             Version 6.20

Table of Tables
Table 1: Activity – Initial Filtering .............................................................................................................................. 17

Table 2: Activity – Detailed Assessment ...................................................................................................................... 17

Table 3: Server Evaluation Attributes ......................................................................................................................... 18

Table 4: Server Performance Attributes ...................................................................................................................... 18

Table 5: Server Cost Attributes ................................................................................................................................... 19

Table 6: Server Intercomponent Compatibility Attributes ........................................................................................... 19

Table 7: Server Interoperability Attributes ................................................................................................................. 19

Table 8: Server Vendor Support Attributes ................................................................................................................. 20

Table 9: Server Flexibility Attributes .......................................................................................................................... 20

Table 10: Server Security Attributes ............................................................................................................................ 20

Table 11: Client Evaluation Attributes ........................................................................................................................ 21

Table 12: Client Cost Attributes .................................................................................................................................. 21

Table 13: Client Performance Attributes..................................................................................................................... 21

Table 14: Client Vendor Support Attributes ................................................................................................................ 22

Table 15: Client Flexibility Attributes ......................................................................................................................... 22

Table 16: Server Assessment Activity for Performance ............................................................................................... 23

Table 17: Server Assessment Activity for Cost ............................................................................................................ 23

Table 18: Server Assessment Activity for Intercomponent Compatibility.................................................................... 24

Table 19: Server Assessment Activity for Interoperability .......................................................................................... 24

Table 20: Server Assessment Activity for Vendor Support .......................................................................................... 25

Table 21: Server Assessment Activity for Flexibility ................................................................................................... 25

Table 22: Server Assessment Activity for Security....................................................................................................... 26

Table 23: Client Assessment Activity for Cost ............................................................................................................. 27

Table 24: Client Assessment Activity for Performance ............................................................................................... 27




CAP_LCA_F05_T03_V06.20                                                       VII                                                           12/04/05
COTS Assessment Process                                                                                                              Version 6.20
Table 25: Client Assessment Activity for Vendor Support ........................................................................................... 28

Table 26: Client Assessment Activity for Flexibility .................................................................................................... 28

Table 27 Deliverable for Inception Phase ................................................................................................................... 39

Table 28: Deliverable for Elaboration Phase ............................................................................................................. 40

Table 29: Responsibilities............................................................................................................................................ 43

Table 30: Assessment Instruments ............................................................................................................................... 44

Table 31: Assessment Facilities................................................................................................................................... 47

Figure 10 - COTS Assessment Process........................................................................................................................ 48

Table 32: Worldwide Thin Client Market Share in 2004............................................................................................. 50

Table 33: Worldwide PC Market Share in 2004 ......................................................................................................... 50

Table 34: Risk-01 Single point of failure of the proposed system prototype ............................................................... 56

Table 35: Risk-02 Thin-client capability of handling multimedia application ............................................................ 56

Table 36: Risk-03 Continuous development of prototypes .......................................................................................... 56

Table 37: Risk-04 COTS integration issues ................................................................................................................. 57

Table 38: Risk-05 COTS integration and maintenance cost still uncertain ................................................................ 57

Table 39: Risk-06 Single COTS Vendor ...................................................................................................................... 57

Table 40: Risk-07 Lack of COTS vendor support ........................................................................................................ 58

Table 41: Risk-08 Faulty Vendor Claims .................................................................................................................... 58

Table 42: Risk-09 Team member availability .............................................................................................................. 58

Table 43: Risk-10 Budget overruns ............................................................................................................................. 59

Table 44: Risk-11 Schedule conflict between client and evaluators ............................................................................ 59

Table 45 Individual Responsibilities............................................................................................................................ 61




CAP_LCA_F05_T03_V06.20                                                        VIII                                                          12/04/05
COTS Assessment Process                                                                                                             Version 6.20

Table of Figures
Figure 1 - Overall Strategy.......................................................................................................................................... 16

Figure 2 - MS Project Plan for Inception Phase ......................................................................................................... 34

Figure 3 - MS Project Plan for Elaboration Phase (1/6) ............................................................................................ 35

Figure 4 - MS Project Plan for Elaboration Phase (2/6) ............................................................................................ 35

Figure 5 - MS Project Plan for Elaboration Phase (3/6) ............................................................................................ 36

Figure 6 - MS Project Plan for Elaboration Phase (4/6) ............................................................................................ 36

Figure 7 - MS Project Plan for Elaboration Phase (5/6) ............................................................................................ 37

Figure 8 - MS Project Plan for Elaboration Phase (6/6) ............................................................................................ 37

Figure 9 - Project Timeline ......................................................................................................................................... 38

Figure 11 - Worldwide Enterprise Thin Client Shipment by Region, 2003 and 2007 ................................................. 49

Figure 12 - Current Thin Client Model with Wyse Winterm 3250 .............................................................................. 53

Figure 13 - Wyse Winterm Thin Client Feature Comparison Chart ........................................................................... 54

Figure 14 - Price Comparison Chart of Thin Client Vendors ..................................................................................... 55

Figure 15 – COCOTS Project Information ................................................................................................................. 63

Figure 16 – COCOTS Product Used ........................................................................................................................... 63

Figure 17 - COCOTS Assessment Effort ..................................................................................................................... 64

Figure 18 – Assessment Effort for Generic Component .............................................................................................. 65

Figure 19 - Assessment Effort for Middleware ............................................................................................................ 66

Figure 20 - Assessment Effort for Network Manager .................................................................................................. 67

Figure 21 - Assessment Effort for Operating System ................................................................................................... 68

Figure 22 - Assessment Effort for Word Processing.................................................................................................... 69

Figure 23 – COCOTS Assessment Result .................................................................................................................... 70




CAP_LCA_F05_T03_V06.20                                                         IX                                                          12/04/05
COTS Assessment Process                                                       Version 6.20

1. Introduction
   This CAP document contains the minimal essentials for the “why /whereas, what /when,
   who/where, how and how much” aspects of the assessment activity being planned. This
   project involves an analysis of the existing thin client network infrastructure of Pasadena
   High School. The current network infrastructure at PHS library consists of two main servers
   (TC95, TC96) connecting to forty plus thin client terminal units. This system was originally
   designed, and deployed by Tangent Computer, who also currently serves as the school’s
   main source for technical support. The librarian is currently experiencing extremely long
   delay (up to 30 minutes per classroom) during the login process between the thin client
   terminals and authentication server. In addition, the client also reports that the thin client’s
   network can only support up to eight simultaneous users when executing multimedia
   intensive software (Wayang Outpost).

   This system design, (by Team3 of CS577a for Fall 2005) consisting of main servers and
   multiple thin client stations will provide a low cost solution that facilitates easy deployment
   and necessitates only minimal maintenance. This CAP is a living document and is updated
   as and when new risks, opportunities or changes emerge.

 1.1 Purpose Scope and Assumptions
The purpose of this document is to provide the minimum essential set of plans needed to perform
a COTS assessment for Pasadena High School network. Its scope covers the COTS assessment
aspects of the network and infrastructure of the system.

The following assumptions must continue to be valid in order to implement the plans below
within the resources specified:

      The thin client terminals must have a solid physical network connection to the application
       server (TC95), and must be able to communicate with the server TC95.

      The client must be able to provide the assessment team full access to application server
       (TC95, TC96) and access to Wyse thin client terminals.

      The client continues the support contract with Tangent Computer after project
       deployment, and Tangent Computer will serve as the primary Database and System
       Maintainer.

      The existing infrastructure and topologies consisting of network connection, PC setups,
       thin client setup, server configuration, and server location must not change drastically
       during the assessment period.

      Purchasing of all COTS products will be made by the Client and the Customer (PHS).




CAP_LCA_F05_T03_V06.20                        10                                  12/04/05
COTS Assessment Process                                                        Version 6.20

 1.2 References
         DART – Distributed Assessment of Risks Tool

      http://greenbay.usc.edu:8080/dart/appManager?event=HOMEPAGE

         IDC, “Worldwide Enterprise Thin Client 2004-2008 Forecast and Analysis: The Challenge of
          Staying Thin”

      http://www.idc.com/research/viewtoc.jsp?containerId=31945

         IDC, “Thin Clients: Selecting the Right Desktop Strategy for Your Organization”

      http://www.wyse.com/resources/whitepapers/PDF/IDC_DesktopStrategy.pdf

         InfoWorld, “Think thin”

      http://www.infoworld.com/article/05/07/14/29FEthin_1.html

         Mercury LoadRunner, a load performance and simulation test tool for server based systems.

      http://www.mercury.com/us/products/performance-center/loadrunner/

         Observer, a network monitoring tool.

      http://www.networkinstruments.com/products/observer.html

         Red Herring, “To be Wyse is to be thin”

      http://www.redherring.com/Article.aspx?a=10913&hed=To%20be%20Wyse%20is%20to%20be
      %20thin

         Renaissance Place, an accelerated reading program

      http://www.renlearn.com/RenaissancePlace/default.htm

         Tangent Computer (Provides Server and Technical Support to PHS)

      http://www.tangent.com/

         Wayang Outpost, USC ISI project for Flash based online SAT and Geometry Learning

      http://kulit.isi.edu/

         WebLoad, a performance and simulation test tool for web based applications.

      http://www.radview.com/default.asp




CAP_LCA_F05_T03_V06.20                           11                                12/04/05
COTS Assessment Process                                                       Version 6.20
         Wyse Thin Client Specification

      http://www.wyse.com/service/discontd/winterm/wint3000/3230.asp

         Wyse Thin Client Comparison Chart

      http://www.wyse.com/products/winterm/WYSE_thinclient_comparison.pdf

         Ye Yang, and Barry Boehm, "Guidelines for Producing COTS Assessment Background,
          Process, and Report Documents," USC-CSE Tech report

      http://greenbay.usc.edu/csci577/fall2005/site/guidelines/CBA-AssessmentIntensive.pdf




CAP_LCA_F05_T03_V06.20                        12                                   12/04/05
COTS Assessment Process                                                         Version 6.20

 1.3 Change Summary
      Version 1.0         Andrew released initial draft with sections 1 and section 3

      Version 1.1         Chris added sections 2 and section 4 for the LCO core

      Version 1.2         Chris added table of contents

      Version 1.3         Chris added table of tables

      Version 2.0         This version incorporates the comments and suggestion given in the CAP
                          QR report. Andrew revised sections 1 & 3 for better readability. Andrew
                          added COTS tools in 4.1.2 to address network/user load simulation.
                          Chris added section 2.1.1 for the missing information. Chris revised
                          section 2.1.2 to show more details in AT-05.

      Version 2.1         Andrew added sections 2.2, 2.3, 5

      Version 2.2         Chris added section 4.2, 4.3, and Table of Figures. Revised section 1.2 to
                          reflect the reference used in section 4.2 and 4.3, also revised Table of
                          Contents and Table of Tables to reflect the change.

      Version 2.3         Chris revised section 2.1 by converting all importance into weights to
                          better reflect the criteria importance. Also added some detail into each
                          attribute tables.

      Version 2.4         Chris revised section 4.2 because it’s missing some COTS related
                          information. Section 1.2 was also updated to reflect the change of
                          information in section 4.2.

      Version 2.5         Andrew revised section 4.1.2 to correct the information of one of the
                          software.

      Version 2.6         Chris revised section 2.1.2 to change the evaluation criteria and weights
                          according to the risks identified. Also revised some of the milestone
                          dates in section 2.2.1.

      Version 2.7         Chris revised section 4.2.2 to add the server qualification.

      Version 3.0         Andrew revised 2.1, added figures Approach Adapted, added Time Line
                          in 2.2.1, revised individual roles in section 3, update figure 3, update
                          4.1.2, and added 2.3.2, 2.2.2 to incorporate the comments in ARB review.




CAP_LCA_F05_T03_V06.20                        13                                   12/04/05
COTS Assessment Process                                                        Version 6.20
      Version 3.1         Chris added section 5.2 to reflect the effort estimation in calendar month;
                          also updated table of contents, table of tables and table of figures to
                          reflect all the changes.

      Version 3.2         Chris revised section 2.1.2 and 2.1.3 based on the ARB presentation.
                          Revised section 4.2.1.2 based on ARB comments. Revised section 4.3 to
                          reflect the current risks, including some of the new risks as suggested
                          during the ARB session.

      Version 3.3         Chris revised section 1.2, 2.1.2, 2.1.3, and 4.2.1.1 based on the IV&Ver’s
                          review comments on LCO Draft. Section 1.2 was changed in alphabetical
                          order. In section 2.1.2 the section description was revised specify that all
                          the evaluation criteria are for the detailed assessment. In section 2.1.3 the
                          maintenance cost was added to the assessment activity. In section 4.2.1.1
                          the table reference in the paragraph was revised to reflect the current
                          table numbers in the document.

      Version 3.4         Chris revised section 2.1.2, 2.1.3, 4.2.2.1 and 4.2.2.2 based on the
                          IV&Ver’s review comments on LCO Package. In section 2.1.2 the
                          attribute in table 10 was broken down into three sub-attributes. In section
                          2.1.3 more detail were added into the assessment activity section in table
                          15 and table 17. In section 4.2.2.1 and 4.2.2.2 the figure references were
                          updated to match the current figure numbers in the document.

      Version 4.0         Andrew and Kunal revised section 2.1.2 and 2.1.3 and added section
                          2.1.2.1, 2.1.2.2, 2.1.3.1, and 2.1.3.2 to reflect the change of evaluation
                          attributes in CAR section 4.1.2 for server product and 4.2.2 for client
                          product for the LCA Draft release. Also modified the timeline in section
                          2.2.2 figure 3 to reflect the current schedule. Chris revised section 2.3.2
                          for the deliverable version numbers for elaboration phase. Also revised
                          section 4 to reflect the current risks.

      Version 4.1         Chris revised section 2.1.2.2 for attribute AT-C02 and section 2.1.3.1 to
                          reflect the current assessment activities. Also added glossary section.

      Version 4.2         Andrew revised section 2.2.2 to update the milestone and schedule for
                          elaboration phase. Also updated the MS project plan for the elaboration
                          phase. Revised section 5.1 to add detail assessment activities in work
                          break down.


CAP_LCA_F05_T03_V06.20                      14                                      12/04/05
COTS Assessment Process                                                      Version 6.20
      Version 5.0         Chris updated section 1.2 references. Revised section 2.1.2, 2.2.2, 4.1.2,
                          and 4.2.1.1 to incorporate the review comments in QR.

      Version 5.1         Chris updated attribute weight for Security attribute in section 2.1.2.1
                          and 2.1.3.1 and changed identifier from AT-S06 to AT-S07, Flexibility
                          attribute identifier changed from AT-S07 to AT-S06. Changed AT-C01
                          attribute weight in section 2.1.2.2 and 2.1.3.2.

      Version 5.2         Andrew updated the overall strategy diagram in section 2.1 to reflect the
                          current assessment strategy. Added the COTS assessment process
                          diagram in section 4.1.3 to help understand the assessment process.

      Version 5.3         Chris added the DART tool reference to section 1.2.

      Version 6.0         Andrew revised 2.1.3.1 to add table item numbering. Revised 4.1.2 fixed
                          table. Revise 4.3 Table 40. All changes were made to incorporate the
                          IV&V LCA Draft comments for LCA Package.

      Version 6.1         Chris revised section 2.1.2.1 and 2.3.2 to incorporate the LCA ARB
                          comments for LCA Package.

      Version 6.2         Chris added section description to section 2.1.2 and 2.1.3. Also added
                          one more glossary entry based on IV&Ver’s LCA Draft review
                          comments.




CAP_LCA_F05_T03_V06.20                      15                                    12/04/05
COTS Assessment Process                                                            Version 6.20


2. Milestones and Products
This section describes the COTS assessment process for the Pasadena High School Network
solution and rationale behind such process. It contains schedules and milestones of when each
requirement goal will be completed.

 2.1 Overall Strategy
This section describes the COTS assessment strategy for our project. The assessment started with
a preliminary network study for PHS resulting in a list of attributes as evaluation criteria. Each of
the attributes is assigned an importance based on the result of the Win-Win negotiation and client
meetings. From these criterions, our objective is to research possible COTS candidates and
incorporate these COTS products into an initial prototype. The prototypes will then be reviewed
for feasibility, and then presented to the client for feedback. Figure 1 below summarizes the
overall approach of the project.



 Approach Adopted

    Review
    Client’s
    Requirements
                                                      Submit finalized           Finalize
                              Preliminary             prototype to client        Prototypes
                              Network
                              Assessment




                                                                              Present Prototype for
                                                                              client feedback

       Practical              COTS
       Network                Products
       Assessment             Research




                              COTS
                              Assessment             Prototype              Prototype Feasibility
                              Process                Proposal               Analysis (Identify Risks)



                                     Figure 1 - Overall Strategy




CAP_LCA_F05_T03_V06.20                         16                                       12/04/05
COTS Assessment Process                                                          Version 6.20

       2.1.1       Selection and Assessment Iterations
The following tables describe the iterations of the top level selection and assessment activities of
the project.


 Activity Identifier   ACT-01
 Activity Name         Initial Filtering
 Description           A list of possible COTS candidates are chosen that meets the general
                       capabilities of the project requirements. Then the initial COTS candidates
                       are filtered based on client’s initial OC&P’s to come out with a final list
                       of COTS candidates.
 Duration              3 days
 # Iteration           1

                                   Table 1: Activity – Initial Filtering


 Activity Identifier   ACT-02
 Activity Name         Detailed Assessment
 Description           Each COTS candidate from the initial filtering result will be evaluated
                       against a set of evaluation criteria that is agreed by both the evaluators
                       and the client. Each evaluation criteria is assigned a weight that is
                       determined by the client during the OC&P discussion, and each COTS
                       candidate will be scored based on the evaluation criteria and weight.
 Duration              1 week
 # Iteration           3

                                Table 2: Activity – Detailed Assessment




CAP_LCA_F05_T03_V06.20                            17                                 12/04/05
COTS Assessment Process                                                        Version 6.20

       2.1.2       Attributes for Evaluation
This section summarizes the evaluation attributes for the COTS assessment process. The
evaluation attributes have been broken down into two categories, the server and client part.
Because the server and client play different roles on PHS’s thin client network, each of them is
evaluated under different criteria.

                       2.1.2.1             Evaluation Attributes for Server COTS
                                           Product
The following table is a list of high level COTS attributes to be used as evaluation criterions
during the detailed assessment of the server COTS product

 Identifier     Name Of Attribute                                       Weight
 AT-S01         Performance                                             270
 AT-S02         Cost                                                    150
 AT-S03         Intercomponent Compatibility                            140
 AT-S04         Interoperability                                        130
 AT-S05         Vendor Support                                          120
 AT-S06         Flexibility                                             100
 AT-S07         Security                                                50

                                Table 3: Server Evaluation Attributes

Each of the attributes is described in detail as the following:
AT-S01
Performance
The performance attributes covers the following:

 Name of the Sub Attribute                             Weight
 Number of concurrent users                            80
 System response time (multimedia applications)        70
 System login time                                     65
 System response time (regular applications)           35
 Network bandwidth                                     20
Total: 270
                              Table 4: Server Performance Attributes




CAP_LCA_F05_T03_V06.20                          18                                 12/04/05
COTS Assessment Process                                                          Version 6.20

AT-S02
Cost
The cost attributes covers the following:

 Name of the Sub Attribute                        Weight
 Initial purchase cost                            60
 Upgrade cost                                     50
 Annual maintenance cost                          40
Total: 150
                                   Table 5: Server Cost Attributes
AT-S03
Intercomponent Compatibility
The intercomponent compatibility attributes covers the following:

 Name of the Sub Attribute                        Weight
 Wayang Outpost                                   50
 Renaissance Place                                45
 MS Office Suite                                  30
 Choice                                           15
Total: 140
                       Table 6: Server Intercomponent Compatibility Attributes
AT-S04
Interoperability
The interoperability attributes covers the following:

 Name of the Sub Attribute                        Weight
 Authentication                                   30
 Application processing                           30
 Active Directory service                         30
 Print service                                    20
 User profile/folder                              20
Total: 130
                              Table 7: Server Interoperability Attributes




CAP_LCA_F05_T03_V06.20                          19                                  12/04/05
COTS Assessment Process                                                  Version 6.20
AT-S05
Vendor Support
The vendor support attributes covers the following:


 Name of the Sub Attribute                         Weight
 Response time for critical problems               40
 Remote assistance                                 30
 Hardware support                                  20
 Software bundling                                 20
 Warranty                                          10
Total: 120

                             Table 8: Server Vendor Support Attributes

AT-S06
Flexibility
The flexibility attributes covers the following:


 Name of the Sub Attribute                         Weight
 Downward compatibility                            40
 Upgradeability                                    60
Total: 100
                                Table 9: Server Flexibility Attributes


AT-S07
Security
The security attributes covers the following:


 Name of the Sub Attribute                         Weight
 User privileges                                   50
Total: 50

                                Table 10: Server Security Attributes




CAP_LCA_F05_T03_V06.20                          20                          12/04/05
COTS Assessment Process                                                        Version 6.20


                         2.1.2.2            Evaluation Attributes for Client COTS
                                            Product:

The following table is a list of high level COTS attributes to be used as evaluation criterions
during the detailed assessment of the client COTS products:

 Identifier     Name Of Attribute                                       Weight
 AT-C01         Cost                                                    120
 AT-C02         Performance                                             120
 AT-C03         Vendor Support                                          90
 AT-C04         Flexibility                                             80

                               Table 11: Client Evaluation Attributes

Each of the attributes is described in detail as the following:
AT-C01
Cost
The cost attributes covers the following:

 Name of the Sub Attribute                        Weight
 Initial purchase cost                            70
 Annual maintenance cost                          50
Total: 120
                                   Table 12: Client Cost Attributes
AT-C02
Performance
The performance attribute covers the following:

 Name of the Sub Attribute                        Weight
 Hardware/ Software specification                 120
Total: 120
                              Table 13: Client Performance Attributes




CAP_LCA_F05_T03_V06.20                          21                                 12/04/05
COTS Assessment Process                                                   Version 6.20
AT-C03
Vendor Support
The vendor support attributes covers the following:


 Name of the Sub Attribute                         Weight
 Hardware support                                  40
 Warranty                                          50
Total: 90

                             Table 14: Client Vendor Support Attributes

AT-C04
Flexibility
The flexibility attributes covers the following:


 Name of the Sub Attribute                         Weight
 Upgradeability                                    80
Total: 80
                                Table 15: Client Flexibility Attributes




CAP_LCA_F05_T03_V06.20                          22                           12/04/05
COTS Assessment Process                                                       Version 6.20


       2.1.3       Assessment Activity
This section summarizes the assessment activities for the COTS assessment process. The
assessment activities have been broken down into two categories, the server and client part.
Because the server and client play different roles on PHS’s thin client network, each of them is
evaluated under different criteria.

                               2.1.3.1 Assessment Activity for Server COTS
                                       Product
This section describes the detailed assessment activities to be performed for each of the
evaluation attributes, including any assumptions and constraints for the activities to be performed.


 Attribute            AT-S01
 Identifier
 Attribute Name       Performance
 Assessment Type      Execution test/ analysis
 Assessment           1. Have the users log in to the system. As each user logs in, record the
 Activity                time it takes and monitor how many users can be logged in at the same
                         time before the system goes down.
                      2. Have the users run different applications. We will monitor and record
                         the system response time for the different type of applications at
                         different level of concurrent users.
 Constraints &        All users will perform about the same amount of work on the system.
 Assumptions

                      Table 16: Server Assessment Activity for Performance


 Attribute            AT-S02
 Identifier
 Attribute Name       Cost
 Assessment Type      Supplier inquiry
 Assessment           We will get quote from each of the vendors for a possible cost of the
 Activity             server, including an estimate for future upgrades and annual maintenance
                      cost required for each of the solutions.
 Constraints &        We need to provide a solution for the educational environment.
 Assumptions

                          Table 17: Server Assessment Activity for Cost



CAP_LCA_F05_T03_V06.20                           23                               12/04/05
COTS Assessment Process                                                         Version 6.20


Attribute            AT-S03
Identifier
Attribute Name       Intercomponent Compatibility
Assessment Type      Analysis
Assessment           We will run all the required applications on the new system and monitor
Activity             the process and record any abnormal activities.
Constraints &        Minimum system specs/requirement should be supplied by each of the
                     application vendors.
Assumptions

             Table 18: Server Assessment Activity for Intercomponent Compatibility


Attribute            AT-S04
Identifier
Attribute Name       Interoperability
Assessment Type      Supplier inquiry, analysis
Assessment           1. Talk to the supplier about the current implementation of the
Activity                authentication and application server as well as how they handle the
                        directory and print services and user folders.
                     2. Analyze the information exchange between server and clients of the
                        authentication process. Analyze and monitor the performance of the
                        directory and print services using the windows task manager. Monitor
                        the user folder utilization level on the server.
Constraints &        All the thin client terminals have to be connected to the server.
Assumptions

                    Table 19: Server Assessment Activity for Interoperability




CAP_LCA_F05_T03_V06.20                        24                                   12/04/05
COTS Assessment Process                                                     Version 6.20


Attribute          AT-S05
Identifier
Attribute Name     Vendor Support
Assessment Type    Supplier inquiry, reference check
Assessment         The purchase of the new server will include a service plan for remote
Activity           administration. We will need to talk to the vendor about exactly what
                   degree of technical support they can provide and the lifecycle of the
                   product, including how long vendor will provide the support after a model
                   has been phased out. Also we can try to find the customer satisfaction
                   rating for a particular vendor if it’s available.
Constraints &      The number of vendors may be limited to provide a good solution for
                   educational environment.
Assumptions

                  Table 20: Server Assessment Activity for Vendor Support


Attribute          AT-S06
Identifier
Attribute Name     Flexibility
Assessment Type    Supplier inquiry, prototype application
Assessment         1. We will make prototype for each of the proposed solution to reflect
Activity              the possible upgrade.
                   2. We will get information about the hardware from COTS vendor as a
                      benchmark for the downward compatibility.
Constraints &      School can provide continuing funding for the future upgrade.
Assumptions

                    Table 21: Server Assessment Activity for Flexibility




CAP_LCA_F05_T03_V06.20                     25                                  12/04/05
COTS Assessment Process                                                  Version 6.20


Attribute         AT-S07
Identifier
Attribute Name    Security
Assessment Type   Analysis
Assessment        1. Check if password rule is enforced on user accounts.
Activity          2. Check if users are assigned proper rights.
                  3. Check if user information is accessible by other users.
Constraints &     Vendor is responsible for creating the student account database based on
                  the information provided by the school librarian.
Assumptions

                    Table 22: Server Assessment Activity for Security




CAP_LCA_F05_T03_V06.20                    26                                   12/04/05
COTS Assessment Process                                                       Version 6.20


                      2.1.3.2             Assessment Activity for Client COTS
                                          Product

This section describes the detailed assessment activities to be performed for each of the
evaluation attributes, including any assumptions and constraints for the activities to be performed.


 Attribute            AT-C01
 Identifier
 Attribute Name       Cost
 Assessment Type      Supplier inquiry
 Assessment           We will get quote from each of the vendors for a possible cost for the
 Activity             thin-client solution, including an estimate for initial cost and annual
                      maintenance cost required for each of the solutions.
 Constraints &        We need to provide a solution for the educational environment.
 Assumptions

                          Table 23: Client Assessment Activity for Cost


 Attribute            AT-C02
 Identifier
 Attribute Name       Performance
 Assessment Type      Supplier inquiry
                      The performance of thin-clients is measured by their Hardware/Software
                      specification based on:
 Assessment
 Activity                   CPU Performance
                            Memory Capacity
                            Embedded Operation System
                      As the thin client depends primarily on the central server for processing
                      activities, its performance is basically judged on the configuration and
                      performance of the server. Thus we need to inquire with the COTS vendor
                      that how the current server configuration will affect the performance of
                      the thin-client.
 Constraints &        The COTS vendor must be reachable for product inquiries
 Assumptions

                      Table 24: Client Assessment Activity for Performance




CAP_LCA_F05_T03_V06.20                        27                                  12/04/05
COTS Assessment Process                                                     Version 6.20


Attribute          AT-C03
Identifier
Attribute Name     Vendor Support
Assessment Type    Supplier inquiry, reference check
Assessment         The purchase of the new thin-clients will include a service plan for
Activity           hardware support. We will need to talk to the vendor about exactly what
                   degree of technical support they can provide and the lifecycle of the
                   product, including how long vendor will provide the support after a model
                   has been phased out. Also we can try to find the customer satisfaction
                   rating for a particular vendor if it’s available.
Constraints &      The number of vendors may be limited to provide a good solution for
                   educational environment.
Assumptions

                  Table 25: Client Assessment Activity for Vendor Support


Attribute          AT-C04
Identifier
Attribute Name     Flexibility
Assessment Type    Supplier inquiry
Assessment         As the thin-clients are not easily upgradeable, therefore we need to
Activity           contact the COTS vendor to inquire which components of the thin-
                   client are upgradeable.
Constraints &      The COTS vendor must be reachable for product inquiries and the
                   components must be readily available.
Assumptions

                     Table 26: Client Assessment Activity for Flexibility




CAP_LCA_F05_T03_V06.20                      28                                 12/04/05
COTS Assessment Process                                                             Version 6.20

 2.2 Milestones and Schedules
This section describes the assessment planning preparation, execution, and analysis activities for
the Pasadena High School Network Study. The planning utilizes Gantt Charts made using
Microsoft Project. The items bellow highlights the major milestones in the project. The MS
Project Plan describes in detailed activities to reach the milestones.

       2.2.1       Inception Phase
              Team formation (09/07/05)

               o   A team (Ajithkumar Kattil, Chris Yuan, Kunal Kadakia, Ashwin Kusabhadran,
                   Andrew Ha, and Devesh Thanvi) was formed after careful consideration of each team
                   member’s profile. Members were selected based on their preferred roles, strength,
                   software experience, industry experience, and time availability while taking into
                   consideration all the factors that are critical to the functioning of an efficient team.

              Project Assignment (09/09/05)

                   o The project preferences were submitted to instructional staff.
                   o Project 3, Pasadena High School Network Study was assigned to the team, and
                       there after this team will be refer to as Team 3.

              First Meeting with the client (09/12/05)

               o   We had the first meeting with our client’s representative Erin Shaw of ISI who gave
                   us a brief overview of the project. She also provided us with the contact information
                   of Mrs. Jeanine Foote, Pasadena High School Librarian, who is our client.

              Team meeting (09/13/05)

               o   We finalized the roles of the members of the team. We also discuss on various issues
                   to be proposed to the client in order to get a better perspective about the project and
                   the client’s expectations.

              Meeting with the client at PHS to get an overview of the project. (09/14/05)

               o   The clients discussed the existing system infrastructure, problems, and issues with the
                   current system.
               o   The client also provided us the printouts of the current system specifications.
               o   The process of Easy Win-Win negotiations was also explained to the client.
               o   Team 3 also did some preliminary network evaluations of the current system.

              Team meeting to discuss the Early OCD document (09/16/05)

               o   The team met to discuss contents of the Early OCD.
               o   Further discussions were made on the existing system based on the client interviews
                   and preliminary network assessment at PHS by the team members.


CAP_LCA_F05_T03_V06.20                           29                                     12/04/05
COTS Assessment Process                                                         Version 6.20



            First session of the Win-Win negotiations (09/18/05)

             o   Two team members (Andrew Ha and Chris Yuan) decided to go to PHS to help the
                 client with the Win-Win Tool. The remaining team members participated in the
                 Win-Win Session at USC.
             o   The first set of win conditions of the stakeholders were obtained through discussions
                 based on the brainstorming data, and using the Easy Win-Win Tool.
             o   A hard copy of the win conditions were provided to all the stakeholders for review.

            Contact with the IV & V member (09/20/05)

             o   Two IV & V members, Vincent Chu and Winston Kwong were assigned to Team 3

            Completion of the Early OCD (09/20/05)

             o   The early OCD sections were completed and an informal peer review was conducted.

            Peer Review of Early OCD sections (09/20/05)

             o   The Early OCD was reviewed by the team members and the appropriate changes
                 were incorporated by the authors.

            Submission of the Early OCD and corresponding QR (09/21/05)

             o   The early OCD was submitted on the website along with the QR based on the peer
                 review to cs577a staff.

            Second session of the Win-Win negotiations (09/22/05)

             o   Team members Andrew Ha and Chris Yuan went to PHS to help librarian conduct
                 the second win-win session.
             o   A few more modifications were made to the win conditions obtained in the first
                 session.
             o   The stakeholders voted on the win conditions based on the criteria of business
                 importance and ease of implementation and the win conditions were prioritized.

            Team meeting to discuss the Initial Prototype (09/25/05)

             o   The team met to discuss system functionality and Initial Prototypes.
             o   The team plan to release four different prototypes for the client.
             o   A plan for the IV & V reviews was discussed.

            Initial prototype (09/25/05)

             o   A prototype and a system flowchart were developed as per the discussions from the
                 Win-Win negotiations for submission.



CAP_LCA_F05_T03_V06.20                        30                                        12/04/05
COTS Assessment Process                                                       Version 6.20



            Submission of the prototype (09/26/05)

             o   The prototype report was submitted and posted on the team website.

            Win-Win Report Submission (09/26/05)

             o   The Win-Win Report containing all the agreements derived from the prioritized Win
                 conditions was submitted and posted on the team website.

            Second client meeting to discuss finalized Win-Win report and prototypes (09/27/05)

             o   Team 3 finalized Win-Win report with the client.
             o   Team 3 also discussed and reviewed OCD document with the client.
             o   Team 3 presented the client with initial proposed prototypes and received feedback
                 on client’s prototype preferences.
             o   Team 3 scheduled client for ARB presentation.

            Team meeting for discussion on the CAB & CAP (09/27/05)

             o   A team meeting was conducted to discuss the requirements that have been extracted
                 during the Win-Win negotiations.
             o   Members of Team 3 decided to form sub-teams to work on CAP and CAB.
             o   Discussions were made on the COTS assessment boundaries and environment and
                 the prioritized system capabilities to be evaluated.
             o   The team members also discussed about the process to convert the early OCD to
                     CAB & CAP
             o   The UML model of the system was discussed and various processes were identified.
             o   The various risks were identified and prioritized and a mitigation plan was
                 formulated.

            Completion of the LCO drafts (09/29/05)

             o   The LCO drafts composed of CAB and CAP documents were completed and were
                 made available on the team website.

            Agile Review Process of the OCD,CAB ,CAR and Rose Model (10/10/05)

             o   The above mentioned documents were sent for Agile Independent Review
             o   The documents were also submitted for Agile Internal Review.
             o   Comments and feedbacks from the review were placed into consideration and
                 document revision process.

            Update of the LCO drafts (10/12/05)

             o   The LCO drafts were updated and were made available on the team website.




CAP_LCA_F05_T03_V06.20                       31                                   12/04/05
COTS Assessment Process                                                         Version 6.20


             Practical Network Evaluations at PHS (10/12/05)

              o   A team meeting was conducted to discuss possible test tools, test scenario, test
                  method for the PHS network.
              o   Members reviewed the list of software application at PHS.
              o   Team 3 spends approximately 4.5 hours running various test scenarios.
              o   Team 3 recorded results in network analysis results.

             LCO - ARB presentation (10/18/05)

              o   The completed LCO was reviewed by the Architectural Review Board comprising of
                  the clients and the instructional staff.

             Prototype submitted to COTS Expert (10/19/05)

              o Prototype proposed in ARB Meeting was submitted to COTS Expert (Tangent
                Computer) for evaluation.
              o The results from COTS Expert will then be used to refine the prototype model to be
                submitted to the client.

             LCO Package (10/24/05)

              o   A team meeting was held to discuss and review ARB Presentation Results and IV&V
                  Peer Review Results.
              o   The team will incorporate the changes suggested in the ARB Presentation, IV&V
                  Peer Review and will submit the final LCO Package.


      2.2.2       Elaboration Phase
             Prototype Finalized with COTS Expert & Client (10/26/05)

              o The results from COTS Expert and comments from the client will then be used to
                  refine the finalized prototype model.

             Submit Final Specifications to COTS Expert (11/03/05)

              o The final specification will be submitted to COTS expert.
              o The client will then submit a formal request for price quote.

             Customer & Client Submit Purchase Order to COTS Vendor (11/08/05)

              o Customer (PHS) based on the price quote given by COTS Vendor (Tangent
                  Computer) will submit a formal Purchase Order (PO) to Tangent Computer.

             Developer Survey (11/15/05)

              o Complete developer survey for IV&V.


CAP_LCA_F05_T03_V06.20                          32                                 12/04/05
COTS Assessment Process                                                       Version 6.20
            Detailed COTS Assessment Evaluation Process (11/18/05)

             o Categorize COTS products based on evaluation criteria.
             o Generate Test Results table for Test Cases for each COTS product.
             o Team evaluates COTS products and assigns points based on evaluation.

            LCA Draft Completed (11/21/05)

             o   Complete all sections of CAP
             o   Complete all sections of CAB
             o   Complete all sections of CAR
             o   Submit LCA Draft for Agile Internal Review

            LCA ARB Presentation (11/29/05 )

            LCA Package Completed (12/05/05)

             o Submit finalized sections of CAP, CAB, and CAR to IV&V.
             o Incorporate IV&V changes into CAP, CAB, CAR
             o Finalized all sections of CAP, CAB, and CAR.

            Tangent Computer Installs Server & Setup Software (12/09/05)

             o Tangent Computer will install Server, thin clients, setup Active Directory, and setup
                 related software for PHS.

            Team 3 Perform Post Installation Analysis (12/12/05)

             o Team 3 will perform post installation analysis on new server to test for system
                 performance.

            Project Completed (12/16/05)

             o   Estimated project completion date.




CAP_LCA_F05_T03_V06.20                       33                                   12/04/05
COTS Assessment Process                                                Version 6.20




                      Figure 2 - MS Project Plan for Inception Phase




CAP_LCA_F05_T03_V06.20                   34                               12/04/05
COTS Assessment Process                                                     Version 6.20




                   Figure 3 - MS Project Plan for Elaboration Phase (1/6)




                   Figure 4 - MS Project Plan for Elaboration Phase (2/6)



CAP_LCA_F05_T03_V06.20                     35                                  12/04/05
COTS Assessment Process                                                     Version 6.20




                   Figure 5 - MS Project Plan for Elaboration Phase (3/6)




                   Figure 6 - MS Project Plan for Elaboration Phase (4/6)



CAP_LCA_F05_T03_V06.20                     36                                  12/04/05
COTS Assessment Process                                                     Version 6.20




                   Figure 7 - MS Project Plan for Elaboration Phase (5/6)




                   Figure 8 - MS Project Plan for Elaboration Phase (6/6)




CAP_LCA_F05_T03_V06.20                     37                                  12/04/05
COTS Assessment Process                                          Version 6.20



The following figure summarized the project in a time line.




                                   Figure 9 - Project Timeline




CAP_LCA_F05_T03_V06.20                       38                     12/04/05
COTS Assessment Process                                                           Version 6.20

 2.3 Deliverables
This section lists the deliverables at the various stages of the project.

         2.3.1     Inception Phase
The following table shows the list of deliverable items for the inception phase for the Pasadena
High School Network Study project.


  Date       Content        Version        Required         Recipient        Acceptance Criteria
                                            Format
09/21/05    OCD          OCD_2.0         Document        OCD Draft          Lean MBASE
                                                                            Guidelines V1.2
                                         (Printed and
                                         electronic)
09/23/05    EWW          EWW_1.0         Document        Easy Win-Win       Easy Win-Win Report
                                                         Report             Template
                                         (Printed and
                                         electronic)
09/26/05    Prototype    Prototype_1.0 Document          Prototype          Prototype Template
                                         (Printed and
                                         electronic)
10/24/05    CAP          LCO_3.2         Document        LCO package        Guidelines for
                                                                            Developing COTS-
                                         (Printed and
                                                                            Based Applications_v
                                         electronic)
                                                                            1.0
10/24/05    CAB          LCO_5.2         Document        LCO package        Guidelines for
                                                                            Developing COTS-
                                         (Printed and
                                                                            Based Applications_v
                                         electronic)
                                                                            1.0
10/24/05    CAR          LCO_3.2         Document        LCO package        Guidelines for
                                                                            Developing COTS-
                                         (Printed and
                                                                            Based Applications_v
                                         electronic)
                                                                            1.0

                              Table 27 Deliverable for Inception Phase




CAP_LCA_F05_T03_V06.20                          39                                    12/04/05
COTS Assessment Process                                                        Version 6.20

         2.3.2    Elaboration Phase
The following table shows the list of deliverable items for the elaboration phase for the Pasadena
High School Network Study project.


  Date      Content       Version        Required       Recipient         Acceptance Criteria
                                          Format
12/05/05   CAP          LCA_6.2        Document       LCA package        Guidelines for
                                                                         Developing COTS-
                                       (Printed and
                                                                         Based Applications_v
                                       electronic)
                                                                         1.0
12/05/05   CAB          LCA_7.1        Document       LCA package        Guidelines for
                                                                         Developing COTS-
                                       (Printed and
                                                                         Based Applications_v
                                       electronic)
                                                                         1.0
12/05/05   CAR          LCA_6.5        Document       LCA package        Guidelines for
                                                                         Developing COTS-
                                       (Printed and
                                                                         Based Applications_v
                                       electronic)
                                                                         1.0
12/05/05   Prototype    Prototype_2.0 Document        Prototype Final    Prototype Template
                                       (Printed and
                                       electronic)

                           Table 28: Deliverable for Elaboration Phase




CAP_LCA_F05_T03_V06.20                        40                                   12/04/05
COTS Assessment Process                                                        Version 6.20

3. Responsibilities
This section identifies key stakeholder roles and summarizes their responsibilities in planning,
preparing, executing, analyzing, or reviewing the COTS assessment. This section also
summarizes the objectives, contents, and stakeholder roles involved in key milestone reviews.


 Stakeholders           Stakeholders’                 When                   Where
                        responsibility
 Evaluators:            Creating, modifying,          Win-Win negotiations   Team meeting and
                        using, and reporting          and client meeting.    client meeting at PHS
 CS577a Team 3
                        evaluation criteria &                                and USC.
 Andrew Ha              results to the client.
 Kunal Kadakia
                        Creating & modifying          Team meetings.         Contact by Email,
 Ajithkumar Kattil      an evaluation plan.                                  Phone, MSN
 Ashwin                                                                      Messenger, and
 Kusabhadran                                                                 meeting.
 Devesh Thanvi          Providing and                 Team meeting and       Team meeting and
 Chris Yuan             Revising Prototypes           client meetings.       client meeting at PHS.
                        model to the client.

                        Consult COTS vendor           In parallel with the   One team member
                        (Tangent Computer)            evaluation process,    (Andrew Ha) will be
                        to evaluate available         provide information    assigned responsible.
                        features of the COTS          for helping decision   He will be reporting
                        package; analyze and          making in evaluation   to team leader and
                        provide a reprioritized       reviews.               other stakeholders.
                        list of features for the
                        system under
                        consideration.
 IV&V                   Review the COTS               Each time the        Contact via email.
                        assessment                    evaluators posts new
 Vincent Chu
                        documents, and                documents (OCD,
 Winston Kwong          provides evaluators           CAB, CAP, CAR, and
                        with COTS                     Prototypes) on class
                        assessment process            web site.
                        feedback through peer
                        review.
 Client /               Provide the evaluators Client meetings, ARB          Contact via email,
 Administrator          with the requirements                                phone, and client
                        criteria for the project.                            meeting at PHS.
 Jeanine Foote
 (School Librarian)



CAP_LCA_F05_T03_V06.20                           41                                12/04/05
COTS Assessment Process                                                       Version 6.20

                      Provide feedback to         Continuous process
                      proposed prototypes.        after initial prototype
                                                  is proposed.

                      Contact COTS vendor
                      and request formal          After final prototype
                      quotations for COTS         is accepted.
                      product.


Sponsor &             Provide software            Any time as needed.       Contact by phone or
Researcher            support,                                              email.
                                                  Client meeting, ARB.
                      documentations, and
Erin Shaw
                      performance test data
                      for Wayang Outpost.

                      Research on the             In parallel with the
                      performance of the          evaluation process,
                      multimedia                  provide information
                      application hosted by       for helping decision
                      ISI.                        making in evaluation
                                                  reviews.
System Maintainer     Provide technical           Any time as needed        Contact by phone and
                      support for database        (typically prior to the   by email. An excel
Nick Haddad,
                      maintenance by              beginning of the          spreadsheet of the
Technical Support
                      adding & deleting           school year where the     student roster list is
Department, Tangent
                      user accounts from          client wishes to enter    sent by the client to
Computer
                      server (TC95, TC96,         the class roster into     the database
                      TC97)                       the system database).     maintainer.
                      Provide technical
                      support and remote
                      software updates and
                      system maintenance.
                      Provide feedback on         During Prototype
                      COTS prototype              phase of the project.
                      feasibility.
COTS Vendor,          Providing COTS              Any time as needed,       Contact by email and
COTS Expert, and      documentation, price        and regularly.            phone.
Authorize             model, possible future
Representative        release info, and
                      helping to identify
Louise O'Sullivan,
                      difficulty of meeting
Sales Department,
                      the requirements gap.
Tangent Computer




CAP_LCA_F05_T03_V06.20                       42                                   12/04/05
COTS Assessment Process                                                     Version 6.20

Users                  Serve as test subjects      Any time as needed     Contact by phone and
                       for system                  and regularly.         special appointment
Students and Faculty
                       performance and                                    needed to schedule a
members at PHS
                       reliability testing.                               class meeting.
                       Provide feedback for
                       system (COTS)
                       usability and
                       accessibility.
Customer               Provide the resources       Client & Customer      Contact through
                       necessary to fund the       meeting.               client.
Pasadena High
                       project.
School (PHS)

                       Submit formal               After client decides
                       Purchase Order to           on the finalized
                       COTS Vendor.                product to be
                                                   purchase.

                                  Table 29: Responsibilities




CAP_LCA_F05_T03_V06.20                        43                               12/04/05
COTS Assessment Process                                                        Version 6.20

4. Approach
This section explains the approach used for evaluating the hardware and software COTS solution
for Pasadena High School.

 4.1 Assessment Framework
This assessment includes several hardware and software COTS solutions. We will evaluate each
COTS solution and give an overall score to see which solution can satisfy the most of system’s
OC&P’s.

       4.1.1       Instruments
The following table provides a description of the instruments used in the assessment activities for
the hardware and software COTS product.


 Assessment Instrument              Description
 Evaluation Criteria and Weights    A spreadsheet file is created from system’s OC&P’s and
                                    maintained by the evaluation team.
 COTS Product Literature            Product manuals and relevant documentation obtained from
                                    the client and the vendors.
 COTS Hardware Demos                The demos will be conducted by the local representatives of
                                    each vendor.
 COTS Software Demos                The demos will be conducted by the evaluation team to
                                    simulate the user environment.
 Evaluation Data Collection         A spreadsheet file is created from evaluation criteria and
 Form                               used by the evaluation team to fill in evaluation result after
                                    reviewing each COTS candidate.

                                Table 30: Assessment Instruments




CAP_LCA_F05_T03_V06.20                        44                                   12/04/05
COTS Assessment Process                                                      Version 6.20

       4.1.2      Facilities
The following table provides a description of the facilities involved in each type of assessment
for the hardware and software COTS product.


 Facilities                  Descriptions
 Hardware                    SERVER (TC97) Proposed
                             Location: Library
                             Specification: Dual Intel Pentium 4 Xenon 3.06 GHz.
                                            4GB RAM, 120GB Hard drive, 8MB ATI-RAGE
                                            XL, 10/100/1000 Base T network interface card.
                             SERVER (TC95) Existing
                             Location: Library
                             Specification: Pentium III Xenon 1.0 GHz
                                            3GB RAM, 80 GB Hard dive (x2), 4MB ATI
                                            VGA, CD-RW, SVGA 24bit colors, 1204*768
                                            resolution, 10/100 Base T network interface card.
                             SERVER (TC96) Existing
                             Location: PC Lab
                             Specification: Pentium III 933 MHz
                                            1GB RAM, 80 GB Hard dive (x2), 4MB ATI
                                            VGA, CD-RW, SVGA 24bit colors, 1204*768
                                            resolution, 10/100 Base T network interface card.
                             SERVER (AR) Existing
                             Location: Library
                             Specification: Pentium III 1.0 GHz
                                            1GB RAM, 80 GB Hard dive (x2), 4MB ATI
                                            VGA, CD-RW, SVGA 24bit colors, 1204*768
                                            resolution, 10/100 Base T network interface card.


                             CLIENT (WYSE Winterm V90) Proposed
                                            1GHz x86 CPU coupled to a high resolution 24-bit
                                            video controller. 2 serial, 1 parallel, 2 PS/2, audio
                                            , headphone /speaker out, and 3 USB 2.0 ports.
                                            Windows XP Embedded Device
                             CLIENT (Tangent WebDT166) Proposed



CAP_LCA_F05_T03_V06.20                       45                                  12/04/05
COTS Assessment Process                                                   Version 6.20
                                            AMD Geode™ GX 533 or AMD Geode™ LX 800
                                            Processor, Display Resolution is Up to 1600 x
                                            1200(GX) or 1920 x 1440 (LX), RAM is 128MB
                                            Windows CE embedded operating system


                          CLIENT (WYSE Winterm 3230LE) Existing
                          Location: Library, PC Lab
                          Specifications:
                                            Thin client runs Embedded Windows CE
                                            2 USB Ports, Audio, Built in IE, 10/100 Base T
                                            network
Software                  Microsoft Windows 2003 Performance Monitoring Tool
                                 The performance monitoring tools supplied with the
                                 Windows Server 2003 family operating systems allows
                                 administrators to monitor system performance and the
                                 effects of configuration changes on system throughput.
                          Hosted Applications
                                 Applications that the server will host will include:
                                 Microsoft Office Suite (Word, Excel, PowerPoint,
                                 Publisher).
                                 Renaissance Place (Network web based application for
                                 reading assessment).
                                 Wayang Outpost (Flashed based geometry tutorial
                                 program developed by USC ISI).
                                 Choices (Career planning application).
COTS License              Windows 2003 Performance Tool is license by the Microsoft
                          Corporation, and is part of the system distributed by Tangent
                          Computer.
                          Microsoft Office Suite is license by Microsoft Corporation
                          Renaissance Place is licensed by Renaissance Learning.
                          Wayang Outpost is licensed by USC’s ISI.
                          Choices program is license by Brown University.
Procedure                 Using Windows 2003 Performance Monitoring tool to measure
                          the CPU Utilization, Memory Utilization, and Network Resource
                          Utilization in a class room of 15-25 students.
                                Observe Classroom & Student Activities (15-25 Students)
                                Measure CPU, Network, and Memory Utilization (Using
                                 Windows Task Manager) for the following software:

CAP_LCA_F05_T03_V06.20                      46                                12/04/05
COTS Assessment Process                                                 Version 6.20

                                     MS Office          Applications    (Word,        Excel,
                                      PowerPoint)
                                     Choices
                                     Algebra Tutorial
                                     Internet Explorer (Static Web Page Surfing)
                                     Internet Explorer (Flash Animation Page Testing)
                                              www.macromedia.com
                                              www.disney.com
                                     Windows Media Player
                                              Streaming USC Lectures
                             Simulate 11 Users Surfing Flash Intensive Web Sites


                          Table 31: Assessment Facilities




CAP_LCA_F05_T03_V06.20                47                                   12/04/05
COTS Assessment Process                                                        Version 6.20


       4.1.3       COTS Assessment
The COTS Assessment Process defines the steps which are necessary to evaluate potential COTS
solutions for this project. From the results of the win-win negotiations, a set of high level
attributes was defined. Each of these attributes was then assigned different weights based on the
client’s requirements and win conditions. Next we refine our market trend analysis to identify
the possible COTS solutions that matches the desired attributes. Evaluation criteria were
developed and a test plan was written to test each of these COTS products. The test result of
each criteria should give a scale from 1 to 10 (with 10 as the best), and the total of the weighted
result scores will be complied into a results matrix to serve as the final evaluation result for the
COTS product. COTS products which score highest marks will then be use to build the
prototype. Figure 10 below shows the detailed COTS assessment process framework.

                             COTS Assessment Process


      Define High Level
       COTS Attributes                Identify possible                  Set Evaluation
      based on Win-Win                COTS solutions                        Criteria
         Negotiations



                                                                           Define Test
                                                                           Procedures




         Choose COTS                        Evaluate
          product with                       COTS                           Perform Tests
         highest ranking                   Assessment                             &
                                            Results
                                                                        Generate Test Results




                              Figure 10 - COTS Assessment Process




CAP_LCA_F05_T03_V06.20                         48                                  12/04/05
COTS Assessment Process                                                     Version 6.20

 4.2 Complementary Activity
This section describes the complementary activities being performed in our project.

       4.2.1      Market Trend Analysis
                 4.2.1.1 Market Sharing

This section describes the top vendors of the thin client market, as well as the comparison
between thin client and standalone PC market. Figure 11 shows the shipment distribution by
region in 2003 and a forecast for 2007 (source provided by IDC). Table 32 shows the thin client
market share in 2004, and Table 33 shows the PC market share in 2004 relatively. As we can see
that the total shipment of thin client is roughly 3% of the total PC shipment in 2003 and 2004,
but the forecast is showing that the thin client will increase to nearly 10% of the total market
need by 2008, according to IDC’s data.




        Figure 11 - Worldwide Enterprise Thin Client Shipment by Region, 2003 and 2007




CAP_LCA_F05_T03_V06.20                       49                                 12/04/05
COTS Assessment Process                                                     Version 6.20

                 Vendor Name                            Market Share (%)
                      Wyse                                       37
                     Neoware                                     19
                Hewlett-Packard                                  15
                       Sun                                      7.5
                      Other                                     21.5

                     Table 32: Worldwide Thin Client Market Share in 2004


   Vendor Name                    Shipments (1,000’s)                  Market Share (%)
        Dell                             8,700                                 18.7
  Hewlett-Packard                        7,100                                 15.3
        IBM                              2,300                                 5.0
   Fujitsu-Siemens                       2,100                                 4.5
        Acer                             1,850                                 4.0
       Apple                             1,070                                 2.3

                        Table 33: Worldwide PC Market Share in 2004




CAP_LCA_F05_T03_V06.20                      50                                 12/04/05
COTS Assessment Process                                                       Version 6.20

                       4.2.1.2 Business Benefit

This section describes the benefit gained by using the thin client technology. The benefits are
listed as the following:
       • Cost - This is the most commonly cited advantage of thin clients. On a per unit basis,
       they are generally far less expensive than PCs because of the lack of onboard memory.
       This, in turn, means that thin clients have few or no moving parts, which makes them
       inherently less likely to break down. Other savings can come from the centralization of
       computing power and ease of managing devices. Finally, hardware becomes almost
       entirely interchangeable; one thin client can be switched for another, with the end-user
       employee suffering little or no downtime.
       • Manageability and centralization - Thin clients fit a major trend in corporate
       computing, centralized management of computing resources. As they rely on servers to
       store information and run programs, thin clients are always centralized. Rather than
       working from machine by machine, the maintainer can manage software upgrades, user
       authorizations, and corporate group policy for all thin client users.
       • Security - One key place in which this centralization strategy plays out is in security.
       Rather than relying on end users to update their security settings, the maintainer can
       maintain all security software, making sure it is consistent and up-to-date across all thin
       client users. Furthermore, most thin clients do not have hard drives. This means that they
       cannot act as a target for viruses or as a means of losing important intellectual property,
       as often happens when notebook computers are stolen. Finally, because thin clients do
       not function without a centralized server, they are less of a target for thieves.
       • Reliability - Thin clients can increase reliability in a number of ways. Since thin client
       rely on the server rather than the independent operating system, the clients are guaranteed
       to be up and running provided there is no network or server down time. Generally this is
       a more reliable option.
From the perspective of end users, thin clients can offer:
       • Simplicity - Thin clients liberate users to get their jobs done without having to deal
       with the unproductive overhead of managing a PC, operating system, and applications.
       Meanwhile, both machines and interfaces can be designed to be task-oriented,
       minimizing employee-training needs.
       • Flexibility of access - Users can work from any location in which a connected thin
       client is available. This means that users can log on from any thin client within a network
       and have access to their own applications and information.
       • Upgrades - Thin clients can help free users from ongoing PC upgrade cycles. By
       centralizing software upgrades, companies can give users access to the computing power
       they need.




CAP_LCA_F05_T03_V06.20                         51                                 12/04/05
COTS Assessment Process                                                    Version 6.20

       4.2.2      Product Line Analysis
This section describes the network structure of the PHS library and explains how the thin client
solution can fit in for use in the system.

                      4.2.2.1     Server Qualification

Figure 12 below shows the current network structure of the PHS library. In order to handle the
multimedia processing for the thin client network, the server should have the following
qualification:
            Intel Xeon 3.0 GHz CPU or above
            Hyper threading supports
            Multi-processors
            2GB memory or more
            120GB hard drive or more

                      4.2.2.2     Client Qualification
The thin client terminal model they currently have is outdated and discontinued from the vendor
product line. Figure 13 shows the comparison chart of Wyse’s current models. Figure 14 shows a
price comparison between the leading thin client vendors. The client should have the following
qualification:
            Local application execution
            Embedded browser
            300 MHz CPU or above
            128MB flash memory or above




CAP_LCA_F05_T03_V06.20                       52                                12/04/05
COTS Assessment Process                                                 Version 6.20




               Figure 12 - Current Thin Client Model with Wyse Winterm 3250




CAP_LCA_F05_T03_V06.20                    53                                  12/04/05
COTS Assessment Process                                                Version 6.20




              Figure 13 - Wyse Winterm Thin Client Feature Comparison Chart




CAP_LCA_F05_T03_V06.20                   54                                   12/04/05
COTS Assessment Process                                                       Version 6.20




                    Figure 14 - Price Comparison Chart of Thin Client Vendors




                      4.2.2.3      Updates and COTS Support

The regular updates on the component are achieved by the step-wise upgrading, by adding or
replacing the old thin client terminals on an annual basis. In the proposed solution PHS library
computer lab will be covered by the service contract plan from Tangent, who is their thin client
provider. The vendor will provide remote administration for as long as the service contract plan
is renewed on an annual basis. For a long term goal of the project, the service plan will no longer
be needed provided that the PHS library would be fully upgraded and a minimal staffing would
be required for doing the regular maintenance in the library.

The life cycle of the thin client terminals (5-8 years) are usually much longer than a standalone
PC (3-4 years). Since the thin client is based on SBC (Server Based Computing), the thin client
itself only requires only a minimum amount of work to process while the server is handling the
rest. Therefore when new technology comes, it is usually only required to upgrade the server
rather than the individual thin clients, which makes the thin client a long term viable solution.




CAP_LCA_F05_T03_V06.20                        55                                  12/04/05
COTS Assessment Process                                                         Version 6.20

 4.3 Risk Management
This section describes the top-n risks of our project.


 Risk Identifier      RK-01
 Risk Name            Single point of failure of the proposed system prototype
 Description          In the proposed prototype, illustrate a possible single point of failure, in
                      which a failure in the Active Directory Server will disconnect all thin
                      client networks.
 Mitigation Plan      Mirror the AD server to a secondary server to serve as a backup in case
                      the primary server fails to avoid any client down time.

            Table 34: Risk-01 Single point of failure of the proposed system prototype


 Risk Identifier      RK-02
 Risk Name            Thin-client capability of handling multimedia application
 Description          The study of thin-client technology shows that thin-client generally has
                      poor multimedia capability while the client has a strong preference about
                      the thin-client solution.
 Mitigation Plan      Limit the total number of multimedia applications so run on the system.
                      Request ISI to reduce the multimedia contents (e.g., continuous loop) in
                      Wayang Outpost. Also suggest the high end thin client model to improve
                      the multimedia supporting.

            Table 35: Risk-02 Thin-client capability of handling multimedia application


 Risk Identifier      RK-03
 Risk Name            Continuous development of prototypes
 Description          In our life cycle model it shows a continuous loop to develop prototypes,
                      which means the development will never end if client is not satisfied with
                      the prototype proposed.
 Mitigation Plan      Limit the total number of prototypes to develop throughout the life cycle
                      of the project to avoid infinite loop.

                     Table 36: Risk-03 Continuous development of prototypes




CAP_LCA_F05_T03_V06.20                         56                                   12/04/05
COTS Assessment Process                                                        Version 6.20


Risk Identifier       RK-04
Risk Name             COTS integration issues
Description           PHS project requires a high degree of integration between different COTS
                      products, e.g., the existing legacy Wyse thin client terminals and the
                      Citrix thin client solution. Since the existing thin client network (Wyse) is
                      hardware based solution, integrating the proposed Citrix software solution
                      can be a major risk in terms of hardware and software incompatibility
                      issues faced. The interoperability can be limited due to the proprietary
                      interfaces of individual COTS vendors.
Mitigation Plan       We need to contact the COTS vendors about the interoperability between
                      different COTS product. Also we should request vendors for possible
                      demos to run the different COTS applications together.

                            Table 37: Risk-04 COTS integration issues


Risk Identifier       RK-05
Risk Name             COTS integration and maintenance cost still uncertain
Description           There is a risk involved in figuring out the exact cost of integration (with
                      the legacy systems) and maintenance of COTS.
Mitigation Plan       Try to reach the chosen COTS vendors and finalize on the commercial
                      aspects required for the respective prototype. This will help us in doing
                      the business case analysis for each COTS prototype. Propose different
                      prototypes to the client and act like a consultant for this project.

              Table 38: Risk-05 COTS integration and maintenance cost still uncertain


Risk Identifier       RK-06
Risk Name             Single COTS Vendor
Description           One of the client's win conditions is to stay with the existing COTS
                      provider, Tangent Computer. In this case we will have very limited
                      number of choices of the COTS solution we can propose.
Mitigation Plan       We will contact the COTS vendor and give them the project constraint we
                      have and ask the COTS vendor to suggest the possible available solutions
                      they can provide.

                              Table 39: Risk-06 Single COTS Vendor




CAP_LCA_F05_T03_V06.20                         57                                  12/04/05
COTS Assessment Process                                                 Version 6.20



Risk Identifier   RK-07
Risk Name         Lack of COTS vendor support
Description       During the COTS prototype evaluation period, we tried contacting COTS
                  vendor for support and feedback for our proposed prototypes. During the
                  evaluation process we did not receive feedbacks from COTS vendor.
Mitigation Plan   We need to develop a closer relationship with the vendors throughout the
                  duration of the project. Also we need to make sure to find the right
                  window for contact.
                  We have decided to show some "carrot" to the COTS Vendor in the form
                  of future business to entice them to give us more support.

                     Table 40: Risk-07 Lack of COTS vendor support


Risk Identifier   RK-08
Risk Name         Faulty Vendor Claims
Description       A lot of the assessment was done by going through the specification
                  provided by the vendor without real mode testing. It can result in bad
                  performance or incompatibility issue between the components during the
                  transition and deployment process.
Mitigation Plan   Develop different prototypes and check the demonstration for future
                  upgrades to have hands-on experience.

                           Table 41: Risk-08 Faulty Vendor Claims


Risk Identifier   RK-09
Risk Name         Team member availability
Description       Since most of the team members are taking 1 or 2 courses other than the
                  cs577, some team members may not be available for some project
                  activities.
Mitigation Plan   Do a project planning and negotiate with team members to allocate the
                  time for critical project activities first.

                          Table 42: Risk-09 Team member availability




CAP_LCA_F05_T03_V06.20                      58                              12/04/05
COTS Assessment Process                                                         Version 6.20


Risk Identifier        RK-10
Risk Name              Budget overruns
Description            PHS has a tight budget on their computer systems, which can limit the
                       number of our COTS solutions.
Mitigation Plan        Filter out the COTS products with high license and maintenance cost.

                                 Table 43: Risk-10 Budget overruns


Risk Identifier        RK-11
Risk Name              Schedule conflict between client and evaluators.
Description            Since our client is off-campus (PHS librarian), it is hard for us to schedule
                       the meetings with our client.
Mitigation Plan        Do a project plan and invite the client to a meeting ahead of time to avoid
                       conflict.

                  Table 44: Risk-11 Schedule conflict between client and evaluators




CAP_LCA_F05_T03_V06.20                          59                                  12/04/05
COTS Assessment Process                                                       Version 6.20

5. Resources
This section describes the estimated amount of effort, cost, and calendar time that will be
required to plan, prepare, execute, analyze, document, review, and refine the assessments
indicated in the previous sections.

 5.1 Work Breakdown
This section describes the breakdown for individual responsibilities and team responsibilities.
   I.      Project Management
           a. Inception Phase Management
                   i. Top-level Evaluation Plan (LCO version of CAP).
                  ii. Inception phase project control and status assessments.
                  iii. Inception phase stakeholder coordination.
                  iv. Elaboration phase commitment package and review (LCO package
                      preparation and ARB review).
           b. Elaboration phase management
                   i. Update CAP with detailed evaluation plan (LCA version of CAP).
                  ii. Elaboration phase project control and status assessments.
                  iii. Elaboration phase stakeholder coordination.
                  iv. COTS Detail Evaluation phase commitment package and review (LCA
                      package preparation and ARB review).
   II.     Environment and Configuration Management (CM)
           a. Inception phase environment/CM scooping and initialization.
           b. Elaboration phase environment/CM
                   i. Development environment installation and administration
                  ii. Elaboration phase CM
                  iii. Development environment integration and custom components
   III.    Requirements
           a. Inception phase requirements development
                   i. COTS Assessment Background and business modeling (LCO version of
                      CAB)
                  ii. Initial stakeholder requirements negotiation
           b. Elaboration phase requirements base lining
                   i. CAB elaboration and base lining (LCA version of CAB)
   IV.     Implementation

CAP_LCA_F05_T03_V06.20                        60                                  12/04/05
COTS Assessment Process                                                         Version 6.20
             a. Inception phase prototyping
    V.       Assessment
             a. Inception phase assessment
                       i. Initial assessment plan (LCO version; part of CAP)
                       ii. Initial COTS Assessment Report (LCO version of CAR)
                      iii. Business case analysis (part of CAR)
             b. Elaboration phase assessment
                       i. Elaboration of assessment plan (LCA version; part of CAP)
                       ii. Elaboration of assessment report (LCA version of CAR)
                                1. Detailed COTS Assessment Evaluation Process (CAR)
                                      a. Categorized COTS products based on evaluation criteria
                                         (CAR)
                                      b. Develop test cases based on evaluation criteria (CAR)
                                      c. Formulate test results based on tests cases (CAR)
                                      d. Develop test results matrix (CAR)
                                2. Generate Business Case Analysis (CAR)
                                3. Propose recommendation of COTS products based on results and
                                   analysis (CAR)
The following list the individual and team responsibilities for PHS Network Study.


Work Breakdown Items                                    Personnel Responsible
I.a.i ,I.a.ii , I.a.iii, V.b.ii.2                          Ajithkumar Kattil
  I.a.iv, I.b.iv, II.a., II.b,                                 Chris Yuan
III.a.ii, V.b.ii.1.d, V.b.ii.3
    I.b.i ,V.a.i, V.b.i,                                     Kunal Kadakia
   V.b.ii.1.a, V.b.ii.1.b
         III.a.i , III.b                                 Ashwin Kusabhadran
IV, I.b.ii, I.b.iii, V.b.ii.1.a,                               Andrew Ha
   V.b.ii.1.b, V.b.ii.1.c
   V.a.ii , V.a.iii ,V.b.ii                                  Devesh Thanvi

                                    Table 45 Individual Responsibilities




CAP_LCA_F05_T03_V06.20                             61                               12/04/05
COTS Assessment Process                                                        Version 6.20

 5.2 Effort Estimation
This section summarizes the effort required to assess the COTS hardware and software used in
the Pasadena High School Network infrastructure. We are using the COCOTS tool provided to
us to do this activity.
Assumption for the estimation:
1. There are no tailoring and glue code activities required for our project.
2. Generic component are used to described the hardware components such as the thin client
   terminals, which are not listed as available COTS product in the COTS assessment tab in the
   COTS excel sheet.
3. All major assessment activities are done on the generic component since it’s the major part of
   the project.




CAP_LCA_F05_T03_V06.20                         62                                 12/04/05
COTS Assessment Process                                            Version 6.20




                          Figure 15 – COCOTS Project Information




                            Figure 16 – COCOTS Product Used




CAP_LCA_F05_T03_V06.20                    63                          12/04/05
COTS Assessment Process                                          Version 6.20




                          Figure 17 - COCOTS Assessment Effort




CAP_LCA_F05_T03_V06.20                   64                         12/04/05
COTS Assessment Process                                                 Version 6.20




                  Figure 18 – Assessment Effort for Generic Component



CAP_LCA_F05_T03_V06.20                  65                                 12/04/05
COTS Assessment Process                                              Version 6.20




                      Figure 19 - Assessment Effort for Middleware



CAP_LCA_F05_T03_V06.20                   66                             12/04/05
COTS Assessment Process                                                Version 6.20




                   Figure 20 - Assessment Effort for Network Manager



CAP_LCA_F05_T03_V06.20                  67                                12/04/05
COTS Assessment Process                                                 Version 6.20




                   Figure 21 - Assessment Effort for Operating System



CAP_LCA_F05_T03_V06.20                   68                                12/04/05
COTS Assessment Process                                                Version 6.20




                   Figure 22 - Assessment Effort for Word Processing



CAP_LCA_F05_T03_V06.20                  69                                12/04/05
COTS Assessment Process                                                       Version 6.20




                             Figure 23 – COCOTS Assessment Result
The final TDEV comes out to be 5.19 month, which is more than the project constraint of the 12-
weeks schedule described in CAB section 2.4. However, since the natural of our project is more
a hardware-based COTS assessment, the COCOTS tool doesn’t provide a complete set of
components that can fit into our project. Therefore the estimated effort may not reflect the actual
calendar months required to complete the project.




CAP_LCA_F05_T03_V06.20                        70                                  12/04/05
COTS Assessment Process                                                    Version 6.20



Glossary
      1. Active Directory Services:
         Active Directory (codename Cascade) is an implementation of LDAP directory
         services by Microsoft for use in Windows environments. Active Directory allows
         administrators to assign enterprise wide policies, deploy programs to many
         computers, and apply critical updates to an entire organization. An Active Directory
         stores information and settings relating to an organization in a central, organized,
         accessible database. Active Directory networks can vary from a small installation
         with a few hundred objects, to a large installation with millions of objects.
      2. Black-box Testing
         Software testing technique whereby the internal workings of the item
         being tested are not known by the tester
      3. Citrix MetaFrame:
         Citrix Presentation Server (formerly Citrix MetaFrame) is a remote access/application
         publishing product built on the Independent Computing Architecture (ICA), Citrix
         Systems' thin client protocol. The Microsoft Remote Desktop Protocol, part of
         Microsoft's Terminal Services, is based on Citrix technology and was licensed from
         Citrix in 1997. Unlike traditional frame buffered protocols like VNC, ICA transmits
         high-level window display information, much like the X11 protocol, as opposed to
         purely graphical information.
      4. COTS (Commercial Off The Shelf)
         COTS software is defined as a software system that has been built as a composition
         of many other COTS software components (Vigder, 1998). Here the developer of the
         software act as the integrator who purchase the components from third party vendors
         and assembly them to build the final product
      5. COCOTS
         COCOTS is a cost estimation tool designed to capture explicitly the most important
         costs associated with COTS component integration. COCOTS is actually an amalgam
         of four related sub-models, each addressing individually what the authors have
         identified as the four primary sources of COTS software integration costs.
      6. Downward Compatibility
         A product is said to be backward compatible when it is able to take the place of an
         older product, by interoperating with other products that were designed for the older
         product.
      7. Gantt Chart:
         A Gantt chart is a popular type of bar chart that aims to show the timing of tasks or
         activities as they occur over time. Although the Gantt chart did not initially indicate



CAP_LCA_F05_T03_V06.20                      71                                  12/04/05
COTS Assessment Process                                                        Version 6.20
         the relationships between activities this has become more common in current usage as
         both timing and interdependencies between tasks can be identified.
      8. ISI :
         Part of the University of Southern California (USC), ISI is involved in a broad
         spectrum of information processing research and in the development of advanced
         computer and communication technologies.
      9. PHS: Pasadena High school
      10. ROI (Return of Investment)
         Return on Investment. A measure of a corporation's profitability, equal
         to a fiscal year's income divided by common stock and preferred stock
         equity plus long-term debt. ROI measures how effectively the            firm uses its
         capital to generate profit; the higher the ROI, the better.
      11. SMP:
         Symmetric Multiprocessing, or SMP, is a multiprocessor computer architecture
         where two or more identical processors are connected to a single shared main
         memory. Most common multiprocessor systems today use SMP architecture.
         SMP systems allow any processor to work on any task no matter where the data
         for that task is located in memory; with proper operating system support, SMP
         systems can easily move tasks between processors to balance the work load
         efficiently.
      12. TC-95 : Application server at PHS maintained by Tangent Computer
      13. TC-96 : Authentication server at PHS maintained by Tangent Computer
      14. Thin client:
         A thin client is a computer (client) in client-server architecture networks which
         has little or no application logic, so it has to depend primarily on the central server
         for processing activities. The word "thin" refers to the small boot image which
         such clients typically require - perhaps no more than required to connect to a
         network and start up a dedicated web browser.
      15. White-box Testing
         A software testing technique whereby explicit knowledge of the internal
         workings of the item being tested are used to select the test data.




CAP_LCA_F05_T03_V06.20                       72                                   12/04/05

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:6
posted:2/9/2012
language:English
pages:72