Docstoc

COTS Assessment Report _CAR_

Document Sample
COTS Assessment Report _CAR_ Powered By Docstoc
					COTS Assessment Report                                  Version 6.50


           COTS Assessment Report (CAR)
        Pasadena High School Computer Network Study



                               Team #3


        Ajithkumar Kattil           (Project Manager)
        Chris Yuan                  (Tester & Test Reviewer)
        Kunal Kadakia               (Business Process Analyst)
        Ashwin Kusabhadran          (Test Designer)
        Andrew Ha                   (Tester & Prototype)
        Devesh Thanvi               (Requirements Analyst)
        Vincent Chu                 (IV&V)
        Winston Kwong               (IV&V)
        Mrs. Jeanine Foote          (Client)
        Erin Shaw                   (Sponsor & Researcher)
        Pasadena High School        (Customer)




CAR_LCA_F05a_T03_V06.50         I                          12/05/05
COTS Assessment Report                                                                   Version 6.50


Version History
Date       Author          Version   Changes made                          Rationale
10/10/05   Andrew Ha       1.0        Initial Draft Release                Initial draft release sections 1, 2
10/10/05   Chris Yuan      1.1        Added Section 3.2                    Added section 3.2
10/10/05   Kunal Kadakia   1.2        Added Section 3.1                    Added section 3.1
10/10/05   Chris Yuan      1.3        Added Table of Contents, Table of    Added TOC and revised reference
                                       Tables, revised reference
10/11/05   Andrew Ha       2.0        Revised Sections 1 & 2               Revised Section 1 & 2 to incorporate
                                                                             the changes mentioned in CAR QR
10/11/05   Kunal Kadakia   2.1        Revised Section 3.1                  Revised Section 3.1 to incorporate
                                                                             the changes mentioned in CAR QR
10/11/05   Chris Yuan      2.2        Revised Section 3.2.1 and also       Revised Section 3.2.1 to incorporate
                                       revised Table of Contents             the changes mentioned in CAR QR
10/19/05   Andrew Ha       3.0        Added Section 3.2.2 Approach         Added Section 3.2.2 Approach
                                       Adopted Diagram                       Adopted Diagram (ARB Review)
                                      Added Section 3.2.3                  Added Section 3.2.3 (ARB Review)
                                      Added Section 3.2.4                  Added Section 3.2.4 (ARB Review)
10/22/05   Chris Yuan      3.1        Added TOF                            Added TOF
                                      Revised TOC, TOT                     Revised TOC, TOT
10/23/05   Kunal Kadakia   3.2        Reviewed Sections 1.0 & 2.0          Reviewed Sections 1.0 & 2.0 to
                                      Reviewed Sections 3.1 & 3.2.1         maintain consistency with CAB and
                                                                             CAP, as per the ARB Review
                                                                            Reviewed Section 3.1 & 3.2.1 (ARB
                                                                             review)
11/17/05   Chris Yuan      4.0        Added Section 4                      Added section 4 for LCA Draft
11/17/05   Ajithkumar      4.1        Added detail for Section 4.3.3       Added detail for section 4.3.3
           Kattil
                                      Added Section 4.3.4                  Added section 4.3.4
11/17/05   Andrew Ha       4.2        Added detail for Section 4.3.3       Added detail for section 4.3.3
11/17/05   Ashwin          4.3        Added detail for Section 4.3.3       Added detail for section 4.3.3
           Kusabhadran
11/18/05   Chris Yuan      4.4        Added Section 4.3.2                  Added section 4.3.2 for test
                                                                             preparation
11/20/05   Chris Yuan      4.5        Divided Section 4 into server and    Divided Section 4 into server and
                                       client part                           client part
                                      Added Section 4.1.1 to 4.1.5         Added Section 4.1.1 to 4.1.5
                                      Added Section 4.2.1 to 4.2.5         Added Section 4.2.1 to 4.2.5
                                      Added Section 5                      Added Section 5 for LCA Draft
                                      Added Section 5.1 and 5.2            Added Section 5.1 and 5.2
11/21/05   Chris Yuan      4.6        Revised Section 3.2.1                Revised section 3.2.1
                                      Added Glossary section               Added Glossary section
                                      Updated References section           Updated References section




CAR_LCA_F05a_T03_V06.50                            II                                          12/05/05
COTS Assessment Report                                                                         Version 6.50

Date       Author          Version   Changes made                                Rationale
11/21/05   Kunal Kadakia   5.0        Revised Sections 3.1, 3.2.1, 4.1.1,        Revised Sections 3.1, 3.2.1, 4.1.1,
                                       4.1.3.3, 4.1.4, 4.1.5, 4.2.3, 4.2.3.4.1     4.1.3.3, 4.1.4, 4.1.5, 4.2.3, 4.2.3.4.1
           Ashwin
           Kusabhadran

           Ajithkumar
           Kattil

           Devesh Thanvi
11/27/05   Chris Yuan      5.1        Revised section 3.2.1                      Revised sections 3.2.1 and 4.1.X for
                                      Revised section 4.1.2                       the change of weight in the Security
                                                                                   attribute
                                      Revised section 4.1.3.3
                                                                                  Revised section 3.2.1 and 4.2.X for
                                      Revised section 4.1.3.4                     the change of sub-attribute and
                                      Revised section 4.1.3.5                     weight in Cost attribute
                                      Revised section 4.1.4                      Added result matrix chart in section
                                      Revised section 4.2.2                       4.1.4 and 4.2.4
                                      Revised section 4.2.3.3.2
                                      Revised section 4.2.3.4
                                      Revised section 4.2.4
11/27/05   Andrew Ha       5.2        Revised section 3.2.2                      Revised overall strategy diagram
                                      Revised section 3.2.4                       and added COTS assessment
                                                                                   process detail diagram in section
                                                                                   3.2.2
                                                                                  Revised prototype diagrams in
                                                                                   section 3.2.4

11/27/05   Chris Yuan      5.3        Revised section 3.1                        Revised section 3.1

11/27/05   Chris Yuan      5.4        Revised section 4.2.3.4                    Revised section 4.2.3.4
                                      Revised section 4.2.4                      Revised section 4.2.4
11/27/05   Ajithkumar      5.5        Revised section 4.2.5.1                    Revised section 4.2.5.1
           Kattil
                                      Revised section 4.2.5.2                    Revised section 4.2.5.2
11/28/05   Devesh Thanvi   5.6        Revised Section 4.2                        Revised Section 4.2
12/04/05   Chris Yuan      6.0        Revised section 3.1                        Revised all sections based on LCA
                                      Revised section 4.1.2                       ARB comments
                                      Revised section 4.1.3.3.2
                                      Revised section 4.1.3.4
                                      Revised section 4.1.4
                                      Revised section 4.2.3.3
                                      Revised section 4.2.3.4
                                      Revised section 4.2.4
12/04/05   Ajithkumar      6.1        Revised section 4.1.5                      Revised all sections based on LCA
           Kattil                                                                  ARB comments
                                      Revised section 4.2.5



CAR_LCA_F05a_T03_V06.50                           III                                                12/05/05
COTS Assessment Report                                                          Version 6.50

Date      Author          Version   Changes made                  Rationale
12/4/05   Andrew Ha       6.2        Added Section 3.2.3.1        Added Section 3.2.3.1 and 3.2.3.2 to
                                     Added Section 3.2.3.2         clarify test performed.
12/4/05   Chris Yuan      6.3        Revised section 4.1.3.3.2    Revised section 4.X.3.3.2 based to
                                     Revised section 4.2.3.3.2     add test dependencies
12/4/05   Andrew Ha       6.4        Revised 3.2.3                Revised 3.2.3, 3.2.4, 4, 4.1.3.3.2
                                     Revised 3.2.4                 based on IV&V’s comments.
          Devesh Thanvi
                                     Revised section 4
                                     Revised 4.1.3.3.2
12/5/05   Ajithkumar      6.5        Revised section 4.1.5        Revised section 4.1.5 and 4.2.5
          Kattil                                                    based on IV&V’s comments for LCA
                                     Revised section 4.2.5
                                                                    Draft.
          Ashwin
          Kusabhadran




CAR_LCA_F05a_T03_V06.50                         IV                                   12/05/05
COTS Assessment Report                                                                                                                        Version 6.50


Table of Contents
Version History ...........................................................................................................................................................II

Table of Contents ........................................................................................................................................................ V

Table of Tables ......................................................................................................................................................... VII

Table of Figures ........................................................................................................................................................ XI

Preface ......................................................................................................................................................................... 12

1.       Executive Summary ........................................................................................................................................... 17

2.       Purpose, Scope, and Assumptions...................................................................................................................... 18

3.       Assessment Approach ........................................................................................................................................ 19

         3.1      System Objectives and Context ................................................................................................................ 19

         3.2      Assessment Objectives and Approach ...................................................................................................... 21

                  3.2.1         Assessment Objectives ............................................................................................................... 21

                  3.2.2         Approach .................................................................................................................................... 26

                  3.2.3         Network Assessment Summary.................................................................................................. 28

                  3.2.4         Prototypes Proposed ................................................................................................................... 30

4.       Assessment Result.............................................................................................................................................. 36

         4.1      Assessment Results-Part 1 [Server] .......................................................................................................... 36

                  4.1.1         COTS Assessed .......................................................................................................................... 36

                  4.1.2         Evaluation Criteria ..................................................................................................................... 37

                  4.1.3         Test Procedure ............................................................................................................................ 40

                  4.1.4         Evaluation Results Screen Matrix ............................................................................................ 103

                  4.1.5         Business Case Analysis ............................................................................................................ 107

         4.2      Assessment Results-Part 2 [Client] ........................................................................................................ 112

                  4.2.1         COTS Assessed ........................................................................................................................ 112

                  4.2.2         Evaluation Criteria ................................................................................................................... 113

                  4.2.3         Test Procedure .......................................................................................................................... 115




CAR_LCA_F05a_T03_V06.50                                                          V                                                                   12/05/05
COTS Assessment Report                                                                                                                      Version 6.50

                  4.2.4        Evaluation Results Screen Matrix ............................................................................................ 137

                  4.2.5        Business Case Analysis ............................................................................................................ 140

5.       Conclusion and Recommendation .................................................................................................................... 150

         5.1      Conclusion and Recommendation Part 1 [Server] .................................................................................. 150

         5.2      Conclusion and Recommendation Part 2 [Client] .................................................................................. 152

Glossary .................................................................................................................................................................... 153




CAR_LCA_F05a_T03_V06.50                                                        VI                                                                  12/05/05
COTS Assessment Report                                                                                                           Version 6.50


Table of Tables
  Table 1 High-level Server Assessment Attributes ................................................................................................... 22

  Table 2 Server Assessment Activities ...................................................................................................................... 23

  Table 3 High-level Client Assessment Attributes .................................................................................................... 24

  Table 4 Client Assessment Activities ....................................................................................................................... 25

  Table 5 Server COTS Assessed ............................................................................................................................... 37

  Table 6 Server Evaluation Criteria ......................................................................................................................... 37

  Table 7 Server Performance Attributes .................................................................................................................. 37

  Table 8 Server Cost Attributes ................................................................................................................................ 38

  Table 9 Server Intercomponent Compatibility Attributes ....................................................................................... 38

  Table 10 Server Interoperability Attributes ............................................................................................................ 38

  Table 11 Server Vendor Support Attributes ............................................................................................................ 38

  Table 12 Server Flexibility Attributes ..................................................................................................................... 39

  Table 13 Server Security Attributes ........................................................................................................................ 39

  Table 14 Hardware Preparation for Server COTS Product ................................................................................... 40

  Table 15 Software Preparation for Server COTS Product ..................................................................................... 41

  Table 16 Other Preparation for Server COTS Product .......................................................................................... 41

  Table 17 Server Flexibility Attribute Result Rationale ........................................................................................... 43

  Table 18 Server Test Procedure Specification 1-1.................................................................................................. 44

  Table 19 Server Test Procedure Specification 1-2.................................................................................................. 45

  Table 20 Server Test Procedure Specification 1-3.................................................................................................. 46

  Table 21 Server Test Procedure Specification 1-4.................................................................................................. 48

  Table 22 Server Test Procedure Specification 1-5.................................................................................................. 50

  Table 23 Server Test Procedure Specification 2-1.................................................................................................. 52

  Table 24 Server Test Procedure Specification 2-2.................................................................................................. 54

  Table 25 Server Test Procedure Specification 2-3.................................................................................................. 56

  Table 26 Server Test Procedure Specification 3-1.................................................................................................. 58



CAR_LCA_F05a_T03_V06.50                                                VII                                                              12/05/05
COTS Assessment Report                                                                                                          Version 6.50

  Table 27 Server Test Procedure Specification 3-2.................................................................................................. 59

  Table 28 Server Test Procedure Specification 3-3.................................................................................................. 61

  Table 29 Server Test Procedure Specification 3-4.................................................................................................. 62

  Table 30 Server Test Procedure Specification 4-1.................................................................................................. 63

  Table 31 Server Test Procedure Specification 4-2.................................................................................................. 64

  Table 32 Server Test Procedure Specification 4-3.................................................................................................. 66

  Table 33 Server Test Procedure Specification 4-4.................................................................................................. 68

  Table 34 Server Test Procedure Specification 4-5.................................................................................................. 70

  Table 35 Server Test Procedure Specification 5-1.................................................................................................. 73

  Table 36 Server Test Procedure Specification 5-2.................................................................................................. 75

  Table 37 Server Test Procedure Specification 5-3.................................................................................................. 78

  Table 38 Server Test Procedure Specification 5-4.................................................................................................. 80

  Table 39 Server Test Procedure Specification 5-5.................................................................................................. 82

  Table 40 Server Test Procedure Specification 7-1.................................................................................................. 84

  Table 41 Server Test Result 1-1 .............................................................................................................................. 85

  Table 42 Server Test Result 1-2 .............................................................................................................................. 86

  Table 43 Server Test Result 1-3 .............................................................................................................................. 87

  Table 44 Server Test Result 1-4 .............................................................................................................................. 88

  Table 45 Server Test Result 1-5 .............................................................................................................................. 89

  Table 46 Server Test Result 2-1 .............................................................................................................................. 90

  Table 47 Server Test Result 2-2 .............................................................................................................................. 91

  Table 48 Server Test Result 2-3 .............................................................................................................................. 92

  Table 49 Server Test Result 3-1 .............................................................................................................................. 93

  Table 50 Server Test Result 3-2 .............................................................................................................................. 94

  Table 51 Server Test Result 3-3 .............................................................................................................................. 95

  Table 52 Server Test Result 3-4 .............................................................................................................................. 95

  Table 53 Server Test Result 4-1 .............................................................................................................................. 96

  Table 54 Server Test Result 4-2 .............................................................................................................................. 96


CAR_LCA_F05a_T03_V06.50                                               VIII                                                             12/05/05
COTS Assessment Report                                                                                                          Version 6.50

  Table 55 Server Test Result 4-3 .............................................................................................................................. 97

  Table 56 Server Test Result 4-4 .............................................................................................................................. 98

  Table 57 Server Test Result 4-5 .............................................................................................................................. 98

  Table 58 Server Test Result 5-1 .............................................................................................................................. 99

  Table 59 Server Test Result 5-2 .............................................................................................................................. 99

  Table 60 Server Test Result 5-3 ............................................................................................................................ 100

  Table 61 Server Test Result 5-4 ............................................................................................................................ 100

  Table 62 Server Test Result 5-5 ............................................................................................................................ 101

  Table 63 Server Test Result 7-1 ............................................................................................................................ 101

  Table 64 Server Result Matrix AT-S01 ................................................................................................................. 103

  Table 65 Server Result Matrix AT-S02 ................................................................................................................. 103

  Table 66 Server Result Matrix AT-S03 ................................................................................................................. 104

  Table 67 Server Result Matrix AT-S04 ................................................................................................................. 104

  Table 68 Server Result Matrix AT-S05 ................................................................................................................. 105

  Table 69 Server Result Matrix AT-S06 ................................................................................................................. 105

  Table 70 Server Result Matrix AT-S07 ................................................................................................................. 105

  Table 71 Server Result Matrix Overall Summary ................................................................................................. 106

  Table 72 ROI Statistics for Breakeven Analysis after implementing prototype 2, phase1 .................................... 110

  Table 73 Client COTS Assessed ............................................................................................................................ 113

  Table 74 Client Evaluation Criteria ..................................................................................................................... 113

  Table 75 Client Cost Attribute .............................................................................................................................. 114

  Table 76 Client Performance Attribute ................................................................................................................. 114

  Table 77 Client Vendor Support Attribute ............................................................................................................ 114

  Table 78 Client Flexibility Attribute ..................................................................................................................... 114

  Table 79 Hardware Preparation for Client COTS Product .................................................................................. 116

  Table 80 Software Preparation for Client COTS Product .................................................................................... 117

  Table 81 Other Preparation for Client COTS Product ......................................................................................... 117

  Table 82 Client Performance Attribute Result Rational ....................................................................................... 119


CAR_LCA_F05a_T03_V06.50                                                IX                                                              12/05/05
COTS Assessment Report                                                                                                    Version 6.50

  Table 83 Client Flexibility Attribute Result Rational............................................................................................ 120

  Table 84 Client Test Procedure Specification 1-1 ................................................................................................ 122

  Table 85 Client Test Procedure Specification 1-2 ................................................................................................ 124

  Table 86 Client Test Procedure Specification 3-1 ................................................................................................ 127

  Table 87 Client Test Procedure Specification 3-2 ................................................................................................ 129

  Table 88 Client Test Results 1-1 for Tangent Model ............................................................................................ 130

  Table 89 Client Test Results 1-2 for Tangent Model ............................................................................................ 131

  Table 90 Client Test Results 3-1 for Tangent Model ............................................................................................ 132

  Table 91 Client Test Results 3-2 for Tangent Model ............................................................................................ 132

  Table 92 Client Test Results 1-1 for Wyse ............................................................................................................ 133

  Table 93 Client Test Results 1-2 for Wyse ............................................................................................................ 133

  Table 94 Client Test Results 3-1 for Wyse ............................................................................................................ 134

  Table 95 Client Test Results 3-2 for Wyse ............................................................................................................ 135

  Table 96 Client Result Matrix AT-C01 ................................................................................................................. 137

  Table 97 Client Result Matrix AT-C02 ................................................................................................................. 137

  Table 98 Client Result Matrix AT-C03 ................................................................................................................. 138

  Table 99 Client Result Matrix AT-C04 ................................................................................................................. 138

  Table 100 Client Result Matrix Overall Summary ................................................................................................ 138

  Table 101 ROI Statistics for Breakeven Analysis after implementing prototype 2, phase2 (product 1) ............... 143

  Table 102 ROI Statistics for Breakeven Analysis after implementing prototype 2, phase2 (product 2) ............... 148




CAR_LCA_F05a_T03_V06.50                                              X                                                           12/05/05
COTS Assessment Report                                                                                                         Version 6.50


Table of Figures
  Figure 1 Approach Adopted.................................................................................................................................... 26

  Figure 2 COTS Assessment Process ....................................................................................................................... 27

  Figure 3 Prototype I – Citrix Thin Client Structure................................................................................................ 30

  Figure 4 Prototype I – Citrix Thin Client Network Diagram.................................................................................. 31

  Figure 5 Prototype II, Phase I - New TC Application Server ................................................................................. 32

  Figure 6 Solving Single Point Failure Strategy ...................................................................................................... 34

  Figure 7 Prototype II, Phase II – New Thin Client Terminals ................................................................................ 35

  Figure 8 Server Result Summary .......................................................................................................................... 106

  Figure 9 Server Break-even Analysis on ROI ....................................................................................................... 110

  Figure 10 Client Result Summary ......................................................................................................................... 139

  Figure 11 Client Break-even Analysis on ROI (Product 1) .................................................................................. 143

  Figure 12 Client Break-even Analysis on ROI (Product 2) .................................................................................. 148




CAR_LCA_F05a_T03_V06.50                                                XI                                                             12/05/05
COTS Assessment Report                                                      Version 6.50


Preface
The CAR document is reasonably self-contained, but relies on the CAB for detailed background
on the project and organizational goals and environment. It also relies on the CAP for details on
milestones, budgets, schedules, and risks. Its level of detail is risk-driven, particularly with
respect to budgets, schedules and customer needs.


 References
          Client meeting notes
       http://greenbay.usc.edu/csci577/fall2005/projects/team3/CMN/CMN_09_27_F05_T03.pdf

          COTS Assessment Background (CAB)
       http://greenbay.usc.edu/csci577/fall2005/projects/team3/LCO/CAB_LCO_F05a_T03_V0
       5.20.pdf

          COTS Assessment Process (CAP)
       http://greenbay.usc.edu/csci577/fall2005/projects/team3/LCO/CAP_LCO_F05a_T03_V0
       3.10.pdf

          Easy WinWin Negotiation
       http://greenbay.usc.edu/csci577/fall2005/projects/team3/LCA/EWW_LCA_F05a_T03_V
       03.0.pdf

          Intel Corporation
       http://www.intel.com/

          Microsoft Corporation
       http://www.microsoft.com/

          MS Terminal Services, “Juggling Terminal Service Resources”
       http://www.msterminalservices.org/articles/Juggling-Terminal-Service-Resources.html

          Renaissance Learning
        http://www.renlearn.com/RenaissancePlace/default.htm

          Tangent Computer, Inc
       http://www.tangent.com/

          Wayang Outpost
       http://www.wayangoutpost.net/


CAR_LCA_F05a_T03_V06.50                    12                                   12/05/05
COTS Assessment Report                                                  Version 6.50

         WebDT
      http://www.dtresearch.com/prod_webDT166.html
         Wyse Technology Inc
      http://www.wyse.com

         Ye Yang, and Barry Boehm, "Guidelines for Producing COTS Assessment
          Background, Process, and Report Documents," USC-CSE Tech report
      http://greenbay.usc.edu/csci577/fall2005/site/guidelines/CBA-AssessmentIntensive.pdf




CAR_LCA_F05a_T03_V06.50                 13                                  12/05/05
COTS Assessment Report                                                         Version 6.50

 Change Summary
      Version 1.0        Andrew released initial draft with sections with Preface, References,
                         Change Summary, Sections 1 and 2.

      Version 1.1        Chris added section 3.2 for the LCO draft.

      Version 1.2        Kunal added section 3.1 for the LCO draft.

      Version 1.3        Chris added Table of Contents, Table of Tables. The reference section is
                         revised to include the reference to the previous documents.

      Version 2.0        This version incorporates the comments and suggestion given in the
                         CAR QR report.

                         Andrew revised sections 1 & 2 for better understandability.

      Version 2.1        Kunal revised section 3.1.

      Version 2.2        Chris revised section 3.2.1 to improve the assessment objectives and also
                         revised Table of Contents and Table of Tables.

      Version 3.0        Added Section 3.2.2 Approach Adopted Diagram, Added Section 3.2.3,
                         Added Section 3.2.4. All sections were added based on the ARB review
                         comments.

      Version 3.1        Chris revised Table of Contents, Table of Tables and Added Table of
                         Figures.

      Version 3.2        Kunal reviewed Sections 1.0 & 2.0 to maintain consistency with the
                         CAB and CAP, as per the ARB Review.

                         Kunal reviewed Sections 3.1 & 3.2.1 as per the ARB Review.

      Version 4.0        Chris added section 4 for LCA Draft. Initial test cases added in section
                         4.3.3 for evaluation criteria AT-S03 and AT-S04.

      Version 4.1        Ajithkumar added test cases in section 4.3.3 for evaluation criteria AT-
                         S05. Also added section 4.3.4 for the testing result for AT-S05.

      Version 4.2        Andrew added test cases in section 4.3.3 for evaluation criteria AT-S02
                         and AT-S04.

      Version 4.3        Ashwin added test cases in section 4.3.3 for evaluation criteria AT-S01.

      Version 4.4        Chris added section 4.3.2 for the test preparation.


CAR_LCA_F05a_T03_V06.50                 14                                        12/05/05
COTS Assessment Report                                                        Version 6.50

      Version 4.5        Chris modified the document to complete the LCA Draft requirement.
                         Section 4 was divided into server and client part to describe the
                         assessment done on different kind of COTS product. Section 4.1 and 4.2
                         were added for the assessment on the server and client part relatively.
                         Section 4.X.1 to 4.X.5 were added to described the detailed assessment
                         activities, result screen matrix and business case analysis. Section 5 was
                         added for the conclusion and recommendation of the COTS assessment.
                         Section 5.1 and 5.2 were added for the server and client part relatively.

      Version 4.6        Chris revised section 3.2.1 to reflect the current assessment attributes and
                         activities. The section is divided into server and client sections based on
                         the actual assessment method. Some of the weights were reassigned
                         based on the new attributes. Also added the glossary section and updated
                         references section.

      Version 5.0        Kunal Kadakia revised 3.1, 3.2.1,
                         Ajithkumar Kattil & Ashwin Kusabhadran revised 4.1.1, 4.1.3.3, 4.1.4,
                         4.1.5, 4.2.3, 4.2.3.4.1
                         Devesh Thanvi revised Section 5.1

      Version 5.1        Chris revised section 3.2.1 evaluation attributes identifier by changing
                         Flexibility to AT-S06 and Security attribute to AT-S07 due to the change
                         of weight. Revised section 4.1.2, 4.1.3.3, 4.1.3.4, 4.1.3.5, 4.1.4, 4.2.2 for
                         the change of weight in Security attribute. Revised section 4.2.2,
                         4.2.3.3.2, 4.2.3.4, 4.2.4 for the change of weight and sub-attribute in Cost
                         attribute (AT-C01). Added result matrix chart to section 4.1.4 and 4.2.4
                         for better readability.

      Version 5.2        Andrew revised the approach adopted diagram and added the COTS
                         assessment process diagram in section 3.2.2. Revised prototype diagrams
                         and description in section 3.2.4.

      Version 5.3        Chris revised system objective and budget constraint from $3500 to
                         $3000 in section 3.1 according to the change in CAB.

      Version 5.4        Chris revised test result for client maintenance cost AT-C01-2 in section
                         4.2.3.4. Also updated results matrix in section 4.2.4 accordingly.




CAR_LCA_F05a_T03_V06.50                  15                                        12/05/05
COTS Assessment Report                                                       Version 6.50

      Version 5.5        Ajithkumar Kattil revised the business case analysis in section 4.2.5.1
                         and 4.2.5.2 to reflect the change of maintenance cost.

      Version 5.6        Devesh Thanvi revised section 4.2 to reflect consistency in the
                         Assessment Results of the client.

      Version 6.0        Chris revised document to incorporate LCA ARB comments. In section
                         3.1 the backup and recovery capability is removed according to the
                         change in CAB. In section 4 the test cases and results for AT-S02, AT-
                         S03, AT-S04, AT-S05, and AT-C01 were revised.

      Version 6.1        Ajithkumar Kattil revised the business case analysis in section 4.1.5 and
                         4.2.5 to incorporate the LCA ARB comments.

      Version 6.2        Andrew added 3.2.3.1, 3.2.3.2 to clarify test performed at PHS.

      Version 6.3        Chris revised section 4.1.3.3.2 and 4.2.3.3.2 to add dependencies to the
                         test cases based on the LCA ARB and IV&V review comments.

      Version 6.4        Andrew and Devesh Thanvi revised section 3.2.3, 4, 4.1.3.3.2, to resolve
                         issues based on the IV&V review comments.

      Version 6.5        Ajithkumar Kattil and Ashwin Kusabhadran revised the business case
                         analysis in section 4.1.5 and 4.2.5 to incorporate the IV&V LCA Draft
                         review comments by adding more details to the cost breakdown.




CAR_LCA_F05a_T03_V06.50                 16                                        12/05/05
COTS Assessment Report                                                        Version 6.50


1. Executive Summary
The project under consideration is Pasadena High School (PHS) Computer Network Study
which involves an analysis of the existing thin client network infrastructure at PHS.
Pasadena High School needs a powerful server system to host and serve multimedia intensive
learning applications (such as Wayang Outpost, and Renaissance Place) in their library and
computer lab. The proposed new system will provide support for 40 plus thin clients that will
allow students to work simultaneously on the high-end multimedia interactive applications. This
system design, consisting of main servers and multiple thin client stations will provide a low cost
solution that would facilitate easy deployment and necessitate minimal maintenance.


The high-level objectives of the COTS assessment are as follows:

      Analyze the current system infrastructure and identify the critical issues, constraints, and
       limitations that bound to the existing system.
      Recommend two or more distinct prototypes, each highlighting levels of change and the
       costs needed to maximize the client‟s benefits and levels of service.
      To measure the compliance of the proposed COTS based system with the guidelines set
       forth by the client during the Win-Win negotiations.

The results of the analysis will be published and presented to the client to serve as formal
justifications, so that the client can submit requests for software and server upgrades to the
Pasadena Unified School District (PUSD).




CAR_LCA_F05a_T03_V06.50                     17                                    12/05/05
COTS Assessment Report                                                      Version 6.50


2. Purpose, Scope, and Assumptions
This section describes the purpose, scope, and assumptions that underlie the analysis, results,
conclusion and recommendations.
The purpose of this document is to
    Summarize PHS COTS assessment process.
    Present the major COTS assessment results and conclusions.
    Make recommendations to the client based on the COTS assessment results.
    Provide substantiation for the results by using performance tuning and load testing COTS
       product for predicting system behavior for the COTS system under various test scenarios.

The scope of this document covers the COTS assessment objectives, context, approach, results,
conclusions, recommendations, and supporting data.

The following assumptions underlie the analysis, results, conclusions, and recommendations:

      The client must provide the assessment team with full access to the application server and
       thin client terminals during the evaluation period.

      The maintenance of the computer lab at the PHS library must be covered by the service
       contract plan with Tangent, who is their thin client provider. The vendor will provide
       remote administration as long as the service contract plan is renewed on an annual basis.

      The existing infrastructure and topologies consisting of network connections, PC setups,
       thin client setup, server configuration, and server location must not change drastically
       during the assessment period.

      The purpose and scope of PHS Network Study must not change.




CAR_LCA_F05a_T03_V06.50                    18                                   12/05/05
COTS Assessment Report                                                         Version 6.50


3. Assessment Approach
This section describes the system and assessment objectives and approach.

 3.1 System Objectives and Context
This section briefly summarizes the application system objectives, constraints and priorities
referenced from sections 2, 3 and 4 of CAB.
Our client from Pasadena High School needs a high performance computer network system
solution. It should support multimedia learning applications (such as Wayang Outpost, and
Renaissance Place) in their library and also help the students and staff members in their
interactive learning process. Our project would evaluate the existing network and investigate the
feasibility of different COTS packages such as Tangent thin client solution, Citrix Meta frame
solution - developed by Citrix and verify if the capabilities of the system satisfy the requirements
of the client. Our final COTS recommendation to the client would be a report of different
prototypes (as per the client‟s request) on the basis of the different tests conducted by the team
on the system and also by evaluating the cost benefit analysis of the different prototypes. A
report showing the degree to which the capabilities of the implemented system satisfy the
requirements of the client will also be presented.


In order to reduce the probability of failure, we need to summarize the major constraints on the
project, which are given as follows:
      Limited Budget: The Customer can afford a budget of 3000 USD per year for server
       maintenance. The budget for capital investment has yet not been allocated.

      Limited Time: The project must be completed within 12 weeks.
      Legacy servers and thin-client solution: The proposed solution should integrate properly
       with the existing TC-95 ( application server) and TC-96 (Authentication server) servers
       and the Wyse thin client solutions
      Current COTS vendor – Tangent Computers: the client wants to continue with the current
       vendor, Tangent Computers.


The prioritized capabilities of the system are as follows:
      Thin client multimedia capability
       The proposed thin client network solution should be capable of handling high end flash
       based multimedia intensive applications that will allow the students to work
       simultaneously on the Wayang Outpost tutorial (a Geometric Software hosted by ISI).




CAR_LCA_F05a_T03_V06.50                      19                                    12/05/05
COTS Assessment Report                                                        Version 6.50

      Integrated network solution
       The system should integrate the existing thin client network with the proposed COTS
       solution so that applications like Accelerated reader would be available to all users across
       the network.
      Network Load balancing
       Network Load Balancing services will enable multiple dual processor servers to be
       configured as a logical group that will balance the user sessions across a multitude of
       servers and allow users to be dynamically routed to a server that is least busy.
      Symmetric multiprocessing enabled (SMP)
       Upgrading the existing server to dual processors (SMP) will make use of hardware load
       balancing, which in turn, will increase the throughput of the user terminals.




CAR_LCA_F05a_T03_V06.50                     20                                    12/05/05
COTS Assessment Report                                                           Version 6.50

 3.2 Assessment Objectives and Approach
This section briefly summarizes the COTS assessment objectives and approach from section 2.1
and section 4 in CAP.

       3.2.1       Assessment Objectives
A list of high-level system attributes is derived from the client‟s objectives, constraints, and
priorities to serve as the evaluation criteria.
The list is categorized into two parts, one for the server and one for the client and is ordered by
importance as indicated by the evaluation weights in the table below, which is agreed by both the
client and evaluators. Each of the attributes is assigned an importance based on the result of the
Win-Win negotiation and client meetings.
The first part of the list summarizes the evaluation attributes for the server as in the table below:

 Identifier   Name Of Attribute        Weight      Rationale
 AT-S01       Performance              270         The performance factor covers the following
                                                   attributes:
                                                   - System login time (Acceptable time is less
                                                   than 30 seconds for each user)
                                                   - Number of concurrent users (acceptable
                                                   number of users is 36)
                                                   - System response time for regular applications
                                                   - System response time for multimedia
                                                   intensive applications
 AT-S02       Cost                     150         The cost is determined by the overall cost of
                                                   the implementation, including the initial and
                                                   future hardware upgrades and maintenance of
                                                   the system. We will provide cost estimation for
                                                   each system prototype that we would propose
                                                   to meet the expectations.
 AT-S03       Intercomponent           140         A list of applications provided by the client
              Compatibility                        should be installed on the server and should be
                                                   made available to all the terminal clients. The
                                                   applications includes the following:
                                                   - Wayang Outpost (web-based)
                                                   - MS Office Suite
                                                   - Renaissance Place
                                                   - Choices



CAR_LCA_F05a_T03_V06.50                      21                                      12/05/05
COTS Assessment Report                                                       Version 6.50


                                                Each application will be tested on the server
                                                and on the terminal clients to measure its
                                                compatibility with the system.
AT-S04     Interoperability         130         Interoperability is measured by the following
                                                network services:
                                                - Authentication service
                                                - Application processing
                                                - Active Directory service
                                                - Print service
                                                - User profile/folder
                                                Each service will be tested on how the
                                                information is exchanged between the server
                                                and client on the network.
AT-S05     Vendor Support           120         The vendor support metrics are measured by
                                                the degree of support of the following
                                                attributes:
                                                - Response time for critical problems
                                                - Remote assistance
                                                - Hardware support
                                                - Software upgrades
                                                - Warranty
                                                The possible sources for evaluating the
                                                attributes include vendor communication via
                                                phone and email and overall customer
                                                satisfaction rate, if available.
AT-S06     Flexibility              100         The flexibility attribute is measured by the
                                                system‟s upgradeability as well as the
                                                downward compatibility to older hardware
                                                components. Since the hardware is not
                                                available for testing, we will reply on the
                                                information provided by COTS vendor as well
                                                as the hardware specification data.
AT-S07     Security                 50          The security metrics are measured on the user
                                                privilege capability.
                                                The system should be capable of handling and
                                                assigning different access rights as per the
                                                different types of user access.
                         Table 1 High-level Server Assessment Attributes



CAR_LCA_F05a_T03_V06.50                    22                                    12/05/05
COTS Assessment Report                                                         Version 6.50

The table below summarizes the assessment activities for the server COTS product:

 Identifier Name Of Attribute        Assessment Method
 AT-S01      Performance             Each performance attribute will be tested by having the
                                     users to login to the system and run different applications.
                                     The evaluator will record the result at different concurrent
                                     user levels.
 AT-S02      Cost                    A list of quotes should be provided by the vendor to show
                                     the estimation for the initial cost as well as the future cost,
                                     depending on the different prototype given in the project.
 AT-S03      Intercomponent          The system will be tested against the required applications
             Compatibility           by first running each application independently and then
                                     running the different applications simultaneously. The
                                     process will be monitored and the result will be recorded
                                     by the evaluators.
 AT-S04      Interoperability        We will test all the network services using the client
                                     terminal to determine if all the communications between
                                     the server and clients are properly established. We will
                                     also test the availability of varies kind of network services
                                     as the assessment activity.
 AT-S05      Vendor Support          We can evaluate the vendor support level by gathering the
                                     information about their customers‟ satisfaction levels in
                                     the market. Since Tangent is a big vendor there will be
                                     many reviews available about their products and services.
 AT-S06      Flexibility             Flexibility is reflected by the different prototypes
                                     provided in the project. Each prototype will show the ease
                                     of upgrade of a particular solution, including the
                                     recommendations about the future upgrades. The
                                     downward compatibility will be determined by the
                                     benchmark on the market as well as the information
                                     provided by COTS vendor.
 AT-S07      Security                The security attribute can be evaluated by checking
                                     whether the users are assigned proper rights based on their
                                     roles. The individual profiles and folders should be kept
                                     private from the other users. Also certain password rules
                                     should be enforced on user accounts to ensure the security
                                     level.
                                Table 2 Server Assessment Activities




CAR_LCA_F05a_T03_V06.50                     23                                     12/05/05
COTS Assessment Report                                                         Version 6.50

The second part of the list summarizes the evaluation attributes for the client COTS products as
in the table below:

 Identifier   Name Of Attribute       Weight       Rationale
 AT-C01       Cost                    120          The cost is determined by the overall cost of
                                                   the implementation, including the initial and
                                                   maintenance of the system. We will provide
                                                   cost estimation for each prototype that we
                                                   would propose to meet the expectations.
 AT-C02       Performance             120          We evaluate the performance of the client by
                                                   its hardware specification and operating system
                                                   capability.
 AT-C03       Vendor Support          90           The vendor support metrics are measured by
                                                   the degree of support of the following
                                                   attributes:
                                                   - Hardware support
                                                   - Warranty
                                                   The possible sources for evaluating the
                                                   attributes include vendor communication via
                                                   phone and email and overall customer
                                                   satisfaction rate, if available.
 AT-C04       Flexibility             80           The flexibility attribute is measured by the
                                                   system‟s upgradeability in order to adopt the
                                                   changes in the future.
                            Table 3 High-level Client Assessment Attributes




CAR_LCA_F05a_T03_V06.50                      24                                    12/05/05
COTS Assessment Report                                                      Version 6.50

The table below summarizes the assessment activities for the client COTS product:

 Identifier Name Of Attribute       Assessment Method
 AT-C01      Cost                   A list of quotes should be provided by the vendor to show
                                    the estimation for the initial cost as well as the
                                    maintenance cost, depending on the different prototype
                                    given in the project.
 AT-C02      Performance            We will evaluate the performance by comparing the
                                    hardware specification and operating system capability to
                                    see if it meets the requirements to handle all the required
                                    applications by the librarian.
 AT-C03      Vendor Support         We can evaluate the vendor support level by gathering the
                                    information about their customers‟ satisfaction levels in
                                    the market. Since Tangent is a big vendor there will be
                                    many reviews available about their products and services.
 AT-C04      Flexibility            We will evaluate the flexibility by determine weather the
                                    client model is upgradeable or not in both hardware and
                                    software.
                               Table 4 Client Assessment Activities




CAR_LCA_F05a_T03_V06.50                    25                                   12/05/05
COTS Assessment Report                                                             Version 6.50

       3.2.2       Approach
This section summarizes the overall strategy described in section 2.1 in CAP and COTS
assessment process described in section 4.1.3 in CAP. The assessment started with a preliminary
network study for PHS resulting in a list of attributes as evaluation criteria. Each of the attributes
is assigned an importance based on the result of the Win-Win negotiation and client meetings.
From these criterions, our objective is to research possible COTS candidates and incorporate
these COTS products into an initial prototype. The prototypes will then be reviewed for
feasibility, and then presented to the client for feedback. Figure 1 shows our overall approach of
the COTS assessment, and Figure 2 shows the detailed COTS assessment process framework.



 Approach Adopted

     Preliminary
      Network
     Assessment
                                                        Submit finalized             Finalize
                                 Review                prototype to client          Prototypes
                                 Client’s
                               Requirement




                                                                               Present Prototype
                                                                               for client feedback

          Practical               COTS
          Network                Products
         Assessment              Research




                                   COTS
                                                       Prototype              Prototype Feasibility
                                Assessment
                                                       Proposal              Analysis (Identify Risks)
                                  Process




                                     Figure 1 Approach Adopted




CAR_LCA_F05a_T03_V06.50                      26                                         12/05/05
COTS Assessment Report                                                Version 6.50


                         COTS Assessment Process


    Define High Level
    COTS Attributes                                             Set Evaluation
                               Identify possible
    based in Win-Win                                            Criteria
                               COTS solutions
    Negotiations



                                                                  Define Test
                                                                  Procedures



      Choose COTS                Evaluate
      item with                  COTS                         Perform Tests and
      highest ranking            Assessment                   generate Test Results
      for Prototype              Results


                           Figure 2 COTS Assessment Process




CAR_LCA_F05a_T03_V06.50              27                                   12/05/05
COTS Assessment Report                                                         Version 6.50

       3.2.3       Network Assessment Summary
This section describes the test procedures, test preformed, and results observed during for the
practical network assessment test for the current system at PHS.


               3.2.3.1        Network Assessment Description
The network assessment test spans a total of three hours, which Team 3 spent diagnosing the
performance of server (TC95) at Pasadena High School. Team 3 logged into TC95 server as the
administrator which allows us to monitor each of the thin clients and their activities on the
network, and record how each activity affects the performance of the server. The primary tool
used for this test is Windows 2003‟s Process, Performance, and Networking analyzing monitor.
Our main objectives are:
      Observe Classroom & Student Activities (15-25) Students
       Measure CPU, Network, and Memory Utilization (Using Windows Task Manager) for
       the following software:
              MS Office Applications (Word, Excel, PowerPoint)
              Choices
              Algebra Tutorial
              Internet Explorer (Static Web Page Surfing)
              Internet Explorer (Flash Animation Page Testing)
                      www.macromedia.com
                      www.disney.com
              Windows Media Player
                      Streaming USC Lectures
      Simulate 11 Users Surfing Flash Intensive Web Sites


               3.2.3.2        Network Assessment Results
The following list the results of our practical network test for the current system at PHS.
CPU Utilization per Thin Client
       MS Office Suite (1%)
       Choices (Licensed Expire so we were not able to test)
       Algebra Tutorial (<1%)
       Internet Explorer (Static Pages www.usc.edu) (2-5%)
       Internet Explorer (Flash Intensive Page www.disney.com (Disney) (15-25%)
       Windows Media Player (Streaming Video) (2-5%)


CAR_LCA_F05a_T03_V06.50                      28                                    12/05/05
COTS Assessment Report                                                      Version 6.50

Memory Utilization per Thin Client
       MS Office Suite ~ 25,000K which is less than 1%) of Total 3 Gigabytes
       Choices (could not test)
       Internet Explorer (Static Pages) ~21,000K
       Internet Explorer (Flash Intensive Page) ~22,000K
       Windows Media Player ~17,000K
Network Utilization per Thin Client (Over 100Mbs Network)
       MS Office Suite ~ .02% – 1% of 100Mbs.
       Choices (could not test)
       Internet Explorer (Static Pages) ~ 1% - 2% during page load and ~.02% afterwards.
       Internet Explorer (Flash Intensive Page) ~1% - 2% and ~.02% afterwards.
       Windows Media Player ~ 384K Constant Streaming ~ .03%

The results from our test showed that the system perform poorly at loading the flash intensive
webpage. This is an indication that the current server does not have the performance capability
to support all forty thin clients terminal at the same time.




CAR_LCA_F05a_T03_V06.50                    29                                   12/05/05
COTS Assessment Report                                                             Version 6.50


       3.2.4       Prototypes Proposed
This section describes the different prototypes proposed to improve the performance of the
current network at Pasadena High School. Our client, the school librarian, has requested us to
propose a new system that will provide support for 40 plus thin clients by allowing the students
to work on the high-end multimedia interactive applications simultaneously. The system, being
evaluated, already exists and has been implemented by an outside vendor, Tangent Computer.
This system design, consisting of main servers and multiple thin client stations will provide a
low cost solution that would facilitate easy deployment and necessitate minimal maintenance.
The client currently has a budget constraint, which will play an important role in the
consideration for the new system design. Prototyping will help achieve the shared vision of key
success stake holders of the project.
Prototype I:           Citrix Thin Client Solution
In this prototype, we are proposing a thin client network based on Citrix solution. Citrix
MetaFrame will be loaded in servers as well as the clients. The clients can be a stand alone PC
loaded with MetaFrame software. Thin clients will be integrated with standard PCs that capable
of handling intensive multimedia applications. MetaFrame will include high multimedia and
animation technologies which will make the interactive multimedia application like Wayang
Outpost run faster in thin clients. Also the old legacy hardware (fat clients) and Wyse thin clients
can be added into the proposed network solution. This solution aims to change the current thin
client setup into gradually using stand alone PCs in a client server based environment.


                               Citrix Thin Client Structure

                                            Printers
                                                                                Windows
       Citrix                                                                     Based
       MetaFrame                                                                Terminals
       Server                            NETWORK



                  MS                        Old x86                    Web ICA
                Windows                      Dos                        Clients
                 Based
                 Client                   ICA clients




                          Figure 3 Prototype I – Citrix Thin Client Structure




CAR_LCA_F05a_T03_V06.50                      30                                        12/05/05
COTS Assessment Report                                                             Version 6.50




                       Figure 4 Prototype I – Citrix Thin Client Network Diagram

This prototype diagram shows a top level solution for PHS Citrix Solution. The current system will be
replaced with three servers each serving as a MetaFrame Server, Application Server, and Authentication
Server. Thin clients and PCs connect to the server via a gateway (through an access point, such as a
router or a switch).

Citrix solution will give the client the following benefits:

   1. The Citrix Load Balancing Services will allow the users to be dynamically routed to a
      Citrix server that is least busy. This will enable 40 plus users to access the high-end
      mathematical learning applications simultaneously.
   2. Unlike the Wyse thin client solution, this solution doesn‟t require any proprietary
      hardware, thereby increasing portability and scalability.
   3. Citrix MetaFrame will provide encrypted authentication for the users
       The Secure ICA package allows for 40, 56, or 128 bit encryption.


CAR_LCA_F05a_T03_V06.50                       31                                      12/05/05
COTS Assessment Report                                                             Version 6.50

Prototype II, Phase I:        New TC Application Server




                      Figure 5 Prototype II, Phase I - New TC Application Server

This prototype plans for a step wise refinement of the current network and system infrastructure
at PHS. The main goal is to reuse all the existing legacy products and current servers at PHS and
introduce a plan for gradually replacing the main servers and the multimedia enabled thin clients.
Its goal is to focus on minimal system change while maximizing the performance to meet the
client‟s requirements. The core change in this prototype is to introduce a new server (TC97) to
serve as the new application server. TC97 has the system specifications of Dual 3.02GHz
Pentium 4 Xenon Processor with 4GB of RAM, which will be sufficient to handle the current 40
plus thin clients. TC95 server will be reassigned the role of Active Directory Server, which will
store all student‟s login and authentication services. Active Directory treats each server as a
resource on the network, and will authenticate users based on their group policies. TC96 server
will host the typing tutorial program, and serve as a file server to store student personal folders.
Accelerated Reader server will continue to host the Accelerated Reader Application, and
Wayang Outpost Server will continue to host the Wayang Outpost application. TC96 and TC97
will also serve as a mirror server for TC95.




CAR_LCA_F05a_T03_V06.50                      32                                       12/05/05
COTS Assessment Report                                                         Version 6.50


The benefits of this scenario are:
      Thin clients from the PC lab and Library can access all the software from the TC97
       server, and also the authentication takes place at one location only.
      Since Active Directory Server is handled in one place, the librarian need not manage a
       separate Active Directory Server for TC96.
      Dell PCs can also access software on TC97, in addition to the Accelerated Reader
       application.
      Centralized Application Server and thin clients results in minimal management from the
       client.
One major drawback in this scenario is the single point of failure, which often poses a major
threat in all thin client-server based design. Often if a server is down due to some internal errors,
then all the thin clients connected to that sever will not be useable. This design aims to add
backup server to the system using pre-existing servers. The mirror process allows the
administrator to load the Active Directory tree structure for TC95, TC96, and TC97. Because
TC95, TC96 and the new TC97 are all running Windows 2003, then each server can also act as
an Active Directory Server. The Administrator can set any one of these server (in this case TC95)
to be the main Active Directory Server. While TC95 still acts as the main server to handle
Active Directory Services, TC96, and TC97 has the capability of taking over as the domain
controller in case TC95 becomes unavailable.




CAR_LCA_F05a_T03_V06.50                     33                                     12/05/05
COTS Assessment Report                                                    Version 6.50




                         Figure 6 Solving Single Point Failure Strategy




CAR_LCA_F05a_T03_V06.50                   34                                 12/05/05
COTS Assessment Report                                                              Version 6.50

Prototype II, Phase II: New Thin Client Terminals




                      Figure 7 Prototype II, Phase II – New Thin Client Terminals

During Phase II of the second prototype, the legacy Wyse 3230LE terminals will be upgraded to
a thin client with better multimedia capability. The new thin client will also support local
application processing to reduce the server load.
The benefits of the phase II are:
      Web based applications can be processed by the thin clients independently and thereby
       they need not be dependent on the TC97 Server for CPU.
      It can be fully integrated with the older Wyse Thin clients, which in turn, allows the
       network to run a mix-mode of thin clients.
      It is fully scalable and thus the client can choose to replace or upgrade them, whenever
       required.




CAR_LCA_F05a_T03_V06.50                       35                                       12/05/05
COTS Assessment Report                                                       Version 6.50



4. Assessment Result
This section is organized to present the assessment scenarios and the corresponding assessment
results for the Server and the Client COTS products, depending on the critical assessment issues.
The critical assessment issues are reflected in the choice of the assessment criteria, weights and
rating scales (Referred to Section 2.1.2 of CAP).
Prototype I was introduced to the client and was immediately deemed not feasible due to the high
cost in implementation of a Citrix based solution. The client has thus requested Team3 to
produce a new low cost solution while keeping the current system components intact. Team 3
decided to split the design into two phases which in each phase the client only replaces the
needed component to gradually improve performance based on need. Prototype II, phase I and
phase II was the result of our discussion with the clients based on their new requirements.
Part 1 of the Assessment Results covers the Server COTS product for the Phase I implementation,
as discussed in the prototype in Section 3.2.4. Part 2 of the Assessment Results covers the Client
COTS products for the Phase II implementation, as discussed in the prototype in Section 3.2.4.

 4.1 Assessment Results-Part 1 [Server]
This section describes the assessment and test results for the server COTS product.

       4.1.1       COTS Assessed
The COTS product assessed is:
 COTS Product                    Web Address                Description
 Tangent Pillar™ 2750s           http://www.tangent.co      Tangent Pillar 2750s, with a newly
                                 m/products/gen/servers     designed rack mount or tower
                                 /2750s.htm                 chassis, delivers the highest levels
                                                            of power, performance, scalability
                                                            and reliability with dual Intel®
                                                            Xeon™ processors at 3.20GHz with
                                                            533MHz system bus. It supports up
                                                            to 12GB ECC DDR266 SDRAM
                                                            for improved performance.
                                                            Standard features include three
                                                            PCI-X expansion slots, seven hot-
                                                            swap SCSI hard drives with dual
                                                            channel Ultra160 SCSI, and one
                                                            Gigabit and one 10/100 Ethernet
                                                            ports. Onboard PC health
                                                            monitoring proactively helps assure
                                                            continuous operation while optional
                                                            UPS and backup peripherals are


CAR_LCA_F05a_T03_V06.50                    36                                    12/05/05
COTS Assessment Report                                                           Version 6.50

                                                                 added assurances of system
                                                                 integrity.
                                     Table 5 Server COTS Assessed




       4.1.2       Evaluation Criteria
The set of evaluation criteria for the Server COTS product chosen is shown in the following
table. The last column presents the corresponding weight assigned based on the discussion
between the client and the team members. The evaluation weights indicate the importance the
client perceives an attribute of the system shall have.

 No            Evaluation Criteria – COTS attributes                      Weight
 AT-S01        Performance                                                270
 AT-S02        Cost                                                       150
 AT-S03        Intercomponent Compatibility                               140
 AT-S04        Interoperability                                           130
 AT-S05        Vendor Support                                             120
 AT-S06        Flexibility                                                100
 AT-S07        Security                                                   50
                                    Table 6 Server Evaluation Criteria


The following tables break down each single criterion in the above table, into more details in
order to obtain a better measure of each criterion of the COTS product.
Table 7 below breaks down the Performance (AT-S01) criterion into more details as follows:

Weight         Features
80             Number of concurrent users
70             System response time (multimedia applications)
65             System login time
35             System response time (regular applications)
20             Network bandwidth
                                  Table 7 Server Performance Attributes




CAR_LCA_F05a_T03_V06.50                        37                                   12/05/05
COTS Assessment Report                                                           Version 6.50

Table 8 below breaks down the Cost (AT-S02) criterion into more details as follows:

 Weight        Features
 60            Initial purchase cost
 50            Upgrade cost
 40            Annual maintenance cost
                                     Table 8 Server Cost Attributes
Table 9 below breaks down the Intercomponent Compatibility (AT-S03) criterion into more
details as follows:

 Weight        Features
 50            Wayang Outpost
 45            Renaissance Place
 30            MS Office Suite
 15            Choice
                        Table 9 Server Intercomponent Compatibility Attributes
Table 10 below breaks down Interoperability (AT-S04) criterion into more details as follows:

 Weight        Features
 30            Authentication
 30            Application processing
 30            Active Directory service
 20            Print service
 20            User profile/folder
                               Table 10 Server Interoperability Attributes
Table 11 below breaks down Vendor Support (AT-S05) criterion into more details as follows:

 Weight        Features
 40            Response time for critical problems
 30            Remote assistance
 20            Hardware support
 20            Software bundling
 10            Warranty
                               Table 11 Server Vendor Support Attributes




CAR_LCA_F05a_T03_V06.50                       38                                    12/05/05
COTS Assessment Report                                                     Version 6.50

Table 12 below breaks down Flexibility (AT-S06) criterion into more details as follows:

 Weight        Features
 40            Downward compatibility
 60            Upgradeability
                                 Table 12 Server Flexibility Attributes
Table 13 below breaks down Security (AT-S07) criterion into more details as follows:

 Weight        Features
 50            User privileges
                                 Table 13 Server Security Attributes




CAR_LCA_F05a_T03_V06.50                       39                               12/05/05
COTS Assessment Report                                                        Version 6.50


       4.1.3       Test Procedure
This section documents the detailed evaluation process stating the test set-up, test procedures
used and their corresponding results.

                      4.1.3.1            Test Identification
The COTS product that is going to be tested is:
       Tangent Pillar™ 2750s
The server COTS evaluation test has 23 test procedures till date, which will be performed by
three team members, and the results of our COTS evaluation tests will work as the basis for our
recommendation to the client.

                      4.1.3.2            Test Preparation

The following sections list the different requirements and preparations in order to complete the
testing procedures.

4.1.3.2.1   Hardware Preparation

The following table lists the hardware requirement of the server COTS product:

 ID            COTS             Hardware Requirements            Met or not
               Product
               Model
 HREQ-S1       Tangent          1. Tangent Pillar™ 2750s         1. No. The new server has not
               Pillar™ 2750s       server as the application        been delivered as of the
                                   server                           LCA Package
                                2. One server as the             2. Yes. Currently TC95 is the
                                   authentication server            application and
                                3. One server as the backup,        authentication server in
                                   folder storage server            PHS
                                4. 40+ thin-client terminals     3. Yes. Currently TC96 is the
                                   connected with the server        folder storage server in
                                   with internet connection         PHS
                                                                 4. Yes. PHS currently has 40
                                                                    thin-client terminals
                                                                    connected to TC95
                      Table 14 Hardware Preparation for Server COTS Product




CAR_LCA_F05a_T03_V06.50                     40                                    12/05/05
COTS Assessment Report                                                         Version 6.50


4.1.3.2.2   Software Preparation

The following table lists the software requirement of the server COTS product:

 ID            COTS    Software Requirements     Met or not
               Product
               Model
 SREQ-S1       Tangent 1. Windows 2003 Server or 1. Yes. Tangent Pillar™ 2750s
               Pillar™    Windows 2000 Server       supports both Windows 2003 and
               2750s      operating system          2000 server as per the information
                       2. MS Office Suite           provided on the vendor‟s website
                       3. Choices                   (http://www.tangent.com/products/
                       4. Renaissance Place         gen/servers/2750s.htm)

                                                          2. Yes. PHS has licensed MS Office
                                                             Suite
                                                          3. No. The current license for Choice
                                                             software has expired
                                                          4. Yes. PHS has licensed
                                                             Renaissance Place on AR server
                       Table 15 Software Preparation for Server COTS Product

4.1.3.2.3   Other Pre-test Preparations

The following table lists other pre-test requirement of the server COTS product:

 ID          COTS Product       Pre-test Preparations             Met or not
             Model
 PREP-S1     Tangent            The testers need to have their    Yes. The librarian at PHS has
             Pillar™ 2750s      authenticated username and        provided usernames and
                                password set-up for access to     passwords to both the
                                all the servers and thin-client   administrator and the regular
                                terminals                         users to gain access to the
                                                                  system
                        Table 16 Other Preparation for Server COTS Product




CAR_LCA_F05a_T03_V06.50                     41                                     12/05/05
COTS Assessment Report                                                        Version 6.50


                      4.1.3.3            Test Procedure Specifications

This section provides the detailed test procedures carried out by the testers in order to rate each
evaluation criterion.
    The testing procedure adopted for evaluating all the COTS attributes AT-S01~AT-S05
       and AT-S07 is black-box testing. The black-box testing techniques widely used for the
       test process are equivalence partitioning and boundary value analysis. By applying these
       black-box techniques, we derived a set of test cases, which are presented below.
For attribute AT-S06 there are no detailed test procedure specifications. The rationale for this
attribute is explained in section 4.1.3.3.1 of the document.




CAR_LCA_F05a_T03_V06.50                     42                                    12/05/05
COTS Assessment Report                                                            Version 6.50


4.1.3.3.1      Rationales for Attributes with No Test Procedures

In case of the Flexibility attribute (AT-S06):


Downward Compatibility:
Downward Compatibility of the Server COTS pertains to its compatibility with an earlier version
of itself.
In the new server, the processor is Intel Pentium Xeon which is part of the Intel family and thus
it is downward compatible with all Intel chips. Also the server architecture is standard and not
proprietary.
The test cases are not applicable since the downward compatibility of a future product cannot be
verified directly and we need to refer to the specifications and performance results given by the
vendor.
Reference:
http://www.intel.com/performance/desktop/platform_technologies/em64t.htm
Rating: 9/10


Upgradeability:


         Hardware                Current Configuration of              Maximum possible
                                  Tangent Pillar 2750s                   configuration
            Memory                          4 GB                              12 GB
            (DIMM)
             HDD                      120 GB + 36 GB                       4 x 146 GB
                         Table 17 Server Flexibility Attribute Result Rationale
Thus, we see from the above table that the current server configuration can easily be upgraded in
terms of memory and HDD in order to increase its computational capability.
Also enough expansion slots and bays are there for any future expansion such as:
One PCI-Express slot (X4)
One 64-bit/133MHz PCI-X slot
One 64-bit/100MHz PCI-X slot
One 32-bit/33MHz PCI slot
Rating: 8/10




CAR_LCA_F05a_T03_V06.50                       43                                      12/05/05
COTS Assessment Report                                                          Version 6.50


4.1.3.3.2      Test Procedures

The following tables indicate the test procedures for the performance attribute (AT-S01):

 Test Case:                1-1
 Identifier:               AT-S01-1
 Test Items:               Number of concurrent users
 Test Description:         This test will test the capability of the system to support
                           simultaneous user logins of all 40 users
 Pre-Conditions:           The following functions will be performed before the test:
                              1. All the terminals must be connected to the network and the
                                  network and the server should be up and running.
                              2. The user accounts for all 40 users are created.
                              3. All the 40 users are provided with their usernames and
                                  password.
 Post-Conditions:          The system should be able to support all the users who are trying to
                           logon to the network without crashing.
 Input Specifications:     All the 40 users enter their Username and Password.
 Expected Output           The users get logged in to the system simultaneously.
 Specifications:
 Pass / Fail Criteria:     The test is successful if all 40 users are able to simultaneously log in
                           to the system without the server crashing or freezing.
                           The test is a failure if the server crashes while all the 40 users are
                           trying to login to the system.
 Test process:             As the terminal boots up and displays the login screen
                               1. Enter user name.
                               2. Enter the password.
                               3. Click [OK] when finished entering data.
                               4. Login to PHS_LIB domain
                                 As the users login to their terminals, monitor how many users
                                 can be logged in at the same time before the system goes down.
                                 1. The Pre-Conditions are fulfilled.
 Assumptions and
 Constraints:
 Dependencies:             None
 Traceability:             CAB Document Section 5 (LOS-1 System Performance)
                          Table 18 Server Test Procedure Specification 1-1




CAR_LCA_F05a_T03_V06.50                      44                                     12/05/05
COTS Assessment Report                                                      Version 6.50



Test Case:               1-2
Identifier:              AT-S01-2
Test Items:              System response time (multimedia applications)
Test Description:        This will test the system‟s response time (i.e. how long it takes on an
                         average to open up an instance multimedia application on a
                         terminal) for the Wayang Outpost application.
                         This test is performed with different level of concurrent users i.e.
                         with different number of users running the multimedia application
                         simultaneously.
Pre-Conditions:          The following functions will be performed before the test:
                            1. All 40 users are logged into the system
                            2. The Wayang Outpost application is accessible to all users.
Post-Conditions:         All users are able to run the application.
Input Specifications:    1. The user enters his username and password to login to the system.
                         2. The user opens up an instance of the Wayang Outpost application
Expected Output          The Wayang Outpost application opens up in all 40 terminals.
Specifications:
Pass / Fail Criteria:    The test is successful if on an average the users are able to open the
                         Wayang Outpost application within 15 seconds
                         The test fails if it takes more than 15 seconds to open up an instance
                         of the Wayang Outpost application.
Test process:            After users are logged in to the system:
                            1. Each user starts an instance of Wayang Outpost application
                                on their terminal.
                            2. We will monitor and record the system response time for the
                                application at different level of concurrent users.
                            1. The Pre-Conditions are fulfilled.
Assumptions and
                            2. All users will perform about the same amount of work on the
Constraints:
                                system
Dependencies:            Number of concurrent users (AT-S01-1)
                         Network bandwidth (AT-S01-5)
Traceability:            CAB Document, Section 4 CAP-1 & CAB Document Section 5
                         LOS-1
                         Table 19 Server Test Procedure Specification 1-2




CAR_LCA_F05a_T03_V06.50                    45                                   12/05/05
COTS Assessment Report                                                         Version 6.50


Test Case:               1-3
Identifier:              AT-S01-3
Test Items:              System login time.
Test Description:        This test measures the time required for users to log in to the system.
                         This test is performed with different level of concurrent users i.e.
                         with different number of users trying to login to the system
                         simultaneously.
Pre-Conditions:          The following functions will be performed before the test:
                               1. All the terminals must be connected to the network and the
                               network should be up and running.
                               2. The username and password for all the 40 users are present in
                               the database.
                               3. All the 40 users are provided with their user name and
                               password
Post-Conditions:         All 40 users get logged into the system
Input Specifications:    The user enters his username and password
Expected Output          The user gets logged into the system. (The user is considered logged
Specifications:          in after the login dialog box disappear, and all the desktop icons
                         loaded on the user desktop)
Pass / Fail Criteria:    The test is successful if it does not take more than 30 seconds for a
                         person to log in and not more than 5 minutes for all 40 users to
                         login.
                         The test is a failure if it exceeds the above mentioned time frame.
Test process:            As the terminal boots up and displays the login in screen
                               1. Enter user name.
                               2. Enter the password
                               3.Click [OK] when finished entering
                               As the users log in to their terminals, monitor the time it takes
                               for a single user to log in to the system and then for all the 40
                               users to login.
                               1. The Pre-Conditions are fulfilled.
Assumptions and
Constraints:
Dependencies:            Number of concurrent users (AT-S01-1)
                         Network bandwidth (AT-S01-5)
Traceability:            CAB Document Section 5 LOS-4
                         Table 20 Server Test Procedure Specification 1-3



CAR_LCA_F05a_T03_V06.50                     46                                     12/05/05
COTS Assessment Report                                                        Version 6.50


Test Case:               1-4
Identifier:              AT-S01-4
Test Items:              System response time (regular applications)
Test Description:        This will test the system‟s response time (i.e. how long it takes on an
                         average to open up an instance of the application on a terminal)
                         regular application like running web browsers, word processors,
                         typing program, etc.
                         This test is performed with different level of concurrent users i.e.
                         with different number of users running various applications
                         simultaneously.
Pre-Conditions:          The following functions will be performed before the test:
                               1. All the terminals must be connected to the network and the
                               network should be up and running.
                               2. The username and password for all the 40 users are present in
                               the database.
                               3. All the 40 users are provided with
                               their user name and password
                               4. The users are given access rights to
                               the various applications
Post-Conditions:         All 40 users are able to open up all desired applications (regular
                         applications) on their terminal. (The application is considered open
                         once the user can interact with it)
Input Specifications:    1. The users enter the username and password to login to the system.
                         2. The user opens up any of the regular applications.
Expected Output          The various regular application desired by all 40 users open up on
Specifications:          their respective terminals.
Pass / Fail Criteria:    The test is successful if on an average the users are able to open an
                         instance of the regular application within 10 seconds
                         The test fails if on an average it takes more than 10 seconds to open
                         up an instance of the regular application
Test process:            After user are logged in to the system:
                               1. Each user starts an instance of various applications on their
                               terminal.
                               2. We will monitor and record the system response time for the
                               different type of application at different level of concurrent
                               users.



CAR_LCA_F05a_T03_V06.50                     47                                    12/05/05
COTS Assessment Report                                                      Version 6.50

                             1. The Pre-Conditions are fulfilled.
Assumptions and
                             2. All users will perform about the same amount of work on the
Constraints:
                                system.
                             3. The users are not running any other high-end applications in
                                background
Dependencies:            Number of concurrent users (AT-S01-1)
                         Network bandwidth (AT-S01-5)
Traceability:            CAB Document Section LOS-1
                         Table 21 Server Test Procedure Specification 1-4




CAR_LCA_F05a_T03_V06.50                    48                                  12/05/05
COTS Assessment Report                                                        Version 6.50



Test Case:               1-5
Identifier:              AT-S01-5
Test Items:              Network bandwidth
Test Description:        This test case evaluates the network bandwidth being used by each
                         thin client in the network while running both high-end multimedia
                         and regular applications.
Pre-Conditions:          The following functions will be performed before the test:
                               1. All the terminals must be connected to the network and the
                               network should be up and running.
                               2. The username and password for all the 40 users are present in
                               the database.
                               3. All the 40 users are provided with
                               their user name and password
                               4. The users are given access rights to
                               the various applications
Post-Conditions:         The server is still up and running even after all 40 users run different
                         multimedia and regular applications on their terminals.
Input Specifications:    1. The users enter the username and password to login to the system.
                         2. The user opens up an instance of all the different multimedia and
                         regular applications.
Expected Output          All different applications are opened up.
Specifications:
Pass / Fail Criteria:    The test is successful if on an average the bandwidth consumed by
                         each client is equal to or less than 2 Mbps
                         The test fails if each client consumes more than 2 Mbps of network
                         bandwidth.
Test process:            After all user are logged in to the system:
                               1. Each user starts an instance of various applications (both high-
                               end multimedia and regular applications) on their terminal.
                               2. Once the users have finished launching all the applications,
                               we will monitor and record the bandwidth usage of all clients
                               and calculate and average.

                                 1. The Pre-Conditions are fulfilled.
Assumptions and
                                 2. All users will perform about the same amount of work on
Constraints:
                                    the system. I.e. there are no applications running in the


CAR_LCA_F05a_T03_V06.50                     49                                    12/05/05
COTS Assessment Report                                                      Version 6.50

                                    background.
Dependencies:            None
Traceability:            CAB Document Section 4 CAP-3 & CAP-4
                         Table 22 Server Test Procedure Specification 1-5




CAR_LCA_F05a_T03_V06.50                    50                                  12/05/05
COTS Assessment Report                                                        Version 6.50

The following tables indicate the test procedures for the cost attribute (AT-S02):

 Test Case:                2-1
 Identifier:               AT-S02-1
 Test Items:               Initial Purchase Cost
 Test Description:         This test will compare the initial cost of ownership of the server
                           against the client‟s expected cost, as per the client‟s budget
 Pre-conditions:           The following pre-conditions must be met before the following test
                           can be performed:
                                  1. COTS product must be available from COTS vendor
                                  2. COTS vendor must be able to supply price information
                                      for COTS product.
                                  3. A communication channel (phone number or email
                                      address) must be established between the customer and
                                      the COTS vendor prior to the test.
                                  4. Customer needs to set an expected price range for
                                      purchasing COTS items.
 Post-conditions:          The following function will be preformed after the test procedure.
                                  1. Obtain a price quote for 1 unit of the Server COTS
                                     product.
 Input Specifications:     The following information will be added to perform the test:
                                  1. Customer calls or emails COTS vendor indicating her
                                     interest in purchasing a COTS product.
                                  2. Customer supply parts number to COTS vendor for
                                     product lookup.
 Expected Output           The following information are the expected output:
 Specifications:                  1. COTS vendor will supply price quote (in US dollar
                                     amounts)
 Pass/Fail Criteria:       The following information is the pass/fail criteria for testing:
                                  1. COTS item availability (in stock, back order,
                                     discontinued, etc.)
                                  2. COTS item price falls within the initial cost price range
                                     set by the customer.
 Test Process:             The following list the test process:
                                  1. Customer contact COTS vendor via phone or email.
                                  2. Customer checks the availability of COTS items with
                                     COTS vendor.
                                  3. Customer request price quote for one unit of COTS item.
                                  4. Customer compares price against initial estimated cost
                                     price.
 Assumptions and           The following list the assumptions and constraints for the test:



CAR_LCA_F05a_T03_V06.50                     51                                       12/05/05
COTS Assessment Report                                                      Version 6.50

constraints:                     1. COTS vendor is willing to disclose pricing information
                                 2. Pricing information for COTS item must not change
                                    dramatically during the duration of the test.
                                 3. Final price quote must not exceed the maximum
                                    expected price range set by the customer.
Dependencies:            None
Traceability:            None
                         Table 23 Server Test Procedure Specification 2-1




CAR_LCA_F05a_T03_V06.50                    52                                  12/05/05
COTS Assessment Report                                                     Version 6.50



Test Case:               2-2
Identifier:              AT-S02-2
Test Items:              Upgrade Cost
Test Description:        This test will compare the upgrade cost of the server against the
                         client‟s expected upgrade cost, as per the client‟s budget
Pre-conditions:          The following pre-conditions must be met before the following test
                         can be performed:
                                1. COTS product upgrade parts must be available from
                                    COTS vendor.
                                2. COTS vendor must be able to supply price information
                                    for COTS product upgrade parts.
                                3. A communication channel (phone number or email
                                    address) must be established between the customer and
                                    the COTS vendor prior to the test.
                                4. Customer needs to set an expected price range for
                                    purchasing COTS items upgrade.
                                5. COTS upgrades must be compatible with current COTS
                                    product.
                                6. Current COTS product evaluated must be able to have
                                    support for upgrades (software or hardware).
Post-conditions:         The following function will be preformed after the test procedure.
                                1. Obtain a price quote for upgrade of 1 unit of a particular
                                   COTS product or module.
Input Specifications:    The following information will be added to perform the test:
                                1. Customer calls or emails COTS vendor indicating
                                   interest in purchasing a COTS item upgrade.
                                2. Customer supply parts number to COTS vendor for
                                   lookup of the product that is intended to be upgraded.
Expected Output          The following information are the expected output:
Specifications:                 1. COTS vendor will supply price quote (in US dollar
                                   amounts) for COTS upgrade.
Pass/Fail Criteria:      The following information is the pass/fail criteria for testing:
                                1. COTS upgrade item availability (in stock, back order,
                                   discontinued, etc.)
                                2. COTS upgrade item price falls within the upgrade cost
                                   price range set by the customer.
                                3. COTS upgrade item compatibility with current COTS
                                   products.
Test Process:            The following list the test process:
                                1. Customer contact COTS vendor via phone or email.
                                2. Customer checks the availability of COTS upgrade items


CAR_LCA_F05a_T03_V06.50                  53                                    12/05/05
COTS Assessment Report                                                      Version 6.50

                                    with COTS vendor or weather items has upgrade options
                                    available.
                                 3. Customer request price quote for upgrades of one unit of
                                    COTS item.
                                 4. Customer compare upgrade price against initial estimated
                                    upgrade cost price.


Assumptions and          The following list the assumptions and constraints for the test:
constraints:                    1. COTS vendor is willing to disclose upgrade pricing
                                   information
                                2. Pricing information for COTS upgrade item must not
                                   change dramatically during the duration of the test.
                                3. Final upgrade price quote must not exceed the maximum
                                   expected price range set by the customer.
                                4. COTS upgrades depends on existing COTS products and
                                   must be compatible with existing COTS products.
Dependencies:            None
Traceability:            None
                         Table 24 Server Test Procedure Specification 2-2




CAR_LCA_F05a_T03_V06.50                    54                                  12/05/05
COTS Assessment Report                                                     Version 6.50



Test Case:               2-3
Identifier:              AT-S02-3
Test Items:              Annual Maintenance Cost
Test Description:        This test will compare the maintenance cost against the client‟s
                         expected maintenance budget.
Pre-conditions:          The following pre-conditions must be met before the following test
                         can be performed:
                                1. COTS vendor must have a maintenance plan.
                                2. COTS vendor must be able to supply maintenance price
                                    information for COTS product.
                                3. A communication channel (phone number or email
                                    address) must be established between the customer and
                                    the COTS vendor prior to the test.
                                4. Customer needs to set an expected price range for
                                    maintenance COTS items.
Post-conditions:         The following function will be preformed after the test procedure.
                                1. Obtain a price quote for maintenance cost for COTS
                                   system for one academic year.


Input Specifications:    The following information will be added to perform the test:
                                1. Customer calls or emails COTS vendor indicating
                                   interest in purchasing a one year contract for COTS item
                                   maintenance.
Expected Output          The following information are the expected output:
Specifications:                 1. COTS vendor will supply price quote (in US dollar
                                   amounts) for one year maintenance fee.
Pass/Fail Criteria:      The following information is the pass/fail criteria for testing:
                                1. Availability of a maintenance plan from COTS vendor
                                   for COTS system.
                                2. COTS system maintenance price falls within the annual
                                   maintenance budget price range set by the customer.
Test Process:            The following list the test process:
                                1. Customer contact COTS vendor via phone or email.
                                2. Customer asks for the availability of COTS maintenance
                                   plan.
                                3. Customer request price quote for system maintenance for
                                   one academic year.
                                4. Customer compares support price against the annual
                                   budget dedicated to supporting COTS system.



CAR_LCA_F05a_T03_V06.50                  55                                    12/05/05
COTS Assessment Report                                                      Version 6.50


Assumptions and          The following list the assumptions and constraints for the test:
constraints:                    1. COTS vendor is willing to disclose upgrade pricing
                                   information
                                2. Pricing information for COTS upgrade item must not
                                   change dramatically during the duration of the test.
                                3. Final upgrade price quote must not exceed the maximum
                                   expected price range set by the customer.
                                4. COTS system support plan may be depending upon
                                   COTS vendor.
                                5. COTS support plan may require a signing a year
                                   contract.
Dependencies:            None
Traceability:            CAB Document Section 2.4 PC-1
                         Table 25 Server Test Procedure Specification 2-3




CAR_LCA_F05a_T03_V06.50                    56                                  12/05/05
COTS Assessment Report                                                       Version 6.50

The following tables indicate the test procedures for the intercomponent compatibility attribute
(AT-S03):

 Test Case:                3-1
 Identifier:               AT-S03-1
 Test Items:               Compatibility of Wayang Outpost to run in the system
                           environment
 Test Description:         This test will test the compatibility of the system for Wayang
                           Outpost, a flash based web application
 Pre-Conditions:           The following conditions have to be met before we can test
                           the item:
                               1. Authentication server is up and running
                               2. Application server is up and running
                               3. Local network is running properly between server and
                                   client machines
                               4. Internet connection is available
                               5. User has logged into the client machine
                               6. Standard browser is available (IE, Mozilla, Netscape)
                               7. Wayang Outpost service website is available
 Post-Conditions:          User is able to use all the available services provided by
                           Wayang Outpost, including sound and animation interactions.
 Input Specifications:     The following tasks will be performed:
                              1. Sound check for the system
                                      The “sound check” button available on the login
                                       screen
                              2. Login to Wayang Outpost
                                      Enter username
                                      Enter password
                              3. Do geometry test
                                      Choose or enter correct answer
                                      Choose or enter incorrect answer
 Expected Output           The following outcome will be expected by the user:
 Specifications:              1. Sound check for the system
                                      A short audio clip will be played
                              2. Login to Wayang Outpost
                                      The welcome screen shows up after user logs in
                              3. Do geometry test


CAR_LCA_F05a_T03_V06.50                     57                                   12/05/05
COTS Assessment Report                                                       Version 6.50


                                    Correct answer mark and time spent on the
                                     question is recorded
                                    Wrong answer mark is recorded and hint is given
Pass / Fail Criteria:    The following information is the pass/fail criteria for testing:
                            1. Sound check for the system
                                    User should be able to hear to audio from the audio
                                     device on the system (speaker or headset)
                            2. Login to Wayang Outpost
                                    User should be able to login to Wayang Outpost
                                     with valid username and password
                            3. Do geometry test
                                    User should receive correct marks in the record as
                                     well as the time he spent for question he answered
                                     correctly
                                    User should receive hint about the question and
                                     need to choose or enter another answer to proceed
Test process:               1. Click on the browser icon on desktop
                            2. Type in the URL for Wayang Outpost
                               (http://kulit.isi.edu/#)
                            3. Click on the “LOG IN” button
                            4. Click on “sound check” button
                            5. Click on “ok” to confirm the sound check result
                            6. Enter user name and password in the pop-up window
                               and click on “log in” button
                            7. Click on “nursery adventure” icon
                            8. Enter correct answer and click on “Next” button
                            9. Enter incorrect answer and click on “Next” button
Assumptions and          All users have the access to the internet on the client machine
Constraints:             and will be provided a proper set of username and password
                         for the Wayang Outpost login
Dependencies:            Interoperability of application processing (AT-S04-2)
Traceability:            CAB Document Section 4 CAP-1
                         Table 26 Server Test Procedure Specification 3-1




CAR_LCA_F05a_T03_V06.50                    58                                    12/05/05
COTS Assessment Report                                                      Version 6.50



Test Case:               3-2
Identifier:              AT-S03-2
Test Items:              Compatibility of MS Office Suite to run in the system environment
Test Description:        This test will test the capability of the system for MS Office Suite,
                         including the use of Word, Excel, and PowerPoint
Pre-Conditions:          The following conditions have to be met before we can test the item:
                            1. Authentication server is up and running
                            2. Application server is up and running
                            3. Local network is running properly between server and client
                                machines
                            4. User has logged into the client machine
Post-Conditions:         User is able to use all the standard functions provided by MS Office
                         Suite, including creating a new document, open, edit and save an
                         existing document.
Input Specifications:    None
Expected Output          None
Specifications:
Pass / Fail Criteria:    None


Test process:            None
Assumptions and          Since all the COTS candidate chosen are windows based machines,
Constraints:             the compatibility of MS Office Suite is not an issue and the test
                         specification detail is neglected
Dependencies:            Interoperability of application processing (AT-S04-2)
Traceability:            None
                         Table 27 Server Test Procedure Specification 3-2




CAR_LCA_F05a_T03_V06.50                    59                                   12/05/05
COTS Assessment Report                                                      Version 6.50



Test Case:               3-3
Identifier:              AT-S03-3
Test Items:              Compatibility of Renaissance Place (Accelerated Reader) to run in
                         the system environment
Test Description:        This test will test the capability of the system for Renaissance Place
Pre-Conditions:          The following conditions have to be met before we can test the item:
                            1. Authentication server is up and running
                            2. AR server is up and running
                            3. Local network is running properly between server and client
                                machines
                            4. User has logged into the client machine
                            5. Printer & Printer Services must be installed and functioning
                                properly.
Post-Conditions:         User is able to do quizzes and tests on accelerated reader based on
                         the assigned library books and receive report about the reading
Input Specifications:    The following tasks will be performed:
                           1. Login to Accelerated Reader
                                   Enter username
                                   Enter password
                           2. Do reading test
                                   Enter or choose correct answer
                                   Enter or choose wrong answer
                           3. Generate report
                                   Use the print report button
Expected Output          The following tasks will be performed:
Specifications:            1. Login to Accelerated Reader
                                   Main menu will show up after login to Accelerated
                                    Reader
                           2. Do reading test
                                   Correct answer mark and time spent on the question is
                                    recorded
                                   Wrong answer mark and time spent on the question is
                                    recorded
                           3. Generate report
                                   Report generated for the user


CAR_LCA_F05a_T03_V06.50                  60                                     12/05/05
COTS Assessment Report                                                       Version 6.50



Pass / Fail Criteria:    The following information is the pass/fail criteria for testing:
                            1. Login to Accelerated Reader
                                    User should be able to login Accelerated Reader with
                                     valid username and password
                            2. Do reading test
                                    User should receive correct marks in the record as well
                                     as the time he spent for question he answered correctly
                                    User should receive wrong marks in the record as well as
                                     the time he spent for question he answered incorrectly
                            3. Generate report
                                    User should be able to generate the report by using the
                                     print report button
Test process:               1. Click on the Accelerated Reader icon on desktop
                            2. Enter user name and password
                            3. Click on the “LOG IN” button
                            4. Click on “Reading Test” button
                            5. Enter correct answer and click “Next”
                            6. Enter incorrect answer and click “Next”
                            7. Click on “Finish” button
                            8. Click on “Print Report” button
Assumptions and          All users will be provided a proper set of username and password
Constraints:             for the Accelerated Reader login
Dependencies:            Interoperability of application processing (AT-S04-2)
Traceability:            None
                         Table 28 Server Test Procedure Specification 3-3




CAR_LCA_F05a_T03_V06.50                    61                                    12/05/05
COTS Assessment Report                                                      Version 6.50



Test Case:               3-4
Identifier:              AT-S03-4
Test Items:              Compatibility of Choice to run in the system environment
Test Description:        This test will test the capability of the system for Choice
Pre-Conditions:          The following conditions have to be met before we can test the item:
                            1. Authentication server is up and running
                            2. Application server is up and running
                            3. Local network is running properly between server and client
                                machines
                            4. User has logged into the client machine
                            5. Obtained proper license for the Choice application
Post-Conditions:         User is able to use all the available functions provided by Choice
Input Specifications:    N/A
Expected Output          N/A
Specifications:
Pass / Fail Criteria:    N/A


Test process:            N/A
Assumptions and          The current license for Choice application has expired, therefore no
Constraints:             testing can be done at the time of the testing
Dependencies:            N/A
Traceability:            N/A
                         Table 29 Server Test Procedure Specification 3-4




CAR_LCA_F05a_T03_V06.50                    62                                    12/05/05
COTS Assessment Report                                                        Version 6.50

The following tables indicate the test procedures for the interoperability attribute (AT-S04):

 Test Case:                 4-1
 Identifier:                AT-S04-1
 Test Items:                Interoperability of authentication process
 Test Description:          This test will test the ability of information being exchanged
                            between server and clients during the authentication process
 Pre-Conditions:            The following conditions have to be met before we can test the item:
                               1. Authentication server is up and running
                               2. Local network is running properly between server and client
                                   machines
                               3. User profile has been pre-entered by system administrator
 Post-Conditions:           User can login to the system
 Input Specifications:      User will enter the username and password to login to the system
 Expected Output            User will be able to login with valid username and password, or be
 Specifications:            prompted with error message to correct the information if username
                            and password are invalid.
 Pass / Fail Criteria:      User login information will be validated and user can login into the
                            system with proper username and password
                                1. Enter valid username and password and click ok
 Test process:
                                2. Enter invalid username and/or password and click ok
 Assumptions and            None
 Constraints:
 Dependencies:              None
 Traceability:              CAB Document Section 4 CAP-2
                           Table 30 Server Test Procedure Specification 4-1




CAR_LCA_F05a_T03_V06.50                      63                                    12/05/05
COTS Assessment Report                                                      Version 6.50



Test Case:               4-2
Identifier:              AT-S04-2
Test Items:              Interoperability of application processing
Test Description:        This test will test the ability of information being exchanged
                         between server and clients when client machines request to run an
                         application hosted on the server
Pre-Conditions:          The following conditions have to be met before we can test the item:
                            1. Authentication server is up and running
                            2. Application server is up and running
                            3. Local network is running properly between server and client
                                machines
                            4. User has logged into the client machine
Post-Conditions:         User has access to all the applications hosted on the server
Input Specifications:    User will run the applications hosted on the server from the client
                         machines by clicking the icons on the desktop
Expected Output          User can run all the applications
Specifications:
Pass / Fail Criteria:    User login information will be validated and user can login into the
                         system with proper username and password
                             1. Enter valid username and password and click ok
Test process:
                             2. Enter invalid username and/or password and click ok
Assumptions and          None
Constraints:
Dependencies:            Interoperability of Active Directory service (AT-S04-3)
Traceability:            None
                         Table 31 Server Test Procedure Specification 4-2




CAR_LCA_F05a_T03_V06.50                    64                                   12/05/05
COTS Assessment Report                                                         Version 6.50



Test Case:               4-3
Identifier:              AT-S04-3
Test Items:              Interoperability of Active Directory service
Test Description:        This test will test the ability of information being exchanged
                         between server and clients when client machines request any
                         resource on the network
Pre-Conditions:          The following conditions have to be met before we can test the item:
                            1. Authentication server is up and running
                            2. Local network is running properly between server and client
                                machines
                            3. User has logged into the client machine
Post-Conditions:         User has access to different network resources according to the user
                         access level assigned in the Active Directory database
Input Specifications:    The following network resources will be requested by the user:
                           1. Application server
                                   Open MS Office from desktop
                                   Open Choice from desktop
                           2. AR server
                                   Open Accelerated Reader from desktop
                           3. Print service
                                   Print a document
                           4. User profile/folder
                                   Open a file on desktop
                                   Save a file on desktop
Expected Output          The following outcome will be expected:
Specifications:            1. Application server
                                   Open MS Office from desktop
                                        i.        MS Office will open if user has the access right
                                       ii.        MS Office will not open if user does not have
                                                  the access right
                                   Open Choice from desktop
                                        i.        Choice will open if user has the access right
                                       ii.        Choice will not open if user does not have the
                                                  access right


CAR_LCA_F05a_T03_V06.50                      65                                    12/05/05
COTS Assessment Report                                                               Version 6.50


                            2. AR server
                                    Open Accelerated Reader from desktop
                                       i.         Accelerated Reader will open if user has the
                                                  access right
                                      ii.         Accelerated Reader will not open if user does not
                                                  have the access right
                            3. Print service
                                    Print a document
                                         i.       User with access right can print out from a
                                                  network printer
                                         ii.      User without access right can‟t print out from a
                                                  network printer
                            4. User profile/folder
                                    Open a file on desktop
                                            i.        User can open any available files on the desktop
                                    Save a file on desktop
                                            i.        User can save file if logged in as regular user
                                            ii.       User can‟t save file if logged in as guest
Pass / Fail Criteria:    User will be granted access right to the network resources by the
                         Active Directory service according to the role in the Active
                         Directory profile.
                            1. Click on MS Word icon on the desktop
Test process:
                            2. Click on Open icon in MS Word
                            3. Select a file from desktop to open
                            4. Click on Print icon in MS Word
                            5. Click on Save icon in MS Word
                            6. Click on Choice icon on the desktop
                            7. Click on Accelerated Reader icon on the desktop
                            8. Click on Choice icon on the desktop
Assumptions and          None
Constraints:
Dependencies:            Interoperability of Active Directory service (AT-S04-3)
Traceability:            None
                         Table 32 Server Test Procedure Specification 4-3




CAR_LCA_F05a_T03_V06.50                          66                                      12/05/05
COTS Assessment Report                                                     Version 6.50



Test Case:               4-4
Identifier:              AT-S04-4
Test Items:              Print Service
Test Description:        This test will evaluate COTS support for printer services.
Pre-conditions:          The following pre-conditions must be met before the following test
                         can be performed:
                                1. At least one printer is attached to the network, and is
                                    functioning properly.
                                2. Each printer on the network has a unique name.
                                3. System has support for word processing capability with
                                    printer support.
                                4. User has access rights to printer through Active
                                    Directory Services.
                                5. The printer is fully supported by Windows with the
                                    Windows compatible drivers.
Post-conditions:         The following function will be performed after the test procedure:
                                1. Obtain a test page document printed from any client
                                   terminal.
                                2. Obtain a document printed using MS Office or similar
                                   word processor.
Input Specifications:    The following information will be added to perform the test:
                                1. Print jobs submitted to the print services
Expected Output          The following information are the expected output:
Specifications:                 1. Print Services will output Test Page to printer.
                                2. Print Services will output printed word document to
                                   printer.
Pass/Fail Criteria:      The following information is the pass/fail criteria for testing:
                                1. Option to setup print services.
                                2. Printer can be made to share between all clients.
                                3. Print Jobs can be queued or spooled
                                4. Print Jobs has assigned priority (High, Medium, Low)
                                5. Print Jobs can be canceled
Test Process:            The following list the test process:
                                1. User open up Windows Print Manager
                                2. User select print Test Page
                                3. User open up MS Office or Word Processor
                                4. User type in sample text “Hello World”
                                5. User Select File  Print
                                6. User Open Print Manager
                                7. User Assign priority to print jobs


CAR_LCA_F05a_T03_V06.50                  67                                     12/05/05
COTS Assessment Report                                                      Version 6.50


Assumptions and          The following list the assumptions and constraints for the test:
constraints:                    1. Printer must have network capabilities or connected to a
                                   print server.
                                2. Printer services must be able to comply with Windows
                                   Print Services.
                                3. Printers on the network are setup to have unique name
Dependencies:            Interoperability of Active Directory service (AT-S04-3)
Traceability:            CAB Document Section 4 CAP-2
                         Table 33 Server Test Procedure Specification 4-4




CAR_LCA_F05a_T03_V06.50                    68                                  12/05/05
COTS Assessment Report                                                      Version 6.50



Test Case:               4-5
Identifier:              AT-S04-5
Test Items:              Creating User Profile
Test Description:        This test will create a new user profile in the Active Directory.
Pre-conditions:          The following pre-condition must be met before the following test
                         can be performed:
                                1.   System has been setup to include Microsoft Active
                                     Directory Services, and Active Directory Services has
                                     been started.
                                2.   System must be running Windows Server 2003.
                                3.   Initial new user profile settings are empty.
                                4.   User is a student or faculty member at PHS.
                                5.   Administrative privileges are given to the System
                                     Administrator to create new user profile.
                                6.   System already has group policies for the following
                                     group (Student, Faculty, Administrator)
Post-conditions:         The following function will be performed after the test procedure.
                                1. The new user profile (including username, password, and
                                   group policies) appears in Active Directory.
                                2. A folder with the username will be created in the Active
                                   Directory Server or File Server to serve as the user‟s
                                   home folder.
                                3. Each profile will be assigned a set disk storage quota.
                                4. Student accounts will set to expire after 1 academic year.
                                5. Faculty accounts will only expire if the faculty is no
                                   longer an employee at PHS.
Input Specifications:    The following information will be added to perform the test:
                                1. New profile information including:
                                        Username
                                        Password
                                        User ID
                                        Status (Student or Faculty)
                                        Assigned group policy (Students, Faculty,
                                         Administrator)



CAR_LCA_F05a_T03_V06.50                   69                                    12/05/05
COTS Assessment Report                                                       Version 6.50


Expected Output          The following information are the expected output:
Specifications:                  1. Active Directory Services created a user profile in the
                                    database.
                                 2. A folder with the username is created to serve as the
                                    home folder for that user.
Pass/Fail Criteria:      The following information is the pass/fail criteria for testing:
                                 1. Support for Active Directory Services
                                 2. System resources (storage space) allows for creation of
                                    new user folders for all students and faculty at PHS.
                                 3. Disk quota per user cannot be less than 15 megabytes.
                                 4. User can only view and access their own folders, and
                                    cannot view or access folders of other users.
                                 5. System administrator has the rights to access all folders.
Test Process:            The following list the test process:
                                 1. System administrator launches Active Directory
                                    Services.
                                 2. System administrator chooses to create a new user
                                    profile.
                                 3. System administrator add (username, password, user id,
                                    status) information to the new account.
                                 4. System administrator associates the new account with the
                                    current group policies.
                                 5. System administrator confirms the creation of a new user
                                    profile.
Assumptions and          The following list the assumptions and constraints for the test:
constraints:                     1. User profile does not currently exist in the system.
                                 2. User is authorized only by the administrator.
                                 3. Only System Administrator can create new user profile.
                                 4. Each user profile must be unique.
                                 5. Disk quota per user is set to be 15 megabytes.
Dependencies:            Interoperability of Active Directory service (AT-S04-3)
Traceability:            None
                         Table 34 Server Test Procedure Specification 4-5




CAR_LCA_F05a_T03_V06.50                    70                                    12/05/05
COTS Assessment Report                                                         Version 6.50

The following tables indicate the test procedures for the vendor support attribute (AT-S05):

 Test Case:                5-1
 Identifier:               AT-S05-1
 Test Items:               Response time for critical problems
 Test Description:         This test will find out the average response time taken by the vendor
                           to fix the critical problems faced by the customer.
 Pre-conditions:           The following pre-condition must be met before the following test
                           can be performed:
                                   1. Identify and obtain the contact person‟s name,
                                      Department, telephone number and email address from
                                      the COTS vendor
                                   2. A correspondence must have already occurred and the
                                      communications channel (telephone or email) must have
                                      been established between the customer and the COTS
                                      vendor prior to the test.
                                   3. Both the customer and the COTS vendor must be aware
                                      of the contractual bindings and/or the warranty
                                      obligations of the vendor to the customer. The problem
                                      to be fixed should fall in this scope.
 Post-conditions:          The following function will be performed after the test procedure.
                                   1. The vendor would have responded, logged-in a call, and
                                      given a date and time to the customer for looking into the
                                      problem.
                                   2. The vendor would have provided the support on the date
                                      and time confirmed by them with the customer when the
                                      call was logged in.
 Input Specifications:     The following information will be added to perform the test:
                                   1. Customer will contact the COTS vendor by phone or
                                      email and „login a call‟ indicating that there is a critical
                                      problem which has arisen in the customer site and needs
                                      immediate attention.
                                   2. Customer will supply more information (if required) like
                                      warranty details, known details of the problem etc to
                                      COTS vendor.
 Expected Output           The following information are the expected output:
 Specifications:                   1. COTS vendor will login a call and give the customer a
                                      call priority number as well as the date and time on
                                      which they will be attending this call.


CAR_LCA_F05a_T03_V06.50                     71                                     12/05/05
COTS Assessment Report                                                       Version 6.50


                                2. COTS vendor will attend to the call on the mentioned
                                   date and time, and the problem will be solved (or advice
                                   will be given to the customer for the future course of
                                   action).
Pass/Fail Criteria:      The following information is the pass/fail criteria for testing:
                                1. The COTS vendor will respond within a timeframe
                                   already being discussed and agreed upon in the contract
                                   or warranty card.
                                2. COTS vendor will solve the problem or take any other
                                   required action on the customer site on the date and time
                                   as per their agreement during the call log-in.
Test Process:            The following list the test process:
                                1. Customer contact COTS vendor via phone or email.
                                2. Customer „login a call‟ with the COTS vendor indicating
                                   that there is a critical problem which has arisen in the
                                   customer site which needs immediate attention.
                                3. Customer compares the time taken for the initial
                                   response with the expected response time as per the
                                   contract or warranty card.
                                4. Customer compares the date and time on which the
                                   COTS vendor takes the required action on the customer
                                   site and solves the problem, to the expected response
                                   time as per the contract or warranty card.
                                5. Also the customer compares the date and time on which
                                   the COTS vendor take the required action on the
                                   customer site and solve the problem to that promised by
                                   the COTS vendor when the call was logged-in.


Assumptions and          The following list the assumptions and constraints for the test:
constraints:                    1. COTS vendor will not have any abnormal activities
                                   going on in their organization like a company merger,
                                   department take-over etc which may in normal case
                                   delay the response time.
                                2. COTS vendor contact is working at his/her regular
                                   workload and there is nothing unusual happening with
                                   his work like a much increased workload. This can really
                                   slow down the response in many cases.
                                3. The problem to be fixed should fall in the scope of the
                                contractual bindings and/or the warranty obligations of the
                                vendor to the customer.


CAR_LCA_F05a_T03_V06.50                   72                                     12/05/05
COTS Assessment Report                                                      Version 6.50


Dependencies:            None
Traceability:            None
                         Table 35 Server Test Procedure Specification 5-1




CAR_LCA_F05a_T03_V06.50                    73                                  12/05/05
COTS Assessment Report                                                      Version 6.50



Test Case:               5-2
Identifier:              AT-S05-2
Test Items:              Remote assistance
Test Description:        This test will find out the effectiveness of the remote assistance
                         provided by the COTS vendor.
Pre-conditions:          The following pre-condition must be met before the following test
                         can be performed:
                                1. Identify and obtain the contact person‟s name,
                                   Department, telephone number and email address from
                                   the COTS vendor
                                2. Both the customer and the COTS vendor must be aware
                                   of the contractual bindings and/or the warranty
                                   obligations of the vendor to the customer. The problem
                                   to be fixed by remote assistance should fall in this scope.
                                3. A time for calling the COTS vendor should be fixed
                                   before itself if it is necessary.
Post-conditions:         The following function will be performed after the test procedure.
                                1. The vendor would have responded to the call and
                                   provided the support then itself during the same call, or
                                   given a date and time to call him for looking into the
                                   problem.
                                2. The vendor would have solved the problem remotely, or
                                   provided with an alternate solution to the customer for
                                   future actions.
Input Specifications:    The following information will be added to perform the test:
                                1. Customer will contact the COTS vendor by phone or
                                   email and „login a call‟ indicating that there is a problem
                                   to be solved remotely which has arisen in the customer
                                   site which needs immediate attention.
                                2. Customer will supply more information (if required) like
                                   warranty details, known details of the problem etc to
                                   COTS vendor by telephone or email if necessary.
                                3. Customer will supply more details on the problem to the
                                   vendor if necessary.
Expected Output          The following information are the expected output:
Specifications:                 1. COTS vendor will login a call and give the customer
                                   remote assistance in solving the problem.



CAR_LCA_F05a_T03_V06.50                   74                                    12/05/05
COTS Assessment Report                                                       Version 6.50


                                 2. The vendor will solve the problem remotely, or provide
                                    with an alternate solution to the customer for future
                                    actions
Pass/Fail Criteria:      The following information is the pass/fail criteria for testing:
                                 1. The COTS vendor will respond within a timeframe
                                    already being discussed and agreed upon in the contract
                                    or warranty card for remote assistance.
                                 2. COTS vendor will solve the problem or take any other
                                    required action on the customer site by doing remote
                                    support to prove the effectiveness of support.
Test Process:            The following list the test process:
                                 1. Customer contact COTS vendor via phone or email.
                                 2. Customer „login a call‟ with the COTS vendor indicating
                                    that there is a problem which has arisen in the customer
                                    site which needs remote assistance.
                                 3. Customer compare the time taken for the response with
                                    the expected response time as per the contract or
                                    warranty card for remote assistance
                                 4. The customer confirms that the problem has been solved
                                    and the support has been effective.
Assumptions and              The following list the assumptions and constraints for the test:
constraints:                     1. Pre-conditions are met.
                                 2. The problem to be fixed should fall in the scope of the
                                    contractual bindings and/or the warranty obligations of
                                    the vendor to the customer
                                 3. The problem identified in the system to make a test case
                                    should be fit enough to be made a remote support call.
Dependencies:            None
Traceability:            None
                         Table 36 Server Test Procedure Specification 5-2




CAR_LCA_F05a_T03_V06.50                    75                                    12/05/05
COTS Assessment Report                                                       Version 6.50



Test Case:               5-3
Identifier:              AT-S05-3
Test Items:              Hardware support
Test Description:        This test will find out the efficiency of the vendor to fix the
                         hardware related issues
Pre-conditions:          The following pre-condition must be met before the following test
                         can be performed:
                                1. Identify and obtain the contact person‟s name,
                                   Department, telephone number and email address from
                                   the COTS vendor
                                2. A correspondence must have already occurred and the
                                   communications channel (telephone or email) established
                                   between the customer and the COTS vendor prior to the
                                   test.
                                3. Both the customer and the COTS vendor must be aware
                                   of the contractual bindings and/or the warranty
                                   obligations of the vendor to the customer. The problem
                                   to be fixed should fall in this scope.
Post-conditions:         The following function will be performed after the test procedure.
                                1. The vendor would have responded, logged-in a call, and
                                   given a date and time to the customer for looking into the
                                   problem.
                                2. The vendor would have provided the support on the date
                                   and time confirmed by them with the customer when the
                                   call was logged in.
                                3. The problem would have got solved.
Input Specifications:    The following information will be added to perform the test:
                                1. Customer will contact the COTS vendor by phone or
                                   email and „login a call‟ indicating that there is a
                                   hardware problem which has arisen in the customer site
                                   which needs immediate attention.
                                2. Customer will supply more information (if required) like
                                   warranty details, known details of the problem etc to
                                   COTS vendor.
Expected Output          The following information are the expected output:
Specifications:                 1. COTS vendor will login a call and give the customer a
                                   call priority number as well as the date and time on
                                   which they will be attending this call.


CAR_LCA_F05a_T03_V06.50                   76                                     12/05/05
COTS Assessment Report                                                       Version 6.50


                                 2. COTS vendor will attend to the call on the mentioned
                                    date and time, and the problem will be solved (or advise
                                    given to the customer for the future course of action).
Pass/Fail Criteria:      The following information is the pass/fail criteria for testing:
                                 1. The COTS vendor will respond within a timeframe
                                    already being discussed and agreed upon in the contract
                                    or warranty card.
                                 2. COTS vendor will solve the problem or take any other
                                    required action on the customer site on the date and time
                                    as per their agreement during the call log-in.
Test Process:            The following list the test process:
                                 1. Customer contact COTS vendor via phone or email.
                                 2. Customer „login a call‟ with the COTS vendor indicating
                                    that there is a hardware which has arisen in the customer
                                    site.
                                 3. Customer compares the time taken for the initial
                                    response with the expected response time as per the
                                    contract or warranty card.
                                 4. Customer compares the date and time on which the
                                    COTS vendor takes the required action on the customer
                                    site and solves the problem, to the expected response
                                    time as per the contract or warranty card.
                                 5. Also the customer compares the date and time on which
                                    the COTS vendor take the required action on the
                                    customer site and solve the problem to that promised by
                                    the COTS vendor when the call was logged-in.
                                 6. The customer confirms that the hardware problem has
                                    been solved and the support has been effective
Assumptions and          The following list the assumptions and constraints for the test:
constraints:                     1. COTS vendor will not have any abnormal activities
                                    going on in their organization like a company merger,
                                    department take-over etc which may in normal case
                                    delay the response time.
                                 2. COTS vendor contact is working at his/her regular
                                    workload and there is nothing unusual happening with
                                    his work like a much increased workload. This can really
                                    slow down the response in many cases.
                                 3. The pre-conditions are met.
                                 4. If the hardware is to be replaced, the COTS vendor has it


CAR_LCA_F05a_T03_V06.50                   77                                     12/05/05
COTS Assessment Report                                                      Version 6.50

                                     in stock.
Dependencies:            None
Traceability:            None
                         Table 37 Server Test Procedure Specification 5-3




CAR_LCA_F05a_T03_V06.50                    78                                  12/05/05
COTS Assessment Report                                                     Version 6.50



Test Case:               5-4
Identifier:              AT-S05-4
Test items:              Software bundling
Test Description:        This test will measure the capability of the vendor in supplying the
                         software upgrades to the customer
Pre-conditions:          The following pre-condition must be met before the following test
                         can be performed:
                                1. Identify and obtain the contact person‟s name,
                                    Department, telephone number and email address from
                                    the COTS vendor
                                2. A correspondence must have already occurred and the
                                    communications channel (telephone or email) established
                                    between the customer and the COTS vendor prior to the
                                    test.
                                3. Both the customer and the COTS vendor must be aware
                                    of the contractual bindings and/or the warranty
                                    obligations of the vendor to the customer. The software
                                    upgrade requested by the customer should fall in this
                                    scope.
                                4. A new software upgrade has to be made available by the
                                    software manufacturer
Post-conditions:         The following function will be performed after the test procedure.
                                1. The vendor would have provided the software upgrade
                                   requested by the customer.
Input Specifications:    The following information will be added to perform the test:
                                1. Customer will contact the COTS vendor by phone or
                                   email and „login a call‟ indicating that they need to
                                   procure the software upgrade.
                                2. Customer will supply more information (if required) like
                                   warranty details to COTS vendor.
Expected Output          The following information are the expected output:
Specifications:                 1. COTS vendor will login a call and give the customer a
                                   call priority number as well as the date and time on
                                   which they will be sending the upgrade.
                                2. COTS vendor will send the software upgrade by email or
                                   by post by the promised date and time and the customer
                                   can install it
Pass/Fail Criteria:      The following information is the pass/fail criteria for testing:
                                1. The COTS vendor will respond within a timeframe
                                   already being discussed and agreed upon in the contract
                                   or warranty card for software upgrades


CAR_LCA_F05a_T03_V06.50                  79                                    12/05/05
COTS Assessment Report                                                      Version 6.50

                                 2. COTS vendor will send the software upgrade by email or
                                    by post by the promised date and time and help the
                                    customer in installation.
Test Process:            The following list the test process:
                                1. Customer contact COTS vendor via phone or email.
                                2. Customer „login a call‟ with the COTS vendor indicating
                                   that they need to procure the new software upgrade.
                                3. Customer compares the time taken for the initial
                                   response with the expected response time as per the
                                   contract or warranty card.
                                4. Customer compares the date and time on which the
                                   COTS vendor supplies the upgrade to the expected
                                   response time as per the contract or warranty card.
                                5. Also the customer compares the date on which they
                                   receive the software upgrade from the COTS vendor to
                                   that promised by the COTS vendor when the call was
                                   logged-in.
                                6. The customer confirms that they are being offered proper
                                   assistance in installing the upgrade
Assumptions and          The following list the assumptions and constraints for the test:
constraints:                    1. The preconditions are met.
                                2. The software upgrade requested has been newly released
                                   by the manufacturer.
Dependencies:            None
Traceability:            None
                         Table 38 Server Test Procedure Specification 5-4




CAR_LCA_F05a_T03_V06.50                    80                                  12/05/05
COTS Assessment Report                                                     Version 6.50



Test Case:               5-5
Identifier:              AT-S05-5
Test Items:              Warranty
Test Description:        This test will find out the effectiveness of warranty support provided
                         by the vendor to fix the problems faced by the customer for items
                         under warranty
Pre-conditions:          The following pre-condition must be met before the following test can
                         be performed:
                                1. Identify and obtain the contact person‟s name in the
                                   technical Department of COTS vendor. Also his telephone
                                   number and email address.
                                2. A correspondence must have already occurred and the
                                   communications channel (telephone or email) established
                                   between the customer and the COTS vendor prior to the
                                   test.
                                3. Both the customer and the COTS vendor must be aware of
                                   the contractual bindings and/or the warranty obligations of
                                   the vendor to the customer. The problem to be fixed should
                                   fall in this scope.
Post-conditions:         The following function will be performed after the test procedure.
                                1. The vendor would have responded, logged-in a call, and
                                   given a date and time to the customer for looking into the
                                   problem.
                                2. The vendor would have provided the support on the date
                                   and time confirmed by them with the customer when the
                                   call was logged in.
Input Specifications:    The following information will be added to perform the test:
                                1. Customer will contact the COTS vendor by phone or email
                                   and „login a call‟ indicating that there is a problem which
                                   has arisen in the customer site related to an item under
                                   warranty which needs immediate attention.
                                2. Customer will supply more information (if required) like
                                   warranty details, known details of the problem etc to COTS
                                   vendor.
Expected Output          The following information are the expected output:
Specifications:                 1. COTS vendor will login a call and give the customer a call
                                   priority number as well as the date and time on which they
                                   will be attending this call.
                                2. COTS vendor will attend to the call on the mentioned date
                                   and time.
                                3. The problem will be solved (or advice given to the
                                   customer for the future course of action).


CAR_LCA_F05a_T03_V06.50                  81                                    12/05/05
COTS Assessment Report                                                      Version 6.50


Pass/Fail Criteria:      The following information is the pass/fail criteria for testing:
                                1. The COTS vendor will respond within a timeframe already
                                   being discussed and agreed upon in the contract or warranty
                                   card.
                                2. COTS vendor will attend to the call on the mentioned date
                                   and time
                                3. COTS vendor will solve the problem or take any other
                                   required action on the customer site on the date and time as
                                   per their agreement during the call log-in.
Test Process:            The following list the test process:
                                1. Customer contact COTS vendor via phone or email.
                                2. Customer will contact the COTS vendor by phone or email
                                   and „login a call‟ indicating that there is a problem which
                                   has arisen in the customer site related to an item under
                                   warranty which needs immediate attention.
                                3. Customer compares the time taken for the initial response
                                   with the expected response time as per the contract or
                                   warranty card.
                                4. Customer compares the date and time on which the COTS
                                   vendor takes the required action on the customer site and
                                   solves the problem, to the expected response time as per the
                                   contract or warranty card.
                                5. Also the customer compares the date and time on which the
                                   COTS vendor take the required action on the customer site
                                   and solve the problem to that promised by the COTS
                                   vendor when the call was logged-in.
                                6. The customer checks whether the problem has been solved.
Assumptions and          The following list the assumptions and constraints for the test:
constraints:                    1. COTS vendor will not have any abnormal activities going
                                   on in their organization like a company merger, department
                                   take-over etc which may in normal case delay the response
                                   time.
                                2. COTS vendor contact is working at his/her regular
                                   workload and there is nothing unusual happening with his
                                   work like a much increased workload. This can really slow
                                   down the response in many cases.
                                3. The item which has developed a problem is under warranty.
Dependencies:            None
Traceability:            None
                         Table 39 Server Test Procedure Specification 5-5




CAR_LCA_F05a_T03_V06.50                    82                                  12/05/05
COTS Assessment Report                                                         Version 6.50

The following tables indicate the test procedures for the security attribute (AT-S07):

 Test Case:                7-1
 Identifier:               AT-S07-1
 Test Items:               User Privileges
 Test Description:         This test case will measure the capability of the server in enforcing
                           user privileges at administrator, student, faculty and guest levels.
 Pre-conditions:           The following pre-conditions must be met before the following test
                           can be performed:
                                   1. All user groups are defined.
                                   2. All group policies are predefined.
                                   3. Password policies are predefined.
                                   4. All users are assigned to their appropriate groups.
 Post-conditions:          Each user of the system will have only those access rights and
                           privileges on basis of their group policies
 Input Specifications:     The following information will be added to perform the test:
                           All the users enter their respective username and password to logon
                           to the system.
 Expected Output           The users are only able to access those applications and personalized
 Specifications:           folders for which they have been granted access.
 Pass/Fail Criteria:       The following information is the pass/fail criteria for testing
                                   1. System administrator has full access to all system
                                      resources.
                                   2. Faculties only have access to student folders and have
                                      rights to install educational software
                                   3. Students have read/write/delete permissions for their
                                      personal folders.
 Test Process:             The following list the test process:
                                   1. Login to the system as Administrator and check if all
                                      administrator rights are granted
                                   2. Login to the system as Faculty and check if all faculty
                                      group rights are granted.
                                   3. Login to the system as Student and check if all student
                                      group rights are granted.
 Assumptions and           The following list the assumptions and constraints for the test:
 constraints:                      1. The Pre-Conditions are fulfilled.


CAR_LCA_F05a_T03_V06.50                      83                                    12/05/05
COTS Assessment Report                                                      Version 6.50


                                 2. The user name should be unique for all users.
Dependencies:            Interoperability of Active Directory service (AT-S04-3)
Traceability:            None
                         Table 40 Server Test Procedure Specification 7-1




CAR_LCA_F05a_T03_V06.50                    84                                  12/05/05
COTS Assessment Report                                                             Version 6.50


                        4.1.3.4               Test Results

This section lists the test results of all the test procedures described in section 4.1.3.3 of this
document.


The following tables indicate the test result for the performance attribute (AT-S01):

 Test Case:                      1-1
 Identifier:                     AT-S01-1
 Test Items:                     Number of concurrent users
 Test Result Classification      Pass
 (Pass /Fail):
 Problem / Defect Report:        As we are awaiting the delivery of the server, we were currently
                                 not able to perform the tests at the customer‟s site. Therefore
                                 these test results are based on the conversations with the COTS
                                 vendor and on the web references given by him/her.
 Feedback / Comment:             The server has a total of 4 GB RAM.
                                 Assuming that Windows 2003 Server requires 512 MB RAM to
                                 perform optimally and each thin-client requires 64 MB RAM to
                                 perform optimally, then ideally the server can support up to a
                                 maximum of 56 concurrent clients.
                                 Reference: http://www.msterminalservices.org/articles/Juggling-
                                 Terminal-Service-Resources.html
                                 Rating – 10/10
                                       Table 41 Server Test Result 1-1




CAR_LCA_F05a_T03_V06.50                         85                                     12/05/05
COTS Assessment Report                                                     Version 6.50



Test Case:                   1-2
Identifier:                  AT-S01-2
Test Items:                  System response time (multimedia applications)
Test Result Classification   Pass
(Pass /Fail):
Problem / Defect Report:     As we are awaiting the delivery of the server, we were currently
                             not able to perform the tests at the customer‟s site. Therefore
                             these test results are based on the conversations with the COTS
                             vendor and on the web references given by him/her.
Feedback / Comment:          The server supports two 3.0 GHz Xeon Processor that conform
                             to the symmetric multiprocessing standard. Using SMP, it makes
                             it possible for multimedia applications to use multiple processors
                             when additional processing power is required to increase the
                             capability of a system.
                             As multimedia applications are processor-intensive, the system
                             response time will greatly benefit from the server‟s support for
                             SMP.
                             Reference:
                             http://www.microsoft.com/windowsserver2003/evaluation/featur
                             es/highlights.mspx#winmedia
                             Rating – 10/10
                                   Table 42 Server Test Result 1-2




CAR_LCA_F05a_T03_V06.50                     86                                 12/05/05
COTS Assessment Report                                                    Version 6.50



Test Case:                   1-3
Identifier:                  AT-S01-3
Test Items:                  System login time
Test Result Classification   Pass
(Pass /Fail):
Problem / Defect Report:     As we are awaiting the delivery of the server, we were currently
                             not able to perform the tests at the customer‟s site. Therefore
                             these test results are based on the conversations with the COTS
                             vendor and on the web references given by him/her.
Feedback / Comment:          The simultaneous login processing load is distributed between
                             the two 3.0 GHz Xeon processors in the server and thus a single
                             processor will not be burdened with the entire processing load,
                             thus there will be a major improvement in the system login time.
                             Reference:
                             http://www.microsoft.com/windowsserver2003/evaluation/featur
                             es/highlights.mspx#winmedia
                             Rating – 10/10
                                   Table 43 Server Test Result 1-3




CAR_LCA_F05a_T03_V06.50                     87                                12/05/05
COTS Assessment Report                                                     Version 6.50



Test Case:                   1-4
Identifier:                  AT-S01-4
Test Items:                  System response time (regular applications)
Test Result Classification   Pass
(Pass /Fail):
Problem / Defect Report:     As we are awaiting the delivery of the server, we were currently
                             not able to perform the tests at the customer‟s site. Therefore
                             these test results are based on the conversations with the COTS
                             vendor and on the web references given by him/her.
Feedback / Comment:          The server supports two 3.0 GHz Xeon Processor that conform
                             to the symmetric multiprocessing standard. Using SMP, it makes
                             it possible for applications to use multiple processors when
                             additional processing power is required to increase the capability
                             of a system.
                             As applications are processor-intensive, the system response
                             time will greatly benefit from the server‟s support for SMP.
                             Reference:
                             http://www.microsoft.com/windowsserver2003/evaluation/feature
                             s/highlights.mspx#winmedia
                             Rating – 10/10
                                   Table 44 Server Test Result 1-4




CAR_LCA_F05a_T03_V06.50                     88                                 12/05/05
COTS Assessment Report                                                    Version 6.50



Test Case:                   1-5
Identifier:                  AT-S01-5
Test Items:                  Network bandwidth
Test Result Classification   Pass
(Pass /Fail):
Problem / Defect Report:     None
Feedback / Comment:          The network at PHS has a bandwidth of 100 Megabits and each
                             thin-client connection was tested to consume a maximum of 2
                             Megabits per client, thus the overall bandwidth with 40 thin-
                             client terminals would sum up to:
                             2 x 40 = 80 Megabits.
                             Reference: As per the tests performed by the developers at the
                             client‟s site.
                             (Refer Section 3.2.3 of CAR)
                             Rating – 8/10
                                   Table 45 Server Test Result 1-5




CAR_LCA_F05a_T03_V06.50                     89                                12/05/05
COTS Assessment Report                                                           Version 6.50

The following tables indicate the test result for the cost attribute (AT-S02):

 Test Case:                     2-1
 Identifier:                    AT-S02-1
 Test Items:                    Initial purchase cost for Tangent TC Server Model Pillar 2750s
 Test Result Classification     Pass- The initial purchase cost falls within the customer‟s budget
 (Pass / Fail)                  for the server.
                                The initial cost for the server is $5,211.40 pre-sales tax.
                                The customer‟s expected server cost is $ 5,500
 Problem / Defect Report        None
 Feedback / Comments:           Reference:
                                Louise O‟Sullivan at Tangent Computer
                                Contact # : 800-342-9388 ext. 2137
                                Rating: 10/10
                                      Table 46 Server Test Result 2-1




CAR_LCA_F05a_T03_V06.50                        90                                   12/05/05
COTS Assessment Report                                                       Version 6.50



Test Case:                   2-2
Identifier:                  AT-S02-2
Test Items:                  Upgrade cost for Tangent TC Server Model Pillar 2750S
Test Result Classification   Pass – The upgrade cost is within the customer‟s upgrade budget.
(Pass / Fail)
Problem / Defect Report      Hardware Upgrade Cost: We are limited to RAM and Storage.
                             Software Upgrade Cost: Depends on price of software.
Feedback / Comments:         The upgrade cost for this particular item is limited to only DDR-
                             RAM and Storage. Due to the constraints with the motherboard
                             only support Pentium 4 CPU, we do not expect the system to
                             support next generation CPUs from Intel.       Estimated Upgrade
                             Cost of both Memory and Storage: $1,100.00
                             For software upgrades, we do not see any problem with software as
                             long as the software is based on Microsoft Windows Platform. As
                             for operating system update, the next major Windows Server
                             release, Windows Server "Longhorn" is scheduled for late 2007.
                             At the time of this writing, Microsoft has not release the specs for
                             this product.
                             The current server hardware specification is built to support future
                             upgrades if necessary. However currently, we do not see
                             upgrading as a major issue.
                             Reference:
                             Louise O‟Sullivan at Tangent Computer
                             Contact # : 800-342-9388 ext. 2137
                             Rating: 7/10
                                   Table 47 Server Test Result 2-2




CAR_LCA_F05a_T03_V06.50                     91                                   12/05/05
COTS Assessment Report                                                   Version 6.50



Test Case:                   2-3
Identifier:                  AT-S02-3
Test Items:                  Annual maintenance cost for Tangent TC Server Model Pillar
                             2750s
Test Result Classification   Pass
(Pass / Fail)
Problem / Defect Report      None
Feedback / Comments:         Currently PHS has an annual server-maintenance contract with
                             Tangent Computer.
                             The new server that is being bought from Tangent will also be
                             under the same maintenance contract, as agreed between the
                             client and the COTS vendor and thus the maintenance cost will
                             be within the customer‟s budget.
                             Reference:
                             Louise O‟Sullivan at Tangent Computer
                             Contact # : 800-342-9388 ext. 2137
                             Rating: 10/10
                                   Table 48 Server Test Result 2-3




CAR_LCA_F05a_T03_V06.50                     92                               12/05/05
COTS Assessment Report                                                        Version 6.50

The following tables indicate the test result for the intercomponent capability attribute (AT-S03):

 Test Case:                    3-1
 Identifier:                   AT-S03-1
 Test Items:                   Compatibility of Wayang Outpost to run in the system
                               environment
 Test Result Classification    Pass
 (Pass /Fail):
 Problem / Defect Report:      The quality of the flash animation was not good because the
                               client terminals‟ low capability of handling heavy multimedia
                               contents.
 Feedback / Comment:           Users are able to login and run the application through Internet
                               Explorer from the client terminals. The application was able to
                               run smoothly the testers were able to complete all the available
                               tests and games provided by the application.
                               The proposed system is just a server similar to the current one
                               but with higher specifications. Thus there will not be any
                               compatibility issues with the new server since the current server
                               passed the compatibility test.
                               Reference:
                               http://www.wayangoutpost.net/
                               Rating – 8/10
                                     Table 49 Server Test Result 3-1




CAR_LCA_F05a_T03_V06.50                       93                                  12/05/05
COTS Assessment Report                                                    Version 6.50


Test Case:                   3-2
Identifier:                  AT-S03-2
Test Items:                  Compatibility of MS Office Suite to run in the system
                             environment
Test Result Classification   Pass
(Pass /Fail):
Problem / Defect Report:     None
Feedback / Comment:          The result of this testing is obvious since both the server and
                             clients are running under the Windows-based operating system.
                             Running MS Office Suite under Windows system would not be a
                             problem.
                             Reference:
                             http://www.microsoft.com/office/editions/prodinfo/default.mspx
                             Rating – 10/10
                                   Table 50 Server Test Result 3-2




CAR_LCA_F05a_T03_V06.50                     94                                12/05/05
COTS Assessment Report                                                      Version 6.50



Test Case:                   3-3
Identifier:                  AT-S03-3
Test Items:                  Compatibility of Renaissance Place (Accelerated Reader) to run
                             in the system environment
Test Result Classification   Pass
(Pass /Fail):
Problem / Defect Report:     None
Feedback / Comment:          The Accelerated Reader is currently hosted on a separate server
                             on the network. Since it‟s a web-based application, all it needs to
                             run the application is the access to the web browser. The testers
                             were able to complete all the reading tests as well printing out
                             the report generated by the application.
                             Reference:
                             http://www.renlearn.com/ar/overview/ARSystemRequirements.p
                             df
                             http://www.renlearn.com/ProductSystemRequirements.pdf
                             Rating – 10/10
                                   Table 51 Server Test Result 3-3


Test Case:                   3-4
Identifier:                  AT-S03-4
Test Items:                  Compatibility of Choice to run in the system environment
Test Result Classification   Pass
(Pass /Fail):
Problem / Defect Report:     The license of the software is currently expired
Feedback / Comment:          The Choice application is running fine under the current server.
                             Since the new server will be using the same operating system,
                             the application will have no problem running under the new
                             server.
                             Reference:
                             As per the test performed by the developers at the client‟s site
                             (Refer Section 3.2.3 of CAR)
                             Rating – 10/10
                                   Table 52 Server Test Result 3-4




CAR_LCA_F05a_T03_V06.50                     95                                  12/05/05
COTS Assessment Report                                                          Version 6.50

The following tables indicate the test result for the interoperability attribute (AT-S04):

 Test Case:                     4-1
 Identifier:                    AT-S04-1
 Test Items:                    Interoperability of authentication process
 Test Result Classification     Pass
 (Pass /Fail):
 Problem / Defect Report:       As we are awaiting the delivery of the server, we were currently
                                not able to perform the tests at the customer‟s site. Therefore
                                these test results are based on the conversations with the COTS
                                vendor and on the web references given by him/her.
 Feedback / Comment:            COTS vendor support claims that the server running Windows
                                2003 Server operating system will have full capability to support
                                the authentication process with any thin client that supports
                                terminal service client.
                                Reference:
                                Nick Haddad at Tangent Computer
                                Contact # : 800-342-9388
                                Rating – 10/10
                                      Table 53 Server Test Result 4-1

 Test Case:                     4-2
 Identifier:                    AT-S04-2
 Test Items:                    Interoperability of application processing
 Test Result Classification     Pass
 (Pass /Fail):
 Problem / Defect Report:       As we are awaiting the delivery of the server, we were currently
                                not able to perform the tests at the customer‟s site. Therefore
                                these test results are based on the conversations with the COTS
                                vendor and on the web references given by him/her.
 Feedback / Comment:            The COTS vendor support claims that users will be able to run
                                all the applications available on the server, as well as run any
                                web-based applications.
                                Reference:
                                Nick Haddad at Tangent Computer
                                Contact # : 800-342-9388
                                Rating – 10/10
                                      Table 54 Server Test Result 4-2


CAR_LCA_F05a_T03_V06.50                        96                                   12/05/05
COTS Assessment Report                                                      Version 6.50



Test Case:                   4-3
Identifier:                  AT-S04-3
Test Items:                  Interoperability of Active Directory service
Test Result Classification   Pass
(Pass /Fail):
Problem / Defect Report:     As we are awaiting the delivery of the server, we were currently
                             not able to perform the tests at the customer‟s site. Therefore
                             these test results are based on the conversations with the COTS
                             vendor and on the web references given by him/her.
Feedback / Comment:          COTS vendor support claims that the server running Windows
                             2003 Server operating system will have full capability to support
                             the Active Directory service.
                             Reference:
                             Nick Haddad at Tangent Computer
                             Contact # : 800-342-9388
                             http://www.tangent.com/canopy/ActiveDirectory_Benefits.pdf
                             Rating – 10/10
                                   Table 55 Server Test Result 4-3




CAR_LCA_F05a_T03_V06.50                     97                                 12/05/05
COTS Assessment Report                                                      Version 6.50



Test Case:                   4-4
Identifier:                  AT-S04-4
Test Items:                  Print Services
Test Result Classification   Pass
(Pass /Fail):
Problem / Defect Report:     As we are awaiting the delivery of the server, we were currently
                             not able to perform the tests at the customer‟s site. Therefore
                             these test results are based on the conversations with the COTS
                             vendor and on the web references given by him/her.
Feedback / Comment:          When a client connects to the server (which has support for
                             Terminal Services), local printers attached to line printer port
                             (LPT), communications port (COM), and universal serial bus
                             (USB) ports are automatically detected and a local queue is
                             created on the server. The client computer printer settings for the
                             default printer and some properties (such as printing on both
                             sides of the page) are used by the server.
                             Reference:
                             http://download.microsoft.com/download/4/6/b/46bae314-ea7b-
                             4c39-bcb6-defbc907ee54/TSPrint.doc
                             Rating: 10/10
                                   Table 56 Server Test Result 4-4

Test Case:                   4-5
Identifier:                  AT-S04-5
Test Items:                  Creating User Profile
Test Result Classification   Pass
(Pass /Fail):
Problem / Defect Report:     As we are awaiting the delivery of the server, we were currently
                             not able to perform the tests at the customer‟s site. Therefore
                             these test results are based on the conversations with the COTS
                             vendor and on the web references given by him/her.
Feedback / Comment:          Active Directory Services on the server allows the administrator
                             to create user profile based on their policy settings.
                             Reference:
                             http://www.tangent.com/canopy/ActiveDirectory_Benefits.pdf
                             Rating: 10/10
                                   Table 57 Server Test Result 4-5



CAR_LCA_F05a_T03_V06.50                     98                                  12/05/05
COTS Assessment Report                                                        Version 6.50

The following tables indicate the test result for the vendor support attribute (AT-S05):

 Test Case:                    5-1
 Identifier:                   AT-S05-1
 Test Items:                   Response time for critical problems
 Test Result Classification    Pass
 (Pass /Fail):
 Problem / Defect Report:      The COTS vendor didn‟t give feedback for the prototypes that
                               the developers sent for review, which may affect the deployment
                               and installation of the server.
 Feedback / Comment:           The COTS support vendor was able to respond to most of the
                               problems within 24 hours in the past. However, since Nick is the
                               only direct support contact for PHS in Tangent, it may take
                               longer for Tangent to solve some problems if Nick is not
                               available.
                               Reference:
                               Nick Haddad at Tangent Computer
                               Contact # : 800-342-9388
                               Rating – 9/10
                                     Table 58 Server Test Result 5-1


 Test Case:                    5-2
 Identifier:                   AT-S05-2
 Test Items:                   Remote assistance
 Test Result Classification    Pass
 (Pass /Fail):
 Problem / Defect Report:      None.
 Feedback / Comment:           The COTS vendor has an administrative account where he can
                               remotely log in to the system to perform the necessary remote
                               assistance as required by the client. This is guaranteed by the
                               annual maintenance contract provided by Tangent Computer.
                               Reference:
                               Nick Haddad at Tangent Computer
                               Contact # : 800-342-9388
                               Rating – 10/10
                                     Table 59 Server Test Result 5-2



CAR_LCA_F05a_T03_V06.50                       99                                  12/05/05
COTS Assessment Report                                                     Version 6.50



Test Case:                   5-3
Identifier:                  AT-S05-3
Test Items:                  Hardware support
Test Result Classification   Pass
(Pass /Fail):
Problem / Defect Report:     None.
Feedback / Comment:          The maintenance contract guarantees hardware technical support
                             via phone to the client.
                             If necessary, Tangent Computer will provide on-site hardware
                             technical support on a needed basis.
                             Reference:
                             Nick Haddad at Tangent Computer
                             Contact # : 800-342-9388
                             Rating – 10/10
                                   Table 60 Server Test Result 5-3

Test Case:                   5-4
Identifier:                  AT-S05-4
Test Items:                  Software bundling
Test Result Classification   Fail
(Pass /Fail):
Problem / Defect Report:     Some required software applications by the client are not
                             supported by the COTS vendor under the maintenance contract.
                             Client has to contact the individual software provides for support.
Feedback / Comment:          According to the annual server- maintenance contract, the COTS
                             vendor is liable to provide any software upgrade (OS & MS
                             Office Suite) to the client when required.
                             Third-party applications (Wayang Outpost, Choices &
                             Renaissance Place) are not supported in the maintenance contract.
                             Reference:
                             Nick Haddad at Tangent Computer
                             Contact # : 800-342-9388
                             Rating: 4/10
                                   Table 61 Server Test Result 5-4




CAR_LCA_F05a_T03_V06.50                     100                                 12/05/05
COTS Assessment Report                                                          Version 6.50



 Test Case:                     5-5
 Identifier:                    AT-S05-5
 Test Items:                    Warranty
 Test Result Classification     Pass
 (Pass /Fail):
 Problem / Defect Report:       None.
 Feedback / Comment:            The COTS vendor provides a full hardware warranty support for
                                3 years as expected by the client.
                                Reference:
                                Nick Haddad at Tangent Computer
                                Contact # : 800-342-9388
                                Rating: 10/10
                                      Table 62 Server Test Result 5-5
The following tables indicate the test result for the security attribute (AT-S07):

 Test Case:                     7-1
 Identifier:                    AT-S07-1
 Test Items:                    User Privileges
 Test Result Classification     Pass
 (Pass /Fail):
 Problem / Defect Report:       As we are awaiting the delivery of the server, we were currently
                                not able to perform the tests at the customer‟s site. Therefore
                                these test results are based on the conversations with the COTS
                                vendor and on the web references given by him/her.
 Feedback / Comment:            The active directory group policy system enables administrator
                                to easily define roles and enforces access rights.
                                Reference:
                                http://www.microsoft.com/windowsserver2003/techinfo/overvie
                                w/security.mspx
                                Rating: 10/10
                                      Table 63 Server Test Result 7-1




CAR_LCA_F05a_T03_V06.50                        101                                   12/05/05
COTS Assessment Report                                                         Version 6.50


                       4.1.3.5            Test Summary

This section summarizes the evaluation test performed on the server COTS product.

4.1.3.5.1    Summary

The tests were performed on the server COTS product, Tangent Pillar™ 2750s. Our COTS
evaluation test has 26 test procedures, which will be performed by three team members and the
results of our COTS evaluation tests will work as the basis for our final COTS recommendation.

4.1.3.5.2    Summary of Results and Consequences

The test results and references showed that the Tangent Pillar™ 2750s server has covered all the
core capabilities and level of service requirement as required by the client. However, the detailed
test results also revealed some limitation beyond what the server can change. For example, the
network infrastructure will also affect the performance and functionality that the server can
provide. Also if additional users or new applications are required in the future, the performance
will be greatly affected due to the nature of the thin-client network.

4.1.3.5.3    Evaluation

As the server has not yet been delivered to the client, we need to make use of benchmarking and
references to substantiate the test results.AT-S01, AT-S03, AT-S04, and AT-S07 are tested based
on the expected performance and outcome of the server. AT-S02, AT-S05, and AT-S06 are
based on the information provided by the COTS vendor, such as price list, hardware
specification data, or the vendor feedback. All the detailed evaluations are described in section
4.1.3 of this document.

4.1.3.5.4    Summary of Activities

Total staffs involved in testing are three of team members.
Total elapsed time used for each of the major testing activities is about 10 hours.
The actual machine time cannot be determined at this moment because the server has not been
delivered.




CAR_LCA_F05a_T03_V06.50                     102                                       12/05/05
COTS Assessment Report                                                            Version 6.50

         4.1.4     Evaluation Results Screen Matrix
This section lists the rating of the evaluation results from Section 4.1.3.4 of CAR and calculates
the score of each evaluation criteria according to the weights assigned so as to obtain an overall
view of the level of satisfaction the Tangent Pillar™ 2750s provides.
The weight is described in section 4.1.2 of this document.
The rating is assigned in the test results in section 4.1.3.4 of this document.
The score is calculated using the formula: Weight * Rating = Score
The total score is the sum of the all scores for the product.
The weighted average rating is calculated using the formula:
 ((Weight / Total Weight) * Rating)


AT-S01 Performance

 Weight                      Evaluation Criteria                          Tangent Pillar™ 2750s
                                                                           Rating          Score
   80      Number of concurrent users                                               10            800
   70      System response time (multimedia applications)                           10            700
   65      System login time                                                        10            650
   35      System response time (regular applications)                              10            350
   20      Network bandwidth                                                         8            160
   270                                                    Total Score                -           2660
                                          Weighted Average Rating                 9.85              -
                                   Table 64 Server Result Matrix AT-S01
AT-S02 Cost

 Weight                      Evaluation Criteria                          Tangent Pillar™ 2750s
                                                                           Rating          Score
   60      Initial purchase cost                                                    10            600
   50      Upgrade cost                                                              7            350
   40      Annual maintenance cost                                                  10            400
   150                                                    Total Score                -           1350
                                          Weighted Average Rating                    9              -
                                   Table 65 Server Result Matrix AT-S02



CAR_LCA_F05a_T03_V06.50                        103                                   12/05/05
COTS Assessment Report                                                     Version 6.50

AT-S03 Intercomponent Compatibility

Weight                     Evaluation Criteria                        Tangent Pillar™ 2750s
                                                                       Rating         Score
   50    Wayang Outpost                                                          8           400
   30    MS Office Suite                                                        10           300
   25    Renaissance Place (Accelerated Reader)                                 10           250
   15    Choices                                                                10           150
  120                                                 Total Score                -          1100
                                      Weighted Average Rating              9.17                -
                               Table 66 Server Result Matrix AT-S03
AT-S04 Interoperability

Weight                     Evaluation Criteria                        Tangent Pillar™ 2750s
                                                                       Rating         Score
   30    Authentication                                                         10           300
   30    Application processing                                                 10           300
   30    Active Directory service                                               10           300
   20    Print service                                                          10           200
   20    User profile/folder                                                    10           200
  130                                                 Total Score                -          1300
                                      Weighted Average Rating                   10             -
                               Table 67 Server Result Matrix AT-S04




CAR_LCA_F05a_T03_V06.50                    104                                   12/05/05
COTS Assessment Report                                                    Version 6.50

AT-S05 Vendor Support

 Weight                     Evaluation Criteria                      Tangent Pillar™ 2750s
                                                                      Rating         Score
   40     Response time for critical problems                                   9           360
   30     Remote assistance                                                    10           300
   20     Hardware support                                                     10           200
   20     Software bundling                                                     4            80
   10     Warranty                                                             10           100
  120                                                 Total Score               -          1040
                                     Weighted Average Rating              8.67                -
                              Table 68 Server Result Matrix AT-S05
AT-S06 Flexibility

 Weight                     Evaluation Criteria                      Tangent Pillar™ 2750s
                                                                      Rating         Score
   40     Downward Compatibility                                                9           360
   60     Extendibility (ease of upgrade of server)                             8           480
  100                                                 Total Score               -           840
                                     Weighted Average Rating               8.4                -
                              Table 69 Server Result Matrix AT-S06
AT-S07 Security

 Weight                     Evaluation Criteria                      Tangent Pillar™ 2750s
                                                                      Rating         Score
   50     User privileges                                                      10           500
   50                                                 Total Score               -           500
                                     Weighted Average Rating                   10             -
                              Table 70 Server Result Matrix AT-S07




CAR_LCA_F05a_T03_V06.50                   105                                   12/05/05
COTS Assessment Report                                                                                 Version 6.50

Overall Summary


Weight                                Evaluation Criteria                                       Tangent Pillar™ 2750s
                                                                                                 Rating              Score
    270          Performance                                                                           9.85                 2660
    150          Cost                                                                                       9               1350
    120          Intercomponent Compatibility                                                          9.17                 1100
    130          Interoperability                                                                          10               1300
    120          Vendor Support                                                                        8.67                 1040
    100          Flexibility                                                                               8.4                  840
      50         Security                                                                                  10                   500
    940                                                                Total Score                              -           8790
                                                 Weighted Average Rating                               9.35                       -
                                    Table 71 Server Result Matrix Overall Summary

                                               Server Result Summary Chart


           12


                   9.85                                           10                                                   10
           10
                                9               9.17
                                                                                    8.67
                                                                                                     8.4
            8
  Rating




            6



            4



            2



            0
                Performance    Cost        Intercomponent   Interoperability   Vendor Support     Flexibility        Security
                                            Compatibility
                                                               Criteria

                                                        Tangent Pillar™ 2750TS

                                           Figure 8 Server Result Summary




CAR_LCA_F05a_T03_V06.50                                106                                                      12/05/05
COTS Assessment Report                                                         Version 6.50


       4.1.5       Business Case Analysis
This section describes the business case analysis of the server COTS hardware products in terms
of added value, deployment costs and operational costs. The business case analysis is a
framework prepared for decision makers to show that the proposed project is feasible and makes
sound financial sense by showing that it is techno commercially viable solution.

                       4.1.5.1            COTS Ownership Cost

COTS ownership cost is the initial investment done on the purchase of the COTS product. Here
the hardware COTS purchased is server. As per the phase1 of our prototype, Pasadena High
School has already purchased the server from Tangent Computer for $ 5250, which is a one-time
cost. The other costs incurred by the school are as follows:


Setup and Installation cost (One-time costs) = $500
TOTAL COTS Ownership cost                     = $5750


Hence, the total COTS Ownership cost for the PHS is $5750

                       4.1.5.2            Development Cost
This section describes the cost involved in developing the COTS products. This does not apply to
our project, since this project involves only network assessment and there is no software
development involved here. Also the cs577a team is working on this project as a part of our
course requirements.

                       4.1.5.3            Transition Cost
There is no transition cost here since the client doesn‟t have to spend any extra time for training.
Also this is just new model of server which has been installed and there is only hardware
upgrade involved.

                       4.1.5.4            Operational Cost

There is no additional operational cost for this new server. Even though there is user time
involved in administration of this server, there is no additional workload involved. Thus
operational cost cannot be taken while calculating the business case for this server. The new
server installation doesn‟t have any operational cost in phase 1 of the prototype 2.




CAR_LCA_F05a_T03_V06.50                     107                                    12/05/05
COTS Assessment Report                                                         Version 6.50


                       4.1.5.5             Annual Maintenance Cost

The Annual maintenance cost here is the cost incurred every year for system maintenance which
includes server maintenance and software upgrades. This is $3000 (approx) every year for server
maintenance. The annual maintenance contract (AMC) expired on august 31, 2005. The new
contract will be made for the newly installed server. Server maintenance cost for the new server
includes remote maintenance, backing up and software updates is $3000.

                       4.1.5.6             Estimate of Value Added ROI

The client has spent time over phone, meetings, emails and also commuting to USC for ARB
meetings. We have arrived at a total number of 62 person hours spent by the client altogether,
out of which 35 hours (approximately) have been spent on discussions on server.
The phase 1 of prototype 2, which is the installation of the server offers operational and
maintenance costs benefits over the old system.
In the current system the server freezes frequently and the administrator has to spend many hours
trying to diagnose system instability, which often results in rebooting of the server. Also server is
not able to handle more than 8 users concurrently for any multimedia application. The new
system will have much faster user logins, centralized accessibility, much improved multimedia
application performance. This will reduce the time spent by the Librarian and other faculties on
the system tremendously. These cost benefits outweigh the cost of the server.
From the expected login time of the system and as per the feedback from the administrator
regarding the usage, the amount of time the system administrator spends at the users‟ desk is
reduced by 30 %.
The system log in time is much faster. In the old system the login time was more than 30
minutes for the entire class of 30-40 users to login. In the new system this is less than 5 minutes
for the entire class to login. This means that students spend less time with the PC‟s and the
productivity of students as well as faculties will increase. We arrived at a figure of 6 hours total
savings per week, Administrator and faculties put together.


Total savings / year
In the new system the total savings in operational cost in every year from January 2006 for
Librarian and faculties = 288 person hours approx.
(3 hours for librarian, 3 hours for all faculties together in a week).
These are rough estimates obtained from the client. This saving of the time spent results in a
salary savings of $30 /hr (Salary estimate not obtained from the client because of confidentiality
reasons. An approximate estimate taken)




CAR_LCA_F05a_T03_V06.50                      108                                    12/05/05
COTS Assessment Report                                                          Version 6.50


Thus, the resulting annual salary saving would be
($30/hr * ((3 + 3) hours/ week) * 4(weeks /month) * (12 months/year))
= $8640 / year
Total client effort = 62 person-hours (We have arrived at a total number of 62 person hours spent
by the client for arriving at this prototype).
Out of these 62 hours, approximately 35 hours has been spent on discussing the server prototype.
=30 (salary/hr) * 35 hours
= $1050


Setup and Installation cost:                 = $500
Server purchase cost                         = $5250
-------------------------------------------------------------
COTS Ownership costs                         = $ 5750


Total Annual Costs for 2006:
         Annual maintenance costs = $3000
         COTS Ownership costs            = $5750
         Client effort cost               =$1050
-----------------------------------------------------------------
         Total annual costs               = $9800
Total estimated annual Savings:           = $8640
Return on investment (ROI) = (Benefits – Costs)/ Costs
We use initial investment cost as ROI denominator, and pay for maintenance out of operational
savings every year.
Total benefits realized = benefits – maintenance cost = $8640 – $3000= $5640
Total costs= $9800 – $3000= $6800 (This has been taken into account by the savings)
ROI at the end of the year (2006)             = (5640 - 6800)/ 6800 = - 0.17
Total benefits realized for second year = accumulated benefits – maintenance cost
                                              = 5640 + 8640 -3000 = 11280
ROI at the end of the year (2007)              = (5640 - 6800)/ 6800 = - 0.17




CAR_LCA_F05a_T03_V06.50                            109                             12/05/05
COTS Assessment Report                                                                       Version 6.50


Thus from the available estimates the ROI can be estimated as follows:

 Years after Deployment                 Initial investment Accumulated benefits               ROI
                                        costs (USD)        (USD)
                       1                                6800                          5640                     - 0.17
                       2                                6800                        11280                       0.65
                       3                                6800                        16920                        1.4
                       4                                6800                        22560                        2.3
                       5                                6800                        28200                       3.14
                   Table 72 ROI Statistics for Breakeven Analysis after implementing prototype 2, phase1

                                                 Breakeven Analysis on ROI

        3.5



          3



        2.5



          2
  ROI




        1.5                                                                                                      ROI



          1



        0.5



          0
               0              1              2               3               4           5                 6


        -0.5
                                                   Years after Deployment


                                       Figure 9 Server Break-even Analysis on ROI




CAR_LCA_F05a_T03_V06.50                                110                                       12/05/05
COTS Assessment Report                                                       Version 6.50

The return on investment will reach break-even point for PHS by second year of installation of
the system. This calculation is based on the rough estimations provided by the client. The actual
return on investment may vary depending on the level of system usage and savings. Also there is
a major intangible benefit which is not considered here , i.e., less time spend on the system by
the users (students), higher user satisfaction and better learning experience for the students .


Thus, the above Business case analysis shows that the proposed project is feasible and makes
sound financial sense since it has been proved to be economically viable and technically feasible.




CAR_LCA_F05a_T03_V06.50                    111                                   12/05/05
COTS Assessment Report                                                       Version 6.50



 4.2 Assessment Results-Part 2 [Client]
This section describes the assessment and test results for the client COTS product.

       4.2.1      COTS Assessed
The COTS products assessed are:
 COTS Products                    Web Address               Description
 Tangent WebDT 166                http://www.dtresearch. The WebDT 166 features the
                                  com/prod_webDT166.h integration of the energy efficient,
                                  tml                    yet powerful AMD Geode™
                                                         GX533 and LX800 processor
                                                         technology into a most compact and
                                  http://www.dtresearch. robust enclosure.
                                  com/datasheets/WebD
                                  T%20166%20flyer%20 The WebDT 166 runs the Windows
                                                         CE Embedded operating system
                                  081205-2.pdf
                                                         and provides support for flash based
                                                         applications.
                                                            The WebDT 166 client model finds
                                                            applications like server based
                                                            computing and web-based
                                                            applications in variety of industries
                                                            like education and healthcare etc.
                                                            The software operating systems
                                                            supports server and browser-based
                                                            computing in addition to local
                                                            embedded applications.


 Wyse Winterm V90                 http://www.wyse.com/      The Winterm™ V90 includes a
                                  products/winterm/V90/     powerful 1GHz CPU; smart card
                                                            slot; Card Bus/PCMCIA slot; and
                                                            serial, parallel, and USB ports.
                                                            Video performance is fast and crisp,
                                                            minimizing eyestrain and meeting
                                                            stringent health and ergonomic
                                                            requirements.
                                                            The Winterm V90 runs the
                                                            Microsoft® Windows XPe
                                                            operating system providing fast
                                                            boot-up functionality and the ability
                                                            to easily and rapidly switch


CAR_LCA_F05a_T03_V06.50                    112                                   12/05/05
COTS Assessment Report                                                        Version 6.50

                                                              between a typical PC desktop and a
                                                              connection manager dashboard.
                                                              As the model V90 is diskless and
                                                              application installation and
                                                              execution is fully managed, it is
                                                              inherently secure from viruses and
                                                              other malicious software attacks.
                                                              The Winterm V90 offers a broad
                                                              range of mounting options for any
                                                              work environment via its innovative
                                                              monorail mounting system
                                                              It has Wyse™ Rapport®
                                                              (Workgroup edition), the enterprise
                                                              client management tool that
                                                              leverages the value of your IT
                                                              infrastructure for maximum ROI.
                                  Table 73 Client COTS Assessed


       4.2.2       Evaluation Criteria
The set of evaluation criteria for the client COTS product chosen is shown in the following table.
The last column presents the corresponding weight assigned based on the discussion between the
client and the team members. The evaluation weights indicate the importance the client perceives
an attribute of the system shall have.

     No        Evaluation Criteria – COTS attributes                    Weight
  AT-C01       Cost                                                     120
  AT-C02       Performance                                              120
  AT-C03       Vendor Support                                           90
  AT-C04       Flexibility                                              80
                                 Table 74 Client Evaluation Criteria




CAR_LCA_F05a_T03_V06.50                     113                                   12/05/05
COTS Assessment Report                                                      Version 6.50

The following tables break down each single criterion in the above table, into more details in
order to obtain a better measure of each criterion of the COTS product.


Table below breaks down the Cost (AT-C01) criterion into more details as follows:

 Weight        Features
 70            Initial purchase cost
 50            Annual Maintenance cost
                                    Table 75 Client Cost Attribute


Table below breaks down the Performance (AT-C02) criterion into more details as follows:

 Weight        Features
 120           Hardware/Software specification
                                Table 76 Client Performance Attribute


Table below breaks down Vendor Support (AT-C03) criterion into more details as follows:

 Weight        Features
 40            Hardware support
 50            Warranty
                             Table 77 Client Vendor Support Attribute


Table below breaks down the Flexibility (AT-C04) criterion into more details as follows:

 Weight        Features
 80            Upgradeability
                                 Table 78 Client Flexibility Attribute




CAR_LCA_F05a_T03_V06.50                      114                                12/05/05
COTS Assessment Report                                                      Version 6.50


       4.2.3       Test Procedure
This section documents the detailed evaluation process stating the test set-up, test procedures
used and their corresponding results.

                      4.2.3.1           Test Identification
The COTS products that are going to be tested are:
       Tangent WebDT 166
       Wyse Winterm V90
Our COTS evaluation test has 5 test procedures till date, which will be performed by three team
members, and the results of our COTS evaluation tests will work as the basis for our
recommendation to the client.




CAR_LCA_F05a_T03_V06.50                   115                                   12/05/05
COTS Assessment Report                                                        Version 6.50


                      4.2.3.2           Test Preparation

During CS577A, various assessments were performed to evaluate the various COTS product
mentioned above. The various assessments are designed and executed to test the functionality
and performance of critical business scenarios of the above mentioned COTS products. This
section states the minimum hardware, software and other test preparation requirements and
whether they were met or not.

4.2.3.2.1   Hardware Preparation

The following table lists the hardware requirement of the client COTS product:

 ID           COTS Product       Hardware Requirements           Met or not
              Model
 HREQ-C1      Tangent            1. One server as the            1. Yes. Currently TC95 is the
              WebDT 166             application server              application and
                                 2. One server as the               authentication server in PHS
                                    authentication server        2. Yes. Currently TC95 is the
                                 3. One server as the               application and
                                    backup, folder storage          authentication server in PHS
                                    server                       3. Yes. Currently TC96 is the
                                 4. 40+ thin-client terminals       folder storage server in PHS
                                    connected with the           4. Yes. PHS currently has 40
                                    server with internet            thin-client terminals
                                    connection                      connected to TC95
                                 5. Tangent WebDT 166            5. No. The new thin-client
                                                                    model for Tangent WebDT
                                                                    166 is not available for
                                                                    testing
 HREQ-C2      Wyse Winterm       1. One server as the            1. Yes. Currently TC95 is the
              V90                   application server              application and
                                 2. One server as the               authentication server in PHS
                                    authentication server        2. Yes. Currently TC95 is the
                                 3. One server as the               application and
                                    backup, folder storage          authentication server in PHS
                                    server                       3. Yes. Currently TC96 is the
                                 4. 40+ thin-client terminals       folder storage server in PHS
                                    connected with the           4. Yes. PHS currently has 40
                                    server with internet            thin-client terminals
                                    connection                      connected to TC95
                                 5. Wyse Winterm V90             5. No. The new thin-client
                                                                    model for Wyse Winterm
                                                                    V90 is not available for
                                                                    testing
                      Table 79 Hardware Preparation for Client COTS Product


CAR_LCA_F05a_T03_V06.50                   116                                    12/05/05
COTS Assessment Report                                                          Version 6.50

4.2.3.2.2   Software Preparation

The following table lists the software requirement of the client COTS product:

 ID      COTS Product           Software Requirements              Met or not
         Model
 SREQ-C1 Tangent                1. Windows based operating         1. Yes. Tangent WebDT 166
         WebDT 166                 system embedded to                 supports both Windows
                                   support terminal service           and Linux based OS,
                                   client                             including Windows CE,
                                                                      Windows XP, and
                                                                      Windows XP Professional
 SREQ-C2 Wyse Winterm           1. Windows based operating         1. Yes. Wyse Winterm V90 is
         V90                       system embedded to                 based on Windows XP
                                   support terminal service           embedded operating
                                   client                             system.
                       Table 80 Software Preparation for Client COTS Product



4.2.3.2.3   Other Pre-test Preparation

The following table lists other pre-test requirement of the client COTS product:

 ID          COTS Product       Pre-test Preparations              Met or not
             Model
 PREP-C1     Tangent            The testers need authenticated     Yes. The librarian in PHS has
             WebDT 166          username and password set-         provided both administrator
                                up for access to all the servers   and regular users username and
                                and thin-client terminals          password to access the system
 PREP-C2     Wyse Winterm       The testers need authenticated     Yes. The librarian in PHS has
             V90                username and password set-         provided both administrator
                                up for access to all the servers   and regular users username and
                                and thin-client terminals          password to access the system
                        Table 81 Other Preparation for Client COTS Product




CAR_LCA_F05a_T03_V06.50                    117                                     12/05/05
COTS Assessment Report                                                        Version 6.50


                      4.2.3.3            Test Procedure Specifications

This section provides the detailed test procedures carried out by the testers in order to rate each
evaluation criterion.
    The testing procedure adopted for evaluating the COTS attributes AT-C01 and AT-C03 is
       black-box testing. The black-box testing techniques widely used for the test process are
       equivalence partitioning and boundary value analysis. By applying these black-box
       techniques, we derived a set of test cases, which are presented below.
For attribute AT-C02 and AT-C04 there are no detailed test procedure specifications. The
rationale for this attribute is explained in section 4.1.3.3.1 of the document.




CAR_LCA_F05a_T03_V06.50                    118                                    12/05/05
COTS Assessment Report                                                          Version 6.50

4.2.3.3.1     Rationales for Attributes with No Test Procedures

In case of the Performance attribute (AT-C02):

The performance of thin clients is measured in terms of its internal processor, the physical
memory embedded in the device and by the operating system that is loaded onto the thin client
device. Based on the result of our market research, the CPU for these devices is range from
500MHz to 1GHz. The embedded memory ranges from 128MB to 512MB. The two popular
OS that run are Embedded Windows CE, and Embedded Windows XP. Due to the fact that
Team 3 was not able to obtain a demo unit for Wyse Winterm V90 and Tangent WebDT 166, we
can only evaluate performance based on the system specifications. Thus there are no test cases
associated when we refer to thin client performance. Our selections of thin client are selected
based on the system specs. Thin clients with high processing power and high memory running
Windows XP Embedded will be ranked higher than thin clients with slower processor, and
minimal memory running on Windows CE.


Performance results for thin client based on their system speculations.


 Category                        Tangent WebDT 166                   Wyse Winterm V90
 CPU                             533 MHz AMD Geode                   1.0GHz (x86)
 Memory                          128MB                               256MB
 OS                              Windows CE Embedded                 Windows XP Embedded
 Rating                          5/10                                10/10
                        Table 82 Client Performance Attribute Result Rational
References:


http://www.dtresearch.com/datasheets/WebDT%20166%20flyer%20081205-2.pdf
http://www.wyse.com/products/winterm/


In case of the Flexibility attribute (AT-C04):

Thin clients are embedded devices built to connect to a server. Thus, these devices cannot be
physically upgraded due to the fact that many components including the processor and memory
modules are soldered directly into the device. Wyse however does sell a 256MB Upgrade Kit
for their Winterm device. As for WebDT 166, the manufacture‟s documentation states that these
units cannot be upgraded. The internal OS on this thin client device cannot be upgraded.




CAR_LCA_F05a_T03_V06.50                     119                                     12/05/05
COTS Assessment Report                                                               Version 6.50

Flexibility results rating


 Tangent WebDT 166                                      Wyse Winterm V90
 0/10                                                   6/10
                             Table 83 Client Flexibility Attribute Result Rational
References


http://www.dtresearch.com/datasheets/WebDT%20166%20flyer%20081205-2.pdf
http://www.wyse.com/products/winterm/




CAR_LCA_F05a_T03_V06.50                          120                                    12/05/05
COTS Assessment Report                                                        Version 6.50

4.2.3.3.2      Test Procedures

The following tables indicate the test procedures for the cost attribute (AT-C01):

 Test Case:                1-1
 Identifier:               AT-C01-1
 Test Items:               Initial Purchase Cost
 Test Description:         This test will compare the initial cost of ownership among the
                           different COTS products.
 Pre-conditions:           The following pre-condition must be meet before the following test
                           can be preformed:
                                  1. COTS products must be available from COTS vendor
                                  2. COTS vendor must be able to supply price information
                                      for COTS products.
                                  3. A communication channel (phone number or email
                                      address) must be established between the customer and
                                      the COTS vendor prior to the test.
                                  4. Customer needs to set an expected price range for
                                      purchasing COTS products.
 Post-conditions:          The following function will be preformed after the test procedure.
                                  1. Obtain a price quote for 1 unit for all COTS products.
                                  2. COTS item will be rated as High, Medium, or Low
                                     Initial Cost based on price.
 Input Specifications:     The following information will be added to perform the test:
                                  1. Customer dial phone or email COTS vendor indicating
                                     interest in purchasing a COTS product.
                                  2. Customer supply parts number to COTS vendor for
                                     product lookup.
 Expected Output           The following information are the expected output:
 Specifications:                  1. COTS vendor will supply price quote (in US dollar
                                     amounts)
                                  2. COTS products will be assigned High, Medium, Low
                                     status.
 Pass/Fail Criteria:       The following information is the pass/fail criteria for testing:
                                  1. COTS products availability (in stock, back order,
                                     discontinued, etc.)
                                  2. COTS product price falls within the initial cost price
                                     range set by the customer
 Test Process:             The following list the test process:
                                  1. Customer contact COTS vendor via phone or email.
                                  2. Customer checks the availability of COTS products with
                                     COTS vendor.


CAR_LCA_F05a_T03_V06.50                    121                                       12/05/05
COTS Assessment Report                                                      Version 6.50

                                 3. Customer request price quote for one unit of COTS
                                    product.
                                 4. Customer compare price against initial estimated cost
                                    price.
Assumptions and          The following list the assumptions and constraints for the test:
constraints:                    1. COTS vendor is willing to disclose pricing information
                                2. Pricing information for COTS products must not change
                                   dramatically during the duration of the test.
                                3. Final price quote must not exceed the maximum
                                   expected price range set by the customer.
Dependencies:            None
Traceability:            None
                         Table 84 Client Test Procedure Specification 1-1




CAR_LCA_F05a_T03_V06.50                   122                                  12/05/05
COTS Assessment Report                                                    Version 6.50



Test Case:               1-2
Identifier:              AT-C01-2
Test Items:              Annual Maintenance Cost
Test Description:        This test will compare the maintenance cost among the different
                         COTS products.
Pre-conditions:          The following pre-condition must be meet before the following test
                         can be preformed:
                                1. COTS vendor must have a maintenance plan.
                                2. COTS vendor must be able to supply maintenance price
                                    information for COTS product.
                                3. A communication channel (phone number or email
                                    address) must be established between the customer and
                                    the COTS vendor prior to the test.
                                4. Customer needs to set an expected price range for
                                    maintenance of COTS items.
Post-conditions:         The following function will be preformed after the test procedure.
                                1. Obtain a price quote for maintenance cost for COTS
                                   system for one academic year.
Input Specifications:    The following information will be added to perform the test:
                                1. Customer dial phone or email COTS vendor indicating
                                   interest in purchasing a one year contract for COTS item
                                   maintenance.
Expected Output          The following information are the expected output:
Specifications:                 1. COTS vendor will supply price quote (in US dollar
                                   amounts) for one year maintenance fee.
Pass/Fail Criteria:      The following information is the pass/fail criteria for testing:
                                1. Availability of a maintenance plan from COTS vendor
                                   for COTS system.
                                2. COTS system maintenance price falls within the annual
                                   maintenance budget price range set by the customer.
Test Process:            The following list the test process:
                                1. Customer contact COTS vendor via phone or email.
                                2. Customer asks for the availability of COTS maintenance
                                   plan.
                                3. Customer request price quote for system maintenance for
                                   one academic year.
                                4. Customer checks if support plan offers 24 by 7 technical
                                   supports.
                                5. Customer checks if plan offer onsite and remote support.
                                6. Customer checks if support is done in house or out
                                   source.


CAR_LCA_F05a_T03_V06.50                 123                                   12/05/05
COTS Assessment Report                                                      Version 6.50

                                 7. Customer compare support price against the annual
                                    budget dedicated to supporting COTS system.
Assumptions and          The following list the assumptions and constraints for the test:
constraints:                    1. COTS vendor is willing to disclose upgrade pricing
                                   information
                                2. Pricing information for COTS upgrade item must not
                                   change dramatically during the duration of the test.
                                3. Final upgrade price quote must not exceed the maximum
                                   expected price range set by the customer.
                                4. COTS system support plan may be depending upon
                                   COTS vendor.
                                5. COTS support plan may require a signing a year
                                   contract.
Dependencies:            None
Traceability:            CAB Document Section 2.4 PC-1
                         Table 85 Client Test Procedure Specification 1-2




CAR_LCA_F05a_T03_V06.50                   124                                  12/05/05
COTS Assessment Report                                                        Version 6.50

The following tables indicate the test procedures for the vendor support attribute (AT-C03):

 Test Case:                3-1
 Identifier:               AT-C03-1
 Test Items:               Hardware support
 Test Description:         This test will find out the effectiveness of the vendor in fixing the
                           hardware related issues
 Pre-conditions:           The following pre-condition must be met before the following test
                           can be performed:
                                  1. Identify and obtain the contact person‟s name,
                                     Department, telephone number and email address from
                                     the COTS vendor
                                  2. A correspondence must have already occurred and the
                                     communications channel (telephone or email) established
                                     between the customer and the COTS vendor prior to the
                                     test.
                                  3. Both the customer and the COTS vendor must be aware
                                     of the contractual bindings and/or the warranty
                                     obligations of the vendor to the customer. The problem
                                     to be fixed should fall in this scope.
 Post-conditions:          The following function will be performed after the test procedure.
                                  1. The vendor would have responded, logged-in a call, and
                                     given a date and time to the customer for looking into the
                                     problem.
                                  2. The vendor would have provided the support on the date
                                     and time confirmed by them with the customer when the
                                     call was logged in.
                                  3. The problem would have got solved.
                                  4. The Vendor response will be rated as good, average, or
                                     poor depending on the degree to which fail/pass criteria
                                     is met.
 Input Specifications:     The following information will be added to perform the test:
                                  1. Customer will contact the COTS vendor by phone or
                                     email and „login a call‟ indicating that there is a
                                     hardware problem which has arisen in the customer site
                                     which needs immediate attention.
                                  2. Customer will supply more information (if required) like
                                     warranty details, known details of the problem etc to
                                     COTS vendor.



CAR_LCA_F05a_T03_V06.50                    125                                    12/05/05
COTS Assessment Report                                                       Version 6.50


Expected Output          The following information are the expected output:
Specifications:                 1. COTS vendor will login a call and give the customer a
                                   call priority number as well as the date and time on
                                   which they will be attending this call.
                                2. COTS vendor will attend to the call on the mentioned
                                   date and time, and the problem will be solved (or advise
                                   given to the customer for the future course of action).
Pass/Fail Criteria:      The following information is the pass/fail criteria for testing:
                                1. The COTS vendor will respond within a timeframe
                                   already being discussed and agreed upon in the contract
                                   or warranty card.
                                2. COTS vendor will solve the problem or take any other
                                   required action on the customer site on the date and time
                                   as per their agreement during the call log-in.
Test Process:            The following list the test process:
                                1. Customer contact COTS vendor via phone or email.
                                2. Customer „login a call‟ with the COTS vendor indicating
                                   that there is a hardware which has arisen in the customer
                                   site.
                                3. Customer compares the time taken for the initial
                                   response with the expected response time as per the
                                   contract or warranty card.
                                4. Customer compares the date and time on which the
                                   COTS vendor takes the required action on the customer
                                   site and solves the problem, to the expected response
                                   time as per the contract or warranty card.
                                5. Also the customer compares the date and time on which
                                   the COTS vendor take the required action on the
                                   customer site and solve the problem to that promised by
                                   the COTS vendor when the call was logged-in.
                                6. The customer confirms that the hardware problem has
                                   been solved and the support has been effective
Assumptions and          The following list the assumptions and constraints for the test:
constraints:                    1. COTS vendor will not have any abnormal activities
                                   going on in their organization like a company merger,
                                   department take-over etc which may in normal case
                                   delay the response time.
                                2. COTS vendor contact is working at his/her regular
                                   workload and there is nothing unusual happening with


CAR_LCA_F05a_T03_V06.50                  126                                     12/05/05
COTS Assessment Report                                                      Version 6.50

                                     his work like a much increased workload. This can really
                                     slow down the response in many cases.
                                 3. The pre-conditions are met.
                                 4. If the hardware is to be replaced, the COTS vendor has it
                                    in stock.
Dependencies:            None
Traceability:            None
                         Table 86 Client Test Procedure Specification 3-1




CAR_LCA_F05a_T03_V06.50                   127                                  12/05/05
COTS Assessment Report                                                     Version 6.50



Test Case:               3-2
Identifier:              AT-C03-2
Test Items:              Warranty
Test Description:        This test will find out the effectiveness of warranty support provided
                         by the vendor to fix the problems faced by the customer for items
                         under warranty
Pre-conditions:          The following pre-condition must be met before the following test can
                         be performed:
                                1. Identify and obtain the contact person‟s name in the
                                   technical Department of COTS vendor. Also his telephone
                                   number and email address.
                                2. A correspondence must have already occurred and the
                                   communications channel (telephone or email) established
                                   between the customer and the COTS vendor prior to the
                                   test.
                                3. Both the customer and the COTS vendor must be aware of
                                   the contractual bindings and/or the warranty obligations of
                                   the vendor to the customer. The problem to be fixed should
                                   fall in this scope.
Post-conditions:         The following function will be performed after the test procedure.
                                1. The vendor would have responded, logged-in a call, and
                                   given a date and time to the customer for looking into the
                                   problem.
                                2. The vendor would have provided the support on the date
                                   and time confirmed by them with the customer when the
                                   call was logged in.
                                3. The Vendor response will be rated as good, average, or
                                   poor depending on the degree to which fail/pass criteria is
                                   met.
Input Specifications:    The following information will be added to perform the test:
                                1. Customer will contact the COTS vendor by phone or email
                                   and „login a call‟ indicating that there is a problem which
                                   has arisen in the customer site related to an item under
                                   warranty which needs immediate attention.
                                2. Customer will supply more information (if required) like
                                   warranty details, known details of the problem etc to COTS
                                   vendor.
Expected Output          The following information are the expected output:
Specifications:                 1. COTS vendor will login a call and give the customer a call
                                   priority number as well as the date and time on which they
                                   will be attending this call.
                                2. COTS vendor will attend to the call on the mentioned date


CAR_LCA_F05a_T03_V06.50                  128                                   12/05/05
COTS Assessment Report                                                      Version 6.50

                                    and time.
                                 3. The problem will be solved (or advice given to the
                                    customer for the future course of action).
Pass/Fail Criteria:      The following information is the pass/fail criteria for testing:
                                1. The COTS vendor will respond within a timeframe already
                                   being discussed and agreed upon in the contract or warranty
                                   card.
                                2. COTS vendor will attend to the call on the mentioned date
                                   and time
                                3. COTS vendor will solve the problem or take any other
                                   required action on the customer site on the date and time as
                                   per their agreement during the call log-in.
Test Process:            The following list the test process:
                                1. Customer contact COTS vendor via phone or email.
                                2. Customer will contact the COTS vendor by phone or email
                                   and „login a call‟ indicating that there is a problem which
                                   has arisen in the customer site related to an item under
                                   warranty which needs immediate attention.
                                3. Customer compares the time taken for the initial response
                                   with the expected response time as per the contract or
                                   warranty card.
                                4. Customer compares the date and time on which the COTS
                                   vendor takes the required action on the customer site and
                                   solves the problem, to the expected response time as per the
                                   contract or warranty card.
                                5. Also the customer compares the date and time on which the
                                   COTS vendor take the required action on the customer site
                                   and solve the problem to that promised by the COTS
                                   vendor when the call was logged-in.
                                6. The customer checks whether the problem has been solved.
Assumptions and          The following list the assumptions and constraints for the test:
constraints:                    1. COTS vendor will not have any abnormal activities going
                                   on in their organization like a company merger, department
                                   take-over etc which may in normal case delay the response
                                   time.
                                2. COTS vendor contact is working at his/her regular
                                   workload and there is nothing unusual happening with his
                                   work like a much increased workload. This can really slow
                                   down the response in many cases.
                                3. The item which has developed a problem is under warranty.
Dependencies:            None
Traceability:            None
                         Table 87 Client Test Procedure Specification 3-2



CAR_LCA_F05a_T03_V06.50                   129                                  12/05/05
COTS Assessment Report                                                             Version 6.50


                        4.2.3.4             Test Results

This section lists the test results of all the test procedures described in section 4.2.3.3 of this
document for the two client COTS product.

4.2.3.4.1      Test Results for Tangent WebDT 166

The following tables indicate the test results for the cost attribute (AT-C01):

 Test Case:                      1-1
 Identifier:                     AT-C01-1
 Test Items:                     Initial purchase cost
 Test Result Classification      Pass - The initial cost per Thin Client WebDT 166 is $250.00
 (Pass / Fail)                   The customer‟s expected cost: $600
 Problem / Defect Report         None
 Feedback / Comments:            Tangent Thin Client WebDT 166 doesn‟t offer the performance
                                 of Wyse thin client, however for price conscious consumers who
                                 are willing to sacrifice performance, this is a good alternative.
                                 Reference:
                                 Louise O‟Sullivan at Tangent Computer
                                 Contact # : 800-342-9388
                                 Rating-10/10
                           Table 88 Client Test Results 1-1 for Tangent Model




CAR_LCA_F05a_T03_V06.50                       130                                      12/05/05
COTS Assessment Report                                                        Version 6.50



Test Case:                     1-2
Identifier:                    AT-C01-2
Test Items:                    Annual maintenance cost
                               Pass
Test Results
Classification (Pass / Fail)
Problem / Defect Report        None
Feedback / Comments:           This model is covered by the Tangent maintenance contract.
                               Since Tangent is the direct vendor of this product, the
                               maintenance cost is less than 3rd party models.
                               Reference:
                               Louise O‟Sullivan at Tangent Computer
                               Contact # : 800-342-9388
                               Rating-10/10
                         Table 89 Client Test Results 1-2 for Tangent Model




CAR_LCA_F05a_T03_V06.50                     131                                  12/05/05
COTS Assessment Report                                                         Version 6.50

The following tables indicate the test results for the vendor support attribute (AT-C03):

 Test Case:                    3-1
 Identifier:                   AT-C03-1
 Test Items:                   Hardware support
 Test Result Classification    Pass
 (Pass /Fail):
 Problem / Defect Report:      None.
 Feedback / Comment:           Tangent provides toll-free hardware support.
                               Mon-Fri 6 AM to 4:30 PM PST.
                               Call 1-800-399-8324
                               Reference:
                               http://www.tangent.com/explore/tech.htm
                               Rating : 10/10
                          Table 90 Client Test Results 3-1 for Tangent Model


 Test Case:                    3-2
 Identifier:                   AT-C03-2
 Test Items:                   Warranty
 Test Result Classification    Pass
 (Pass /Fail):
 Problem / Defect Report:      None.
 Feedback / Comment:           Tangent Servers come with a 3-year Limited Warranty, which
                               includes Next Business Day Parts Replacement and 1-year On-
                               site Labor Service at the sole discretion of Tangent. Removable
                               media is only covered for the 1st year of the warranty.
                               Reference:
                               http://www.tangent.com/explore/tech/warranty.htm
                               Rating: 10/10
                          Table 91 Client Test Results 3-2 for Tangent Model




CAR_LCA_F05a_T03_V06.50                     132                                   12/05/05
COTS Assessment Report                                                            Version 6.50


4.2.3.4.2      Test Results for Wyse Winterm V90

The following tables indicate the test results for the cost attribute (AT-C01):

 Test Case:                     1-1
 Identifier:                    AT-C01-1
 Test Items:                    Initial cost
 Test Result Classification     Pass -The initial cost per client Winterm V90
 (Pass / Fail)                   (512 flash / 256MB RAM) with keyboard
                                 Part No. 902094-06 is (USD) $626.00 (Less 25% Educational
                                Discount)
                                The customer‟s expected cost: $600
 Problem / Defect Report        None
 Feedback / Comments:           Winterm V90 is currently Wyse‟s higher end thin client model.
                                This model was chosen because of its support for multimedia
                                applications.
                                Reference:
                                General Sales, Wyse
                                Contact # (800) GET-WYSE (438-9973)
                                Rating-8/10
                               Table 92 Client Test Results 1-1 for Wyse

 Test Case:                     1-2
 Identifier:                    AT-C01-2
 Test Items:                    Annual maintenance cost
 Test Result Classification     Pass -The manufacturer Wyse offers a 3 years limited warranty
 (Pass / Fail)                  on all Wyse terminals. In addition, Tangent support for thin
                                client terminal is covered under Tangent‟s annual support
                                contract.
 Problem / Defect Report        None
 Feedback / Comments:           This model is covered by the Tangent maintenance contract, but
                                since Tangent does not distribute the Wyse model directly, they
                                will charge an additional fee for supporting non Tangent
                                distributed products.
                                Rating-9/10
                               Table 93 Client Test Results 1-2 for Wyse




CAR_LCA_F05a_T03_V06.50                        133                                   12/05/05
COTS Assessment Report                                                        Version 6.50

The following tables indicate the test results for the vendor support attribute (AT-C03):

 Test Case:                    3-1
 Identifier:                   AT-C03-1
 Test Items:                   Hardware support
 Test Result Classification    Pass
 (Pass /Fail):
 Problem / Defect Report:      The COTS vendor (Tangent) does not distribute the Wyse model
                               directly, so the level of support may be limited.
 Feedback / Comment:           If the product is under warranty, then the client needs to go
                               through a Return Material Authorization (RMA) process, where
                               the client can request a support service via web, phone or fax.
                               For hardware technical problems, the client can contact the
                               COTS vendor to get direct support.
                               Reference:
                               Nick Haddad at Tangent Computer
                               Contact # : 800-342-9388
                               http://www.wyse.com/serviceandsupport/service/rmaproc.asp#c3
                               Rating: 8/10
                              Table 94 Client Test Results 3-1 for Wyse




CAR_LCA_F05a_T03_V06.50                     134                                   12/05/05
COTS Assessment Report                                                     Version 6.50



Test Case:                   3-2
Identifier:                  AT-C03-2
Test Items:                  Warranty
Test Result Classification   Pass
(Pass /Fail):
Problem / Defect Report:     None
Feedback / Comment:          Wyse Technology warrants its products to be free from defects
                             in material and workmanship for a period of three years after the
                             date of purchase.
                             Reference:
                             http://www.wyse.com/serviceandsupport/service/prodwarr.asp
                             Rating: 10/10
                             Table 95 Client Test Results 3-2 for Wyse




CAR_LCA_F05a_T03_V06.50                    135                                 12/05/05
COTS Assessment Report                                                         Version 6.50


                       4.2.3.5            Test Summary

This section summarizes the evaluation test performed on the client COTS product.

4.2.3.5.1    Summary

The tests were performed on two client COTS product, Wyse Winterm V90 and Tangent
WebDT 166. Our COTS evaluation test has 4 test procedures, which will be performed by three
team members and the results of our COTS evaluation tests will work as the basis for our final
COTS recommendation.

4.2.3.5.2    Summary of Results and Consequences

The test results and references showed that the Wyse Winterm V90 client model provided a
much better performance and upgrade capability for the client. However, the per-unit cost of the
Wyse model is much more expensive than the Tangent model. Also the hardware support for the
Wyse model may not be as good as the Tangent model since Tangent is the current COTS
vendor for the client.

4.2.3.5.3    Evaluation

Most of the tests are done successfully, except that some unavailable and low priority features of
the COTS product. AT-C01 and AT-C03 are tested based on the quote and service information
provided by the COTS vendor. AT-C02 and AT-C04 are based on the specification data
provided by the COTS vendor. All the detailed evaluations are described in section 4.2.3 of this
document.

4.2.3.5.4    Summary of Activities

Total staffs that involved in testing are three of team members.
Total elapsed time used for each of the major testing activities is about 10 hours.
The actual machine time can not be determined at this moment because we don‟t have the client
hardware to test.




CAR_LCA_F05a_T03_V06.50                     136                                       12/05/05
COTS Assessment Report                                                            Version 6.50


         4.2.4      Evaluation Results Screen Matrix
This section lists the rating of the evaluation results from Section 4.1.3.4 of CAR and calculates
the score of each evaluation criteria according to the weights assigned so as to compare the
overall level of satisfaction between the Wyse Winterm V90 and Tangent WebDT 166.
The weight is described in section 4.2.2 of this document.
The rating is assigned in the test results in section 4.2.3.4 of this document.
The score is calculated using the formula: Weight * Rating = Score
The total score is the sum of the all scores for the product.
The weighted average rating is calculated using the formula:
 ((Weight / Total Weight) * Rating)


AT-C01 Cost


 Weight          Evaluation Criteria            Tangent WebDT 166             Wyse Winterm V90
                                                 Rating         Score         Rating         Score
   70      Initial purchase cost                          10          700               8          560
   50      Annual maintenance cost                        10          500               9          450
   120                       Total Score                   -        1200                -         1010
             Weighted Average Rating                      10              -         8.42             -
                                   Table 96 Client Result Matrix AT-C01
AT-C02 Performance


 Weight          Evaluation Criteria            Tangent WebDT 166             Wyse Winterm V90
                                                 Rating         Score         Rating         Score
   120     Hardware/Software                               5          600              10         1200
           specification
   120                       Total Score                   -          600               -         1200
             Weighted Average Rating                       5              -            10            -
                                   Table 97 Client Result Matrix AT-C02




CAR_LCA_F05a_T03_V06.50                        137                                     12/05/05
COTS Assessment Report                                                        Version 6.50

AT-C03 Vendor Support


 Weight       Evaluation Criteria            Tangent WebDT 166              Wyse Winterm V90
                                              Rating         Score          Rating         Score
   40     Hardware support                             10          400                8          320
   50     Warranty                                     10          500               10          500
   90                       Total Score                 -          900                -          820
           Weighted Average Rating                     10              -        9.11               -
                                Table 98 Client Result Matrix AT-C03
AT-C04 Flexibility


 Weight       Evaluation Criteria            Tangent WebDT 166              Wyse Winterm V90
                                              Rating         Score          Rating         Score
   80     Upgradeability                                0              0              6          480
   80                       Total Score                 -              0              -          480
           Weighted Average Rating                      0              -              6            -
                                Table 99 Client Result Matrix AT-C04
Overall Summary


 Weight       Evaluation Criteria            Tangent WebDT 166              Wyse Winterm V90
                                              Rating         Score          Rating         Score
  120     Cost                                         10        1200           8.42            1010
  120     Performance                                   5          600               10         1200
   90     Vendor support                               10          900          9.11             820
   80     Flexibility                                   0              0              6          480
  410                       Total Score                 -        2700                 -         3510
           Weighted Average Rating                  6.59               -        8.56               -
                           Table 100 Client Result Matrix Overall Summary




CAR_LCA_F05a_T03_V06.50                     138                                      12/05/05
COTS Assessment Report                                                          Version 6.50


                                       Client Result Summary Chart


           12


                10                           10                 10
           10
                                                                        9.11
                            8.42
            8
  Rating




                                                                                             6
            6
                                      5


            4



            2


                                                                                   0
            0
                     Cost            Performance               Vendor support       Flexibility
                                                    Criteria

                                      Tangent WebDT 166    Wyse Winterm V90

                                   Figure 10 Client Result Summary




CAR_LCA_F05a_T03_V06.50                      139                                   12/05/05
COTS Assessment Report                                                         Version 6.50


4.2.5Business Case Analysis
This section describes the business case analysis of the client COTS hardware products in terms
of added value, deployment costs and operational costs. The business case analysis is a
framework prepared for decision makers to show that the proposed project is feasible and makes
sound financial sense by showing that it is techno commercially viable solution.

4.2.5.1 Business Case Analysis for Product 1

This section describes the business case analysis of the Tangent WebDT 166 thin client model.

4.2.5.1.1    COTS Ownership Cost

COTS ownership cost is the initial investment done on the purchase of the COTS product. Here
the hardware COTS purchased is the thin client terminals.
Phase II consists of replacing the Wyse Winterm 3230LE terminals by Tangent WebDT 166
terminals. The current thin client terminals do not have the capability to handle heavy
multimedia intensive applications (Wayang outpost). Also the old models have become obsolete.
The new terminal purchase will be done in phases. We assume that the client will buy 30
terminals at a time as part of phase 2 of prototype 2.


   Installation and setup charges for 30 terminals = $800
   Cost of Tangent WebDT 166 terminal              = $250 / terminal


   COTS Hardware cost (assuming that client buys 30 of Tangent DT 166 terminal)
   = 7500 $
   Hence, the total COTS Ownership cost for PHS is $8300

4.2.5.1.2    Development Cost

This section does not apply to this project, since this project involves only network assessment
and there is no software development involved here. Also the cs577a team is working on this
project as a part of our course requirements.

4.2.5.1.3    Transition Cost

There is no transition cost here since the client doesn‟t have to spend any extra time for training.
Also this is just new model of thin client terminals which has been upgraded from the previous
model where there is only hardware upgrade involved.




CAR_LCA_F05a_T03_V06.50                     140                                    12/05/05
COTS Assessment Report                                                       Version 6.50


4.2.5.1.4    Operational Cost

There is an operational cost involved here since the administrator will have to spend more of her
time on administrating the thin client terminals. We have come to a figure of additional 1 hour
every week as the operational cost.
i.e. (1hour / week) x (4weeks/ month) x (12 months/year) x $30 /hour = $1440/year

4.2.5.1.5    Annual Maintenance Cost

The maintenance cost of $ 3000 per annum is for the server alone and the COTS vendor will do
the maintenance of the old clients which is already there. For the new terminals there is no
maintenance cost for the first three years and there will be a charge of $1000 for maintenance of
30 terminals every year from then onwards.

4.2.5.1.6    Estimate of Value Added ROI

The client has spent time over phone, meetings, emails and also commuting to USC for ARB
meetings. We have arrived at a total number of 62 person hours spent by the client altogether,
out of which 27 hours (approximately) have been spent on discussions on thin client prototypes.
The phase 2 of prototype 2, which is the installation of thin clients offers operational and
maintenance costs benefits over the old system.
The current thin client model Winterm 3230LE is discontinued by the manufacturer and the new
model, Tangent WebDT 166, has AMD Geode GX 533 embedded processor with integrated high
speed video. The new system reduces the amount of time the system administrator spends at the
users‟ desk by more than 40 %. This is because now all the students can work simultaneously in
a single session since there are more terminals. The new system will enable 40 plus users to
access Wayang Outpost (a high-end graphics and multimedia geometric tutorial software) and
Renaissance Place (an interactive assessment program) simultaneously using the thin-client
network. Also the old thin clients were prone to failures because of wear and tear and already
signs were there. The system log in time is much faster. CPU utilization of the server will be
much lesser here since the flash based application will run locally.
We arrived at a figure of 3 hours total savings per week, Administrator and faculties put together
(after talking to the customer and faculties we arrived at this figure).


Total savings / year
In the new system the total savings in operational cost in every year from January 2006 for
Librarian and faculties = 144 person hours approx.
(1.5 hours for librarian, 1.5 hours for all faculties together in a week).
These are rough estimates obtained from the client. This saving of the time spent results in a
salary savings of $30 /hr (Salary estimate not obtained from the client because of confidentiality
reasons. An approximate estimate taken)



CAR_LCA_F05a_T03_V06.50                      141                                 12/05/05
COTS Assessment Report                                                          Version 6.50


Thus, the resulting annual salary saving would be
30 (salary/hr) * (1.5 +1.5) * 4 * 12 hours
= $4320 / year


Total client effort = 62 person-hours (We have arrived at a total number of 62 person hours spent
by the client for arriving at this prototype.)
Out of these 62 hours, approximately 27 hours have been spent on discussing the client
prototype.
=30 (salary/hr) * 27 hours
= $810


    Installation and setup charges for 30 terminals = $800
    Cost of Tangent WebDT 166 terminal                      = $250 / terminal


    COTS Hardware cost (assuming that client buys 30 of Tangent WebDT166 terminals)            =
    $7500


    Hence, the total COTS Ownership cost for PHS is $8300
-------------------------------------------------------------
COTS Ownership costs                         = $8300



Total Annual Costs for 2006:
         Annual maintenance costs = 0
         COTS Ownership costs            = $8300
         Client effort cost               =$810
-----------------------------------------------------------------
         Total annual costs               = $9110


Total estimated annual Savings:           = $4320


Return on investment (ROI) = (Benefits – Costs)/ Costs
Costs is the shown as initial investment in the table



CAR_LCA_F05a_T03_V06.50                            142                             12/05/05
COTS Assessment Report                                                                  Version 6.50

ROI at the end of the year (2006) = (4320 -9110)/9110 = - 0.52
Please note that we are paying off the annual maintenance cost of the client terminals from the
fourth year onwards from the benefits realized, thus the denominator in the ROI remains the
same.
Thus from the available estimates the ROI can be estimated as follows:

 Years after Deployment              Initial Investment          Accumulated benefits    ROI
                                     (USD)                       (USD)
                   1                                   9110                      4320                  -0.52
                   2                                   9110                      8640                  -0.05
                   3                                   9110                     12960                  0.422
                   4                                   9110                     16280                   0.78
                   5                                   9110                     20600                   1.26
        Table 101 ROI Statistics for Breakeven Analysis after implementing prototype 2, phase2 (product 1)

                                             Breakeven Analysis on ROI

        1.4


        1.2


          1


        0.8


        0.6


        0.4
  ROI




                                                                                                             ROI
        0.2


          0
               0          1              2                3              4          5              6
        -0.2


        -0.4


        -0.6


        -0.8
                                                Years after deployment


                              Figure 11 Client Break-even Analysis on ROI (Product 1)




CAR_LCA_F05a_T03_V06.50                             143                                     12/05/05
COTS Assessment Report                                                       Version 6.50

The return on investment will reach break-even point for PHS by first quarter of the third year of
installation of the system. This calculation is based on the rough estimations provided by the
client. The actual return on investment may vary depending on the level of system usage and
savings. Also there is a major intangible benefit which is not considered here , i.e., less time
spend on the system by the users (students) because of the new thin client terminals, higher user
satisfaction and better learning experience for the students .
Another interesting fact which can be observed here is that even though the initial investment is
low in this case, the performance is also low. Thus the benefits realized by the customer are also
low compared to the Wyse model. Thus ROI will reach breakeven almost at the same time as the
previous model. The customer has a choice whether to go for this model with low initial
investment and comparatively lower performance or the previous model with high initial
investment and high performance.
Thus, the above Business case analysis shows that the proposed project is feasible and makes
sound financial sense since it has been proved to be economically viable and technically feasible.




CAR_LCA_F05a_T03_V06.50                    144                                   12/05/05
COTS Assessment Report                                                         Version 6.50


                       4.2.5.2            Business Case Analysis for Product 2

This section describes the business case analysis of the Wyse Winterm V90 thin client model.

4.2.5.2.1    COTS Ownership Cost

COTS ownership cost is the initial investment done on the purchase of the COTS product. Here
the hardware COTS purchased is the thin client terminals.
Phase II consists of replacing the Wyse 3230LE terminals by Wyse Winterm V90 terminals .The
current thin client terminals do not have the capability to handle heavy multimedia intensive
applications ( Wayang outpost) . Also the old models have become obsolete. The new terminal
purchase will be done in phases. We assume that the client will buy 30 terminals at a time as part
of phase 2 of the prototype.


   Installation and setup charges for 30 terminals = $1000
   Cost of Wyse Winterm V90 terminal               = $570 / terminal


   COTS Hardware cost (assuming that client buys 30 of Wyse Winterm V90 terminals)             =
   $17100


   Hence, the total COTS Ownership cost for PHS is $18100

4.2.5.2.2    Development Cost

This section does not apply to this project, since this project involves only network assessment
and there is no software development involved here. Also the cs577 team is working on this
project as a part of our course requirements.

4.2.5.2.3    Transition Cost

There is no transition cost here since the client doesn‟t have to spend any extra time for training.
Also this is just new model of thin client terminals which has been upgraded from the previous
model where there is only hardware upgrade involved.




CAR_LCA_F05a_T03_V06.50                     145                                    12/05/05
COTS Assessment Report                                                       Version 6.50


4.2.5.2.4    Operational Cost

The maintenance cost of $ 3000 per annum is for the server alone and the COTS vendor will do
the maintenance of the old clients which is already there. For the new terminals there is no
maintenance cost for the first three years and there will be a charge of $1000 for maintenance of
30 terminals every year from then onwards.

4.2.5.2.5    Annual Maintenance Cost

The maintenance cost of $ 3000 per annum is for the server alone and the COTS vendor will do
the maintenance of the old clients which is already there. For the new terminals there is no
maintenance cost for the first three years and there will be a charge of $1000 for maintenance of
30 terminals every year from then onwards.

4.2.5.2.6    Estimate of Value Added ROI

The client has spent time over phone, meetings, emails and also commuting to USC for ARB
meetings. We have arrived at a total number of 62 person hours spent by the client altogether,
out of which 27 hours (approximately) have been spent on discussions on thin client prototypes.
The phase 2 of prototype 2, which is the installation of thin clients offers operational and
maintenance costs benefits over the old system.
The current thin client model Winterm 3230LE is discontinued by the manufacturer and the new
model Wyse Winterm V90 has a powerful 1GHz processor with integrated high speed video.
This delivers superior performance by CPU emulating architectures and high resolution video
systems for fast display updates and local application performance. Thus this model is 3 times
faster than the old model .The enterprise client management tool leverages the value of PHS‟s IT
infrastructure for maximum ROI.
The new system reduces the amount of time the system administrator spends at the users‟ desk
by more than 50 %. This is because now all the students can work simultaneously in a single
session since there are more terminals. The new system will enable 40 plus users to access
Wayang Outpost (a high-end graphics and multimedia geometric tutorial software) and
Renaissance Place (an interactive assessment program) simultaneously using the thin-client
network. Also the old thin clients were prone to failures because of wear and tear and already
signs were there. The system log in time is much faster. CPU utilization of the server will be
much lesser here since the flash based application will run locally.
We arrived at a figure of 5 hours total savings per week, Administrator and faculties put together
(after talking to the customer and faculties we arrived at this figure).
Total savings / year
In the new system the total savings in operational cost in every year from January 2006 for
Librarian and faculties = 576 person hours approx.
(3 hours for librarian, 2 hours for all faculties together in a week).




CAR_LCA_F05a_T03_V06.50                      146                                 12/05/05
COTS Assessment Report                                                          Version 6.50

These are rough estimates obtained from the client. This saving of the time spent results in a
salary savings of $30 /hr (Salary estimate not obtained from the client because of confidentiality
reasons. An approximate estimate taken)
Thus, the resulting annual salary saving would be
30 (salary/hr) * (3 +2)*4* 12 hours
= $7200 / year


Total client effort = 62 person-hours (We have arrived at a total number of 62 person hours spent
by the client for arriving at this prototype.)
Out of these 62 hours, approximately 27 hours have been spent on discussing the client
prototype.
=30 (salary/hr) * 27 hours
= $810


    Installation and setup charges for 30 terminals = $1000
    Cost of Wyse Winterm V90 terminal                       = $570 / terminal


    COTS Hardware cost (assuming that client buys 30 of Wyse Winterm V90 terminals)            =
    $17100
-------------------------------------------------------------
COTS Ownership costs                         = $18100


Total Annual Costs for 2006:
         Annual maintenance costs = $0
         COTS Ownership costs            = $18100
         Client effort cost               =$810
-----------------------------------------------------------------
         Total annual costs               = $18910


Total estimated annual Savings:           = $7200


Return on investment (ROI) = (Benefits – Costs)/ Costs
Costs is the shown as initial investment in the table




CAR_LCA_F05a_T03_V06.50                            147                             12/05/05
COTS Assessment Report                                                                  Version 6.50

ROI at the end of the year (2006) = (7200 -18910)/18910 = - 0.6
Please note that we are paying off the annual maintenance cost of the client terminals from the
fourth year onward from the benefits realized, thus the denominator in the ROI remains the same
Thus from the available estimates the ROI can be estimated as follows:

 Years after Deployment              Initial investments        Accumulated benefits     ROI
                                     (USD)                      (USD)
                   1                                 18910                       7200                   - 0.6
                   2                                 18910                      14400                  -0.23
                   3                                 18910                      21600                   0.14
                   4                                 18910                      27800                   0.47
                   5                                 18910                      35000                   0.85
        Table 102 ROI Statistics for Breakeven Analysis after implementing prototype 2, phase2 (product 2)

                                             Breakeven Analysis on ROI

          1



        0.8



        0.6



        0.4



        0.2
  ROI




                                                                                                             ROI

          0
               0          1              2                3              4          5              6

        -0.2



        -0.4



        -0.6



        -0.8
                                                Years after Deployment


                              Figure 12 Client Break-even Analysis on ROI (Product 2)




CAR_LCA_F05a_T03_V06.50                             148                                     12/05/05
COTS Assessment Report                                                          Version 6.50

The return on investment will reach break-even point for PHS by third quarter of the third year
of installation of the system. This calculation is based on the rough estimations provided by the
client. The actual return on investment may vary depending on the level of system usage and
savings. Also there is a major intangible benefit which is not considered here , i.e., less time
spend on the system by the users (students) because of the new thin client terminals, higher user
satisfaction and better learning experience for the students .
Comparing the product 2 with product 1, ROI is reached by third quarter of third year for the
product 2 and by first quarter of third year for product 1. Also there is a dip in the graph after the
third year because of the maintenance cost levied, but it goes up from next year onwards.
Thus, the above Business case analysis shows that the proposed project is feasible and makes
sound financial sense since it has been proved to be economically viable and technically feasible.




CAR_LCA_F05a_T03_V06.50                      149                                    12/05/05
COTS Assessment Report                                                        Version 6.50



5. Conclusion and Recommendation
This section summarizes and concludes the results of COTS assessment for the PHS Network, as
well as giving recommendations on the selected COTS products.

 5.1 Conclusion and Recommendation Part 1 [Server]
The following are the conclusions derived from the COTS assessment and corresponding
recommendations for the server COTS product:


C1. The Tangent Pillar™ 2750s server satisfied all the required performance level under the
    current size and infrastructure of the network. The network bandwidth available right now is
    sufficient to support all the user activities, but may not be sufficient to support more users
    or new applications.
R1. The client may need to upgrade the network bandwidth if the additional new applications
    need to be run regularly or to support additional users.
C2. The cost for Tangent Pillar™ 2750s server is under the client‟s budget, which includes the
    server purchase, necessary upgrades and annual maintenance cost.
R2. Currently no hardware and software upgrade is necessary to perform all the required
    functionality. However, if upgrade is necessary in the future, consider mainly on the
    memory and storage upgrade to help boost up the performance.
C3. The Tangent Pillar™ 2750s server is compatible with all the software required by the client.
    The only one software that did not pass the compatibility test was Choices due to the fact
    that the software license has expired. However, the Choices application should be
    compatible with the Tangent Pillar™ 2750s server specification based on the software
    vendor‟s website.
R3. Client should renew the Choices license utilize all the applications available on the server.
C4. The Tangent Pillar™ 2750s server is able to communicate with all the required hardware
    components available on the PHS network, as well as passing data from and to the thin
    client terminals. There are some limitations of the thin client network that would require a
    persistent connection between the server and clients at all time.
R4. Client should make sure that all the middleware and network components are working
    properly, such as routers, switches, and cables so keep the connection between the server
    and clients.
C5. The COTS vendor Tangent, who supplied the Tangent Pillar™ 2750s server, satisfied most
    of the vendor support evaluation test. In general, the COTS vendor provides a decent
    support to the client and to the developers during the assessment period, but for some
    technical questions the vendor didn‟t provide enough feedback to the developers. Since the
    client has developed a good relationship with the COTS vendor in the past 4 years, this



CAR_LCA_F05a_T03_V06.50                    150                                    12/05/05
COTS Assessment Report                                                        Version 6.50

     should not be a concern for the client for future supporting. The only problem that the client
     may have is the supporting on the third-part software applications.
R5. Since the COTS vendor does not support the third-party software applications required by
    the client, the client should contact the individual software providers for support in the
    future.
C6. Security is usually not a problem on the thin client network. Also the COTS vendor has
    confirmed that the server would support full security capability as provided by the Active
    Directory and Group Policy services in the Windows operating system.
R6. Client should make sure that the system administrator (librarian) should determine the user
    access right properly and the system maintainer (Tangent) should be responsible for
    creating the user profiles according to the user access rights determined by the system
    administrator.
C7. The Tangent Pillar™ 2750s server has satisfied all the flexibility capabilities required by the
    client. It is very flexible of adopting to feature upgrades as well as supporting the older
    hardware components of the same manufacturer.
R7. Client is flexible of making the choice of her future upgrade plan when new applications or
    additional users are required.




CAR_LCA_F05a_T03_V06.50                    151                                    12/05/05
COTS Assessment Report                                                     Version 6.50



 5.2 Conclusion and Recommendation Part 2 [Client]
The following are the conclusions derived from the COTS assessment and corresponding
recommendations for the client COTS products:


C1. The initial cost of the Tangent WebDT 166 is far less than the Wyse Winterm V90 as the
    per-unit cost is only 1/2 of the Wyse model. However, the Wyse model provides a more
    decent performance as well as room for hardware upgrade in the future.
R1. The client should consider implementing phase 2 of the prototype by purchasing new thin
    client terminals when new applications or additional users are required that would exceed
    the workload the server can handle. If budget is allowed, it‟s highly recommended to choose
    the Wyse Winterm V90 model as it offers a much high performance capability.
C2. The current COTS vendor (Tangent) provides a decent level of support for both the Tangent
    and Wyse models. However, the hardware support for the Wyse model can be somewhat
    limited since Tangent is not the direct distributor of Wyse.
R2. The client is currently using the older model of Wyse thin client terminals and the COTS
    vendor has been supporting the PHS thin client network for the past 4 years. Therefore the
    lack of supporting for the Wyse model in the future should not be a big concern. The client
    should still go for the Wyse model if budget is allowed.




CAR_LCA_F05a_T03_V06.50                   152                                  12/05/05
COTS Assessment Report                                                      Version 6.50



Glossary
      1. Active Directory Services:
          Active Directory (codename Cascade) is an implementation of LDAP directory
          services by Microsoft for use in Windows environments. Active Directory allows
          administrators to assign enterprise wide policies, deploy programs to many
          computers, and apply critical updates to an entire organization. An Active Directory
          stores information and settings relating to an organization in a central, organized,
          accessible database. Active Directory networks can vary from a small installation
          with a few hundred objects, to a large installation with millions of objects.
      2. Black-box Testing
          Software testing technique whereby the internal workings of the item
          being tested is not known by the tester
      3   Citrix MetaFrame:
          Citrix Presentation Server (formerly Citrix MetaFrame) is a remote access/application
          publishing product built on the Independent Computing Architecture (ICA), Citrix
          Systems' thin client protocol. The Microsoft Remote Desktop Protocol, part of
          Microsoft's Terminal Services, is based on Citrix technology and was licensed from
          Citrix in 1997. Unlike traditional frame buffered protocols like VNC, ICA transmits
          high-level window display information, much like the X11 protocol, as opposed to
          purely graphical information.
      4   COTS (Commercial Off The Shelf) :
          COTS software is defined as a software system that has been built as a composition
          of many other COTS software components (Vigder, 1998). Here the developer of the
          software act as the integrator who purchase the components from third party vendors
          and assembly them to build the final product
      5   COCOTS:
          COCOTS are a cost estimation tool designed to capture explicitly the most important
          costs associated with COTS component integration. COCOTS is actually an amalgam
          of four related sub-models, each addressing individually what the authors have
          identified as the four primary sources of COTS software integration costs.
      6 Gantt Chart:
          A Gantt chart is a popular type of bar chart that aims to show the timing of tasks or
          activities as they occur over time. Although the Gantt chart did not initially indicate
          the relationships between activities this has become more common in current usage as
          both timing and interdependencies between tasks can be identified.




CAR_LCA_F05a_T03_V06.50                   153                                    12/05/05
COTS Assessment Report                                                         Version 6.50

      7   ISI :
          Part of the University of Southern California (USC), ISI is involved in a broad
          spectrum of information processing research and in the development of advanced
          computer and communication technologies.
      8. PHS: Pasadena High school
      9   ROI (Return of Investment)
          Return on Investment. A measure of a corporation's profitability, equal
          to a fiscal year's income divided by common stock and preferred stock
          equity plus long-term debt. ROI measures how effectively the           firm uses its
          capital to generate profit; the higher the ROI, the better.
      10 SMP:
          Symmetric Multiprocessing, or SMP, is a multiprocessor computer architecture
          where two or more identical processors are connected to a single shared main
          memory. Most common multiprocessor systems today use SMP architecture.
          SMP systems allow any processor to work on any task no matter where the data
          for that task is located in memory; with proper operating system support, SMP
          systems can easily move tasks between processors to balance the work load
          efficiently.
      11 TC-95 : Application Server at PHS maintained by Tangent Computer
      12 TC-96 : Authentication Server at PHS maintained by Tangent Computer
      13 Thin clients:
          A thin client is a computer (client) in client-server architecture networks which
          has little or no application logic, so it has to depend primarily on the central server
          for processing activities. The word "thin" refers to the small boot image which
          such clients typically require - perhaps no more than required to connect to a
          network and start up a dedicated web browser.
      14 White-box Testing
          A software testing technique whereby explicit knowledge of the internal
          working of the item being tested is used to select the test data.




CAR_LCA_F05a_T03_V06.50                    154                                     12/05/05

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:3
posted:1/6/2012
language:English
pages:154
jianghongl jianghongl http://
About