COTS Assessment Report _CAR_ by gegeshandong

VIEWS: 9 PAGES: 153

									COTS Assessment Report                                  Version 5.0


           COTS Assessment Report (CAR)
        Pasadena High School Computer Network Study



                               Team #3


        Ajithkumar Kattil           (Project Manager)
        Chris Yuan                  (Tester & Test Reviewer)
        Kunal Kadakia               (Business Process Analyst)
        Ashwin Kusabhadran          (Test Designer)
        Andrew Ha                   (Tester & Prototype)
        Devesh Thanvi               (Requirements Analyst)
        Vincent Chu                 (IV&V)
        Winston Kwong               (IV&V)
        Mrs. Jeanine Foote          (Client)
        Erin Shaw                   (Sponsor & Researcher)
        Pasadena High School        (Customer)




CAR_LCA_F05a_T03_V05.0          I                          11/21/05
COTS Assessment Report                                                                     Version 5.0


Version History
Date       Author          Version   Changes made                          Rationale
10/10/05   Andrew Ha       1.0        Initial Draft Release                Initial draft release sections 1, 2
10/10/05   Chris Yuan      1.1        Added Section 3.2                    Added section 3.2
10/10/05   Kunal Kadakia   1.2        Added Section 3.1                    Added section 3.1
10/10/05   Chris Yuan      1.3        Added Table of Contents, Table of    Added TOC and revised reference
                                       Tables, revised reference
10/11/05   Andrew Ha       2.0        Revised Sections 1 & 2               Revised Section 1 & 2 to incorporate
                                                                             the changes mentioned in CAR QR
10/11/05   Kunal Kadakia   2.1        Revised Section 3.1                  Revised Section 3.1 to incorporate
                                                                             the changes mentioned in CAR QR
10/11/05   Chris Yuan      2.2        Revised Section 3.2.1 and also       Revised Section 3.2.1 to incorporate
                                       revised Table of Contents             the changes mentioned in CAR QR
10/19/05   Andrew Ha       3.0        Added Section 3.2.2 Approach         Added Section 3.2.2 Approach
                                       Adopted Diagram                       Adopted Diagram (ARB Review)
                                      Added Section 3.2.3                  Added Section 3.2.3 (ARB Review)
                                      Added Section 3.2.4                  Added Section 3.2.4 (ARB Review)
10/22/05   Chris Yuan      3.1        Added TOF                            Added TOF
                                      Revised TOC, TOT                     Revised TOC, TOT
10/23/05   Kunal Kadakia   3.2        Reviewed Sections 1.0 & 2.0          Reviewed Sections 1.0 & 2.0 to
                                      Reviewed Sections 3.1 & 3.2.1         maintain consistency with CAB and
                                                                             CAP, as per the ARB Review
                                                                            Reviewed Section 3.1 & 3.2.1 (ARB
                                                                             review)
11/17/05   Chris Yuan      4.0        Added Section 4                      Added section 4 for LCA Draft
11/17/05   Ajithkumar      4.1        Added detail for Section 4.3.3       Added detail for section 4.3.3
           Kattil
                                      Added Section 4.3.4                  Added section 4.3.4
11/17/05   Andrew Ha       4.2        Added detail for Section 4.3.3       Added detail for section 4.3.3
11/17/05   Ashwin          4.3        Added detail for Section 4.3.3       Added detail for section 4.3.3
           Kusabhadran
11/18/05   Chris Yuan      4.4        Added Section 4.3.2                  Added section 4.3.2 for test
                                                                             preparation
11/20/05   Chris Yuan      4.5        Divided Section 4 into server and    Divided Section 4 into server and
                                       client part                           client part
                                      Added Section 4.1.1 to 4.1.5         Added Section 4.1.1 to 4.1.5
                                      Added Section 4.2.1 to 4.2.5         Added Section 4.2.1 to 4.2.5
                                      Added Section 5                      Added Section 5 for LCA Draft
                                      Added Section 5.1 and 5.2            Added Section 5.1 and 5.2
11/21/05   Chris Yuan      4.6        Revised Section 3.2.1                Revised section 3.2.1
                                      Added Glossary section               Added Glossary section
                                      Updated References section           Updated References section




CAR_LCA_F05a_T03_V05.0                             II                                          11/21/05
COTS Assessment Report                                                                           Version 5.0

Date       Author          Version   Changes made                                Rationale
11/21/05   Kunal Kadakia   5.0        Revised Sections 3.1, 3.2.1, 4.1.1,        Revised Sections 3.1, 3.2.1, 4.1.1,
                                       4.1.3.3, 4.1.4, 4.1.5, 4.2.3, 4.2.3.4.1     4.1.3.3, 4.1.4, 4.1.5, 4.2.3, 4.2.3.4.1
           Ashwin
           Kusabhadran

           Ajithkumar
           Kattil

           Devesh Thanvi




CAR_LCA_F05a_T03_V05.0                            III                                                11/21/05
COTS Assessment Report                                                                                                                          Version 5.0


Table of Contents
Version History ...........................................................................................................................................................II

Table of Contents ...................................................................................................................................................... IV

Table of Tables .......................................................................................................................................................... VI

Table of Figures .......................................................................................................................................................... X

Preface ......................................................................................................................................................................... 11

1.       Executive Summary ........................................................................................................................................... 15

2.       Purpose, Scope, and Assumptions...................................................................................................................... 16

3.       Assessment Approach ........................................................................................................................................ 17

         3.1      System Objectives and Context ................................................................................................................ 17

         3.2      Assessment Objectives and Approach ...................................................................................................... 19

                  3.2.1         Assessment Objectives ............................................................................................................... 19

                  3.2.2         Approach .................................................................................................................................... 24

                  3.2.3         Network Assessment Summary.................................................................................................. 25

                  3.2.4         Prototypes Proposed ................................................................................................................... 26

4.       Assessment Result.............................................................................................................................................. 31

         4.1      Assessment Results-Part 1 [Server] .......................................................................................................... 31

                  4.1.1         COTS Assessed .......................................................................................................................... 31

                  4.1.2         Evaluation Criteria ..................................................................................................................... 32

                  4.1.3         Test Procedure ............................................................................................................................ 35

                  4.1.4         Evaluation Results Screen Matrix ............................................................................................ 100

                  4.1.5         Business Case Analysis ............................................................................................................ 104

         4.2      Assessment Results-Part 2 [Client] ........................................................................................................ 109

                  4.2.1         COTS Assessed ........................................................................................................................ 109

                  4.2.2         Evaluation Criteria ................................................................................................................... 110

                  4.2.3         Test Procedure .......................................................................................................................... 112




CAR_LCA_F05a_T03_V05.0                                                          IV                                                                   11/21/05
COTS Assessment Report                                                                                                                        Version 5.0

                  4.2.4        Evaluation Results Screen Matrix ............................................................................................ 137

                  4.2.5        Business Case Analysis ............................................................................................................ 139

5.       Conclusion and Recommendation .................................................................................................................... 149

         5.1      Conclusion and Recommendation Part 1 [Server] .................................................................................. 149

         5.2      Conclusion and Recommendation Part 2 [Client] .................................................................................. 151

Glossary .................................................................................................................................................................... 152




CAR_LCA_F05a_T03_V05.0                                                          V                                                                  11/21/05
COTS Assessment Report                                                                                                             Version 5.0


Table of Tables
  Table 1 High-level Server Assessment Attributes ................................................................................................... 20

  Table 2 Server Assessment Activities ...................................................................................................................... 21

  Table 3 High-level Client Assessment Attributes .................................................................................................... 22

  Table 4 Client Assessment Activities ....................................................................................................................... 23

  Table 5 Server COTS Assessed ............................................................................................................................... 31

  Table 6 Server Evaluation Criteria ......................................................................................................................... 32

  Table 7 Server Performance Attributes .................................................................................................................. 32

  Table 8 Server Cost Attributes ................................................................................................................................ 33

  Table 9 Server Intercomponent Compatibility Attributes ....................................................................................... 33

  Table 10 Server Interoperability Attributes ............................................................................................................ 33

  Table 11 Server Vendor Support Attributes ............................................................................................................ 33

  Table 12 Server Security Attributes ........................................................................................................................ 34

  Table 13 Server Flexibility Attributes ..................................................................................................................... 34

  Table 14 Hardware Preparation for Server COTS Product ................................................................................... 35

  Table 15 Software Preparation for Server COTS Product ..................................................................................... 36

  Table 16 Other Preparation for Server COTS Product .......................................................................................... 36

  Table 17 Server Flexibility Attribute Result Rationale ........................................................................................... 38

  Table 18 Server Test Procedure Specification 1-1.................................................................................................. 39

  Table 19 Server Test Procedure Specification 1-2.................................................................................................. 40

  Table 20 Server Test Procedure Specification 1-3.................................................................................................. 41

  Table 21 Server Test Procedure Specification 1-4.................................................................................................. 43

  Table 22 Server Test Procedure Specification 1-5.................................................................................................. 45

  Table 23 Server Test Procedure Specification 2-1.................................................................................................. 47

  Table 24 Server Test Procedure Specification 2-2.................................................................................................. 49

  Table 25 Server Test Procedure Specification 2-3.................................................................................................. 51

  Table 26 Server Test Procedure Specification 3-1.................................................................................................. 53



CAR_LCA_F05a_T03_V05.0                                                  VI                                                              11/21/05
COTS Assessment Report                                                                                                            Version 5.0

  Table 27 Server Test Procedure Specification 3-2.................................................................................................. 54

  Table 28 Server Test Procedure Specification 3-3.................................................................................................. 56

  Table 29 Server Test Procedure Specification 3-4.................................................................................................. 57

  Table 30 Server Test Procedure Specification 4-1.................................................................................................. 58

  Table 31 Server Test Procedure Specification 4-2.................................................................................................. 59

  Table 32 Server Test Procedure Specification 4-3.................................................................................................. 61

  Table 33 Server Test Procedure Specification 4-4.................................................................................................. 63

  Table 34 Server Test Procedure Specification 4-5.................................................................................................. 65

  Table 35 Server Test Procedure Specification 5-1.................................................................................................. 68

  Table 36 Server Test Procedure Specification 5-2.................................................................................................. 70

  Table 37 Server Test Procedure Specification 5-3.................................................................................................. 73

  Table 38 Server Test Procedure Specification 5-4.................................................................................................. 75

  Table 39 Server Test Procedure Specification 5-5.................................................................................................. 77

  Table 40 Server Test Procedure Specification 6-1.................................................................................................. 79

  Table 41 Server Test Result 1-1 .............................................................................................................................. 80

  Table 42 Server Test Result 1-2 .............................................................................................................................. 81

  Table 43 Server Test Result 1-3 .............................................................................................................................. 82

  Table 44 Server Test Result 1-4 .............................................................................................................................. 83

  Table 45 Server Test Result 1-5 .............................................................................................................................. 84

  Table 46 Server Test Result 2-1 .............................................................................................................................. 85

  Table 47 Server Test Result 2-2 .............................................................................................................................. 86

  Table 48 Server Test Result 2-3 .............................................................................................................................. 87

  Table 49 Server Test Result 3-1 .............................................................................................................................. 88

  Table 50 Server Test Result 3-2 .............................................................................................................................. 88

  Table 51 Server Test Result 3-3 .............................................................................................................................. 89

  Table 52 Server Test Result 3-4 .............................................................................................................................. 89

  Table 53 Server Test Result 4-1 .............................................................................................................................. 90

  Table 54 Server Test Result 4-2 .............................................................................................................................. 91


CAR_LCA_F05a_T03_V05.0                                                 VII                                                             11/21/05
COTS Assessment Report                                                                                                            Version 5.0

  Table 55 Server Test Result 4-3 .............................................................................................................................. 92

  Table 56 Server Test Result 4-4 .............................................................................................................................. 93

  Table 57 Server Test Result 4-5 .............................................................................................................................. 93

  Table 58 Server Test Result 5-1 .............................................................................................................................. 94

  Table 59 Server Test Result 5-2 .............................................................................................................................. 94

  Table 60 Server Test Result 5-3 .............................................................................................................................. 95

  Table 61 Server Test Result 5-4 .............................................................................................................................. 96

  Table 62 Server Test Result 5-5 .............................................................................................................................. 97

  Table 63 Server Test Result 6-1 .............................................................................................................................. 98

  Table 64 Server Result Matrix AT-S01 ................................................................................................................. 100

  Table 65 Server Result Matrix AT-S02 ................................................................................................................. 100

  Table 66 Server Result Matrix AT-S03 ................................................................................................................. 101

  Table 67 Server Result Matrix AT-S04 ................................................................................................................. 101

  Table 68 Server Result Matrix AT-S05 ................................................................................................................. 102

  Table 69 Server Result Matrix AT-S06 ................................................................................................................. 102

  Table 70 Server Result Matrix AT-S07 ................................................................................................................. 102

  Table 71 Server Result Matrix Overall Summary ................................................................................................. 103

  Table 72 ROI Statistics for Breakeven Analysis after implementing prototype 2, phase1 .................................... 107

  Table 73 Client COTS Assessed ............................................................................................................................ 110

  Table 74 Client Evaluation Criteria ..................................................................................................................... 110

  Table 75 Client Cost Attribute .............................................................................................................................. 111

  Table 76 Client Performance Attribute ................................................................................................................. 111

  Table 77 Client Vendor Support Attribute ............................................................................................................ 111

  Table 78 Client Flexibility Attribute ..................................................................................................................... 111

  Table 79 Hardware Preparation for Client COTS Product .................................................................................. 113

  Table 80 Software Preparation for Client COTS Product .................................................................................... 114

  Table 81 Other Preparation for Client COTS Product ......................................................................................... 114

  Table 82 Client Performance Attribute Result Rational ....................................................................................... 116


CAR_LCA_F05a_T03_V05.0                                                VIII                                                             11/21/05
COTS Assessment Report                                                                                                       Version 5.0

  Table 83 Client Flexibility Attribute Result Rational............................................................................................ 117

  Table 84 Client Test Procedure Specification 1-1 ................................................................................................ 119

  Table 85 Client Test Procedure Specification 1-2 ................................................................................................ 121

  Table 86 Client Test Procedure Specification 1-3 ................................................................................................ 123

  Table 87 Client Test Procedure Specification 3-1 ................................................................................................ 126

  Table 88 Client Test Procedure Specification 3-2 ................................................................................................ 128

  Table 89 Client Test Results 1-1 for Tangent Model ............................................................................................ 129

  Table 90 Client Test Results 1-2 for Tangent Model ............................................................................................ 130

  Table 91 Client Test Results 1-3 for Tangent Model ............................................................................................ 130

  Table 92 Client Test Results 3-1 for Tangent Model ............................................................................................ 131

  Table 93 Client Test Results 3-2 for Tangent Model ............................................................................................ 131

  Table 94 Client Test Results 1-1 for Wyse ............................................................................................................ 132

  Table 95 Client Test Results 1-2 for Wyse ............................................................................................................ 133

  Table 96 Client Test Results 1-3 for Wyse ............................................................................................................ 133

  Table 97 Client Test Results 3-1 for Wyse ............................................................................................................ 134

  Table 98 Client Test Results 3-2 for Wyse ............................................................................................................ 135

  Table 99 Client Result Matrix AT-C01 ................................................................................................................. 137

  Table 100 Client Result Matrix AT-C02 ............................................................................................................... 137

  Table 101 Client Result Matrix AT-C03 ............................................................................................................... 138

  Table 102 Client Result Matrix AT-C04 ............................................................................................................... 138

  Table 103 Client Result Matrix Overall Summary ................................................................................................ 138

  Table 104 ROI Statistics for Breakeven Analysis after implementing prototype 2, phase2 (product 1) ............... 142

  Table 105 ROI Statistics for Breakeven Analysis after implementing prototype 2, phase2 (product 2) ............... 147




CAR_LCA_F05a_T03_V05.0                                              IX                                                           11/21/05
COTS Assessment Report                                                                                                           Version 5.0


Table of Figures
  Figure 1 Approach Adopted.................................................................................................................................... 24

  Figure 2 Prototype I – Citrix Thin Client Structure................................................................................................ 26

  Figure 3 Prototype I – Citrix Thin Client Network Diagram.................................................................................. 27

  Figure 4 Prototype II, Phase I - New TC Application Server ................................................................................. 28

  Figure 5 Prototype II, Phase II – New Thin Client Terminals ................................................................................ 30

  Figure 6 Server Break-even Analysis on ROI ....................................................................................................... 107

  Figure 7 Client Break-even Analysis on ROI (Product 1) .................................................................................... 142

  Figure 8 Client Break-even Analysis on ROI (Product 2) .................................................................................... 147




CAR_LCA_F05a_T03_V05.0                                                  X                                                             11/21/05
COTS Assessment Report                                                       Version 5.0


Preface
The CAR document is reasonably self-contained, but relies on the CAB for detailed background
on the project and organizational goals and environment. It also relies on the CAP for details on
milestones, budgets, schedules, and risks. Its level of detail is risk-driven, particularly with
respect to budgets, schedules and customer needs.


 References
          Client meeting notes
       http://greenbay.usc.edu/csci577/fall2005/projects/team3/CMN/CMN_09_27_F05_T03.pdf

          COTS Assessment Background (CAB)
       http://greenbay.usc.edu/csci577/fall2005/projects/team3/LCO/CAB_LCO_F05a_T03_V0
       5.20.pdf

          COTS Assessment Process (CAP)
       http://greenbay.usc.edu/csci577/fall2005/projects/team3/LCO/CAP_LCO_F05a_T03_V0
       3.10.pdf

          Intel Corporation
       http://www.intel.com/

          Microsoft Corporation
       http://www.microsoft.com/

          MS Terminal Services, “Juggling Terminal Service Resources”
       http://www.msterminalservices.org/articles/Juggling-Terminal-Service-Resources.html

          Renaissance Learning
        http://www.renlearn.com/RenaissancePlace/default.htm

          Tangent Computers, Inc
       http://www.tangent.com/

          Wayang Outpost
       http://www.wayangoutpost.net/

          WebDT
       http://www.dtresearch.com/prod_webDT166.html



CAR_LCA_F05a_T03_V05.0                     11                                   11/21/05
COTS Assessment Report                                                    Version 5.0

         WinWin Negotiation
      http://greenbay.usc.edu/csci577/fall2005/projects/team3/LCO/EWW_LCO_F05a_T03_V
      01.10.pdf

         Wyse Technology Inc
      http://www.wyse.com

         Ye Yang, and Barry Boehm, "Guidelines for Producing COTS Assessment
          Background, Process, and Report Documents," USC-CSE Tech report
      http://greenbay.usc.edu/csci577/fall2005/site/guidelines/CBA-AssessmentIntensive.pdf




CAR_LCA_F05a_T03_V05.0                  12                                  11/21/05
COTS Assessment Report                                                         Version 5.0

 Change Summary
      Version 1.0        Andrew released initial draft with sections with Preface, References,
                         Change Summary, Sections 1 and 2.

      Version 1.1        Chris added section 3.2 for the LCO draft.

      Version 1.2        Kunal added section 3.1 for the LCO draft.

      Version 1.3        Chris added Table of Contents, Table of Tables. The reference section is
                         revised to include the reference to the previous documents.

      Version 2.0        This version incorporates the comments and suggestion given in the
                         CAR QR report.

                         Andrew revised sections 1 & 2 for better understandability.

      Version 2.1        Kunal revised section 3.1.

      Version 2.2        Chris revised section 3.2.1 to improve the assessment objectives and also
                         revised Table of Contents and Table of Tables.

      Version 3.0        Added Section 3.2.2 Approach Adopted Diagram, Added Section 3.2.3,
                         Added Section 3.2.4. All sections were added based on the ARB review
                         comments.

      Version 3.1        Chris revised Table of Contents, Table of Tables and Added Table of
                         Figures.

      Version 3.2        Kunal reviewed Sections 1.0 & 2.0 to maintain consistency with the
                         CAB and CAP, as per the ARB Review.

                         Kunal reviewed Sections 3.1 & 3.2.1 as per the ARB Review.

      Version 4.0        Chris added section 4 for LCA Draft. Initial test cases added in section
                         4.3.3 for evaluation criteria AT-S03 and AT-S04.

      Version 4.1        Ajithkumar added test cases in section 4.3.3 for evaluation criteria AT-
                         S05. Also added section 4.3.4 for the testing result for AT-S05.

      Version 4.2        Andrew added test cases in section 4.3.3 for evaluation criteria AT-S02
                         and AT-S04.

      Version 4.3        Ashwin added test cases in section 4.3.3 for evaluation criteria AT-S01.

      Version 4.4        Chris added section 4.3.2 for the test preparation.


CAR_LCA_F05a_T03_V05.0                  13                                       11/21/05
COTS Assessment Report                                                         Version 5.0

      Version 4.5        Chris modified the document to complete the LCA Draft requirement.
                         Section 4 was divided into server and client part to describe the
                         assessment done on different kind of COTS product. Section 4.1 and 4.2
                         were added for the assessment on the server and client part relatively.
                         Section 4.X.1 to 4.X.5 were added to described the detailed assessment
                         activities, result screen matrix and business case analysis. Section 5 was
                         added for the conclusion and recommendation of the COTS assessment.
                         Section 5.1 and 5.2 were added for the server and client part relatively.

      Version 4.6        Chris revised section 3.2.1 to reflect the current assessment attributes and
                         activities. The section is divided into server and client sections based on
                         the actual assessment method. Some of the weights were reassigned
                         based on the new attributes. Also added the glossary section and updated
                         references section.

      Version 5.0        Kunal Kadakia revised 3.1, 3.2.1,
                         Ajithkumar Kattil & Ashwin Kusabhadran revised 4.1.1, 4.1.3.3, 4.1.4,
                         4.1.5, 4.2.3, 4.2.3.4.1
                         Devesh Thanvi revised Section 5.1




CAR_LCA_F05a_T03_V05.0                   14                                       11/21/05
COTS Assessment Report                                                         Version 5.0


1. Executive Summary
The project under consideration is Pasadena High School (PHS) Computer Network Study
which involves an analysis of the existing thin client network infrastructure at PHS.
Pasadena High School needs a powerful server system to host and serve multimedia intensive
learning applications (such as Wayang Outpost, and Renaissance Place) in their library and
computer lab. The proposed new system will provide support for 40 plus thin clients that will
allow students to work simultaneously on the high-end multimedia interactive applications. This
system design, consisting of main servers and multiple thin client stations will provide a low cost
solution that would facilitate easy deployment and necessitate minimal maintenance.


The high-level objectives of the COTS assessment are as follows:

      Analyze the current system infrastructure and identify the critical issues, constraints, and
       limitations that bound to the existing system.
      Recommend two or more distinct prototypes, each highlighting levels of change and the
       costs needed to maximize the client’s expectations and levels of service.
      To measure the compliance of the proposed COTS based system with the guidelines set
       forth by the client during the Win-Win negotiations.

The results of the analysis will be published and presented to the client to serve as formal
justifications, so that the client can submit requests for software and server upgrades to the
Pasadena Unified School District (PUSD).




CAR_LCA_F05a_T03_V05.0                      15                                    11/21/05
COTS Assessment Report                                                       Version 5.0


2. Purpose, Scope, and Assumptions
This section describes the purpose, scope, and assumptions that underlie the analysis, results,
conclusion and recommendations.
The purpose of this document is to
    Summarize PHS COTS assessment process.
    Present the major COTS assessment results and conclusions.
    Make recommendations to the client based on the COTS assessment results.
    Provide substantiation for the results by using performance tuning and load testing COTS
       product for predicting system behavior for the COTS system under various test scenarios.

The scope of this document covers the COTS assessment objectives, context, approach, results,
conclusions, recommendations, and supporting data.

The following assumptions underlie the analysis, results, conclusions, and recommendations:

      The client must provide the assessment team with full access to the application server and
       thin client terminals during the evaluation period.

      The maintenance of the computer lab at the PHS library must be covered by the service
       contract plan with Tangent, who is their thin client provider. The vendor will provide
       remote administration as long as the service contract plan is renewed on an annual basis.

      The existing infrastructure and topologies consisting of network connections, PC setups,
       thin client setup, server configuration, and server location must not change drastically
       during the assessment period.

      The purpose and scope of PHS Network Study must not change.




CAR_LCA_F05a_T03_V05.0                     16                                   11/21/05
COTS Assessment Report                                                         Version 5.0


3. Assessment Approach
This section describes the system and assessment objectives and approach.

 3.1 System Objectives and Context
This section briefly summarizes the application system objectives, constraints and priorities
referenced from sections 2, 3 and 4 of CAB.
Our client from Pasadena High School needs a high performance thin client network solution. It
should support multimedia learning applications (such as Wayang Outpost, and Renaissance
Place) in their library and also help the students and staff members in their interactive learning
process. Our project would evaluate the existing thin-client network and investigate the
feasibility of different COTS packages, such as the Citrix Meta frame solution - developed by
Citrix. This is to verify if the capabilities of the system satisfy the requirements of the client.
Our final COTS recommendation to the client would be a report of different prototypes (as per
the client’s request) on the basis of the different tests performed by the team on the system. The
team would study and evaluate the market trend analysis, the cost benefit analysis of the different
prototypes and then present a report showing the degree to which the capabilities of the
implemented system satisfy the requirements of the client.


In order to reduce the probability of failure, we need to summarize the major constraints on the
project, which are given as follows:
      Limited Budget: The Customer can afford a budget of 3500 USD per year for server
       maintenance. The budget for capital investment has yet not been allocated.

      Limited Time: The project must be completed within 12 weeks.
      Legacy servers and thin-client solution: The proposed solution should integrate properly
       with the existing TC-95 and TC-96 servers and the Wyse thin client solutions


The prioritized capabilities of the system are as follows:
      Thin client multimedia capability
The proposed thin client network solution should be capable of handling high end flash based
multimedia intensive applications that will allow the students to work simultaneously on the
Wayang Outpost tutorial (a Geometric Software hosted by ISI).
      Integrated network solution
The system should integrate the existing thin client network with the proposed COTS solution so
that applications like Accelerated reader and Renaissance Place would be available to all users
across the network.




CAR_LCA_F05a_T03_V05.0                       17                                   11/21/05
COTS Assessment Report                                                          Version 5.0



      Network Load balancing
Network Load Balancing services will enable multiple dual processor servers to be configured as
a logical group that will balance the user sessions across a multitude of servers and allow users to
be dynamically routed to a server that is least busy.
      Backup and recovery services
The system should be able to provide backup and recovery services that will ease the system
administrator’s work in case of a system crash or a failure. All records of students and faculties
shall be either archived to a backup system or deleted from the system after the last day of the
academic year.
      Symmetric multiprocessing enabled (SMP)
Upgrading the existing server to dual processors (SMP) will make use of hardware load
balancing, which in turn, will increase the throughput of the user terminals.




CAR_LCA_F05a_T03_V05.0                      18                                     11/21/05
COTS Assessment Report                                                            Version 5.0

 3.2 Assessment Objectives and Approach
This section briefly summarizes the COTS assessment objectives and approach from section 2.1
and section 4 in CAP.

       3.2.1       Assessment Objectives
A list of high-level system attributes is derived from the client’s objectives, constraints, and
priorities to serve as the evaluation criteria.
The list is categorized into two parts, one for the server and one for the client and is ordered by
importance as indicated by the evaluation weights in the table below, which is agreed by both the
client and evaluators. Each of the attributes is assigned an importance based on the result of the
WinWin negotiation and client meetings.
The first part of the list summarizes the evaluation attributes for the server as in the table below:

 Identifier   Name Of Attribute        Weight      Rationale
 AT-S01       Performance              270         The performance factor covers the following
                                                   attributes:
                                                   - System login time (Acceptable time is less
                                                   than 30 seconds for each user)
                                                   - Number of concurrent users (minimum
                                                   acceptable number of users is 35-36)
                                                   - System response time for regular applications
                                                   - System response time for multimedia
                                                   intensive applications
 AT-S02       Cost                     150         The cost is determined by the overall cost of
                                                   the implementation, including the initial and
                                                   future hardware upgrades and maintenance of
                                                   the system. We will provide cost estimation for
                                                   each system prototype that we would propose
                                                   to meet the expectations.
 AT-S03       Intercomponent           140         A list of applications provided by the client
              Compatibility                        should be installed on the server and should be
                                                   made available to all the terminal clients. The
                                                   applications includes the following:
                                                   - Wayang Outpost (web-based)
                                                   - MS Office Suite
                                                   - Renaissance Place
                                                   - Choices



CAR_LCA_F05a_T03_V05.0                       19                                      11/21/05
COTS Assessment Report                                                        Version 5.0


                                                Each application will be tested on the server
                                                and on the terminal clients to measure its
                                                compatibility with the system.
AT-S04     Interoperability         130         Interoperability is measured by the following
                                                network services:
                                                - Authentication service
                                                - Application processing
                                                - Active Directory service
                                                - Print service
                                                - User profile/folder
                                                Each service will be tested on how the
                                                information is exchanged between the server
                                                and client on the network.
AT-S05     Vendor Support           120         The vendor support metrics are measured by
                                                the degree of support of the following
                                                attributes:
                                                - Response time for critical problems
                                                - Remote assistance
                                                - Hardware support
                                                - Software upgrades
                                                - Warranty
                                                The possible sources for evaluating the
                                                attributes include vendor communication via
                                                phone and email and overall customer
                                                satisfaction rate, if available.
AT-S06     Security                 110         The security metrics are measured on the user
                                                privilege capability.
                                                The system should be capable of handling and
                                                assigning different access rights as per the
                                                different types of user access.
AT-S07     Flexibility              100         The flexibility attribute is measured by the
                                                system’s upgradeability as well as the
                                                downward compatibility to older hardware
                                                components. Since the hardware is not
                                                available for testing, we will reply on the
                                                information provided by COTS vendor as well
                                                as the hardware specification data.
                         Table 1 High-level Server Assessment Attributes



CAR_LCA_F05a_T03_V05.0                     20                                    11/21/05
COTS Assessment Report                                                           Version 5.0

The table below summarizes the assessment activities for the assessment activities for the server
COTS product:

 Identifier Name Of Attribute         Assessment Method
 AT-S01      Performance              Each performance attribute will be tested by having the
                                      users to login to the system and run different applications.
                                      The evaluator will record the result at different concurrent
                                      user levels.
 AT-S02      Cost                     A list of quotes should be provided by the vendor to show
                                      the estimation for the initial cost as well as the future cost,
                                      depending on the different prototype given in the project.
 AT-S03      Intercomponent           The system will be tested against the required applications
             Compatibility            by first running each application independently and then
                                      running the different applications simultaneously. The
                                      process will be monitored and the result will be recorded
                                      by the evaluators.
 AT-S04      Interoperability         We will test all the network services using the client
                                      terminal to determine if all the communications between
                                      the server and clients are properly established. We will
                                      also test the availability of varies kind of network services
                                      as the assessment activity.
 AT-S05      Vendor Support           We can evaluate the vendor support level by gathering the
                                      information about their customers’ satisfaction levels in
                                      the market. Since Tangent is a big vendor there will be
                                      many reviews available about their products and services.
 AT-S06      Security                 The security attribute can be evaluated by checking
                                      whether the users are assigned proper rights based on their
                                      roles. The individual profiles and folders should be kept
                                      private from the other users. Also certain password rules
                                      should be enforced on user accounts to ensure the security
                                      level.
 AT-S07      Flexibility              Flexibility is reflected by the different prototypes
                                      provided in the project. Each prototype will show the ease
                                      of upgrade of a particular solution, including the
                                      recommendations about the future upgrades. The
                                      downward compatibility will be determined by the
                                      benchmark on the market as well as the information
                                      provided by COTS vendor.
                                Table 2 Server Assessment Activities




CAR_LCA_F05a_T03_V05.0                      21                                      11/21/05
COTS Assessment Report                                                          Version 5.0

The second part of the list summarizes the evaluation attributes for the client as in the table
below:

 Identifier   Name Of Attribute       Weight       Rationale
 AT-C01       Cost                    150          The cost is determined by the overall cost of
                                                   the implementation, including the initial and
                                                   future hardware upgrades and maintenance of
                                                   the system. We will provide cost estimation for
                                                   each system prototype that we would propose
                                                   to meet the expectations.
 AT-C02       Performance             120          The performance for the client is determined
                                                   by the hardware specification and operating
                                                   system capability. Since the majority of
                                                   software applications are running on the server
                                                   machine, the performance of the client machine
                                                   is less important.
 AT-C03       Vendor Support          90           The vendor support metrics are measured by
                                                   the degree of support of the following
                                                   attributes:
                                                   - Hardware support
                                                   - Warranty
                                                   The possible sources for evaluating the
                                                   attributes include vendor communication via
                                                   phone and email and overall customer
                                                   satisfaction rate, if available.
 AT-C04       Flexibility             80           The flexibility attribute is measured by the
                                                   system’s upgradeability in order to adopt the
                                                   changes in the future.
                            Table 3 High-level Client Assessment Attributes




CAR_LCA_F05a_T03_V05.0                       22                                    11/21/05
COTS Assessment Report                                                           Version 5.0

The table below summarizes the assessment activities for the assessment activities for the client
COTS product:

 Identifier Name Of Attribute         Assessment Method
 AT-C01      Cost                     A list of quotes should be provided by the vendor to show
                                      the estimation for the initial cost as well as the future cost,
                                      depending on the different prototype given in the project.
 AT-C02      Performance              We will evaluate the performance by comparing the
                                      hardware specification to see if it meets the requirements
                                      to handle all the required applications by the librarian.
 AT-C03      Vendor Support           We can evaluate the vendor support level by gathering the
                                      information about their customers’ satisfaction levels in
                                      the market. Since Tangent is a big vendor there will be
                                      many reviews available about their products and services.
 AT-C04      Flexibility              We will evaluate the flexibility by determine weather the
                                      client model is upgradeable or not in both hardware and
                                      software.
                                 Table 4 Client Assessment Activities




CAR_LCA_F05a_T03_V05.0                       23                                     11/21/05
COTS Assessment Report                                                             Version 5.0

       3.2.2       Approach
Since our project involves both hardware and software COTS products, the evaluation is
separated into the hardware and software sections. For hardware assessment, each hardware
component is evaluated against the minimum system requirements of each software COTS
product to see if it meets all the required system specifications. For software assessment, each
COTS product is tested based on the evaluation criteria and weights agreed by both the
evaluation team and the client. The assessment activities are summarized in section 3.2.1. The
test results of each criteria will be on a scale from 1 to 10 (with 10 as the best), and the total of
the weighted result scores will be the final evaluation result for the COTS product.



 Approach Adopted

     Preliminary
      Network
     Assessment
                                                       Submit finalized             Finalize
                                 Review               prototype to client          Prototypes
                                 Client’s
                               Requirement




                                                                              Present Prototype
                                                                              for client feedback

          Practical              COTS
          Network               Products
         Assessment             Research




                                COTS Risk              Prototype             Prototype Feasibility
                               Management              Proposal             Analysis (Identify Risks)




                                     Figure 1 Approach Adopted




CAR_LCA_F05a_T03_V05.0                       24                                        11/21/05
COTS Assessment Report                                                         Version 5.0

       3.2.3       Network Assessment Summary
This section describes the test procedures, test preformed, and results observed during for the
practical network assessment test. During our visit to PHS we did the follow test:
      Observe Classroom & Student Activities (15-25) Students
       Measure CPU, Network, and Memory Utilization (Using Windows Task Manager) for
       the following software:
              MS Office Applications (Word, Excel, PowerPoint)
              Choices
              Algebra Tutorial
              Internet Explorer (Static Web Page Surfing)
              Internet Explorer (Flash Animation Page Testing)
                      www.macromedia.com
                      www.disney.com
              Windows Media Player
                      Streaming USC Lectures
      Simulate 11 Users Surfing Flash Intensive Web Sites
The following list the results of our practical network test.
CPU Utilization per Thin Client
       MS Office Suite (1%)
       Choices (Licensed Expire so we were not able to test)
       Algebra Tutorial (<1%)
       Internet Explorer (Static Pages www.usc.edu) (2-5%)
       Internet Explorer (Flash Intensive Page) (15-25%)
       Windows Media Player (Streaming Video) (2-5%)
Memory Utilization per Thin Client
       MS Office Suite ~ 25,000K which is less than 1%) of Total 3 Gigabytes
       Choices (could not test)
       Internet Explorer (Static Pages) ~21,000K
       Internet Explorer (Flash Intensive Page) ~22,000K
       Windows Media Player ~17,000K




CAR_LCA_F05a_T03_V05.0                       25                                   11/21/05
COTS Assessment Report                                                            Version 5.0

       3.2.4      Prototypes Proposed
This section describes the different prototypes proposed to improve the performance of the
current network at Pasadena High School. Our client, the school librarian, has requested us to
propose a new system that will provide support for 40 plus thin clients by allowing the students
to work on the high-end multimedia interactive applications simultaneously. The system, being
evaluated, already exists and has been implemented by an outside vendor, Tangent Computers.
This system design, consisting of main servers and multiple thin client stations will provide a
low cost solution that would facilitate easy deployment and necessitate minimal maintenance.
The client currently has a budget constraint, which will play an important role in the
consideration for the new system design. Prototyping will help achieve the shared vision of key
success stake holders of the project.
Prototype I:          Citrix Thin Client Solution
In this prototype, we are proposing a thin client network based on Citrix solution. Citrix
MetaFrame will be loaded in servers as well as the clients. MetaFrame will include high
multimedia and animation technologies which will make the interactive multimedia application
like Wayang Outpost run faster in thin clients. Also the old legacy hardware (fat clients) and the
Wyse thin clients can be integrated into the proposed network solution.




                                        NETWORK                                 Windows
   Citric                                                                        Based
   MetaFrame
                                                                                Terminals
   Server




              MS                      Old x86                      Web ICA
            Windows                    Dos                          Clients
             Based                  ICA clients
             Client



                          Figure 2 Prototype I – Citrix Thin Client Structure




CAR_LCA_F05a_T03_V05.0                       26                                     11/21/05
COTS Assessment Report                                                            Version 5.0




                      Figure 3 Prototype I – Citrix Thin Client Network Diagram

Citrix solution will give the client the following benefits:

   1. The Citrix Load Balancing Services will allow the users to be dynamically routed to a
      Citrix server that is least busy. This will enable 40 plus users to access the high-end
      mathematical learning applications simultaneously.

   2. Unlike the Wyse thin client solution, this solution doesn’t require any proprietary
      hardware, thereby increasing portability and scalability.


   3. Citrix MetaFrame will provide encrypted authentication for the users
       The Secure ICA package allows for 40, 56, or 128 bit encryption.




CAR_LCA_F05a_T03_V05.0                       27                                     11/21/05
COTS Assessment Report                                                             Version 5.0

Prototype II, Phase I:        New TC Application Server




                      Figure 4 Prototype II, Phase I - New TC Application Server

This prototype plans for a step wise refinement of the current network and system infrastructure
at PHS. The main goal is to reuse all the existing legacy products and current servers at PHS and
introduce a plan for gradually replacing the main servers and the multimedia enabled thin clients.
Its goal is to focus on minimal system change while maximizing the performance to meet the
client’s requirements. The core change in this prototype is to introduce a new server (TC97) to
serve as the new application server. TC97 has the system specifications of Dual 3.02GHz
Pentium 4 Xenon Processor with 4GB of RAM, which will be sufficient to handle the current 40
plus thin clients. TC95 server will be reassigned the role of Active Directory Server, which will
store all student’s login and authentication services. Active Directory treats each server as a
resource on the network, and will authenticate users based on their group policies. TC96 server
will host the typing tutorial program, and serve as a file server to store student personal folders.
Accelerated Reader server will continue to host the Accelerated Reader Application, and
Wayang Outpost Server will continue to host the Wayang Outpost application. TC96 will also
serve as a mirror server for TC95.



CAR_LCA_F05a_T03_V05.0                       28                                      11/21/05
COTS Assessment Report                                                    Version 5.0

The benefits of this scenario are:
      Thin clients from the PC lab and Library can access all the softwares from the TC97
       server, and also the authentication takes place at one location only.
      Since Active Directory Server is handled in one place, the librarian need not manage a
       separate Active Directory Server for TC96.
      Dell PCs can also access softwares on TC97, in addition to the Accelerated Reader
       application.
      Centralized Application Server and thin clients results in minimal management from the
       client.




CAR_LCA_F05a_T03_V05.0                   29                                  11/21/05
COTS Assessment Report                                                              Version 5.0


Prototype II, Phase II: New Thin Client Terminals




                      Figure 5 Prototype II, Phase II – New Thin Client Terminals

During Phase II of the second prototype, the legacy Wyse 3230LE terminals will be upgraded to
a more versatile Winterm V90, a 1GHz x86 processor based thin client with support for 24-bit
display resolution. Winterm V90 has built-in Internet Explorer and thus can process web-based
applications locally on the thin client.
The benefits of the phase II are:
      Web based applications can be processed by the thin clients (Wyse Winterm V90)
       independently and thereby they need not be dependent on the TC97 Server for CPU.
      It can be fully integrated with the older Wyse Thin clients, which in turn, allows the
       network to run a mix-mode of thin clients.
      It is fully scalable and thus the client can choose to replace or upgrade them, whenever
       required.




CAR_LCA_F05a_T03_V05.0                        30                                      11/21/05
COTS Assessment Report                                                        Version 5.0



4. Assessment Result
This section is organized to present the assessment scenarios and the corresponding assessment
results for the Server and the Client COTS products, depending on the critical assessment issues.
The critical assessment issues are reflected in the choice of the assessment criteria, weights and
rating scales (Referred to Section 2.1.2 of CAP).
Part 1 of the Assessment Results covers the Server COTS product for the Phase I implementation,
as discussed in the prototype in Section 3.2.4.
Part 2 of the Assessment Results covers the Client COTS products for the Phase II
implementation, as discussed in the prototype in Section 3.2.4.

 4.1 Assessment Results-Part 1 [Server]
This section describes the assessment and test results for the server COTS product.

       4.1.1       COTS Assessed
The COTS product assessed is:
 COTS Product                    Web Address                Description
 Tangent Pillar™ 2750s           http://www.tangent.co      Tangent Pillar 2750s, with a newly
                                 m/products/gen/servers     designed rackmount or tower
                                 /2750s.htm                 chassis, delivers the highest levels
                                                            of power, performance, scalability
                                                            and reliability with dual Intel®
                                                            Xeon™ processors at 3.20GHz with
                                                            533MHz system bus. It supports up
                                                            to 12GB ECC DDR266 SDRAM
                                                            for improved performance.
                                                            Standard features include three
                                                            PCI-X expansion slots, seven hot-
                                                            swap SCSI hard drives with dual
                                                            channel Ultra160 SCSI, and one
                                                            Gigabit and one 10/100 Ethernet
                                                            ports. Onboard PC health
                                                            monitoring proactively helps assure
                                                            continuous operation while optional
                                                            UPS and backup peripherals are
                                                            added assurances of system
                                                            integrity.
                                   Table 5 Server COTS Assessed




CAR_LCA_F05a_T03_V05.0                     31                                    11/21/05
COTS Assessment Report                                                             Version 5.0

       4.1.2       Evaluation Criteria
The set of evaluation criteria for the Server COTS product chosen is shown in the following
table. The last column presents the corresponding weight assigned based on the discussion
between the client and the team members. The evaluation weights indicate the importance the
client perceives an attribute of the system shall have.

 No            Evaluation Criteria – COTS attributes                      Weight
 AT-S01        Performance                                                270
 AT-S02        Cost                                                       150
 AT-S03        Intercomponent Compatibility                               140
 AT-S04        Interoperability                                           130
 AT-S05        Vendor Support                                             120
 AT-S06        Security                                                   110
 AT-S07        Flexibility                                                100
                                    Table 6 Server Evaluation Criteria


The following tables break down each single criterion in the above table, into more details in
order to obtain a better measure of each criterion of the COTS product.
Table 7 below breaks down the Performance (AT-S01) criterion into more details as follows:

Weight         Features
80             Number of concurrent users
70             System response time (multimedia applications)
65             System login time
35             System response time (regular applications)
20             Network bandwidth
                                  Table 7 Server Performance Attributes




CAR_LCA_F05a_T03_V05.0                         32                                    11/21/05
COTS Assessment Report                                                           Version 5.0

Table 8 below breaks down the Cost (AT-S02) criterion into more details as follows:

 Weight        Features
 60            Initial purchase cost
 50            Upgrade cost
 40            Maintenance cost
                                     Table 8 Server Cost Attributes
Table 9 below breaks down the Intercomponent Compatibility (AT-S03) criterion into more
details as follows:

 Weight        Features
 50            Wayang Outpost
 45            Renaissance Place
 30            MS Office Suite
 15            Choice
                        Table 9 Server Intercomponent Compatibility Attributes
Table 10 below breaks down Interoperability (AT-S04) criterion into more details as follows:

 Weight        Features
 30            Authentication
 30            Application processing
 30            Active Directory service
 20            Print service
 20            User profile/folder
                               Table 10 Server Interoperability Attributes
Table 11 below breaks down Vendor Support (AT-S05) criterion into more details as follows:

 Weight        Features
 40            Response time for critical problems
 30            Remote assistance
 20            Hardware support
 20            Software upgrades
 10            Warranty
                               Table 11 Server Vendor Support Attributes




CAR_LCA_F05a_T03_V05.0                        33                                   11/21/05
COTS Assessment Report                                                       Version 5.0

Table 12 below breaks down Security (AT-S06) criterion into more details as follows:

 Weight        Features
 110           User privileges
                                 Table 12 Server Security Attributes
Table 13 below breaks down Flexibility (AT-S07) criterion into more details as follows:

 Weight        Features
 40            Downward compatibility
 60            Upgradeability
                                 Table 13 Server Flexibility Attributes




CAR_LCA_F05a_T03_V05.0                        34                               11/21/05
COTS Assessment Report                                                         Version 5.0


       4.1.3       Test Procedure
This section documents the detailed evaluation process stating the test set-up, test procedures
used and their corresponding results.

                      4.1.3.1            Test Identification
The COTS product that is going to be tested is:
       Tangent Pillar™ 2750s
The server COTS evaluation test has 23 test procedures till date, which will be performed by
three team members, and the results of our COTS evaluation tests will work as the basis for our
recommendation to the client.

                      4.1.3.2            Test Preparation

The following sections list the different requirements and preparations in order to complete the
testing procedures.

4.1.3.2.1   Hardware Preparation

The following table lists the hardware requirement of the server COTS product:

 ID            COTS             Hardware Requirements            Met or not
               Product
               Model
 HREQ-S1       Tangent          1. Tangent Pillar™ 2750s         1. No. The new server has not
               Pillar™ 2750s       server as the application        been delivered as of the
                                   server                           LCA Package
                                2. One server as the             2. Yes. Currently TC95 is the
                                   authentication server            application and
                                3. One server as the backup,        authentication server in
                                   folder storage server            PHS
                                4. 40+ thin-client terminals     3. Yes. Currently TC96 is the
                                   connected with the server        folder storage server in
                                   with internet connection         PHS
                                                                 4. Yes. PHS currently has 40
                                                                    thin-client terminals
                                                                    connected to TC95
                      Table 14 Hardware Preparation for Server COTS Product




CAR_LCA_F05a_T03_V05.0                      35                                    11/21/05
COTS Assessment Report                                                         Version 5.0


4.1.3.2.2   Software Preparation

The following table lists the software requirement of the server COTS product:

 ID            COTS    Software Requirements     Met or not
               Product
               Model
 SREQ-S1       Tangent 1. Windows 2003 Server or 1. Yes. Tangent Pillar™ 2750s
               Pillar™    Windows 2000 Server       supports both Windows 2003 and
               2750s      operating system          2000 server as per the information
                       2. MS Office Suite           provided on the vendor’s website
                       3. Choices                   (http://www.tangent.com/products/
                       4. Renaissance Place         gen/servers/2750s.htm)

                                                          2. Yes. PHS has licensed MS Office
                                                             Suite
                                                          3. No. The current license for Choice
                                                             software has expired
                                                          4. Yes. PHS has licensed
                                                             Renaissance Place on AR server
                       Table 15 Software Preparation for Server COTS Product

4.1.3.2.3   Other Pre-test Preparations

The following table lists other pre-test requirement of the server COTS product:

 ID          COTS Product       Pre-test Preparations             Met or not
             Model
 PREP-S1     Tangent            The testers need to have their    Yes. The librarian at PHS has
             Pillar™ 2750s      authenticated username and        provided usernames and
                                password set-up for access to     passwords to both the
                                all the servers and thin-client   administrator and the regular
                                terminals                         users to gain access to the
                                                                  system
                        Table 16 Other Preparation for Server COTS Product




CAR_LCA_F05a_T03_V05.0                      36                                     11/21/05
COTS Assessment Report                                                         Version 5.0


                      4.1.3.3            Test Procedure Specifications

This section provides the detailed test procedures carried out by the testers in order to rate each
evaluation criterion.
    The testing procedure adopted for evaluating all the COTS attributes AT-S01~AT-S06 is
       black-box testing. The black-box testing techniques widely used for the test process are
       equivalence partitioning and boundary value analysis. By applying these black-box
       techniques, we derived a set of test cases, which are presented below.
For attribute AT-S07 there are no detailed test procedure specifications. The rationale for this
attribute is explained in section 4.1.3.3.1 of the document.




CAR_LCA_F05a_T03_V05.0                      37                                    11/21/05
COTS Assessment Report                                                            Version 5.0


4.1.3.3.1      Rationales for None Test Procedures

In case of the Flexibility attribute (AT-S07):
Downward Compatibility of the Server COTS pertains to its compatibility with an earlier version
of itself.
In the new server, the processor is Intel Pentium Xeon which is part of the Intel family and thus
it is downward compatible with all Intel chips. Also the server architecture is standard and not
proprietary.
The test cases are not applicable since the downward compatibility of a future product cannot be
verified directly and we need to refer to the specifications and performance results given by the
vendor.
Reference:
http://www.intel.com/performance/desktop/platform_technologies/em64t.htm
Rating: 9/10


Upgradeability:
         Hardware                Current Configuration of              Maximum possible
                                  Tangent Pillar 2750s                   configuration
            Memory                          4 GB                              12 GB
            (DIMM)
             HDD                      120 GB + 36 GB                       4 x 146 GB
                         Table 17 Server Flexibility Attribute Result Rationale
Thus, we see from the above table that the current server configuration can easily be upgraded in
terms of memory and HDD in order to increase its computational capability.
Also enough expansion slots and bays are there for any future expansion such as:
One PCI-Express slot (X4)
One 64-bit/133MHz PCI-X slot
One 64-bit/100MHz PCI-X slot
One 32-bit/33MHz PCI slot
Rating: 8/10




CAR_LCA_F05a_T03_V05.0                        38                                      11/21/05
COTS Assessment Report                                                           Version 5.0


4.1.3.3.2      Test Procedures

The following tables indicate the test procedures for the performance attribute (AT-S01):

 Test Case:                1-1
 Identifier:               AT-S01-1
 Test Items:               Number of concurrent users
 Test Description:         This test will test the capability of the system to support
                           simultaneous user logins of all 40 users
 Pre-Conditions:           The following functions will be performed before the test:
                              1. All the terminals must be connected to the network and the
                                  network and the server should be up and running.
                              2. The user accounts for all 40 users are created.
                              3. All the 40 users are provided with their usernames and
                                  password.
 Post-Conditions:          The system should be able to support all the users who are trying to
                           logon to the network without crashing.
 Input Specifications:     All the 40 users enter their Username and Password.
 Expected Output           The users get logged in to the system simultaneously.
 Specifications:
 Pass / Fail Criteria:     The test is successful if all 40 users are able to simultaneously log in
                           to the system without the server crashing or freezing.
                           The test is a failure if the server crashes while all the 40 users are
                           trying to login to the system.
 Test process:             As the terminal boots up and displays the login screen
                               1. Enter user name.
                               2. Enter the password.
                               3. Click [OK] when finished entering data.
                               4. Login to PHS_LIB domain
                                 As the users login to their terminals, monitor how many users
                                 can be logged in at the same time before the system goes down.
                                 1. The Pre-Conditions are fulfilled.
 Assumptions and
 Constraints:
 Dependencies:             None
 Traceability:             LOS-1
                          Table 18 Server Test Procedure Specification 1-1




CAR_LCA_F05a_T03_V05.0                       39                                     11/21/05
COTS Assessment Report                                                       Version 5.0



Test Case:               1-2
Identifier:              AT-S01-2
Test Items:              System response time (multimedia applications)
Test Description:        This will test the system’s response time (i.e. how long it takes on an
                         average to open up an instance multimedia application on a
                         terminal) for the Wayang Outpost application.
                         This test is performed with different level of concurrent users i.e.
                         with different number of users running the multimedia application
                         simultaneously.
Pre-Conditions:          The following functions will be performed before the test:
                            1. All 40 users are logged into the system
                            2. The Wayang Outpost application is accessible to all users.
Post-Conditions:         All users are able to run the application.
Input Specifications:    1. The user enters his username and password to login to the system.
                         2. The user opens up an instance of the Wayang Outpost application
Expected Output          The Wayang Outpost application opens up in all 40 terminals.
Specifications:
Pass / Fail Criteria:    The test is successful if on an average the users are able to open the
                         Wayang Outpost application within 15 seconds
                         The test fails if it takes more than 15 seconds to open up an instance
                         of the Wayang Outpost application.
Test process:            After users are logged in to the system:
                            1. Each user starts an instance of Wayang Outpost application
                                on their terminal.
                            2. We will monitor and record the system response time for the
                                application at different level of concurrent users.
                            1. The Pre-Conditions are fulfilled.
Assumptions and
                            2. All users will perform about the same amount of work on the
Constraints:
                                system
Dependencies:            None
Traceability:            CAP-1 & LOS-1
                         Table 19 Server Test Procedure Specification 1-2




CAR_LCA_F05a_T03_V05.0                     40                                   11/21/05
COTS Assessment Report                                                          Version 5.0



Test Case:               1-3
Identifier:              AT-S01-3
Test Items:              System login time.
Test Description:        This test measures the time required for users to log in to the system.
                         This test is performed with different level of concurrent users i.e.
                         with different number of users trying to login to the system
                         simultaneously.
Pre-Conditions:          The following functions will be performed before the test:
                               1. All the terminals must be connected to the network and the
                               network should be up and running.
                               2. The username and password for all the 40 users are present in
                               the database.
                               3. All the 40 users are provided with their user name and
                               password
Post-Conditions:         All 40 users get logged into the system
Input Specifications:    The user enters his username and password
Expected Output          The user gets logged into the system.
Specifications:
Pass / Fail Criteria:    The test is successful if it does not take more than 30 seconds for a
                         person to log in and not more than 5 minutes for all 40 users to
                         login.
                         The test is a failure if it exceeds the above mentioned time frame.
Test process:            As the terminal boots up and displays the login in screen
                               1. Enter user name.
                               2. Enter the password
                               3.Click [OK] when finished entering
                               As the users log in to their terminals, monitor the time it takes
                               for a single user to log in to the system and then for all the 40
                               users to login.
                               1. The Pre-Conditions are fulfilled.
Assumptions and
Constraints:
Dependencies:            None
Traceability:            LOS-4
                         Table 20 Server Test Procedure Specification 1-3




CAR_LCA_F05a_T03_V05.0                      41                                     11/21/05
COTS Assessment Report                                                         Version 5.0



Test Case:               1-4
Identifier:              AT-S01-4
Test Items:              System response time (regular applications)
Test Description:        This will test the system’s response time (i.e. how long it takes on an
                         average to open up an instance of the application on a terminal)
                         regular application like running web browsers, playing media files
                         etc.
                         This test is performed with different level of concurrent users i.e.
                         with different number of users running various applications
                         simultaneously.
Pre-Conditions:          The following functions will be performed before the test:
                               1. All the terminals must be connected to the network and the
                               network should be up and running.
                               2. The username and password for all the 40 users are present in
                               the database.
                               3. All the 40 users are provided with
                               their user name and password
                               4. The users are given access rights to
                               the various applications
Post-Conditions:         All 40 users are able to open up all desired applications (regular
                         applications) on their terminal.
Input Specifications:    1. The users enter the username and password to login to the system.
                         2. The user opens up any of the regular applications.
Expected Output          The various regular application desired by all 40 users open up on
Specifications:          their respective terminals.
Pass / Fail Criteria:    The test is successful if on an average the users are able to open an
                         instance of the regular application within 10 seconds
                         The test fails if on an average it takes more than 10 seconds to open
                         up an instance of the regular application
Test process:            After user are logged in to the system:
                               1. Each user starts an instance of various applications on their
                               terminal.
                               2. We will monitor and record the system response time for the
                               different type of application at different level of concurrent
                               users.



CAR_LCA_F05a_T03_V05.0                      42                                    11/21/05
COTS Assessment Report                                                      Version 5.0

                            1. The Pre-Conditions are fulfilled.
Assumptions and
                            2. All users will perform about the same amount of work on the
Constraints:
                               system.
                            3. The users are not running any other high-end applications in
                               background
                         None
Dependencies:
Traceability:            LOS-1
                         Table 21 Server Test Procedure Specification 1-4




CAR_LCA_F05a_T03_V05.0                     43                                 11/21/05
COTS Assessment Report                                                         Version 5.0



Test Case:               1-5
Identifier:              AT-S01-5
Test Items:              Network bandwidth
Test Description:        This test case evaluates the network bandwidth being used by each
                         thin client in the network while running both high-end multimedia
                         and regular applications.
Pre-Conditions:          The following functions will be performed before the test:
                               1. All the terminals must be connected to the network and the
                               network should be up and running.
                               2. The username and password for all the 40 users are present in
                               the database.
                               3. All the 40 users are provided with
                               their user name and password
                               4. The users are given access rights to
                               the various applications
Post-Conditions:         The server is still up and running even after all 40 users run different
                         multimedia and regular applications on their terminals.
Input Specifications:    1. The users enter the username and password to login to the system.
                         2. The user opens up an instance of all the different multimedia and
                         regular applications.
Expected Output          All different applications are opened up.
Specifications:
Pass / Fail Criteria:    The test is successful if on an average the bandwidth consumed by
                         each client is equal to or less than 2 Mbps
                         The test fails if each client consumes more than 2 Mbps of network
                         bandwidth.
Test process:            After all user are logged in to the system:
                               1. Each user starts an instance of various applications (both high-
                               end multimedia and regular applications) on their terminal.
                               2. Once the users have finished launching all the applications,
                               we will monitor and record the bandwidth usage of all clients
                               and calculate and average.

                                 1. The Pre-Conditions are fulfilled.
Assumptions and
                                 2. All users will perform about the same amount of work on
Constraints:
                                    the system. i.e. there are no applications running in the


CAR_LCA_F05a_T03_V05.0                      44                                    11/21/05
COTS Assessment Report                                                      Version 5.0

                                    background.
Dependencies:            None
Traceability:            NA
                         Table 22 Server Test Procedure Specification 1-5




CAR_LCA_F05a_T03_V05.0                     45                                 11/21/05
COTS Assessment Report                                                         Version 5.0

The following tables indicate the test procedures for the cost attribute (AT-S02):

 Test Case:                2-1
 Identifier:               AT-S02-1
 Test Items:               Initial Purchase Cost
 Test Description:         This test will compare the initial cost of ownership of the server
                           against the client’s expected cost, as per the client’s budget
 Pre-conditions:           The following pre-conditions must be met before the following test
                           can be performed:
                                  1. COTS product must be available from COTS vendor
                                  2. COTS vendor must be able to supply price information
                                      for COTS product.
                                  3. A communication channel (phone number or email
                                      address) must be established between the customer and
                                      the COTS vendor prior to the test.
                                  4. Customer needs to set an expected price range for
                                      purchasing COTS items.
 Post-conditions:          The following function will be preformed after the test procedure.
                                  1. Obtain a price quote for 1 unit of the Server COTS
                                     product.
 Input Specifications:     The following information will be added to perform the test:
                                  1. Customer calls or emails COTS vendor indicating her
                                     interest in purchasing a COTS product.
                                  2. Customer supply parts number to COTS vendor for
                                     product lookup.
 Expected Output           The following information are the expected output:
 Specifications:                  1. COTS vendor will supply price quote (in US dollar
                                     amounts)
                                  2. COTS items will be assigned High, Medium, Low status.
 Pass/Fail Criteria:       The following information is the pass/fail criteria for testing:
                                  1. COTS item availability (in stock, back order,
                                     discontinued, etc.)
                                  2. COTS item price falls within the initial cost price range
                                     set by the customer
 Test Process:             The following list the test process:
                                  1. Customer contact COTS vendor via phone or email.
                                  2. Customer checks the availability of COTS items with
                                     COTS vendor.
                                  3. Customer request price quote for one unit of COTS item.
                                  4. Customer compares price against initial estimated cost
                                     price.



CAR_LCA_F05a_T03_V05.0                      46                                       11/21/05
COTS Assessment Report                                                      Version 5.0



Assumptions and          The following list the assumptions and constraints for the test:
constraints:                    1. COTS vendor is willing to disclose pricing information
                                2. Pricing information for COTS item must not change
                                   dramatically during the duration of the test.
                                3. Final price quote must not exceed the maximum
                                   expected price range set by the customer.
Dependencies:            None
Traceability:            NA
                         Table 23 Server Test Procedure Specification 2-1




CAR_LCA_F05a_T03_V05.0                     47                                 11/21/05
COTS Assessment Report                                                      Version 5.0



Test Case:               2-2
Identifier:              AT-S02-2
Test Items:              Upgrade Cost
Test Description:        This test will compare the upgrade cost of the server against the
                         client’s expected upgrade cost, as per the client’s budget
Pre-conditions:          The following pre-conditions must be met before the following test
                         can be performed:
                                1. COTS product upgrade parts must be available from
                                    COTS vendor.
                                2. COTS vendor must be able to supply price information
                                    for COTS product upgrade parts.
                                3. A communication channel (phone number or email
                                    address) must be established between the customer and
                                    the COTS vendor prior to the test.
                                4. Customer needs to set an expected price range for
                                    purchasing COTS items upgrade.
                                5. COTS upgrades must be compatible with current COTS
                                    product.
                                6. Current COTS product evaluated must be able to have
                                    support for upgrades (software or hardware).
Post-conditions:         The following function will be preformed after the test procedure.
                                1. Obtain a price quote for upgrade of 1 unit of a particular
                                   COTS product or module.
Input Specifications:    The following information will be added to perform the test:
                                1. Customer calls or emails COTS vendor indicating
                                   interest in purchasing a COTS item upgrade.
                                2. Customer supply parts number to COTS vendor for
                                   lookup of the product that is intended to be upgraded.
Expected Output          The following information are the expected output:
Specifications:                 1. COTS vendor will supply price quote (in US dollar
                                   amounts) for COTS upgrade.
                                2. COTS upgrade items will be assigned High, Medium,
                                   Low status.
Pass/Fail Criteria:      The following information is the pass/fail criteria for testing:
                                1. COTS upgrade item availability (in stock, back order,
                                   discontinued, etc.)
                                2. COTS upgrade item price falls within the upgrade cost
                                   price range set by the customer.
                                3. COTS upgrade item compatibility with current COTS
                                   products.
Test Process:            The following list the test process:


CAR_LCA_F05a_T03_V05.0                    48                                   11/21/05
COTS Assessment Report                                                      Version 5.0

                                 1. Customer contact COTS vendor via phone or email.
                                 2. Customer checks the availability of COTS upgrade items
                                    with COTS vendor or weather items has upgrade options
                                    available.
                                 3. Customer request price quote for upgrades of one unit of
                                    COTS item.
                                 4. Customer compare upgrade price against initial estimated
                                    upgrade cost price.


Assumptions and          The following list the assumptions and constraints for the test:
constraints:                    1. COTS vendor is willing to disclose upgrade pricing
                                   information
                                2. Pricing information for COTS upgrade item must not
                                   change dramatically during the duration of the test.
                                3. Final upgrade price quote must not exceed the maximum
                                   expected price range set by the customer.
Dependencies:            COTS upgrades depends on existing COTS products and must be
                         compatible with existing COTS products.
Traceability:            NA
                         Table 24 Server Test Procedure Specification 2-2




CAR_LCA_F05a_T03_V05.0                     49                                 11/21/05
COTS Assessment Report                                                      Version 5.0



Test Case:               2-3
Identifier:              AT-S02-3
Test Items:              Maintenance Cost
Test Description:        This test will compare the maintenance cost against the client’s
                         expected maintenance budget.
Pre-conditions:          The following pre-conditions must be met before the following test
                         can be performed:
                                1. COTS vendor must have a maintenance plan.
                                2. COTS vendor must be able to supply maintenance price
                                    information for COTS product.
                                3. A communication channel (phone number or email
                                    address) must be established between the customer and
                                    the COTS vendor prior to the test.
                                4. Customer needs to set an expected price range for
                                    maintenance COTS items.
Post-conditions:         The following function will be preformed after the test procedure.
                                1. Obtain a price quote for maintenance cost for COTS
                                   system for one academic year.


Input Specifications:    The following information will be added to perform the test:
                                1. Customer calls or emails COTS vendor indicating
                                   interest in purchasing a one year contract for COTS item
                                   maintenance.
Expected Output          The following information are the expected output:
Specifications:                 1. COTS vendor will supply price quote (in US dollar
                                   amounts) for one year maintenance fee.
Pass/Fail Criteria:      The following information is the pass/fail criteria for testing:
                                1. Availability of a maintenance plan from COTS vendor
                                   for COTS system.
                                2. COTS system maintenance price falls within the annual
                                   maintenance budget price range set by the customer.
                                3. Maintenance plan offer 24 by 7 customer phone & email
                                   support
                                4. Maintenance plan offers on site support versus remote
                                   support or both.
                                5. Customer support is done in the US or is out source to
                                   other countries (for example India)




CAR_LCA_F05a_T03_V05.0                   50                                    11/21/05
COTS Assessment Report                                                        Version 5.0



Test Process:            The following list the test process:
                                1. Customer contact COTS vendor via phone or email.
                                2. Customer asks for the availability of COTS maintenance
                                   plan.
                                3. Customer request price quote for system maintenance for
                                   one academic year.
                                4. Customer checks if support plan offers 24 by 7 technical
                                   support.
                                5. Customer checks if plan offers onsite and remote
                                   support.
                                6. Customer checks if support is done in-house or out
                                   sourced.
                                7. Customer compares support price against the annual
                                   budget dedicated to supporting COTS system.


Assumptions and          The following list the assumptions and constraints for the test:
constraints:                    1. COTS vendor is willing to disclose upgrade pricing
                                   information
                                2. Pricing information for COTS upgrade item must not
                                   change dramatically during the duration of the test.
                                3. Final upgrade price quote must not exceed the maximum
                                   expected price range set by the customer.
Dependencies:            The following lists the dependencies for the test:
                                 1. COTS system support plan may be depending upon
                                 COTS vendor.
                                 2. COTS support plan may require a signing a year contract.
Traceability:            PC-1
                         Table 25 Server Test Procedure Specification 2-3




CAR_LCA_F05a_T03_V05.0                     51                                   11/21/05
COTS Assessment Report                                                         Version 5.0

The following tables indicate the test procedures for the intercomponent compatibility attribute
(AT-S03):

 Test Case:                3-1
 Identifier:               AT-S03-1
 Test Items:               Compatibility of Wayang Outpost to run in the system
                           environment
 Test Description:         This test will test the compatibility of the system for Wayang
                           Outpost, a flash based web application
 Pre-Conditions:           The following conditions have to be met before we can test
                           the item:
                               1. Authentication server is up and running
                               2. Application server is up and running
                               3. Local network is running properly between server and
                                   client machines
                               4. Internet connection is available
                               5. User has logged into the client machine
                               6. Standard browser is available (IE, Mozilla, Netscape)
                               7. Wayang Outpost service website is available
 Post-Conditions:          User is able to use all the available services provided by
                           Wayang Outpost, including sound and animation interactions.
 Input Specifications:     The following tasks will be performed:
                              1. Sound check for the system
                                      The “sound check” button available on the login
                                       screen
                              2. Login to Wayang Outpost
                                      Enter username
                                      Enter password
                              3. Do geometry test
                                      Choose or enter correct answer
                                      Choose or enter incorrect answer
 Expected Output           The following outcome will be expected by the user:
 Specifications:              1. Sound check for the system
                                      A short audio clip will be played
                              2. Login to Wayang Outpost
                                      The welcome screen shows up after user logs in
                              3. Do geometry test


CAR_LCA_F05a_T03_V05.0                      52                                   11/21/05
COTS Assessment Report                                                        Version 5.0


                                    Correct answer mark and time spent on the
                                     question is recorded
                                    Wrong answer mark is recorded and hint is given
Pass / Fail Criteria:    The following information is the pass/fail criteria for testing:
                            1. Sound check for the system
                                    User should be able to hear to audio from the audio
                                     device on the system (speaker or headset)
                            2. Login to Wayang Outpost
                                    User should be able to login to Wayang Outpost
                                     with valid username and password
                            3. Do geometry test
                                    User should receive correct marks in the record as
                                     well as the time he spent for question he answered
                                     correctly
                                    User should receive hint about the question and
                                     need to choose or enter another answer to proceed
Test process:               1. Click on the browser icon on desktop
                            2. Type in the URL for Wayang Outpost
                               (http://kulit.isi.edu/#)
                            3. Click on the “LOG IN” button
                            4. Click on “sound check” button
                            5. Click on “ok” to confirm the sound check result
                            6. Enter user name and password in the pop-up window
                               and click on “log in” button
                            7. Click on “nursery adventure” icon
                            8. Enter correct answer and click on “Next” button
                            9. Enter incorrect answer and click on “Next” button
Assumptions and          All users have the access to the internet on the client machine
Constraints:             and will be provided a proper set of username and password
                         for the Wayang Outpost login
Dependencies:            None
Traceability:            CAP-1
                         Table 26 Server Test Procedure Specification 3-1




CAR_LCA_F05a_T03_V05.0                     53                                    11/21/05
COTS Assessment Report                                                       Version 5.0



Test Case:               3-2
Identifier:              AT-S03-2
Test Items:              Compatibility of MS Office Suite to run in the system environment
Test Description:        This test will test the capability of the system for MS Office Suite,
                         including the use of Word, Excel, and PowerPoint
Pre-Conditions:          The following conditions have to be met before we can test the item:
                            1. Authentication server is up and running
                            2. Application server is up and running
                            3. Local network is running properly between server and client
                                machines
                            4. User has logged into the client machine
Post-Conditions:         User is able to use all the standard functions provided by MS Office
                         Suite, including creating a new document, open, edit and save an
                         existing document.
Input Specifications:    None
Expected Output          None
Specifications:
Pass / Fail Criteria:    None


Test process:            None
Assumptions and          Since all the COTS candidate chosen are windows based machines,
Constraints:             the compatibility of MS Office Suite is not an issue and the test
                         specification detail is neglected
Dependencies:            None
Traceability:            NA
                         Table 27 Server Test Procedure Specification 3-2




CAR_LCA_F05a_T03_V05.0                     54                                   11/21/05
COTS Assessment Report                                                        Version 5.0



Test Case:               3-3
Identifier:              AT-S03-3
Test Items:              Compatibility of Renaissance Place (Accelerated Reader) to run in
                         the system environment
Test Description:        This test will test the capability of the system for Renaissance Place
Pre-Conditions:          The following conditions have to be met before we can test the item:
                            1. Authentication server is up and running
                            2. AR server is up and running
                            3. Local network is running properly between server and client
                                machines
                            4. User has logged into the client machine
Post-Conditions:         User is able to do quizzes and tests on accelerated reader based on
                         the assigned library books and receive report about the reading
Input Specifications:    The following tasks will be performed:
                           1. Login to Accelerated Reader
                                   Enter username
                                   Enter password
                           2. Do reading test
                                   Enter or choose correct answer
                                   Enter or choose wrong answer
                           3. Generate report
                                   Use the print report button
Expected Output          The following tasks will be performed:
Specifications:            1. Login to Accelerated Reader
                                   Main menu will show up after login to Accelerated
                                    Reader
                           2. Do reading test
                                   Correct answer mark and time spent on the question is
                                    recorded
                                   Wrong answer mark and time spent on the question is
                                    recorded
                           3. Generate report
                                   Report generated for the user
Pass / Fail Criteria:    The following information is the pass/fail criteria for testing:



CAR_LCA_F05a_T03_V05.0                    55                                     11/21/05
COTS Assessment Report                                                       Version 5.0


                            4. Login to Accelerated Reader
                                    User should be able to login Accelerated Reader with
                                     valid username and password
                            5. Do reading test
                                    User should receive correct marks in the record as well
                                     as the time he spent for question he answered correctly
                                    User should receive wrong marks in the record as well as
                                     the time he spent for question he answered incorrectly
                            6. Generate report
                                    User should be able to generate the report by using the
                                     print report button
Test process:               1. Click on the Accelerated Reader icon on desktop
                            2. Enter user name and password
                            3. Click on the “LOG IN” button
                            4. Click on “Reading Test” button
                            5. Enter correct answer and click “Next”
                            6. Enter incorrect answer and click “Next”
                            7. Click on “Finish” button
                            8. Click on “Print Report” button
Assumptions and          All users will be provided a proper set of username and password
Constraints:             for the Accelerated Reader login
Dependencies:            None
Traceability:            NA
                         Table 28 Server Test Procedure Specification 3-3




CAR_LCA_F05a_T03_V05.0                     56                                  11/21/05
COTS Assessment Report                                                        Version 5.0



Test Case:               3-4
Identifier:              AT-S03-4
Test Items:              Compatibility of Choice to run in the system environment
Test Description:        This test will test the capability of the system for Choice
Pre-Conditions:          The following conditions have to be met before we can test the item:
                            1. Authentication server is up and running
                            2. Application server is up and running
                            3. Local network is running properly between server and client
                                machines
                            4. User has logged into the client machine
                            5. Obtained proper license for the Choice application
Post-Conditions:         User is able to use all the available functions provided by Choice
Input Specifications:    None
Expected Output          None
Specifications:
Pass / Fail Criteria:    None


Test process:            None
Assumptions and          The current license for Choice application has expired, therefore no
Constraints:             testing can be done until the license is renewed
Dependencies:            None
Traceability:            NA
                         Table 29 Server Test Procedure Specification 3-4




CAR_LCA_F05a_T03_V05.0                     57                                    11/21/05
COTS Assessment Report                                                          Version 5.0

The following tables indicate the test procedures for the interoperability attribute (AT-S04):

 Test Case:                 4-1
 Identifier:                AT-S04-1
 Test Items:                Interoperability of authentication process
 Test Description:          This test will test the ability of information being exchanged
                            between server and clients during the authentication process
 Pre-Conditions:            The following conditions have to be met before we can test the item:
                               1. Authentication server is up and running
                               2. Local network is running properly between server and client
                                   machines
                               3. User profile has been pre-entered by system administrator
 Post-Conditions:           User can login to the system
 Input Specifications:      User will enter the username and password to login to the system
 Expected Output            User will be able to login with valid username and password, or be
 Specifications:            prompted with error message to correct the information if username
                            and password are invalid.
 Pass / Fail Criteria:      User login information will be validated and user can login into the
                            system with proper username and password
                                1. Enter valid username and password and click ok
 Test process:
                                2. Enter invalid username and/or password and click ok
 Assumptions and            None
 Constraints:
 Dependencies:              None
 Traceability:              CAP-2
                           Table 30 Server Test Procedure Specification 4-1




CAR_LCA_F05a_T03_V05.0                       58                                    11/21/05
COTS Assessment Report                                                       Version 5.0



Test Case:               4-2
Identifier:              AT-S04-2
Test Items:              Interoperability of application processing
Test Description:        This test will test the ability of information being exchanged
                         between server and clients when client machines request to run an
                         application hosted on the server
Pre-Conditions:          The following conditions have to be met before we can test the item:
                            1. Authentication server is up and running
                            2. Application server is up and running
                            3. Local network is running properly between server and client
                                machines
                            4. User has logged into the client machine
Post-Conditions:         User has access to all the applications hosted on the server
Input Specifications:    User will run the applications hosted on the server from the client
                         machines by clicking the icons on the desktop
Expected Output          User can run all the applications
Specifications:
Pass / Fail Criteria:    User login information will be validated and user can login into the
                         system with proper username and password
                             1. Enter valid username and password and click ok
Test process:
                             2. Enter invalid username and/or password and click ok
Assumptions and          None
Constraints:
Dependencies:            None
Traceability:            NA
                         Table 31 Server Test Procedure Specification 4-2




CAR_LCA_F05a_T03_V05.0                     59                                   11/21/05
COTS Assessment Report                                                          Version 5.0



Test Case:               4-3
Identifier:              AT-S04-3
Test Items:              Interoperability of Active Directory service
Test Description:        This test will test the ability of information being exchanged
                         between server and clients when client machines request any
                         resource on the network
Pre-Conditions:          The following conditions have to be met before we can test the item:
                            1. Authentication server is up and running
                            2. Local network is running properly between server and client
                                machines
                            3. User has logged into the client machine
Post-Conditions:         User has access to different network resources according to the user
                         access level assigned in the Active Directory database
Input Specifications:    The following network resources will be requested by the user:
                           1. Application server
                                   Open MS Office from desktop
                                   Open Choice from desktop
                           2. AR server
                                   Open Accelerated Reader from desktop
                           3. Print service
                                   Print a document
                           4. User profile/folder
                                   Open a file on desktop
                                   Save a file on desktop
Expected Output          The following outcome will be expected:
Specifications:            1. Application server
                                   Open MS Office from desktop
                                        i.        MS Office will open if user has the access right
                                       ii.        MS Office will not open if user does not have
                                                  the access right
                                   Open Choice from desktop
                                        i.        Choice will open if user has the access right
                                       ii.        Choice will not open if user does not have the
                                                  access right


CAR_LCA_F05a_T03_V05.0                       60                                    11/21/05
COTS Assessment Report                                                                Version 5.0


                            2. AR server
                                    Open Accelerated Reader from desktop
                                       i.         Accelerated Reader will open if user has the
                                                  access right
                                      ii.         Accelerated Reader will not open if user does not
                                                  have the access right
                            3. Print service
                                    Print a document
                                         i.       User with access right can print out from a
                                                  network printer
                                         ii.      User without access right can’t print out from a
                                                  network printer
                            4. User profile/folder
                                    Open a file on desktop
                                            i.        User can open any available files on the desktop
                                    Save a file on desktop
                                            i.        User can save file if logged in as regular user
                                            ii.       User can’t save file if logged in as guest
Pass / Fail Criteria:    User will be granted access right to the network resources by the
                         Active Directory service according to the role in the Active
                         Directory profile.
                            1. Click on MS Word icon on the desktop
Test process:
                            2. Click on Open icon in MS Word
                            3. Select a file from desktop to open
                            4. Click on Print icon in MS Word
                            5. Click on Save icon in MS Word
                            6. Click on Choice icon on the desktop
                            7. Click on Accelerated Reader icon on the desktop
                            8. Click on Choice icon on the desktop
Assumptions and          None
Constraints:
Dependencies:            None
Traceability:            NA
                         Table 32 Server Test Procedure Specification 4-3




CAR_LCA_F05a_T03_V05.0                           61                                      11/21/05
COTS Assessment Report                                                      Version 5.0



Test Case:               4-4
Identifier:              AT-S04-4
Test Items:              Print Service
Test Description:        This test will evaluate COTS support for printer services.
Pre-conditions:          The following pre-conditions must be met before the following test
                         can be performed:
                                1. At least one printer is attached to the network, and is
                                    functioning properly.
                                2. Each printer on the network has a unique name.
                                3. System has support for word processing capability with
                                    printer support.
                                4. User has access rights to printer through Active
                                    Directory Services.
                                5. The printer is fully supported by Windows with the
                                    Windows compatible drivers.
Post-conditions:         The following function will be performed after the test procedure:
                                1. Obtain a test page document printed from any client
                                   terminal.
                                2. Obtain a document printed using MS Office or similar
                                   word processor.
Input Specifications:    The following information will be added to perform the test:
                                1. Print jobs submitted to the print services
Expected Output          The following information are the expected output:
Specifications:                 1. Print Services will output Test Page to printer.
                                2. Print Services will output printed word document to
                                   printer.
Pass/Fail Criteria:      The following information is the pass/fail criteria for testing:
                                1. Option to setup print services.
                                2. Printer can be made to share between all clients.
                                3. Print Jobs can be queued or spooled
                                4. Print Jobs has assigned priority (High, Medium, Low)
                                5. Print Jobs can be canceled
Test Process:            The following list the test process:
                                1. User open up Windows Print Manager
                                2. User select print Test Page
                                3. User open up MS Office or Word Processor
                                4. User type in sample text “Hello World”
                                5. User Select File  Print
                                6. User Open Print Manager
                                7. User Assign priority to print jobs


CAR_LCA_F05a_T03_V05.0                   62                                     11/21/05
COTS Assessment Report                                                        Version 5.0


Assumptions and          The following list the assumptions and constraints for the test:
constraints:                    1. Printer must have network capabilities or connected to a
                                   print server.
                                2. Printer services must be able to comply with Windows
                                   Print Services.
                                3. Printers on the network are setup to have unique name
Dependencies:            The following lists the dependencies for the test:
                                 1. Print Services must depend on Windows drivers for
                                    printers.
Traceability:            CAP-2
                         Table 33 Server Test Procedure Specification 4-4




CAR_LCA_F05a_T03_V05.0                     63                                   11/21/05
COTS Assessment Report                                                       Version 5.0



Test Case:               4-5
Identifier:              AT-S04-5
Test Items:              Creating User Profile
Test Description:        This test will create a new user profile in the Active Directory.
Pre-conditions:          The following pre-condition must be met before the following test
                         can be performed:
                                1.   System has been setup to include Microsoft Active
                                     Directory Services, and Active Directory Services has
                                     been started.
                                2.   System must be running Windows Server 2003.
                                3.   Initial new user profile settings are empty.
                                4.   User is a student or faculty member at PHS.
                                5.   Administrative privileges are given to the System
                                     Administrator to create new user profile.
                                6.   System already has group policies for the following
                                     group (Student, Faculty, Administrator)
Post-conditions:         The following function will be performed after the test procedure.
                                1. The new user profile (including username, password, and
                                   group policies) appears in Active Directory.
                                2. A folder with the username will be created in the Active
                                   Directory Server or File Server to serve as the user’s
                                   home folder.
                                3. Each profile will be assigned a set disk storage quota.
                                4. Student accounts will set to expire after 1 academic year.
                                5. Faculty accounts will only expire if the faculty is no
                                   longer an employee at PHS.
Input Specifications:    The following information will be added to perform the test:
                                1. New profile information including:
                                    Username
                                    Password
                                    User ID
                                    Status (Student or Faculty)
                                    Assigned group policy (Students, Faculty, Administrator)
Expected Output          The following information are the expected output:


CAR_LCA_F05a_T03_V05.0                    64                                    11/21/05
COTS Assessment Report                                                        Version 5.0

Specifications:                  1. Active Directory Services created a user profile in the
                                    database.
                                 2. A folder with the username is created to serve as the
                                    home folder for that user.
Pass/Fail Criteria:      The following information is the pass/fail criteria for testing:
                                 1. Support for Active Directory Services
                                 2. System resources (storage space) allows for creation of
                                    new user folders for all students and faculty at PHS.
                                 3. Disk quota per user cannot be less than 15 megabytes.
                                 4. User can only view and access their own folders, and
                                    cannot view or access folders of other users.
                                 5. System administrator has the rights to access all folders.
Test Process:            The following list the test process:
                                 1. System administrator launches Active Directory
                                    Services.
                                 2. System administrator chooses to create a new user
                                    profile.
                                 3. System administrator add (username, password, user id,
                                    status) information to the new account.
                                 4. System administrator associates the new account with the
                                    current group policies.
                                 5. System administrator confirms the creation of a new user
                                    profile.
Assumptions and          The following list the assumptions and constraints for the test:
constraints:                     1. User profile does not currently exist in the system.
                                 2. User is authorized only by the administrator.
                                 3. Only System Administrator can create new user profile.
                                 4. Each user profile must be unique.
                                 5. Disk quota per user is set to be 15 megabytes.
Dependencies:            User profile is dependent upon Active Directory Services.
Traceability:            NA
                         Table 34 Server Test Procedure Specification 4-5




CAR_LCA_F05a_T03_V05.0                     65                                    11/21/05
COTS Assessment Report                                                          Version 5.0

The following tables indicate the test procedures for the vendor support attribute (AT-S05):

 Test Case:                5-1
 Identifier:               AT-S05-1
 Test Items:               Response time for critical problems
 Test Description:         This test will find out the average response time taken by the vendor
                           to fix the critical problems faced by the customer.
 Pre-conditions:           The following pre-condition must be met before the following test
                           can be performed:
                                   1. Identify and obtain the contact person’s name,
                                      Department, telephone number and email address from
                                      the COTS vendor
                                   2. A correspondence must have already occurred and the
                                      communications channel (telephone or email) must have
                                      been established between the customer and the COTS
                                      vendor prior to the test.
                                   3. Both the customer and the COTS vendor must be aware
                                      of the contractual bindings and/or the warranty
                                      obligations of the vendor to the customer. The problem
                                      to be fixed should fall in this scope.
 Post-conditions:          The following function will be performed after the test procedure.
                                   1. The vendor would have responded, logged-in a call, and
                                      given a date and time to the customer for looking into the
                                      problem.
                                   2. The vendor would have provided the support on the date
                                      and time confirmed by them with the customer when the
                                      call was logged in.
 Input Specifications:     The following information will be added to perform the test:
                                   1. Customer will contact the COTS vendor by phone or
                                      email and ‘login a call’ indicating that there is a critical
                                      problem which has arisen in the customer site and needs
                                      immediate attention.
                                   2. Customer will supply more information (if required) like
                                      warranty details, known details of the problem etc to
                                      COTS vendor.
 Expected Output           The following information are the expected output:
 Specifications:                   1. COTS vendor will login a call and give the customer a
                                      call priority number as well as the date and time on
                                      which they will be attending this call.


CAR_LCA_F05a_T03_V05.0                      66                                     11/21/05
COTS Assessment Report                                                        Version 5.0


                                2. COTS vendor will attend to the call on the mentioned
                                   date and time, and the problem will be solved (or advice
                                   will be given to the customer for the future course of
                                   action).
Pass/Fail Criteria:      The following information is the pass/fail criteria for testing:
                                1. The COTS vendor will respond within a timeframe
                                   already being discussed and agreed upon in the contract
                                   or warranty card.
                                2. COTS vendor will solve the problem or take any other
                                   required action on the customer site on the date and time
                                   as per their agreement during the call log-in.
Test Process:            The following list the test process:
                                1. Customer contact COTS vendor via phone or email.
                                2. Customer ‘login a call’ with the COTS vendor indicating
                                   that there is a critical problem which has arisen in the
                                   customer site which needs immediate attention.
                                3. Customer compares the time taken for the initial
                                   response with the expected response time as per the
                                   contract or warranty card.
                                4. Customer compares the date and time on which the
                                   COTS vendor takes the required action on the customer
                                   site and solves the problem, to the expected response
                                   time as per the contract or warranty card.
                                5. Also the customer compares the date and time on which
                                   the COTS vendor take the required action on the
                                   customer site and solve the problem to that promised by
                                   the COTS vendor when the call was logged-in.


Assumptions and          The following list the assumptions and constraints for the test:
constraints:                    1. COTS vendor will not have any abnormal activities
                                   going on in their organization like a company merger,
                                   department take-over etc which may in normal case
                                   delay the response time.
                                2. COTS vendor contact is working at his/her regular
                                   workload and there is nothing unusual happening with
                                   his work like a much increased workload. This can really
                                   slow down the response in many cases.
                                3. The problem to be fixed should fall in the scope of the
                                contractual bindings and/or the warranty obligations of the
                                vendor to the customer.


CAR_LCA_F05a_T03_V05.0                    67                                     11/21/05
COTS Assessment Report                                                      Version 5.0


Dependencies:            None
Traceability:            NA
                         Table 35 Server Test Procedure Specification 5-1




CAR_LCA_F05a_T03_V05.0                     68                                 11/21/05
COTS Assessment Report                                                        Version 5.0



Test Case:               5-2
Identifier:              AT-S05-2
Test Items:              Remote assistance
Test Description:        This test will find out the effectiveness of the remote assistance
                         provided by the COTS vendor.
Pre-conditions:          The following pre-condition must be met before the following test
                         can be performed:
                                1. Identify and obtain the contact person’s name,
                                   Department, telephone number and email address from
                                   the COTS vendor
                                2. Both the customer and the COTS vendor must be aware
                                   of the contractual bindings and/or the warranty
                                   obligations of the vendor to the customer. The problem
                                   to be fixed by remote assistance should fall in this scope.
                                3. A time for calling the COTS vendor should be fixed
                                   before itself if it is necessary.
Post-conditions:         The following function will be performed after the test procedure.
                                1. The vendor would have responded to the call and
                                   provided the support then itself during the same call, or
                                   given a date and time to call him for looking into the
                                   problem.
                                2. The vendor would have solved the problem remotely, or
                                   provided with an alternate solution to the customer for
                                   future actions.
Input Specifications:    The following information will be added to perform the test:
                                1. Customer will contact the COTS vendor by phone or
                                   email and ‘login a call’ indicating that there is a problem
                                   to be solved remotely which has arisen in the customer
                                   site which needs immediate attention.
                                2. Customer will supply more information (if required) like
                                   warranty details, known details of the problem etc to
                                   COTS vendor by telephone or email if necessary.
                                3. Customer will supply more details on the problem to the
                                   vendor if necessary.
Expected Output          The following information are the expected output:
Specifications:                 1. COTS vendor will login a call and give the customer
                                   remote assistance in solving the problem.



CAR_LCA_F05a_T03_V05.0                    69                                    11/21/05
COTS Assessment Report                                                        Version 5.0


                                 2. The vendor will solve the problem remotely, or provide
                                    with an alternate solution to the customer for future
                                    actions
Pass/Fail Criteria:      The following information is the pass/fail criteria for testing:
                                 1. The COTS vendor will respond within a timeframe
                                    already being discussed and agreed upon in the contract
                                    or warranty card for remote assistance.
                                 2. COTS vendor will solve the problem or take any other
                                    required action on the customer site by doing remote
                                    support to prove the effectiveness of support.
Test Process:            The following list the test process:
                                 1. Customer contact COTS vendor via phone or email.
                                 2. Customer ‘login a call’ with the COTS vendor indicating
                                    that there is a problem which has arisen in the customer
                                    site which needs remote assistance.
                                 3. Customer compare the time taken for the response with
                                    the expected response time as per the contract or
                                    warranty card for remote assistance
                                 4. The customer confirms that the problem has been solved
                                    and the support has been effective.
Assumptions and               The following list the assumptions and constraints for the test:
constraints:                     1. Pre-conditions are met.
                                 2. The problem to be fixed should fall in the scope of the
                                    contractual bindings and/or the warranty obligations of
                                    the vendor to the customer
                                 3. The problem identified in the system to make a test case
                                    should be fit enough to be made a remote support call.
Dependencies:            None
Traceability:            NA
                         Table 36 Server Test Procedure Specification 5-2




CAR_LCA_F05a_T03_V05.0                     70                                    11/21/05
COTS Assessment Report                                                         Version 5.0



Test Case:               5-3
Identifier:              AT-S05-3
Test Items:              Hardware support
Test Description:        This test will find out the efficiency of the vendor to fix the
                         hardware related issues
Pre-conditions:          The following pre-condition must be met before the following test
                         can be performed:
                                1. Identify and obtain the contact person’s name,
                                   Department, telephone number and email address from
                                   the COTS vendor
                                2. A correspondence must have already occurred and the
                                   communications channel (telephone or email) established
                                   between the customer and the COTS vendor prior to the
                                   test.
                                3. Both the customer and the COTS vendor must be aware
                                   of the contractual bindings and/or the warranty
                                   obligations of the vendor to the customer. The problem
                                   to be fixed should fall in this scope.
Post-conditions:         The following function will be performed after the test procedure.
                                1. The vendor would have responded, logged-in a call, and
                                   given a date and time to the customer for looking into the
                                   problem.
                                2. The vendor would have provided the support on the date
                                   and time confirmed by them with the customer when the
                                   call was logged in.
                                3. The problem would have got solved.
Input Specifications:    The following information will be added to perform the test:
                                1. Customer will contact the COTS vendor by phone or
                                   email and ‘login a call’ indicating that there is a
                                   hardware problem which has arisen in the customer site
                                   which needs immediate attention.
                                2. Customer will supply more information (if required) like
                                   warranty details, known details of the problem etc to
                                   COTS vendor.
Expected Output          The following information are the expected output:
Specifications:                 1. COTS vendor will login a call and give the customer a
                                   call priority number as well as the date and time on
                                   which they will be attending this call.


CAR_LCA_F05a_T03_V05.0                    71                                     11/21/05
COTS Assessment Report                                                        Version 5.0


                                 2. COTS vendor will attend to the call on the mentioned
                                    date and time, and the problem will be solved (or advise
                                    given to the customer for the future course of action).
Pass/Fail Criteria:      The following information is the pass/fail criteria for testing:
                                 1. The COTS vendor will respond within a timeframe
                                    already being discussed and agreed upon in the contract
                                    or warranty card.
                                 2. COTS vendor will solve the problem or take any other
                                    required action on the customer site on the date and time
                                    as per their agreement during the call log-in.
Test Process:            The following list the test process:
                                 1. Customer contact COTS vendor via phone or email.
                                 2. Customer ‘login a call’ with the COTS vendor indicating
                                    that there is a hardware which has arisen in the customer
                                    site.
                                 3. Customer compares the time taken for the initial
                                    response with the expected response time as per the
                                    contract or warranty card.
                                 4. Customer compares the date and time on which the
                                    COTS vendor takes the required action on the customer
                                    site and solves the problem, to the expected response
                                    time as per the contract or warranty card.
                                 5. Also the customer compares the date and time on which
                                    the COTS vendor take the required action on the
                                    customer site and solve the problem to that promised by
                                    the COTS vendor when the call was logged-in.
                                 6. The customer confirms that the hardware problem has
                                    been solved and the support has been effective
Assumptions and          The following list the assumptions and constraints for the test:
constraints:                     1. COTS vendor will not have any abnormal activities
                                    going on in their organization like a company merger,
                                    department take-over etc which may in normal case
                                    delay the response time.
                                 2. COTS vendor contact is working at his/her regular
                                    workload and there is nothing unusual happening with
                                    his work like a much increased workload. This can really
                                    slow down the response in many cases.
                                 3. The pre-conditions are met.
                                 4. If the hardware is to be replaced, the COTS vendor has it


CAR_LCA_F05a_T03_V05.0                    72                                     11/21/05
COTS Assessment Report                                                      Version 5.0

                                     in stock.
Dependencies:            It depends whether the hardware part which has developed a
                         problem is under warranty or not. If it is under warranty it may be
                         replaced free of cost.
Traceability:            NA
                         Table 37 Server Test Procedure Specification 5-3




CAR_LCA_F05a_T03_V05.0                     73                                  11/21/05
COTS Assessment Report                                                      Version 5.0



Test Case:               5-4
Identifier:              AT-S05-4
Test items:              Software upgrades
Test Description:        This test will measure the capability of the vendor in supplying the
                         software upgrades to the customer
Pre-conditions:          The following pre-condition must be met before the following test
                         can be performed:
                                1. Identify and obtain the contact person’s name,
                                    Department, telephone number and email address from
                                    the COTS vendor
                                2. A correspondence must have already occurred and the
                                    communications channel (telephone or email) established
                                    between the customer and the COTS vendor prior to the
                                    test.
                                3. Both the customer and the COTS vendor must be aware
                                    of the contractual bindings and/or the warranty
                                    obligations of the vendor to the customer. The software
                                    upgrade requested by the customer should fall in this
                                    scope.
                                4. A new software upgrade has to be made available by the
                                    software manufacturer
Post-conditions:         The following function will be performed after the test procedure.
                                1. The vendor would have provided the software upgrade
                                   requested by the customer.
Input Specifications:    The following information will be added to perform the test:
                                1. Customer will contact the COTS vendor by phone or
                                   email and ‘login a call’ indicating that they need to
                                   procure the software upgrade.
                                2. Customer will supply more information (if required) like
                                   warranty details to COTS vendor.
Expected Output          The following information are the expected output:
Specifications:                 1. COTS vendor will login a call and give the customer a
                                   call priority number as well as the date and time on
                                   which they will be sending the upgrade.
                                2. COTS vendor will send the software upgrade by email or
                                   by post by the promised date and time and the customer
                                   can install it
Pass/Fail Criteria:      The following information is the pass/fail criteria for testing:
                                1. The COTS vendor will respond within a timeframe
                                   already being discussed and agreed upon in the contract
                                   or warranty card for software upgrades


CAR_LCA_F05a_T03_V05.0                   74                                    11/21/05
COTS Assessment Report                                                      Version 5.0

                                 2. COTS vendor will send the software upgrade by email or
                                    by post by the promised date and time and help the
                                    customer in installation.
Test Process:            The following list the test process:
                                1. Customer contact COTS vendor via phone or email.
                                2. Customer ‘login a call’ with the COTS vendor indicating
                                   that they need to procure the new software upgrade.
                                3. Customer compares the time taken for the initial
                                   response with the expected response time as per the
                                   contract or warranty card.
                                4. Customer compares the date and time on which the
                                   COTS vendor supplies the upgrade to the expected
                                   response time as per the contract or warranty card.
                                5. Also the customer compares the date on which they
                                   receive the software upgrade from the COTS vendor to
                                   that promised by the COTS vendor when the call was
                                   logged-in.
                                6. The customer confirms that they are being offered proper
                                   assistance in installing the upgrade
Assumptions and          The following list the assumptions and constraints for the test:
constraints:                    1. The preconditions are met.
                                2. The software upgrade requested has been newly released
                                   by the manufacturer.
Dependencies:            None
Traceability:            NA
                         Table 38 Server Test Procedure Specification 5-4




CAR_LCA_F05a_T03_V05.0                     75                                 11/21/05
COTS Assessment Report                                                      Version 5.0



Test Case:               5-5
Identifier:              AT-S05-5
Test Items:              Warranty
Test Description:        This test will find out the effectiveness of warranty support provided
                         by the vendor to fix the problems faced by the customer for items
                         under warranty
Pre-conditions:          The following pre-condition must be met before the following test can
                         be performed:
                                1. Identify and obtain the contact person’s name in the
                                   technical Department of COTS vendor. Also his telephone
                                   number and email address.
                                2. A correspondence must have already occurred and the
                                   communications channel (telephone or email) established
                                   between the customer and the COTS vendor prior to the
                                   test.
                                3. Both the customer and the COTS vendor must be aware of
                                   the contractual bindings and/or the warranty obligations of
                                   the vendor to the customer. The problem to be fixed should
                                   fall in this scope.
Post-conditions:         The following function will be performed after the test procedure.
                                1. The vendor would have responded, logged-in a call, and
                                   given a date and time to the customer for looking into the
                                   problem.
                                2. The vendor would have provided the support on the date
                                   and time confirmed by them with the customer when the
                                   call was logged in.
Input Specifications:    The following information will be added to perform the test:
                                1. Customer will contact the COTS vendor by phone or email
                                   and ‘login a call’ indicating that there is a problem which
                                   has arisen in the customer site related to an item under
                                   warranty which needs immediate attention.
                                2. Customer will supply more information (if required) like
                                   warranty details, known details of the problem etc to COTS
                                   vendor.
Expected Output          The following information are the expected output:
Specifications:                 1. COTS vendor will login a call and give the customer a call
                                   priority number as well as the date and time on which they
                                   will be attending this call.
                                2. COTS vendor will attend to the call on the mentioned date
                                   and time.
                                3. The problem will be solved (or advice given to the
                                   customer for the future course of action).


CAR_LCA_F05a_T03_V05.0                   76                                    11/21/05
COTS Assessment Report                                                      Version 5.0


Pass/Fail Criteria:      The following information is the pass/fail criteria for testing:
                                1. The COTS vendor will respond within a timeframe already
                                   being discussed and agreed upon in the contract or warranty
                                   card.
                                2. COTS vendor will attend to the call on the mentioned date
                                   and time
                                3. COTS vendor will solve the problem or take any other
                                   required action on the customer site on the date and time as
                                   per their agreement during the call log-in.
Test Process:            The following list the test process:
                                1. Customer contact COTS vendor via phone or email.
                                2. Customer will contact the COTS vendor by phone or email
                                   and ‘login a call’ indicating that there is a problem which
                                   has arisen in the customer site related to an item under
                                   warranty which needs immediate attention.
                                3. Customer compares the time taken for the initial response
                                   with the expected response time as per the contract or
                                   warranty card.
                                4. Customer compares the date and time on which the COTS
                                   vendor takes the required action on the customer site and
                                   solves the problem, to the expected response time as per the
                                   contract or warranty card.
                                5. Also the customer compares the date and time on which the
                                   COTS vendor take the required action on the customer site
                                   and solve the problem to that promised by the COTS
                                   vendor when the call was logged-in.
                                6. The customer checks whether the problem has been solved.
Assumptions and          The following list the assumptions and constraints for the test:
constraints:                    1. COTS vendor will not have any abnormal activities going
                                   on in their organization like a company merger, department
                                   take-over etc which may in normal case delay the response
                                   time.
                                2. COTS vendor contact is working at his/her regular
                                   workload and there is nothing unusual happening with his
                                   work like a much increased workload. This can really slow
                                   down the response in many cases.
                                3. The item which has developed a problem is under warranty.
Dependencies:            None
Traceability:            NA
                         Table 39 Server Test Procedure Specification 5-5




CAR_LCA_F05a_T03_V05.0                     77                                 11/21/05
COTS Assessment Report                                                          Version 5.0

The following tables indicate the test procedures for the security attribute (AT-S06):

 Test Case:                6-1
 Identifier:               AT-S06-1
 Test Items:               User Privileges
 Test Description:         This test case will measure the capability of the server in enforcing
                           user privileges at administrator, student, faculty and guest levels.
 Pre-conditions:           The following pre-conditions must be met before the following test
                           can be performed:
                                   1. All user groups are defined.
                                   2. All group policies are predefined.
                                   3. Password policies are predefined.
                                   4. All users are assigned to their appropriate groups.
 Post-conditions:          Each user of the system will have only those access rights and
                           privileges on basis of their group policies
 Input Specifications:     The following information will be added to perform the test:
                           All the users enter their respective username and password to logon
                           to the system.
 Expected Output           The users are only able to access those applications and personalized
 Specifications:           folders for which they have been granted access.
 Pass/Fail Criteria:       The following information is the pass/fail criteria for testing
                                   1. System administrator has full access to all system
                                      resources.
                                   2. Faculties only have access to student folders and have
                                      rights to install educational software
                                   3. Students have read/write/delete permissions for their
                                      personal folders.
 Test Process:             The following list the test process:
                                   1. Login to the system as Administrator and check if all
                                      administrator rights are granted
                                   2. Login to the system as Faculty and check if all faculty
                                      group rights are granted.
                                   3. Login to the system as Student and check if all student
                                      group rights are granted.
 Assumptions and           The following list the assumptions and constraints for the test:
 constraints:                      1. The Pre-Conditions are fulfilled.


CAR_LCA_F05a_T03_V05.0                       78                                    11/21/05
COTS Assessment Report                                                      Version 5.0


                                 2. The user name should be unique for all users.
Dependencies:            None
Traceability:            NA
                         Table 40 Server Test Procedure Specification 6-1




CAR_LCA_F05a_T03_V05.0                     79                                 11/21/05
COTS Assessment Report                                                              Version 5.0


                        4.1.3.4               Test Results

This section lists the test results of all the test procedures described in section 4.1.3.3 of this
document.


The following tables indicate the test result for the performance attribute (AT-S01):

 Test Case:                      1-1
 Identifier:                     AT-S01-1
 Test Items:                     Number of concurrent users
 Test Result Classification      Pass
 (Pass /Fail):
 Problem / Defect Report:        As we are awaiting the delivery of the server, we were currently
                                 not able to perform the tests at the customer’s site. Therefore
                                 these test results are based on the conversations with the COTS
                                 vendor and on the web references given by him/her.
 Feedback / Comment:             The server has a total of 4 GB RAM.
                                 Assuming that Windows 2003 Server requires 512 MB RAM to
                                 perform optimally and each thin-client requires 64 MB RAM to
                                 perform optimally, then ideally the server can support up to a
                                 maximum of 56 concurrent clients.
                                 Reference: http://www.msterminalservices.org/articles/Juggling-
                                 Terminal-Service-Resources.html
                                 Rating – 10/10
                                       Table 41 Server Test Result 1-1




CAR_LCA_F05a_T03_V05.0                          80                                     11/21/05
COTS Assessment Report                                                      Version 5.0



Test Case:                   1-2
Identifier:                  AT-S01-2
Test Items:                  System response time (multimedia applications)
Test Result Classification   Pass
(Pass /Fail):
Problem / Defect Report:     As we are awaiting the delivery of the server, we were currently
                             not able to perform the tests at the customer’s site. Therefore
                             these test results are based on the conversations with the COTS
                             vendor and on the web references given by him/her.
Feedback / Comment:          The server supports two 3.0 GHz Xeon Processor that conform
                             to the symmetric multiprocessing standard. Using SMP, it makes
                             it possible for multimedia applications to use multiple processors
                             when additional processing power is required to increase the
                             capability of a system.
                             As multimedia applications are processor-intensive, the system
                             response time will greatly benefit from the server’s support for
                             SMP.
                             Reference:
                             http://www.microsoft.com/windowsserver2003/evaluation/featur
                             es/highlights.mspx#winmedia
                             Rating – 10/10
                                   Table 42 Server Test Result 1-2




CAR_LCA_F05a_T03_V05.0                      81                                 11/21/05
COTS Assessment Report                                                      Version 5.0



Test Case:                   1-3
Identifier:                  AT-S01-3
Test Items:                  System login time
Test Result Classification   Pass
(Pass /Fail):
Problem / Defect Report:     As we are awaiting the delivery of the server, we were currently
                             not able to perform the tests at the customer’s site. Therefore
                             these test results are based on the conversations with the COTS
                             vendor and on the web references given by him/her.
Feedback / Comment:          The simultaneous login processing load is distributed between
                             the two 3.0 GHz Xeon processors in the server and thus a single
                             processor will not be burdened with the entire processing load,
                             thus there will be a major improvement in the system login time.
                             Reference:
                             http://www.microsoft.com/windowsserver2003/evaluation/featur
                             es/highlights.mspx#winmedia
                             Rating – 10/10
                                   Table 43 Server Test Result 1-3




CAR_LCA_F05a_T03_V05.0                      82                                11/21/05
COTS Assessment Report                                                      Version 5.0



Test Case:                   1-4
Identifier:                  AT-S01-4
Test Items:                  System response time (regular applications)
Test Result Classification   Pass
(Pass /Fail):
Problem / Defect Report:     As we are awaiting the delivery of the server, we were currently
                             not able to perform the tests at the customer’s site. Therefore
                             these test results are based on the conversations with the COTS
                             vendor and on the web references given by him/her.
Feedback / Comment:          The server supports two 3.0 GHz Xeon Processor that conform
                             to the symmetric multiprocessing standard. Using SMP, it makes
                             it possible for applications to use multiple processors when
                             additional processing power is required to increase the capability
                             of a system.
                             As applications are processor-intensive, the system response
                             time will greatly benefit from the server’s support for SMP.
                             Reference:
                             http://www.microsoft.com/windowsserver2003/evaluation/feature
                             s/highlights.mspx#winmedia
                             Rating – 10/10
                                   Table 44 Server Test Result 1-4




CAR_LCA_F05a_T03_V05.0                      83                                 11/21/05
COTS Assessment Report                                                      Version 5.0



Test Case:                   1-5
Identifier:                  AT-S01-5
Test Items:                  Network bandwidth
Test Result Classification   Pass
(Pass /Fail):
Problem / Defect Report:     None
Feedback / Comment:          The network at PHS has a bandwidth of 100 Megabits and each
                             thin-client connection was tested to consume a maximum of 2
                             Megabits per client, thus the overall bandwidth with 40 thin-
                             client terminals would sum up to:
                             2 x 40 = 80 Megabits.
                             Reference: As per the tests performed by the developers at the
                             client’s site.
                             (Refer Section 3.2.3 of CAR)
                             Rating – 8/10
                                   Table 45 Server Test Result 1-5




CAR_LCA_F05a_T03_V05.0                      84                                11/21/05
COTS Assessment Report                                                            Version 5.0

The following tables indicate the test result for the cost attribute (AT-S02):

 Test Case:                     2-1
 Identifier:                    AT-S02-1
 Test Items:                    Initial purchase cost for Tangent TC Server Model Pillar 2750s
 Test Result Classification     Pass- The initial purchase cost falls within the client’s budget for
 (Pass / Fail)                  the server.
                                The initial cost for the server is $5,211.40 pre-sales tax.
                                The client’s expected server cost is $ 5,500
 Problem / Defect Report        None
 Feedback / Comments:           Reference:
                                Louise O’Sullivan at Tangent Computers
                                Contact # : 800-342-9388 ext. 2137
                                Rating: 10/10
                                      Table 46 Server Test Result 2-1




CAR_LCA_F05a_T03_V05.0                         85                                   11/21/05
COTS Assessment Report                                                        Version 5.0



Test Case:                   2-2
Identifier:                  AT-S02-2
Test Items:                  Upgrade cost for Tangent TC Server Model Pillar 2750S
Test Result Classification   Pass – The upgrade cost is within the client’s upgrade budget.
(Pass / Fail)
Problem / Defect Report      Hardware Upgrade Cost: We are limited to RAM and Storage.
                             Software Upgrade Cost: Depends on price of software.
Feedback / Comments:         The upgrade cost for this particular item is limited to only DDR-
                             RAM and Storage. Due to the constraints with the motherboard
                             only support Pentium 4 CPU, we do not expect the system to
                             support next generation CPUs from Intel.       Estimated Upgrade
                             Cost of both Memory and Storage: $1,100.00
                             For software upgrades, we do not see any problem with software as
                             long as the software is based on Microsoft Windows Platform. As
                             for operating system update, the next major Windows Server
                             release, Windows Server "Longhorn" is scheduled for late 2007.
                             At the time of this writing, Microsoft has not release the specs for
                             this product.
                             The current server hardware specification is built to support future
                             upgrades if necessary. However currently, we do not see
                             upgrading as a major issue.
                             Reference:
                             Louise O’Sullivan at Tangent Computers
                             Contact # : 800-342-9388 ext. 2137
                             Rating: 7/10
                                   Table 47 Server Test Result 2-2




CAR_LCA_F05a_T03_V05.0                      86                                   11/21/05
COTS Assessment Report                                                    Version 5.0



Test Case:                   2-3
Identifier:                  AT-S02-3
Test Items:                  Maintenance cost for Tangent TC Server Model Pillar 2750s
Test Result Classification   Pass
(Pass / Fail)
Problem / Defect Report      None
Feedback / Comments:         Currently PHS has an annual server-maintenance contract with
                             Tangent Computers.
                             The new server that is being bought from Tangent will also be
                             under the same maintenance contract, as agreed between the
                             client and the COTS vendor and thus the maintenance cost will
                             be within the client’s budget.
                             Reference:
                             Louise O’Sullivan at Tangent Computers
                             Contact # : 800-342-9388 ext. 2137
                             Rating: 10/10
                                   Table 48 Server Test Result 2-3




CAR_LCA_F05a_T03_V05.0                      87                               11/21/05
COTS Assessment Report                                                         Version 5.0

The following tables indicate the test result for the intercomponent capability attribute (AT-S03):

 Test Case:                    3-1
 Identifier:                   AT-S03-1
 Test Items:                   Compatibility of Wayang Outpost to run in the system
                               environment
 Test Result Classification    Pass
 (Pass /Fail):
 Problem / Defect Report:      None
 Feedback / Comment:           Users are able to login and run the application through Internet
                               Explorer from the client terminals. The quality of the flash
                               animation was not good because the client terminals’ low
                               capability of handling heavy multimedia contents. But the
                               application was able to run smoothly the testers were able to
                               complete all the available tests and games provided by the
                               application.
                               Reference:
                               http://www.wayangoutpost.net/
                               Rating – 8/10
                                     Table 49 Server Test Result 3-1


 Test Case:                    3-2
 Identifier:                   AT-S03-2
 Test Items:                   Compatibility of MS Office Suite to run in the system
                               environment
 Test Result Classification    Pass
 (Pass /Fail):
 Problem / Defect Report:      None
 Feedback / Comment:           The result of this testing is obvious since both the server and
                               clients are running under the Windows-based operating system.
                               Running MS Office Suite under Windows system would not be a
                               problem.
                               Reference:
                               http://www.microsoft.com/office/editions/prodinfo/default.mspx
                               Rating – 10/10
                                     Table 50 Server Test Result 3-2




CAR_LCA_F05a_T03_V05.0                        88                                  11/21/05
COTS Assessment Report                                                          Version 5.0



Test Case:                   3-3
Identifier:                  AT-S03-3
Test Items:                  Compatibility of Renaissance Place (Accelerated Reader) to run
                             in the system environment
Test Result Classification   Pass
(Pass /Fail):
Problem / Defect Report:     None
Feedback / Comment:          The Accelerated Reader is currently hosted on a separate server
                             on the network. Since it’s a web-based application, all it needs to
                             run the application is the access to the web browser. The testers
                             were able to complete all the reading tests as well printing out
                             the report generated by the application.
                             Reference:
                             http://www.renlearn.com/ar/overview/ARSystemRequirements.p
                             df
                             http://www.renlearn.com/ProductSystemRequirements.pdf
                             Rating – 10/10
                                   Table 51 Server Test Result 3-3


Test Case:                   3-4
Identifier:                  AT-S03-4
Test Items:                  Compatibility of Choice to run in the system environment
Test Result Classification   Fail
(Pass /Fail):
Problem / Defect Report:     The license of the software is currently expired
Feedback / Comment:          The testers were not able to run the application since the
                             software license has expired. The librarian will renew the
                             software license in the near future so user will have access to the
                             software.
                             Reference:
                             As per the test performed by the developers at the client’s site
                             (Refer Section 3.2.3 of CAR)
                             Rating – 0/10
                                   Table 52 Server Test Result 3-4




CAR_LCA_F05a_T03_V05.0                      89                                    11/21/05
COTS Assessment Report                                                           Version 5.0

The following tables indicate the test result for the interoperability attribute (AT-S04):

 Test Case:                     4-1
 Identifier:                    AT-S04-1
 Test Items:                    Interoperability of authentication process
 Test Result Classification     Pass
 (Pass /Fail):
 Problem / Defect Report:       As we are awaiting the delivery of the server, we were currently
                                not able to perform the tests at the customer’s site. Therefore
                                these test results are based on the conversations with the COTS
                                vendor and on the web references given by him/her.
                                Sometimes the system will reject the user login even with valid
                                username and password entered.
 Feedback / Comment:            As per the analysis performed by the developers on the current
                                system, we have come across the following worst-case scenario:
                                “Users are able to login to the system with valid username and
                                password entered on the client terminal 98% of the time. Only
                                about 2% of the time, system will reject the user login even with
                                the correct username and password information given.”
                                The problem may occur when the client machine would
                                temporarily loose the connection with the server.
                                Thus, with the specifications of the new server, we have given
                                the following rating:
                                Rating – 9/10
                                      Table 53 Server Test Result 4-1




CAR_LCA_F05a_T03_V05.0                         90                                   11/21/05
COTS Assessment Report                                                       Version 5.0



Test Case:                   4-2
Identifier:                  AT-S04-2
Test Items:                  Interoperability of application processing
Test Result Classification   Pass
(Pass /Fail):
Problem / Defect Report:     As we are awaiting the delivery of the server, we were currently
                             not able to perform the tests at the customer’s site. Therefore
                             these test results are based on the conversations with the COTS
                             vendor and on the web references given by him/her.
Feedback / Comment:          The COTS vendor support claims that users will be able to run
                             all the applications available on the server, as well as run any
                             web-based applications.
                             Reference:
                             Nick Haddad at Tangent Computers
                             Contact # : 800-342-9388
                             Rating – 10/10
                                   Table 54 Server Test Result 4-2




CAR_LCA_F05a_T03_V05.0                      91                                  11/21/05
COTS Assessment Report                                                        Version 5.0



Test Case:                   4-3
Identifier:                  AT-S04-3
Test Items:                  Interoperability of Active Directory service
Test Result Classification   Pass
(Pass /Fail):
Problem / Defect Report:     As we are awaiting the delivery of the server, we were currently
                             not able to perform the tests at the customer’s site. Therefore
                             these test results are based on the conversations with the COTS
                             vendor and on the web references given by him/her.
Feedback / Comment:          Users with regular user login are able to access all the network
                             resources provided by the Active Directory, including running
                             applications on the application server and AR server, getting
                             printout from the network printer, as well as getting assess to all
                             the personal files and folders previously saved on the computer.
                             Users with guest user login will be denied to access any personal
                             files and folders. All the other network resources are still
                             available as being provided by the AD service.
                             Reference:
                             http://www.tangent.com/canopy/ActiveDirectory_Benefits.pdf
                             Rating – 9/10
                                   Table 55 Server Test Result 4-3




CAR_LCA_F05a_T03_V05.0                      92                                   11/21/05
COTS Assessment Report                                                       Version 5.0



Test Case:                   4-4
Identifier:                  AT-S04-4
Test Items:                  Print Services
Test Result Classification   Pass
(Pass /Fail):
Problem / Defect Report:     As we are awaiting the delivery of the server, we were currently
                             not able to perform the tests at the customer’s site. Therefore
                             these test results are based on the conversations with the COTS
                             vendor and on the web references given by him/her.
Feedback / Comment:          When a client connects to the server (which has support for
                             Terminal Services), local printers attached to line printer port
                             (LPT), communications port (COM), and universal serial bus
                             (USB) ports are automatically detected and a local queue is
                             created on the server. The client computer printer settings for the
                             default printer and some properties (such as printing on both
                             sides of the page) are used by the server.
                             Reference:
                             http://download.microsoft.com/download/4/6/b/46bae314-ea7b-
                             4c39-bcb6-defbc907ee54/TSPrint.doc
                             Rating: 10/10
                                   Table 56 Server Test Result 4-4

Test Case:                   4-5
Identifier:                  AT-S04-5
Test Items:                  Creating User Profile
Test Result Classification   Pass
(Pass /Fail):
Problem / Defect Report:     As we are awaiting the delivery of the server, we were currently
                             not able to perform the tests at the customer’s site. Therefore
                             these test results are based on the conversations with the COTS
                             vendor and on the web references given by him/her.
Feedback / Comment:          Active Directory Services on the server allows the administrator
                             to create user profile based on their policy settings.
                             Reference:
                             http://www.tangent.com/canopy/ActiveDirectory_Benefits.pdf
                             Rating: 10/10
                                   Table 57 Server Test Result 4-5



CAR_LCA_F05a_T03_V05.0                      93                                  11/21/05
COTS Assessment Report                                                          Version 5.0

The following tables indicate the test result for the vendor support attribute (AT-S05):

 Test Case:                    5-1
 Identifier:                   AT-S05-1
 Test Items:                   Response time for critical problems
 Test Result Classification    Pass
 (Pass /Fail):
 Problem / Defect Report:      The COTS vendor didn’t give feedback for the prototypes that
                               the developers sent for review, which may affect the deployment
                               and installation of the server.
 Feedback / Comment:           The COTS support vendor was able to answer most our queries
                               in a timely manner except for the feedback for our prototypes.
                               Reference:
                               Nick Haddad at Tangent Computers
                               Contact # : 800-342-9388
                               Rating – 7/10
                                     Table 58 Server Test Result 5-1


 Test Case:                    5-2
 Identifier:                   AT-S05-2
 Test Items:                   Remote assistance
 Test Result Classification    Pass
 (Pass /Fail):
 Problem / Defect Report:      None.
 Feedback / Comment:           The COTS vendor has an administrative account where he can
                               remotely log in to the system to perform the necessary remote
                               assistance as required by the client. This is guaranteed by the
                               annual maintenance contract provided by Tangent Computers.
                               Reference:
                               Nick Haddad at Tangent Computers
                               Contact # : 800-342-9388
                               Rating – 10/10
                                     Table 59 Server Test Result 5-2




CAR_LCA_F05a_T03_V05.0                        94                                  11/21/05
COTS Assessment Report                                                    Version 5.0



Test Case:                   5-3
Identifier:                  AT-S05-3
Test Items:                  Hardware support
Test Result Classification   Pass
(Pass /Fail):
Problem / Defect Report:     None.
Feedback / Comment:          The maintenance contract guarantees 24 x 7 hardware technical
                             support via phone to the client.
                             If necessary, Tangent Computers will provide on-site hardware
                             technical support on a needed basis.
                             Reference:
                             Nick Haddad at Tangent Computers
                             Contact # : 800-342-9388
                             Rating – 10/10
                                   Table 60 Server Test Result 5-3


Test Case:                   5-4
Identifier:                  AT-S05-4
Test Items:                  Software upgrades
Test Result Classification   Fail
(Pass /Fail):
Problem / Defect Report:     Some required software applications by the client are not
                             supported by the COTS vendor under the maintenance contract.
                             Client has to contact the individual software provides for
                             support.
Feedback / Comment:          According to the annual server- maintenance contract, the COTS
                             vendor is liable to provide any software upgrade (OS & MS
                             Office Suite) to the client when required.
                             Third-party applications (Wayang Outpost, Choices &
                             Renaissance Place) are not supported in the maintenance
                             contract.
                             Reference:
                             Nick Haddad at Tangent Computers
                             Contact # : 800-342-9388
                             Rating: 4/10


CAR_LCA_F05a_T03_V05.0                      95                               11/21/05
COTS Assessment Report                                     Version 5.0

                         Table 61 Server Test Result 5-4




CAR_LCA_F05a_T03_V05.0            96                         11/21/05
COTS Assessment Report                                                   Version 5.0



Test Case:                   5-5
Identifier:                  AT-S05-5
Test Items:                  Warranty
Test Result Classification   Pass
(Pass /Fail):
Problem / Defect Report:     None.
Feedback / Comment:          The COTS vendor provides a warranty support for 3 years.
                             Reference:
                             Nick Haddad at Tangent Computers
                             Contact # : 800-342-9388
                             Rating: 10/10
                                   Table 62 Server Test Result 5-5




CAR_LCA_F05a_T03_V05.0                      97                              11/21/05
COTS Assessment Report                                                           Version 5.0

The following tables indicate the test result for the security attribute (AT-S06):

 Test Case:                     6-1
 Identifier:                    AT-S06-1
 Test Items:                    User Privileges
 Test Result Classification     Pass
 (Pass /Fail):
 Problem / Defect Report:       As we are awaiting the delivery of the server, we were currently
                                not able to perform the tests at the customer’s site. Therefore
                                these test results are based on the conversations with the COTS
                                vendor and on the web references given by him/her.
 Feedback / Comment:            The active directory group policy system enables administrator
                                to easily define roles and enforces access rights.
                                Reference:
                                http://www.microsoft.com/windowsserver2003/techinfo/overvie
                                w/security.mspx
                                Rating: 10/10
                                      Table 63 Server Test Result 6-1




CAR_LCA_F05a_T03_V05.0                         98                                    11/21/05
COTS Assessment Report                                                          Version 5.0


                       4.1.3.5            Test Summary

This section summarizes the evaluation test performed on the server COTS product.

4.1.3.5.1    Summary

The tests were performed on the server COTS product, Tangent Pillar™ 2750s. Our COTS
evaluation test has 26 test procedures, which will be performed by three team members and the
results of our COTS evaluation tests will work as the basis for our final COTS recommendation.

4.1.3.5.2    Summary of Results and Consequences

The test results and references showed that the Tangent Pillar™ 2750s server has covered all the
core capabilities and level of service requirement as required by the client. However, the detailed
test results also revealed some limitation beyond what the server can change. For example, the
network infrastructure will also affect the performance and functionality that the server can
provide. Also if additional users or new applications are required in the future, the performance
will be greatly affected due to the nature of the thin-client network.

4.1.3.5.3    Evaluation

As the server has not yet been delivered to the client, we need to make use of benchmarking and
references to substantiate the test results.AT-S01, AT-S03, AT-S04, and AT-S06 are tested based
on the expected performance and outcome of the server. AT-S02, AT-S05, and AT-S07 are
based on the information provided by the COTS vendor, such as price list, hardware
specification data, or the vendor feedback. All the detailed evaluations are described in section
4.1.3 of this document.

4.1.3.5.4    Summary of Activities

Total staffs involved in testing are three of team members.
Total elapsed time used for each of the major testing activities is about 10 hours.
The actual machine time cannot be determined at this moment because the server has not been
delivered.




CAR_LCA_F05a_T03_V05.0                       99                                       11/21/05
COTS Assessment Report                                                             Version 5.0

         4.1.4     Evaluation Results Screen Matrix
This section lists the rating of the evaluation results from Section 4.1.3.4 of CAR and calculates
the score of each evaluation criteria according to the weights assigned so as to obtain an overall
view of the level of satisfaction the Tangent Pillar™ 2750s provides.
The weight is described in section 4.1.2 of this document.
The rating is assigned in the test results in section 4.1.3.4 of this document.
The score is calculated using the formula: Weight * Rating = Score
The total score is the sum of the all scores for the product.
The weighted average rating is calculated using the formula:
 ((Weight / Total Weight) * Rating)


AT-S01 Performance

 Weight                      Evaluation Criteria                          Tangent Pillar™ 2750s
                                                                           Rating          Score
   80      Number of concurrent users                                               10            800
   70      System response time (multimedia applications)                           10            700
   65      System login time                                                        10            650
   35      System response time (regular applications)                              10            350
   20      Network bandwidth                                                         8            160
   270                                                    Total Score                -           2660
                                          Weighted Average Rating                 9.85              -
                                   Table 64 Server Result Matrix AT-S01
AT-S02 Cost

 Weight                      Evaluation Criteria                          Tangent Pillar™ 2750s
                                                                           Rating          Score
   60      Initial purchase cost                                                    10            600
   50      Upgrade cost                                                              7            350
   40      Maintenance cost                                                         10            400
   150                                                    Total Score                -           1350
                                          Weighted Average Rating                    9              -
                                   Table 65 Server Result Matrix AT-S02



CAR_LCA_F05a_T03_V05.0                         100                                   11/21/05
COTS Assessment Report                                                      Version 5.0

AT-S03 Intercomponent Compatibility

Weight                     Evaluation Criteria                        Tangent Pillar™ 2750s
                                                                       Rating         Score
   50    Wayang Outpost                                                          8           400
   30    MS Office Suite                                                        10           300
   25    Renaissance Place (Accelerated Reader)                                 10           250
   15    Choices                                                                 0             0
  120                                                 Total Score                -           950
                                      Weighted Average Rating              7.92                -
                               Table 66 Server Result Matrix AT-S03
AT-S04 Interoperability

Weight                     Evaluation Criteria                        Tangent Pillar™ 2750s
                                                                       Rating         Score
   30    Authentication                                                          9           270
   30    Application processing                                                 10           300
   30    Active Directory service                                                9           270
   20    Print service                                                          10           200
   20    User profile/folder                                                    10           200
  130                                                 Total Score                -          1240
                                      Weighted Average Rating              9.54                -
                               Table 67 Server Result Matrix AT-S04




CAR_LCA_F05a_T03_V05.0                     101                                   11/21/05
COTS Assessment Report                                                     Version 5.0

AT-S05 Vendor Support

 Weight                     Evaluation Criteria                      Tangent Pillar™ 2750s
                                                                      Rating         Score
   40     Response time for critical problems                                   7           280
   30     Remote assistance                                                    10           300
   20     Hardware support                                                     10           200
   20     Software upgrades                                                     4            80
   10     Warranty                                                             10           100
  120                                                 Total Score               -           960
                                     Weighted Average Rating                    8             -
                              Table 68 Server Result Matrix AT-S05
AT-S06 Flexibility

 Weight                     Evaluation Criteria                      Tangent Pillar™ 2750s
                                                                      Rating         Score
  110     User privileges                                                      10          1100
  110                                                 Total Score               -          1100
                                     Weighted Average Rating                   10             -
                              Table 69 Server Result Matrix AT-S06
AT-S07 Flexibility

 Weight                     Evaluation Criteria                      Tangent Pillar™ 2750s
                                                                      Rating         Score
   40     Downward Compatibility                                                9           360
   60     Extendibility (ease of upgrade of server)                             8           480
  100                                                 Total Score               -           840
                                     Weighted Average Rating               8.4                -
                              Table 70 Server Result Matrix AT-S07




CAR_LCA_F05a_T03_V05.0                    102                                   11/21/05
COTS Assessment Report                                                           Version 5.0

Overall Summary


Weight                       Evaluation Criteria                        Tangent Pillar™ 2750s
                                                                            Rating         Score
  270    Performance                                                            9.85             2660
  150    Cost                                                                         9          1350
  120    Intercomponent Compatibility                                           7.92              950
  130    Interoperability                                                       9.54             1240
  120    Vendor Support                                                               8           960
  110    Security                                                                    10          1100
  100    Flexibility                                                             8.4              840
  1000                                                  Total Score                   -          9100
                                        Weighted Average Rating                  9.1                -
                            Table 71 Server Result Matrix Overall Summary




CAR_LCA_F05a_T03_V05.0                      103                                       11/21/05
COTS Assessment Report                                                          Version 5.0


       4.1.5       Business Case Analysis
This section describes the business case analysis of the server COTS hardware products in terms
of added value, deployment costs and operational costs. The business case analysis is a
framework prepared for decision makers to show that the proposed project is feasible and makes
sound financial sense by showing that it is techno commercially viable solution.

                       4.1.5.1            COTS Ownership Cost

COTS ownership cost is the initial investment done on the purchase of the COTS product. Here
the hardware COTS purchased is server. As per the phase1 of our prototype, Pasadena High
School has already purchased the server from Tangent computers for $ 5250, which is a one-time
cost. The other costs incurred by the school are as follows:


Setup and Installation cost(One-time costs) = $500
TOTAL COTS Ownership cost                     = $5750


Hence, the total COTS Ownership cost for the PHS is $5750

                       4.1.5.2            Development Cost
This section describes the cost involved in developing the COTS products. This does not apply to
our project, since this project involves only network assessment and there is no software
development involved here. Also the cs577a team is working on this project as a part of our
course requirements.

                       4.1.5.3            Transition Cost
There is no transition cost here since the client doesn’t have to spend any extra time for training.
Also this is just new model of server which has been installed and there is only hardware
upgrade involved.

                       4.1.5.4            Operational Cost

There is no additional operational cost for this new server. Even though there is user time
involved in administration of this server, there is no additional workload involved. Thus
operational cost cannot be taken while calculating the business case for this server. The new
server installation doesn’t have any operational cost in phase 1 of the prototype 2.




CAR_LCA_F05a_T03_V05.0                      104                                    11/21/05
COTS Assessment Report                                                           Version 5.0


                       4.1.5.5             Maintenance Cost

The maintenance cost here is the cost incurred every year for system maintenance which includes
server maintenance and software upgrades. This is $3000 (approx) every year for server
maintenance. The annual maintenance contract (AMC) expired on august 31, 2005. The new
contract will be made for the newly installed server. Server maintenance cost for the new server
includes remote maintenance, backing up and software updates is $3000.

                       4.1.5.6             Estimate of Value Added ROI

The client has spent time over phone, meetings, emails and also commuting to USC for ARB
meetings. We have arrived at a total number of 62 person hours spent by the client altogether,
out of which 35 hours (approximately) have been spent on discussions on server.
The phase 1 of prototype 2, which is the installation of the server offers operational and
maintenance costs benefits over the old system.
In the current system the server freezes frequently and the administrator has to spend many hours
trying to diagnose system instability, which often results in rebooting of the server. Also server is
not able to handle more than 8 users concurrently for any multimedia application. The new
system will have much faster user logins, centralized accessibility, much improved multimedia
application performance. This will reduce the time spent by the Librarian and other faculties on
the system tremendously. These cost benefits outweigh the cost of the server.
From the expected login time of the system and as per the feedback from the administrator
regarding the usage, the amount of time the system administrator spends at the users’ desk is
reduced by 30 %.
The system log in time is much faster. In the old system the login time was more than 30
minutes for the entire class of 30-40 users to login. In the new system this is less than 5 minutes
for the entire class to login. This means that students spend less time with the PC’s and the
productivity of students as well as faculties will increase. We arrived at a figure of 6 hours total
savings per week, Administrator and faculties put together.


Total savings / year
In the new system the total savings in operational cost in every year from January 2006 for
Librarian and faculties = 288 person hours approx.
(3 hours for librarian, 3 hours for all faculties together in a week).
These are rough estimates obtained from the client. This saving of the time spent results in a
salary savings of $30 /hr (Salary estimate not obtained from the client because of confidentiality
reasons. An approximate estimate taken)




CAR_LCA_F05a_T03_V05.0                       105                                    11/21/05
COTS Assessment Report                                                       Version 5.0


Thus, the resulting annual salary saving would be
30 (salary/hr) * (3 + 3)*4* 12 hours
= $8640 / year


Total client effort = 62 person-hours (We have arrived at a total number of 62 person hours spent
by the client for arriving at this prototype).
Out of these 62 hours, approximately 35 hours has been spent on discussing the server prototype.
=30 (salary/hr) * 35 hours
= $1050


Setup and Installation cost:                 = $500
Server purchase cost                         = $5250
-------------------------------------------------------------
COTS Ownership costs                         = $ 5750


Total Annual Costs for 2006:
         Annual maintenance costs = $3000
         COTS Ownership costs            = $5750
         Client effort cost               =$1050
-----------------------------------------------------------------
         Total annual costs               = $9800


Total estimated annual Savings:           = $8640


Return on investment (ROI) = (Benefits – Costs)/ Costs


ROI at the end of the year (2006) = (8640 -9800)/9800 = - 0.11




CAR_LCA_F05a_T03_V05.0                             106                          11/21/05
COTS Assessment Report                                                                       Version 5.0


Thus from the available estimates the ROI can be estimated as follows:

 Years after Deployment                 Accumulated                 Accumulated benefits      ROI
                                        costs (USD)                 (USD)
                       1                                 9800                         8640                  - 0.11
                       2                               12800                        17280                    0.35
                       3                               15800                        25920                    0.64
                       4                               18800                        34560                    0.83
                       5                               21800                        43200                    0.98
                   Table 72 ROI Statistics for Breakeven Analysis after implementing prototype 2, phase1

                                                Beakeven Analysis on ROI

        1.2




          1




        0.8




        0.6
  ROI




                                                                                                           ROI

        0.4




        0.2




          0
               0              1             2               3              4          5             6


        -0.2
                                                  Years after Deployment


                                       Figure 6 Server Break-even Analysis on ROI




CAR_LCA_F05a_T03_V05.0                                  107                                      11/21/05
COTS Assessment Report                                                        Version 5.0

The return on investment will reach break-even point for PHS by first quarter of second year of
installation of the system. This calculation is based on the rough estimations provided by the
client. The actual return on investment may vary depending on the level of system usage and
savings. Also there is a major intangible benefit which is not considered here , i.e., less time
spend on the system by the users (students), higher user satisfaction and better learning
experience for the students .


Thus, the above Business case analysis shows that the proposed project is feasible and makes
sound financial sense since it has been proved to be economically viable and technically feasible.




CAR_LCA_F05a_T03_V05.0                     108                                   11/21/05
COTS Assessment Report                                                         Version 5.0



 4.2 Assessment Results-Part 2 [Client]
This section describes the assessment and test results for the client COTS product.

       4.2.1      COTS Assessed
The COTS products assessed are:
 COTS Products                    Web Address               Description
 Tangent WebDT 166                http://www.dtresearch. The WebDT 166 features the
                                  com/prod_webDT166.h integration of the energy efficient,
                                  tml                    yet powerful AMD Geode™
                                                         GX533 and LX800 processor
                                                         technology into a most compact and
                                                         robust enclosure.
                                                            The space saving package
                                                            accommodates accessories and
                                                            peripherals to address a range of
                                                            usage requirements.
                                                            The software operating systems
                                                            support server- and browser-based
                                                            computing in addition to local,
                                                            embedded applications.
                                                            The versatility in hardware and
                                                            software makes the device
                                                            compelling for numerous vertical
                                                            deployments leveraging network
                                                            access and custom software.
 Wyse Winterm V90                 http://www.wyse.com/      The Winterm™ V90 includes a
                                  products/winterm/V90/     powerful 1GHz CPU; smart card
                                                            slot; Card Bus/PCMCIA slot; and
                                                            serial, parallel, and USB ports.
                                                            Video performance is fast and crisp,
                                                            minimizing eyestrain and meeting
                                                            stringent health and ergonomic
                                                            requirements.
                                                            The Winterm V90 runs the
                                                            Microsoft® Windows XPe
                                                            operating system providing fast
                                                            boot-up functionality and the ability
                                                            to easily and rapidly switch
                                                            between a typical PC desktop and a


CAR_LCA_F05a_T03_V05.0                     109                                   11/21/05
COTS Assessment Report                                                           Version 5.0

                                                              connection manager dashboard.
                                                              As the model V90 is diskless and
                                                              application installation and
                                                              execution is fully managed, it is
                                                              inherently secure from viruses and
                                                              other malicious software attacks.
                                                              The Winterm V90 offers a broad
                                                              range of mounting options for any
                                                              work environment via its innovative
                                                              monorail mounting system
                                                              It has Wyse™ Rapport®
                                                              (Workgroup edition), the enterprise
                                                              client management tool that
                                                              leverages the value of your IT
                                                              infrastructure for maximum ROI.
                                  Table 73 Client COTS Assessed


       4.2.2       Evaluation Criteria
The set of evaluation criteria for the client COTS product chosen is shown in the following table.
The last column presents the corresponding weight assigned based on the discussion between the
client and the team members. The evaluation weights indicate the importance the client perceives
an attribute of the system shall have.

     No        Evaluation Criteria – COTS attributes                    Weight
  AT-C01       Cost                                                     150
  AT-C02       Performance                                              120
  AT-C03       Vendor Support                                           90
  AT-C04       Flexibility                                              80
                                 Table 74 Client Evaluation Criteria




CAR_LCA_F05a_T03_V05.0                      110                                    11/21/05
COTS Assessment Report                                                       Version 5.0

The following tables break down each single criterion in the above table, into more details in
order to obtain a better measure of each criterion of the COTS product.


Table below breaks down the Cost (AT-C01) criterion into more details as follows:

 Weight        Features
 60            Initial purchase cost
 50            Upgrade cost
 40            Maintenance cost
                                    Table 75 Client Cost Attribute


Table below breaks down the Performance (AT-C02) criterion into more details as follows:

 Weight        Features
 120           Hardware specification
                                Table 76 Client Performance Attribute


Table below breaks down Vendor Support (AT-C03) criterion into more details as follows:

 Weight        Features
 40            Hardware support
 50            Warranty
                              Table 77 Client Vendor Support Attribute


Table below breaks down the Flexibility (AT-C04) criterion into more details as follows:

 Weight        Features
 80            Upgradeability
                                 Table 78 Client Flexibility Attribute




CAR_LCA_F05a_T03_V05.0                       111                                11/21/05
COTS Assessment Report                                                       Version 5.0


       4.2.3       Test Procedure
This section documents the detailed evaluation process stating the test set-up, test procedures
used and their corresponding results.

                      4.2.3.1           Test Identification
The COTS products that are going to be tested are:
       Tangent WebDT 166
       Wyse Winterm V90
Our COTS evaluation test has 5 test procedures till date, which will be performed by three team
members, and the results of our COTS evaluation tests will work as the basis for our
recommendation to the client.




CAR_LCA_F05a_T03_V05.0                    112                                   11/21/05
COTS Assessment Report                                                        Version 5.0


                      4.2.3.2           Test Preparation

During CS577A, various assessments were performed to evaluate the various COTS product
mentioned above. The various assessments are designed and executed to test the functionality
and performance of critical business scenarios of the above mentioned COTS products. This
section states the minimum hardware, software and other test preparation requirements and
whether they were met or not.

4.2.3.2.1   Hardware Preparation

The following table lists the hardware requirement of the client COTS product:

 ID           COTS Product       Hardware Requirements           Met or not
              Model
 HREQ-C1      Tangent            1. One server as the            1. Yes. Currently TC95 is the
              WebDT 166             application server              application and
                                 2. One server as the               authentication server in PHS
                                    authentication server        2. Yes. Currently TC95 is the
                                 3. One server as the               application and
                                    backup, folder storage          authentication server in PHS
                                    server                       3. Yes. Currently TC96 is the
                                 4. 40+ thin-client terminals       folder storage server in PHS
                                    connected with the           4. Yes. PHS currently has 40
                                    server with internet            thin-client terminals
                                    connection                      connected to TC95
                                 5. Tangent WebDT 166            5. No. The new thin-client
                                                                    model for Tangent WebDT
                                                                    166 is not available for
                                                                    testing
 HREQ-C2      Wyse Winterm       1. One server as the            1. Yes. Currently TC95 is the
              V90                   application server              application and
                                 2. One server as the               authentication server in PHS
                                    authentication server        2. Yes. Currently TC95 is the
                                 3. One server as the               application and
                                    backup, folder storage          authentication server in PHS
                                    server                       3. Yes. Currently TC96 is the
                                 4. 40+ thin-client terminals       folder storage server in PHS
                                    connected with the           4. Yes. PHS currently has 40
                                    server with internet            thin-client terminals
                                    connection                      connected to TC95
                                 5. Wyse Winterm V90             5. No. The new thin-client
                                                                    model for Wyse Winterm
                                                                    V90 is not available for
                                                                    testing
                      Table 79 Hardware Preparation for Client COTS Product


CAR_LCA_F05a_T03_V05.0                    113                                    11/21/05
COTS Assessment Report                                                          Version 5.0

4.2.3.2.2   Software Preparation

The following table lists the software requirement of the client COTS product:

 ID      COTS Product           Software Requirements              Met or not
         Model
 SREQ-C1 Tangent                1. Windows based operating         1. Yes. Tangent WebDT 166
         WebDT 166                 system embedded to                 supports both Windows
                                   support terminal service           and Linux based OS,
                                   client                             including Windows CE,
                                                                      Windows XP, and
                                                                      Windows XP Professional
 SREQ-C2 Wyse Winterm           1. Windows based operating         1. Yes. Wyse Winterm V90 is
         V90                       system embedded to                 based on Windows XP
                                   support terminal service           embedded operating
                                   client                             system.
                       Table 80 Software Preparation for Client COTS Product



4.2.3.2.3   Other Pre-test Preparation

The following table lists other pre-test requirement of the client COTS product:

 ID          COTS Product       Pre-test Preparations              Met or not
             Model
 PREP-C1     Tangent            The testers need authenticated     Yes. The librarian in PHS has
             WebDT 166          username and password set-         provided both administrator
                                up for access to all the servers   and regular users username and
                                and thin-client terminals          password to access the system
 PREP-C2     Wyse Winterm       The testers need authenticated     Yes. The librarian in PHS has
             V90                username and password set-         provided both administrator
                                up for access to all the servers   and regular users username and
                                and thin-client terminals          password to access the system
                        Table 81 Other Preparation for Client COTS Product




CAR_LCA_F05a_T03_V05.0                     114                                     11/21/05
COTS Assessment Report                                                         Version 5.0


                      4.2.3.3            Test Procedure Specifications

This section provides the detailed test procedures carried out by the testers in order to rate each
evaluation criterion.
    The testing procedure adopted for evaluating the COTS attributes AT-C01 and AT-C03 is
       black-box testing. The black-box testing techniques widely used for the test process are
       equivalence partitioning and boundary value analysis. By applying these black-box
       techniques, we derived a set of test cases, which are presented below.
For attribute AT-C02 and AT-C04 there are no detailed test procedure specifications. The
rationale for this attribute is explained in section 4.1.3.3.1 of the document.




CAR_LCA_F05a_T03_V05.0                     115                                    11/21/05
COTS Assessment Report                                                          Version 5.0

4.2.3.3.1     Rationales for Non Test Procedures

In case of the Performance attribute (AT-C02):

The performance of thin clients is typically measured by three attributes:
The first attribute is measured in terms of its internal processor. The second attribute is
measured by the physical memory embedded on to the device. The final attribute is the
operating system loaded onto the thin client device. Based on the result of our market research,
the CPU for these devices is range from 500MHz to 1GHz. The embedded memory ranges from
128MB to 512MB. The two popular OS that run are Embedded Windows CE, and Embedded
Windows XP. Due to the fact that Team 3 was not able to obtain a demo unit for Wyse Winterm
V90 and Tangent WebDT 166, we can only evaluate performance based on the system
specifications. Thus there are no test cases associated when we refer to thin client performance.
Our selections of thin client are selected based on the system specs. Thin clients with high
processing power and high memory running Windows XP Embedded will be ranked higher than
thin clients with slower processor, and minimal memory running on Windows CE.


Performance results for thin client based on their system speculations.


 Category                         Tangent WebDT 166                  Wyse Winterm V90
 CPU                              533 MHz AMD Geode                  1.0GHz (x86)
 Memory                           128MB                              256MB
 OS                               Windows CE Embedded                Windows XP Embedded
 Rating                           5/10                               10/10
                        Table 82 Client Performance Attribute Result Rational
References:


http://www.dtresearch.com/datasheets/WebDT%20166%20flyer%20081205-2.pdf
http://www.wyse.com/products/winterm/


In case of the Flexibility attribute (AT-C04):

Thin clients are embedded devices built to connect to a server. Thus, these devices cannot be
physically upgraded due to the fact that many components including the processor and memory
modules are soldered directly into the device. Wyse however does sell a 256MB Upgrade Kit
for their Winterm device. As for WebDT 166, the manufacture’s documentation states that these
units cannot be upgraded. The internal OS on this thin client device cannot be upgraded.




CAR_LCA_F05a_T03_V05.0                      116                                     11/21/05
COTS Assessment Report                                                               Version 5.0

Flexibility results rating


 Tangent WebDT 166                                      Wyse Winterm V90
 0/10                                                   6/10
                             Table 83 Client Flexibility Attribute Result Rational
References


http://www.dtresearch.com/datasheets/WebDT%20166%20flyer%20081205-2.pdf
http://www.wyse.com/products/winterm/




CAR_LCA_F05a_T03_V05.0                           117                                   11/21/05
COTS Assessment Report                                                         Version 5.0

4.2.3.3.2      Test Procedures

The following tables indicate the test procedures for the cost attribute (AT-C01):

 Test Case:                1-1
 Identifier:               AT-C01-1
 Test Items:               Initial Cost
 Test Description:         This test will compare the initial cost of ownership among the
                           different COTS products.
 Pre-conditions:           The following pre-condition must be meet before the following test
                           can be preformed:
                                  1. COTS products must be available from COTS vendor
                                  2. COTS vendor must be able to supply price information
                                      for COTS products.
                                  3. A communication channel (phone number or email
                                      address) must be established between the customer and
                                      the COTS vendor prior to the test.
                                  4. Customer needs to set an expected price range for
                                      purchasing COTS products.
 Post-conditions:          The following function will be preformed after the test procedure.
                                  1. Obtain a price quote for 1 unit for all COTS products.
                                  2. COTS item will be rated as High, Medium, or Low
                                     Initial Cost based on price.
 Input Specifications:     The following information will be added to perform the test:
                                  1. Customer dial phone or email COTS vendor indicating
                                     interest in purchasing a COTS product.
                                  2. Customer supply parts number to COTS vendor for
                                     product lookup.
 Expected Output           The following information are the expected output:
 Specifications:                  1. COTS vendor will supply price quote (in US dollar
                                     amounts)
                                  2. COTS products will be assigned High, Medium, Low
                                     status.
 Pass/Fail Criteria:       The following information is the pass/fail criteria for testing:
                                  1. COTS products availability (in stock, back order,
                                     discontinued, etc.)
                                  2. COTS product price falls within the initial cost price
                                     range set by the customer
 Test Process:             The following list the test process:
                                  1. Customer contact COTS vendor via phone or email.
                                  2. Customer checks the availability of COTS products with
                                     COTS vendor.


CAR_LCA_F05a_T03_V05.0                     118                                       11/21/05
COTS Assessment Report                                                      Version 5.0

                                 3. Customer request price quote for one unit of COTS
                                    product.
                                 4. Customer compare price against initial estimated cost
                                    price.
Assumptions and          The following list the assumptions and constraints for the test:
constraints:                    1. COTS vendor is willing to disclose pricing information
                                2. Pricing information for COTS products must not change
                                   dramatically during the duration of the test.
                                3. Final price quote must not exceed the maximum
                                   expected price range set by the customer.
Dependencies:            None
Traceability:            NA
                         Table 84 Client Test Procedure Specification 1-1




CAR_LCA_F05a_T03_V05.0                    119                                 11/21/05
COTS Assessment Report                                                     Version 5.0



Test Case:               1-2
Identifier:              AT-C01-2
Test Items:              Upgrade Cost
Test Description:        This test will compare the upgrade cost among the different COTS
                         products.
Pre-conditions:          The following pre-condition must be meet before the following test
                         can be preformed:
                                1. COTS product upgrades parts must be available from
                                    COTS vendor.
                                2. COTS vendor must be able to supply price information
                                    for COTS product upgrade parts.
                                3. A communication channel (phone number or email
                                    address) must be established between the customer and
                                    the COTS vendor prior to the test.
                                4. Customer needs to set an expected price range for
                                    purchasing COTS products upgrade.
                                5. COTS upgrades must be compatible with current COTS
                                    products.
                                6. Current COTS products evaluated must have support for
                                    upgrades (software or hardware).
Post-conditions:         The following function will be preformed after the test procedure.
                                1. Obtain a price quote for upgrade of 1 unit for all COTS
                                   products or modules.
                                2. COTS upgrade products will be rated as High, Medium,
                                   or Low Upgrade Cost based on price.
Input Specifications:    The following information will be added to perform the test:
                                1. Customer dial phone or email COTS vendor indicating
                                   interest in purchasing a COTS product upgrade.
                                2. Customer supply parts number to COTS vendor for
                                   product intended to upgrade for lookup.
Expected Output          The following information are the expected output:
Specifications:                 1. COTS vendor will supply price quote (in US dollar
                                   amounts) for COTS upgrade.
                                2. COTS upgrade items will be assigned High, Medium,
                                   Low status.
Pass/Fail Criteria:      The following information is the pass/fail criteria for testing:
                                1. COTS upgrade item availability (in stock, back order,
                                   discontinued, etc.)
                                2. COTS upgrade product price falls within the upgrade
                                   cost price range set by the customer.
                                3. COTS upgrades compatibility with current COTS


CAR_LCA_F05a_T03_V05.0                  120                                  11/21/05
COTS Assessment Report                                                      Version 5.0

                                     products.
Test Process:            The following list the test process:
                                1. Customer contact COTS vendor via phone or email.
                                2. Customer checks the availability of COTS upgrade items
                                   with COTS vendor or weather items has upgrade options
                                   available.
                                3. Customer request price quote for upgrades of one unit of
                                   COTS item.
                                4. Customer compare upgrade price against initial estimated
                                   upgrade cost price.
Assumptions and          The following list the assumptions and constraints for the test:
constraints:                    1. COTS vendor is willing to disclose upgrade pricing
                                   information
                                2. Pricing information for COTS upgrade item must not
                                   change dramatically during the duration of the test.
                                3. Final upgrade price quote must not exceed the maximum
                                   expected price range set by the customer.
Dependencies:            COTS upgrades depends on existing COTS products and must be
                         compatible with existing COTS products.
Traceability:            NA
                         Table 85 Client Test Procedure Specification 1-2




CAR_LCA_F05a_T03_V05.0                    121                                 11/21/05
COTS Assessment Report                                                      Version 5.0



Test Case:               1-3
Identifier:              AT-C01-3
Test Items:              Maintenance Cost
Test Description:        This test will compare the maintenance cost among the different
                         COTS products.
Pre-conditions:          The following pre-condition must be meet before the following test
                         can be preformed:
                                1. COTS vendor must have a maintenance plan.
                                2. COTS vendor must be able to supply maintenance price
                                    information for COTS product.
                                3. A communication channel (phone number or email
                                    address) must be established between the customer and
                                    the COTS vendor prior to the test.
                                4. Customer needs to set an expected price range for
                                    maintenance of COTS items.
Post-conditions:         The following function will be preformed after the test procedure.
                                1. Obtain a price quote for maintenance cost for COTS
                                   system for one academic year.
Input Specifications:    The following information will be added to perform the test:
                                1. Customer dial phone or email COTS vendor indicating
                                   interest in purchasing a one year contract for COTS item
                                   maintenance.
Expected Output          The following information are the expected output:
Specifications:                 1. COTS vendor will supply price quote (in US dollar
                                   amounts) for one year maintenance fee.
Pass/Fail Criteria:      The following information is the pass/fail criteria for testing:
                                1. Availability of a maintenance plan from COTS vendor
                                   for COTS system.
                                2. COTS system maintenance price falls within the annual
                                   maintenance budget price range set by the customer.
                                3. Maintenance plan offer 24 by 7 customer phone & email
                                   support
                                4. Maintenance plan offers on site support versus remote
                                   support or both.
                                5. Customer support is done in the US or is outsourced to
                                   other countries (for example India)
Test Process:            The following list the test process:
                                1. Customer contact COTS vendor via phone or email.
                                2. Customer asks for the availability of COTS maintenance
                                   plan.
                                3. Customer request price quote for system maintenance for


CAR_LCA_F05a_T03_V05.0                  122                                   11/21/05
COTS Assessment Report                                                        Version 5.0

                                      one academic year.
                                 4.   Customer checks if support plan offers 24 by 7 technical
                                      supports.
                                 5.   Customer checks if plan offer onsite and remote support.
                                 6.   Customer checks if support is done in house or out
                                      source.
                                 7.   Customer compare support price against the annual
                                      budget dedicated to supporting COTS system.
Assumptions and          The following list the assumptions and constraints for the test:
constraints:                    1. COTS vendor is willing to disclose upgrade pricing
                                   information
                                2. Pricing information for COTS upgrade item must not
                                   change dramatically during the duration of the test.
                                3. Final upgrade price quote must not exceed the maximum
                                   expected price range set by the customer.
Dependencies:            The following lists the dependencies for the test:
                                 1. COTS system support plan may be depending upon
                                    COTS vendor.
                                 2. COTS support plan may require a signing a year
                                    contract.
Traceability:            PC-1
                         Table 86 Client Test Procedure Specification 1-3




CAR_LCA_F05a_T03_V05.0                    123                                   11/21/05
COTS Assessment Report                                                          Version 5.0

The following tables indicate the test procedures for the vendor support attribute (AT-C03):

 Test Case:                3-1
 Identifier:               AT-C03-1
 Test Items:               Hardware support
 Test Description:         This test will find out the effectiveness of the vendor in fixing the
                           hardware related issues
 Pre-conditions:           The following pre-condition must be met before the following test
                           can be performed:
                                  1. Identify and obtain the contact person’s name,
                                     Department, telephone number and email address from
                                     the COTS vendor
                                  2. A correspondence must have already occurred and the
                                     communications channel (telephone or email) established
                                     between the customer and the COTS vendor prior to the
                                     test.
                                  3. Both the customer and the COTS vendor must be aware
                                     of the contractual bindings and/or the warranty
                                     obligations of the vendor to the customer. The problem
                                     to be fixed should fall in this scope.
 Post-conditions:          The following function will be performed after the test procedure.
                                  1. The vendor would have responded, logged-in a call, and
                                     given a date and time to the customer for looking into the
                                     problem.
                                  2. The vendor would have provided the support on the date
                                     and time confirmed by them with the customer when the
                                     call was logged in.
                                  3. The problem would have got solved.
                                  4. The Vendor response will be rated as good, average, or
                                     poor depending on the degree to which fail/pass criteria
                                     is met.
 Input Specifications:     The following information will be added to perform the test:
                                  1. Customer will contact the COTS vendor by phone or
                                     email and ‘login a call’ indicating that there is a
                                     hardware problem which has arisen in the customer site
                                     which needs immediate attention.
                                  2. Customer will supply more information (if required) like
                                     warranty details, known details of the problem etc to
                                     COTS vendor.



CAR_LCA_F05a_T03_V05.0                     124                                    11/21/05
COTS Assessment Report                                                        Version 5.0


Expected Output          The following information are the expected output:
Specifications:                 1. COTS vendor will login a call and give the customer a
                                   call priority number as well as the date and time on
                                   which they will be attending this call.
                                2. COTS vendor will attend to the call on the mentioned
                                   date and time, and the problem will be solved (or advise
                                   given to the customer for the future course of action).
Pass/Fail Criteria:      The following information is the pass/fail criteria for testing:
                                1. The COTS vendor will respond within a timeframe
                                   already being discussed and agreed upon in the contract
                                   or warranty card.
                                2. COTS vendor will solve the problem or take any other
                                   required action on the customer site on the date and time
                                   as per their agreement during the call log-in.
Test Process:            The following list the test process:
                                1. Customer contact COTS vendor via phone or email.
                                2. Customer ‘login a call’ with the COTS vendor indicating
                                   that there is a hardware which has arisen in the customer
                                   site.
                                3. Customer compares the time taken for the initial
                                   response with the expected response time as per the
                                   contract or warranty card.
                                4. Customer compares the date and time on which the
                                   COTS vendor takes the required action on the customer
                                   site and solves the problem, to the expected response
                                   time as per the contract or warranty card.
                                5. Also the customer compares the date and time on which
                                   the COTS vendor take the required action on the
                                   customer site and solve the problem to that promised by
                                   the COTS vendor when the call was logged-in.
                                6. The customer confirms that the hardware problem has
                                   been solved and the support has been effective
Assumptions and          The following list the assumptions and constraints for the test:
constraints:                    1. COTS vendor will not have any abnormal activities
                                   going on in their organization like a company merger,
                                   department take-over etc which may in normal case
                                   delay the response time.
                                2. COTS vendor contact is working at his/her regular
                                   workload and there is nothing unusual happening with


CAR_LCA_F05a_T03_V05.0                   125                                     11/21/05
COTS Assessment Report                                                      Version 5.0

                                     his work like a much increased workload. This can really
                                     slow down the response in many cases.
                                 3. The pre-conditions are met.
                                 4. If the hardware is to be replaced, the COTS vendor has it
                                    in stock.
Dependencies:            It depends whether the hardware part which has developed a
                         problem is under warranty or not. If it is under warranty it may be
                         replaced free of cost.
Traceability:            NA
                         Table 87 Client Test Procedure Specification 3-1




CAR_LCA_F05a_T03_V05.0                    126                                  11/21/05
COTS Assessment Report                                                      Version 5.0



Test Case:               3-2
Identifier:              AT-C03-2
Test Items:              Warranty
Test Description:        This test will find out the effectiveness of warranty support provided
                         by the vendor to fix the problems faced by the customer for items
                         under warranty
Pre-conditions:          The following pre-condition must be met before the following test can
                         be performed:
                                1. Identify and obtain the contact person’s name in the
                                   technical Department of COTS vendor. Also his telephone
                                   number and email address.
                                2. A correspondence must have already occurred and the
                                   communications channel (telephone or email) established
                                   between the customer and the COTS vendor prior to the
                                   test.
                                3. Both the customer and the COTS vendor must be aware of
                                   the contractual bindings and/or the warranty obligations of
                                   the vendor to the customer. The problem to be fixed should
                                   fall in this scope.
Post-conditions:         The following function will be performed after the test procedure.
                                1. The vendor would have responded, logged-in a call, and
                                   given a date and time to the customer for looking into the
                                   problem.
                                2. The vendor would have provided the support on the date
                                   and time confirmed by them with the customer when the
                                   call was logged in.
                                3. The Vendor response will be rated as good, average, or
                                   poor depending on the degree to which fail/pass criteria is
                                   met.
Input Specifications:    The following information will be added to perform the test:
                                1. Customer will contact the COTS vendor by phone or email
                                   and ‘login a call’ indicating that there is a problem which
                                   has arisen in the customer site related to an item under
                                   warranty which needs immediate attention.
                                2. Customer will supply more information (if required) like
                                   warranty details, known details of the problem etc to COTS
                                   vendor.
Expected Output          The following information are the expected output:
Specifications:                 1. COTS vendor will login a call and give the customer a call
                                   priority number as well as the date and time on which they
                                   will be attending this call.
                                2. COTS vendor will attend to the call on the mentioned date


CAR_LCA_F05a_T03_V05.0                   127                                   11/21/05
COTS Assessment Report                                                      Version 5.0

                                    and time.
                                 3. The problem will be solved (or advice given to the
                                    customer for the future course of action).
Pass/Fail Criteria:      The following information is the pass/fail criteria for testing:
                                1. The COTS vendor will respond within a timeframe already
                                   being discussed and agreed upon in the contract or warranty
                                   card.
                                2. COTS vendor will attend to the call on the mentioned date
                                   and time
                                3. COTS vendor will solve the problem or take any other
                                   required action on the customer site on the date and time as
                                   per their agreement during the call log-in.
Test Process:            The following list the test process:
                                1. Customer contact COTS vendor via phone or email.
                                2. Customer will contact the COTS vendor by phone or email
                                   and ‘login a call’ indicating that there is a problem which
                                   has arisen in the customer site related to an item under
                                   warranty which needs immediate attention.
                                3. Customer compares the time taken for the initial response
                                   with the expected response time as per the contract or
                                   warranty card.
                                4. Customer compares the date and time on which the COTS
                                   vendor takes the required action on the customer site and
                                   solves the problem, to the expected response time as per the
                                   contract or warranty card.
                                5. Also the customer compares the date and time on which the
                                   COTS vendor take the required action on the customer site
                                   and solve the problem to that promised by the COTS
                                   vendor when the call was logged-in.
                                6. The customer checks whether the problem has been solved.
Assumptions and          The following list the assumptions and constraints for the test:
constraints:                    1. COTS vendor will not have any abnormal activities going
                                   on in their organization like a company merger, department
                                   take-over etc which may in normal case delay the response
                                   time.
                                2. COTS vendor contact is working at his/her regular
                                   workload and there is nothing unusual happening with his
                                   work like a much increased workload. This can really slow
                                   down the response in many cases.
                                3. The item which has developed a problem is under warranty.
Dependencies:            None
Traceability:            NA
                         Table 88 Client Test Procedure Specification 3-2



CAR_LCA_F05a_T03_V05.0                    128                                 11/21/05
COTS Assessment Report                                                              Version 5.0


                        4.2.3.4             Test Results

This section lists the test results of all the test procedures described in section 4.2.3.3 of this
document for the two client COTS product.

4.2.3.4.1      Test Results for Tangent WebDT 166

The following tables indicate the test results for the cost attribute (AT-C01):

 Test Case:                      1-1
 Identifier:                     AT-C01-1
 Test Items:                     Initial cost
 Test Result Classification      Pass - The initial cost per Thin Client WebDT 166 is $250.00
 (Pass / Fail)                   The client’s expected cost: $600
 Problem / Defect Report         None
 Feedback / Comments:            Tangent Thin Client WebDT 166 doesn’t offer the performance
                                 of Wyse thin client, however for price conscious consumers who
                                 are willing to sacrifice performance, this is a good alternative.
                                 Reference:
                                 Louise O’Sullivan at Tangent Computers
                                 Contact # : 800-342-9388
                                 Rating-10/10
                           Table 89 Client Test Results 1-1 for Tangent Model




CAR_LCA_F05a_T03_V05.0                          129                                    11/21/05
COTS Assessment Report                                                        Version 5.0



Test Case:                     1-2
Identifier:                    AT-C01-2
Test Items:                    Upgrade cost
                               Fail-This item cannot be upgraded
Test Result Classification
(Pass / Fail)
Problem / Defect Report        None
Feedback / Comments:           This item has a maximum memory capacity of 128MB, which
                               cannot be upgraded.
                               Reference:
                               Nick Haddad at Tangent Computers
                               Contact # : 800-342-9388
                               Rating-0/10
                         Table 90 Client Test Results 1-2 for Tangent Model


Test Case:                     1-3
Identifier:                    AT-C01-3
Test Items:                    Maintenance cost
                               Pass
Test Results
Classification (Pass / Fail)
Problem / Defect Report        None
Feedback / Comments:           This item is has a 1 year manufacturing warrantee and is also
                               covered under Tangent’s annual maintenance contract.
                               Rating-9/10
                         Table 91 Client Test Results 1-3 for Tangent Model




CAR_LCA_F05a_T03_V05.0                       130                                11/21/05
COTS Assessment Report                                                         Version 5.0

The following tables indicate the test results for the vendor support attribute (AT-C03):

 Test Case:                    3-1
 Identifier:                   AT-C03-1
 Test Items:                   Hardware support
 Test Result Classification    Pass
 (Pass /Fail):
 Problem / Defect Report:      None.
 Feedback / Comment:           Tangent provides toll-free hardware support.
                               Mon-Fri 6 AM to 4:30 PM PST.
                               Call 1-800-399-8324
                               Reference:
                               http://www.tangent.com/explore/tech.htm
                               Rating : 10/10
                          Table 92 Client Test Results 3-1 for Tangent Model


 Test Case:                    3-2
 Identifier:                   AT-C03-2
 Test Items:                   Warranty
 Test Result Classification    Pass
 (Pass /Fail):
 Problem / Defect Report:      None.
 Feedback / Comment:           Tangent Servers come with a 3-year Limited Warranty, which
                               includes Next Business Day Parts Replacement and 1-year On-
                               site Labor Service at the sole discretion of Tangent. Removable
                               media is only covered for the 1st year of the warranty.
                               Reference:
                               http://www.tangent.com/explore/tech/warranty.htm
                               Rating: 10/10
                          Table 93 Client Test Results 3-2 for Tangent Model




CAR_LCA_F05a_T03_V05.0                      131                                   11/21/05
COTS Assessment Report                                                            Version 5.0


4.2.3.4.2      Test Results for Wyse Winterm V90

The following tables indicate the test results for the cost attribute (AT-C01):

 Test Case:                     1-1
 Identifier:                    AT-C01-1
 Test Items:                    Initial cost
 Test Result Classification     Pass -The initial cost per client Winterm V90
 (Pass / Fail)                   (512 flash / 256MB RAM) with keyboard
                                 Part No. 902094-06 is (USD) $626.00 (Less 25% Educational
                                Discount)
                                The client’s expected cost: $600
 Problem / Defect Report        None
 Feedback / Comments:           Winterm V90 is currently Wyse’s higher end thin client model.
                                This model was chosen because of its support for multimedia
                                applications.
                                Reference:
                                General Sales, Wyse
                                Contact # (800) GET-WYSE (438-9973)
                                Rating-8/10
                               Table 94 Client Test Results 1-1 for Wyse




CAR_LCA_F05a_T03_V05.0                         132                                  11/21/05
COTS Assessment Report                                                      Version 5.0



Test Case:                   1-2
Identifier:                  AT-C01-2
Test Items:                  Upgrade cost
Test Result Classification   Pass -Hardware Upgrade: The upgrade cost for this particular
(Pass / Fail)                item is limited to only DDR-RAM. WYSE 256MB Memory
                             Upgrade Kit (920223-07) (Item #: 920223-07) (USD) $129.00
                             Software Upgrade: The thin client’s internal OS runs on
                             Windows XP Embedded platform, and cannot be upgraded.
                             Application software is executed on the server.
Problem / Defect Report      Thin clients are not built for extensive hardware and software
                             upgrades
Feedback / Comments:         Hardware Upgrade Cost: We are limited to RAM. However
                             due to the nature of this device, it can only support 1 memory
                             upgrade slot.
                             Reference:
                             Technical Support, Wyse
                             Contact # (800) 800-WYSE (9973)
                             Rating-8/10
                             Table 95 Client Test Results 1-2 for Wyse


Test Case:                   1-3
Identifier:                  AT-C01-3
Test Items:                  Maintenance cost
Test Result Classification   Pass -The manufacturer Wyse offers a 3 years limited warranty
(Pass / Fail)                on all Wyse terminals. In addition, Tangent support for thin
                             client terminal is covered under Tangent’s annual support
                             contract.
Problem / Defect Report      None
Feedback / Comments:         Thin client support is part of the annual support contract
                             between the client and COTS vendor (Tangent Computer)
                             Reference:
                             Technical Support, Wyse
                             Contact # (800) 800-WYSE (9973)
                             Rating-10/10
                             Table 96 Client Test Results 1-3 for Wyse



CAR_LCA_F05a_T03_V05.0                     133                                 11/21/05
COTS Assessment Report                                                         Version 5.0

The following tables indicate the test results for the vendor support attribute (AT-C03):

 Test Case:                    3-1
 Identifier:                   AT-C03-1
 Test Items:                   Hardware support
 Test Result Classification    Pass
 (Pass /Fail):
 Problem / Defect Report:      None
 Feedback / Comment:           If the product is under warranty, then the client needs to go
                               through a Return Material Authorization (RMA) process, where
                               the client can request a support service via web, phone or fax.
                               For hardware technical problems, the client can contact the
                               COTS vendor to get direct support. Since the COTS vendor
                               (Tangent) does not distribute the Wyse model directly, the level
                               of support may be limited.
                               Reference:
                               Nick Haddad at Tangent Computers
                               Contact # : 800-342-9388
                               http://www.wyse.com/serviceandsupport/service/rmaproc.asp#c3
                               Rating: 8/10
                              Table 97 Client Test Results 3-1 for Wyse




CAR_LCA_F05a_T03_V05.0                      134                                   11/21/05
COTS Assessment Report                                                      Version 5.0



Test Case:                   3-2
Identifier:                  AT-C03-2
Test Items:                  Warranty
Test Result Classification   Pass
(Pass /Fail):
Problem / Defect Report:     None
Feedback / Comment:          Wyse Technology warrants its products to be free from defects
                             in material and workmanship for a period of one or three years
                             after the date of purchase.
                             Wyse also offers the Buyer Protection Plan which extends all the
                             benefits that you receive during the first year of ownership from
                             the Manufacturer’s warranty, through the second and third years
                             of ownership.
                             Reference:
                             http://www.wyse.com/serviceandsupport/service/prodwarr.asp
                             Rating: 10/10
                             Table 98 Client Test Results 3-2 for Wyse




CAR_LCA_F05a_T03_V05.0                     135                                 11/21/05
COTS Assessment Report                                                          Version 5.0


                       4.2.3.5            Test Summary

This section summarizes the evaluation test performed on the client COTS product.

4.2.3.5.1    Summary

The tests were performed on two client COTS product, Wyse Winterm V90 and Tangent
WebDT 166. Our COTS evaluation test has 5 test procedures, which will be performed by three
team members and the results of our COTS evaluation tests will work as the basis for our final
COTS recommendation.

4.2.3.5.2    Summary of Results and Consequences

The test results and references showed that the Wyse Winterm V90 client model provided a
much better performance and upgrade capability for the client. However, the per-unit cost of the
Wyse model is much more expensive than the Tangent model. Also the hardware support for the
Wyse model may not be as good as the Tangent model since Tangent is the current COTS
vendor for the client.

4.2.3.5.3    Evaluation

Most of the tests are done successfully, except that some unavailable and low priority features of
the COTS product. AT-C01 and AT-C03 are tested based on the quote and service information
provided by the COTS vendor. AT-C02 and AT-C04 are based on the specification data
provided by the COTS vendor. All the detailed evaluations are described in section 4.2.3 of this
document.

4.2.3.5.4    Summary of Activities

Total staffs that involved in testing are three of team members.
Total elapsed time used for each of the major testing activities is about 10 hours.
The actual machine time can not be determined at this moment because we don’t have the client
hardware to test.




CAR_LCA_F05a_T03_V05.0                      136                                       11/21/05
COTS Assessment Report                                                            Version 5.0


         4.2.4      Evaluation Results Screen Matrix
This section lists the rating of the evaluation results from Section 4.1.3.4 of CAR and calculates
the score of each evaluation criteria according to the weights assigned so as to compare the
overall level of satisfaction between the Wyse Winterm V90 and Tangent WebDT 166.
The weight is described in section 4.2.2 of this document.
The rating is assigned in the test results in section 4.2.3.4 of this document.
The score is calculated using the formula: Weight * Rating = Score
The total score is the sum of the all scores for the product.
The weighted average rating is calculated using the formula:
 ((Weight / Total Weight) * Rating)


AT-C01 Cost


 Weight          Evaluation Criteria            Tangent WebDT 166             Wyse Winterm V90
                                                 Rating         Score         Rating         Score
   60      Initial purchase cost                          10          600               8          480
   50      Upgrade cost                                    0              0             8          400
   40      Maintenance cost                                9          360              10          400
   150                       Total Score                   -          960               -         1280
             Weighted Average Rating                    6.4               -        8.53              -
                                   Table 99 Client Result Matrix AT-C01
AT-C02 Performance


 Weight          Evaluation Criteria            Tangent WebDT 166             Wyse Winterm V90
                                                 Rating         Score         Rating         Score
   120     Hardware specification                          5          600              10         1200
   120                       Total Score                   -          600               -         1200
             Weighted Average Rating                       5              -            10            -
                                Table 100 Client Result Matrix AT-C02




CAR_LCA_F05a_T03_V05.0                         137                                     11/21/05
COTS Assessment Report                                                         Version 5.0

AT-C03 Vendor Support


 Weight       Evaluation Criteria            Tangent WebDT 166              Wyse Winterm V90
                                              Rating         Score          Rating         Score
   40     Hardware support                             10          400                8          320
   50     Warranty                                     10          500               10          500
   90                       Total Score                 -          900                -          820
           Weighted Average Rating                     10              -        9.11               -
                               Table 101 Client Result Matrix AT-C03
AT-C04 Flexibility


 Weight       Evaluation Criteria            Tangent WebDT 166              Wyse Winterm V90
                                              Rating         Score          Rating         Score
   80     Upgradeability                                0              0              6          480
   80                       Total Score                 -              0              -          480
           Weighted Average Rating                      0              -              6            -
                               Table 102 Client Result Matrix AT-C04
Overall Summary


 Weight       Evaluation Criteria            Tangent WebDT 166              Wyse Winterm V90
                                              Rating         Score          Rating         Score
  150     Cost                                       6.4           960          8.53            1280
  120     Performance                                   5          600               10         1200
   90     Vendor support                               10          900          9.11             820
   80     Flexibility                                   0              0              6          480
  440                       Total Score                 -        2460                 -         3780
           Weighted Average Rating                  5.59               -        8.59               -
                           Table 103 Client Result Matrix Overall Summary




CAR_LCA_F05a_T03_V05.0                      138                                      11/21/05
COTS Assessment Report                                                          Version 5.0


       4.2.5       Business Case Analysis
This section describes the business case analysis of the client COTS hardware products in terms
of added value, deployment costs and operational costs. The business case analysis is a
framework prepared for decision makers to show that the proposed project is feasible and makes
sound financial sense by showing that it is techno commercially viable solution.

                       4.2.5.1            Business Case Analysis for Product 1

This section describes the business case analysis of the Tangent WebDT 166 thin client model.

4.2.5.1.1    COTS Ownership Cost

COTS ownership cost is the initial investment done on the purchase of the COTS product. Here
the hardware COTS purchased is the thin client terminals.
Phase II consists of replacing the Wyse Winterm 3230LE terminals by Tangent WebDT 166
terminals. The current thin client terminals do not have the capability to handle heavy
multimedia intensive applications (Wayang outpost). Also the old models have become obsolete.
The new terminal purchase will be done in phases. We assume that the client will buy 30
terminals at a time as part of phase 2 of prototype 2.


   Installation and setup charges for 30 terminals = $800
   Cost of Tangent WebDT 166 terminal                   = $250 / terminal


   COTS Hardware cost (assuming that client buys 30 of Tangent DT 166 terminal)
   = 7500 $


   Hence, the total COTS Ownership cost for PHS is $8300

4.2.5.1.2    Development Cost

This section does not apply to this project, since this project involves only network assessment
and there is no software development involved here. Also the cs577 team is working on this
project as a part of our course requirements.

4.2.5.1.3    Transition Cost

There is no transition cost here since the client doesn’t have to spend any extra time for training.
Also this is just new model of thin client terminals which has been upgraded from the previous
model where there is only hardware upgrade involved.




CAR_LCA_F05a_T03_V05.0                      139                                    11/21/05
COTS Assessment Report                                                        Version 5.0


4.2.5.1.4    Operational Cost

There is an operational cost involved here since the administrator will have to spend more of her
time on administrating the thin client terminals. We have come to a figure of additional 1 hour
every week as the operational cost.
i.e. 1 x 4 x 12 x $30 = $1440

4.2.5.1.5    Maintenance Cost

The maintenance cost is there only for the server. The 30 new thin clients will be connected to
the same server and there is no additional maintenance cost.

4.2.5.1.6    Estimate of Value Added ROI

The client has spent time over phone, meetings, emails and also commuting to USC for ARB
meetings. We have arrived at a total number of 62 person hours spent by the client altogether,
out of which 27 hours (approximately) have been spent on discussions on thin client prototypes.
The phase 2 of prototype 2, which is the installation of thin clients offers operational and
maintenance costs benefits over the old system.
The current thin client model Winterm 3230LE is discontinued by the manufacturer and the new
model, Tangent WebDT 166, has AMD Geode GX 533 embedded processor with integrated high
speed video. The new system reduces the amount of time the system administrator spends at the
users’ desk by more than 40 %. This is because now all the students can work simultaneously in
a single session since there are more terminals. The new system will enable 40 plus users to
access Wayang Outpost (a high-end graphics and multimedia geometric tutorial software) and
Renaissance Place (an interactive assessment program) simultaneously using the thin-client
network. Also the old thin clients were prone to failures because of wear and tear and already
signs were there. The system log in time is much faster. CPU utilization of the server will be
much lesser here since the flash based application will run locally.
We arrived at a figure of 3 hours total savings per week, Administrator and faculties put together
(after talking to the customer and faculties we arrived at this figure).


Total savings / year
In the new system the total savings in operational cost in every year from January 2006 for
Librarian and faculties = 144 person hours approx.
(1.5 hours for librarian, 1.5 hours for all faculties together in a week).
These are rough estimates obtained from the client. This saving of the time spent results in a
salary savings of $30 /hr (Salary estimate not obtained from the client because of confidentiality
reasons. An approximate estimate taken)




CAR_LCA_F05a_T03_V05.0                       140                                 11/21/05
COTS Assessment Report                                                          Version 5.0


Thus, the resulting annual salary saving would be
30 (salary/hr) * (1.5 +1.5) * 4 * 12 hours
= $4320 / year


Total client effort = 62 person-hours (We have arrived at a total number of 62 person hours spent
by the client for arriving at this prototype.)
Out of these 62 hours, approximately 27 hours have been spent on discussing the client
prototype.
=30 (salary/hr) * 27 hours
= $810


    Installation and setup charges for 30 terminals = $800
    Cost of Tangent WebDT 166 terminal                      = $250 / terminal


    COTS Hardware cost (assuming that client buys 30 of Tangent WebDT166 terminals)           =
    $7500


    Hence, the total COTS Ownership cost for PHS is $8300
-------------------------------------------------------------
COTS Ownership costs                         = $8300



Total Annual Costs for 2006:
         Annual maintenance costs = 0
         COTS Ownership costs            = $8300
         Client effort cost               =$810
-----------------------------------------------------------------
         Total annual costs               = $9110


Total estimated annual Savings:           = $4320


Return on investment (ROI) = (Benefits – Costs)/ Costs




CAR_LCA_F05a_T03_V05.0                             141                            11/21/05
COTS Assessment Report                                                                      Version 5.0

ROI at the end of the year (2006) = (4320 -9110)/9110 = - 0.6
Thus from the available estimates the ROI can be estimated as follows:

 Years after Deployment             Accumulated costs               Accumulated benefits    ROI
                                    (USD)                           (USD)
                   1                                     9110                        4320                  -0.52
                   2                                     9110                        8640                  -0.05
                   3                                     9110                       12960                  0.422
                   4                                     9110                       17280                   0.89
                   5                                     9110                       21600                   1.37
        Table 104 ROI Statistics for Breakeven Analysis after implementing prototype 2, phase2 (product 1)

                                         Breakeven Analysis on ROI

        1.5




          1




        0.5
  ROI




                                                                                                  ROI


          0
               0        1            2               3                4         5           6




        -0.5




         -1
                                           Years after Deployment



                             Figure 7 Client Break-even Analysis on ROI (Product 1)




CAR_LCA_F05a_T03_V05.0                               142                                        11/21/05
COTS Assessment Report                                                        Version 5.0

The return on investment will reach break-even point for PHS by second quarter of the second
year of installation of the system. This calculation is based on the rough estimations provided by
the client. The actual return on investment may vary depending on the level of system usage and
savings. Also there is a major intangible benefit which is not considered here , i.e., less time
spend on the system by the users (students) because of the new thin client terminals, higher user
satisfaction and better learning experience for the students .
Another interesting fact which can be observed here is that even though the initial investment is
low in this case, the performance is also low. Thus the benefits realized by the customer are also
low compared to the Wyse model. Thus ROI will reach breakeven almost at the same time as the
previous model. The customer has a choice whether to go for this model with low initial
investment and comparatively lower performance or the previous model with high initial
investment and high performance.
Thus, the above Business case analysis shows that the proposed project is feasible and makes
sound financial sense since it has been proved to be economically viable and technically feasible.




CAR_LCA_F05a_T03_V05.0                     143                                   11/21/05
COTS Assessment Report                                                          Version 5.0


                       4.2.5.2            Business Case Analysis for Product 2

This section describes the business case analysis of the Wyse Winterm V90 thin client model.

4.2.5.2.1    COTS Ownership Cost

COTS ownership cost is the initial investment done on the purchase of the COTS product. Here
the hardware COTS purchased is the thin client terminals.
Phase II consists of replacing the Wyse 3230LE terminals by Wyse Winterm V90 terminals .The
current thin client terminals do not have the capability to handle heavy multimedia intensive
applications ( Wayang outpost) . Also the old models have become obsolete. The new terminal
purchase will be done in phases. We assume that the client will buy 30 terminals at a time as part
of phase 2 of the prototype.


   Installation and setup charges for 30 terminals = $1000
   Cost of Wyse Winterm V90 terminal                       = $570 / terminal


   COTS Hardware cost (assuming that client buys 30 of Wyse Winterm V90 terminals)             =
   $17100


   Hence, the total COTS Ownership cost for PHS is $18100

4.2.5.2.2    Development Cost

This section does not apply to this project, since this project involves only network assessment
and there is no software development involved here. Also the cs577 team is working on this
project as a part of our course requirements.

4.2.5.2.3    Transition Cost

There is no transition cost here since the client doesn’t have to spend any extra time for training.
Also this is just new model of thin client terminals which has been upgraded from the previous
model where there is only hardware upgrade involved.




CAR_LCA_F05a_T03_V05.0                      144                                    11/21/05
COTS Assessment Report                                                        Version 5.0


4.2.5.2.4    Operational Cost

There is an operational cost involved here since the administrator will have to spend more of her
time on administrating the thin client terminals. We have come to a figure of additional 1 hour
every week as the operational cost.
i.e. 1 x 4 x 12 x $30 = $1440

4.2.5.2.5    Maintenance Cost

The maintenance cost is there only for the server. The 30 new thin clients will be connected to
the same server and there is no additional maintenance cost.

4.2.5.2.6    Estimate of Value Added ROI

The client has spent time over phone, meetings, emails and also commuting to USC for ARB
meetings. We have arrived at a total number of 62 person hours spent by the client altogether,
out of which 27 hours (approximately) have been spent on discussions on thin client prototypes.
The phase 2 of prototype 2, which is the installation of thin clients offers operational and
maintenance costs benefits over the old system.
The current thin client model Winterm 3230LE is discontinued by the manufacturer and the new
model Wyse Winterm V90 has a powerful 1GHz processor with integrated high speed video.
This delivers superior performance by CPU emulating architectures and high resolution video
systems for fast display updates and local application performance. Thus this model is 3 times
faster than the old model .The enterprise client management tool leverages the value of PHS’s IT
infrastructure for maximum ROI.
The new system reduces the amount of time the system administrator spends at the users’ desk
by more than 50 %. This is because now all the students can work simultaneously in a single
session since there are more terminals. The new system will enable 40 plus users to access
Wayang Outpost (a high-end graphics and multimedia geometric tutorial software) and
Renaissance Place (an interactive assessment program) simultaneously using the thin-client
network. Also the old thin clients were prone to failures because of wear and tear and already
signs were there. The system log in time is much faster. CPU utilization of the server will be
much lesser here since the flash based application will run locally.
We arrived at a figure of 5 hours total savings per week, Administrator and faculties put together
(after talking to the customer and faculties we arrived at this figure).
Total savings / year
In the new system the total savings in operational cost in every year from January 2006 for
Librarian and faculties = 576 person hours approx.
(3 hours for librarian, 2 hours for all faculties together in a week).
These are rough estimates obtained from the client. This saving of the time spent results in a
salary savings of $30 /hr (Salary estimate not obtained from the client because of confidentiality
reasons. An approximate estimate taken)


CAR_LCA_F05a_T03_V05.0                       145                                 11/21/05
COTS Assessment Report                                                                  Version 5.0


Thus, the resulting annual salary saving would be
30 (salary/hr) * (3 +2 )*4* 12 hours
= $7200 / year


Total client effort = 62 person-hours (We have arrived at a total number of 62 person hours spent
by the client for arriving at this prototype.)
Out of these 62 hours, approximately 27 hours have been spent on discussing the client
prototype.
=30 (salary/hr) * 27 hours
= $810


    Installation and setup charges for 30 terminals = $1000
    Cost of Wyse Winterm V90 terminal                               = $570 / terminal


    COTS Hardware cost (assuming that client buys 30 of Wyse Winterm V90 terminals)                   =
    $17100
-------------------------------------------------------------
COTS Ownership costs                         = $18100



Total Annual Costs for 2006:
         Annual maintenance costs = $0
         COTS Ownership costs            = $18100
         Client effort cost               =$810
-----------------------------------------------------------------
         Total annual costs               = $18910


Total estimated annual Savings:           = $7200


Return on investment (ROI) = (Benefits – Costs)/ Costs


ROI at the end of the year (2006) = (7200 -18910)/18910 = - 0.6




CAR_LCA_F05a_T03_V05.0                             146                                    11/21/05
COTS Assessment Report                                                                   Version 5.0


Thus from the available estimates the ROI can be estimated as follows:

 Years after Deployment             Accumulated costs            Accumulated benefits    ROI
                                    (USD)                        (USD)
                   1                                18910                        7200                   - 0.6
                   2                                18910                       14400                  -0.23
                   3                                18910                       21600                      0.14
                   4                                18910                       28800                      0.52
                   5                                18910                       36000                       0.9
        Table 105 ROI Statistics for Breakeven Analysis after implementing prototype 2, phase2 (product 2)

                                           Breakeven Analysis on ROI

          1



        0.8



        0.6



        0.4



        0.2
  ROI




                                                                                                     ROI
          0
               0         1             2               3               4         5             6

        -0.2



        -0.4



        -0.6



        -0.8
                                             Years after Deployment



                             Figure 8 Client Break-even Analysis on ROI (Product 2)




CAR_LCA_F05a_T03_V05.0                             147                                      11/21/05
COTS Assessment Report                                                        Version 5.0

The return on investment will reach break-even point for PHS by second quarter of the third year
of installation of the system. This calculation is based on the rough estimations provided by the
client. The actual return on investment may vary depending on the level of system usage and
savings. Also there is a major intangible benefit which is not considered here , i.e., less time
spend on the system by the users (students) because of the new thin client terminals, higher user
satisfaction and better learning experience for the students .
Thus, the above Business case analysis shows that the proposed project is feasible and makes
sound financial sense since it has been proved to be economically viable and technically feasible.




CAR_LCA_F05a_T03_V05.0                     148                                   11/21/05
COTS Assessment Report                                                         Version 5.0


5. Conclusion and Recommendation
This section summarizes and concludes the results of COTS assessment for the PHS Network, as
well as giving recommendations on the selected COTS products.

 5.1 Conclusion and Recommendation Part 1 [Server]
The following are the conclusions derived from the COTS assessment and corresponding
recommendations for the server COTS product:


C1. The Tangent Pillar™ 2750s server satisfied all the required performance level under the
    current size and infrastructure of the network. The network bandwidth available right now is
    sufficient to support all the user activities, but may not be sufficient to support more users
    or new applications.
R1. The client may need to upgrade the network bandwidth if the additional new applications
    need to be run regularly or to support additional users.
C2. The cost for Tangent Pillar™ 2750s server is under the client’s budget, which includes the
    server purchase, necessary upgrades and annual maintenance cost.
R2. Currently no hardware and software upgrade is necessary to perform all the required
    functionality. However, if upgrade is necessary in the future, consider mainly on the
    memory and storage upgrade to help boost up the performance.
C3. The Tangent Pillar™ 2750s server is compatible with all the software required by the client.
    The only one software that did not pass the compatibility test was Choices due to the fact
    that the software license has expired. However, the Choices application should be
    compatible with the Tangent Pillar™ 2750s server specification based on the software
    vendor’s website.
R3. Client should renew the Choices license utilize all the applications available on the server.
C4. The Tangent Pillar™ 2750s server is able to communicate with all the required hardware
    components available on the PHS network, as well as passing data from and to the thin
    client terminals. There are some limitations of the thin client network that would require a
    persistent connection between the server and clients at all time.
R4. Client should make sure that all the middleware and network components are working
    properly, such as routers, switches, and cables so keep the connection between the server
    and clients.
C5. The COTS vendor Tangent, who supplied the Tangent Pillar™ 2750s server, satisfied most
    of the vendor support evaluation test. In general, the COTS vendor provides a decent
    support to the client and to the developers during the assessment period, but for some
    technical questions the vendor didn’t provide enough feedback to the developers. Since the
    client has developed a good relationship with the COTS vendor in the past 4 years, this
    should not be a concern for the client for future supporting. The only problem that the client
    may have is the supporting on the third-part software applications.


CAR_LCA_F05a_T03_V05.0                     149                                    11/21/05
COTS Assessment Report                                                         Version 5.0

R5. Since the COTS vendor does not support the third-party software applications required by
    the client, the client should contact the individual software providers for support in the
    future.
C6. Security is usually not a problem on the thin client network. Also the COTS vendor has
    confirmed that the server would support full security capability as provided by the Active
    Directory and Group Policy services in the Windows operating system.
R6. Client should make sure that the system administrator (librarian) should determine the user
    access right properly and the system maintainer (Tangent) should be responsible for
    creating the user profiles according to the user access rights determined by the system
    administrator.
C7. The Tangent Pillar™ 2750s server has satisfied all the flexibility capabilities required by the
    client. It is very flexible of adopting to feature upgrades as well as supporting the older
    hardware components of the same manufacturer.
R7. Client is flexible of making the choice of her future upgrade plan when new applications or
    additional users are required.




CAR_LCA_F05a_T03_V05.0                     150                                    11/21/05
COTS Assessment Report                                                      Version 5.0



 5.2 Conclusion and Recommendation Part 2 [Client]
The following are the conclusions derived from the COTS assessment and corresponding
recommendations for the client COTS products:


C1. The initial cost of the Tangent WebDT 166 is far less than the Wyse Winterm V90 as the
    per-unit cost is only 1/2 of the Wyse model. However, the Wyse model provides a more
    decent performance as well as room for hardware upgrade in the future.
R1. The client should consider implementing phase 2 of the prototype by purchasing new thin
    client terminals when new applications or additional users are required that would exceed
    the workload the server can handle. If budget is allowed, it’s highly recommended to choose
    the Wyse Winterm V90 model as it offers a much high performance capability.
C2. The current COTS vendor (Tangent) provides a decent level of support for both the Tangent
    and Wyse models. However, the hardware support for the Wyse model can be somewhat
    limited since Tangent is not the direct distributor of Wyse.
R2. The client is currently using the older model of Wyse thin client terminals and the COTS
    vendor has been supporting the PHS thin client network for the past 4 years. Therefore the
    lack of supporting for the Wyse model in the future should not be a big concern. The client
    should still go for the Wyse model if budget is allowed.




CAR_LCA_F05a_T03_V05.0                    151                                  11/21/05
COTS Assessment Report                                                       Version 5.0



Glossary
      1. Active Directory Services:
          Active Directory (codename Cascade) is an implementation of LDAP directory
          services by Microsoft for use in Windows environments. Active Directory allows
          administrators to assign enterprise wide policies, deploy programs to many
          computers, and apply critical updates to an entire organization. An Active Directory
          stores information and settings relating to an organization in a central, organized,
          accessible database. Active Directory networks can vary from a small installation
          with a few hundred objects, to a large installation with millions of objects.
      2. Black-box Testing
          Software testing technique whereby the internal workings of the item
          being tested are not known by the tester
      3   Citrix MetaFrame:
          Citrix Presentation Server (formerly Citrix MetaFrame) is a remote access/application
          publishing product built on the Independent Computing Architecture (ICA), Citrix
          Systems' thin client protocol. The Microsoft Remote Desktop Protocol, part of
          Microsoft's Terminal Services, is based on Citrix technology and was licensed from
          Citrix in 1997. Unlike traditional framebuffered protocols like VNC, ICA transmits
          high-level window display information, much like the X11 protocol, as opposed to
          purely graphical information.
      4   COTS (Commercial Off The Shelf)
          COTS software is defined as a software system that has been built as a composition
          of many other COTS software components (Vigder, 1998). Here the developer of the
          software act as the integrator who purchase the components from third party vendors
          and assembly them to build the final product
      5   COCOTS
          COCOTS is a cost estimation tool designed to capture explicitly the most important
          costs associated with COTS component integration. COCOTS is actually an amalgam
          of four related sub-models, each addressing individually what the authors have
          identified as the four primary sources of COTS software integration costs.
      6 Gantt Chart:
          A Gantt chart is a popular type of bar chart, that aims to show the timing of tasks or
          activities as they occur over time. Although the Gantt chart did not initially indicate
          the relationships between activities this has become more common in current usage as
          both timing and interdependencies between tasks can be identified.
      7   ISI :




CAR_LCA_F05a_T03_V05.0                    152                                    11/21/05
COTS Assessment Report                                                          Version 5.0

          Part of the University of Southern California (USC), ISI is involved in a broad
          spectrum of information processing research and in the development of advanced
          computer and communication technologies.
      8. PHS: Pasadena High school
      9   ROI (Return of Investment)
          Return on Investment. A measure of a corporation's profitability, equal
          to a fiscal year's income divided by common stock and preferred stock
          equity plus long-term debt. ROI measures how effectively the           firm uses its
          capital to generate profit; the higher the ROI, the better.
      10 SMP:
          Symmetric Multiprocessing, or SMP, is a multiprocessor computer architecture
          where two or more identical processors are connected to a single shared main
          memory. Most common multiprocessor systems today use SMP architecture.
          SMP systems allow any processor to work on any task no matter where the data
          for that task is located in memory; with proper operating system support, SMP
          systems can easily move tasks between processors to balance the work load
          efficiently.
      11 TC-95 : Authentication server at PHS maintained by Tangent computers
      12 TC-96 : Database server at PHS maintained by Tangent computers
      13 Thin client:
          A thin client is a computer (client) in client-server architecture networks which
          has little or no application logic, so it has to depend primarily on the central server
          for processing activities. The word "thin" refers to the small boot image which
          such clients typically require - perhaps no more than required to connect to a
          network and start up a dedicated web browser.
      14 White-box Testing
          A software testing technique whereby explicit knowledge of the internal
          workings of the item being tested are used to select the test data.




CAR_LCA_F05a_T03_V05.0                     153                                     11/21/05

								
To top