Docstoc

Agile System Implementation Guidelines

Document Sample
Agile System Implementation Guidelines Powered By Docstoc
					DHS Office of Systems and Technology


      Guideline Document for:
    Agile System Implementation
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines




 SECTION 1: PLANNING AGILE PROJECTS .................................................................................... 5

 1.1       Establish and Utilize a Single Product Backlog ............................................................................................ 5

 1.2       Recommended Unique Requirement Priorities .......................................................................................... 5

 1.3       Size Estimation ........................................................................................................................................... 5

 1.4       Function Point Estimate ............................................................................................................................. 7

 1.5       System Roadmap: Initial Plan of Work ....................................................................................................... 7


 SECTION 2: SYSTEM MANAGEMENT AND ENGINEERING APPROACH .............................. 8

 2.1       Defined Project Engineering Strategy ......................................................................................................... 9

 2.2    Scrum Management Approach ................................................................................................................... 9
   2.2.1     Planning and Executing the Four Week Engineering Sprints .....................................................................9
     2.2.1.1 Sprint Planning Part 1 ............................................................................................................................9
     2.2.1.2 Sprint Planning Part 2 ..........................................................................................................................10
        2.2.1.2.1 FEATURE TEAM SPRINT BACKLOGS & BURN-DOWN CHARTS .......................................................10
     2.2.1.3 Sprint Charters .....................................................................................................................................10
     2.2.1.4 Daily Stand-up Scrum Meetings ...........................................................................................................11
     2.2.1.5 Sprint Review Meetings .......................................................................................................................11
     2.2.1.6 Sprint User Acceptance ........................................................................................................................11
     2.2.1.7 Retrospectives .....................................................................................................................................12
        2.2.1.7.1 FEATURE TEAM SPRINT RETROSPECTIVES.....................................................................................12
        2.2.1.7.2 SOLUTION RETROSPECTIVES .........................................................................................................12
     2.2.1.8 Pre-Sprint Preparation .........................................................................................................................12
   2.2.2     Graphical Depiction of Planned Sprint Cycle ...........................................................................................12

 2.3       Sprint 1: Project Startup Activities ............................................................................................................ 12

 2.4    Solution Engineering Approach ................................................................................................................ 13
   2.4.1     Collective Ownership of the Solution ......................................................................................................13
   2.4.2     Sprint Execution User Validation .............................................................................................................13
   2.4.3     Training Material Production ...................................................................................................................14
   2.4.4     Continuous Integration ............................................................................................................................14
     2.4.4.1 Software Builds: Code Compilation .....................................................................................................14
        2.4.4.1.1 SOFTWARE VERSION CONTROL .....................................................................................................15
        2.4.4.1.2 DEDICATED BUILD MACHINE/SERVER & CONTINUOUS INTEGRATION SERVER ...........................15
        2.4.4.1.3 AUTOMATED BUILDS .....................................................................................................................15
        2.4.4.1.4 PRIVATE DEVELOPER BUILDS .........................................................................................................15
        2.4.4.1.5 FAST/QUICK BUILD CYCLES ............................................................................................................15
     2.4.4.2 Continuous Database Integration ........................................................................................................15
        2.4.4.2.1 VERSION CONTROLLED DATA DEFINITION LANGUAGE (DDL) SCRIPTS .........................................15
        2.4.4.2.2 AUTOMATED DATABASE INTEGRATION ........................................................................................16
        2.4.4.2.3 DATABASE ADMINISTRATION AND TUNING .................................................................................16
     2.4.4.3 Test Driven Development & Continuous Testing .................................................................................16
AR DHS Agile System Implementation Guidelines v0_13.docx                                                                                                   Page 2
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines



         2.4.4.3.1 AUTOMATED UNIT TESTING ..........................................................................................................16
         2.4.4.3.2 MAINTAINING ROBUST UNIT TESTS ..............................................................................................16
         2.4.4.3.3 CREATION AND MAINTENANCE OF TEST DATA .............................................................................16
         2.4.4.3.4 UNIT TESTING FOR IDENTIFIED DEFECTS .......................................................................................17
         2.4.4.3.5 AUTOMATED DATA CONVERSION TESTING ..................................................................................17
      2.4.4.4 Code Inspection and Review ................................................................................................................17
         2.4.4.4.1 AUTOMATED CODE VERIFICATION: CODING STANDARDS ............................................................17
         2.4.4.4.2 MANAGING CODE COVERAGE .......................................................................................................18
         2.4.4.4.3 ELIMINATION OF DUPLICATE CODE ..............................................................................................18
      2.4.4.5 Documentation Compilation ...............................................................................................................18
         2.4.4.5.1 AUTOMATED API COMPILATION: CODE SPECIFICATIONS .............................................................18
         2.4.4.5.2 AUTOMATIC CREATION OF A SOLUTION DATA DICTIONARY ........................................................18
         2.4.4.5.3 PRODUCT DASHBOARD PUBLISHING ............................................................................................18
    2.4.5     Periodic Automated Environmental Refresh ...........................................................................................19
      2.4.5.1 Automated Environmental Refresh Approach and Strategy ...............................................................19
      2.4.5.2 Temporary Environmental Archive ......................................................................................................19
      2.4.5.3 Reestablish Base system Configuration ...............................................................................................19
      2.4.5.4 Code Migration ....................................................................................................................................19
      2.4.5.5 Structural Database Modifications ......................................................................................................20
      2.4.5.6 Configuration Data Load ......................................................................................................................20
      2.4.5.7 Loading of Test Data ............................................................................................................................20
      2.4.5.8 Shake-down Testing .............................................................................................................................20
    2.4.6     Integration Test and Regression Test ......................................................................................................20
      2.4.6.1 Verification...........................................................................................................................................21
    2.4.7     Pair Programming ....................................................................................................................................21
    2.4.8     Design Approach ......................................................................................................................................21
      2.4.8.1 Configuration vs. Customization ..........................................................................................................21
    2.4.9     Database Customization ..........................................................................................................................21
    2.4.10 Solution Refactoring ................................................................................................................................22
      2.4.10.1     Load and Performance Testing ........................................................................................................22

 2.5    Executing a Production Release Sprint ..................................................................................................... 22
   2.5.1    Release Planning ......................................................................................................................................23
     2.5.1.1 Release Sprint Staffing .........................................................................................................................23
   2.5.2    Planning End-User Training......................................................................................................................23
   2.5.3    Stakeholder Involvement and Communication .......................................................................................24
   2.5.4    Technical Implementation .......................................................................................................................24
   2.5.5    Load and Performance Testing ................................................................................................................24
   2.5.6    Formal User Revalidation & Acceptance .................................................................................................24
   2.5.7    Execution of Data Conversion in Production ...........................................................................................24


 SECTION 3: SUPPORT PROCESSES .............................................................................................. 25

 3.1       Continuous Process Improvement ............................................................................................................ 25

 3.2       Project Configuration Management ......................................................................................................... 25

 3.3    Risk and Issues Management ................................................................................................................... 25
   3.3.1     Contingency Planning ..............................................................................................................................26

AR DHS Agile System Implementation Guidelines v0_13.docx                                                                                                  Page 3
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines



 3.4       Assumption and Constraint Management ................................................................................................ 26

 3.5    Performance Metrics ................................................................................................................................ 27
   3.5.1     Scope & Schedule Management ..............................................................................................................27
     3.5.1.1 Function Point Earn Rate .....................................................................................................................28
     3.5.1.2 Hours per Function Point .....................................................................................................................29
   3.5.2     Team Performance Metrics .....................................................................................................................30
     3.5.2.1 Planned vs. Actual Feature Team Velocity by Iteration .......................................................................30
     3.5.2.2 Actual Feature Team Member Velocity by Iteration ...........................................................................31
     3.5.2.3 Remaining Product Backlog .................................................................................................................32
   3.5.3     Quality Metrics ........................................................................................................................................33
     3.5.3.1 Iteration Unit Test Density ...................................................................................................................33
     3.5.3.2 Solution Unit Test Density ...................................................................................................................34
     3.5.3.3 UAT Defect Density ..............................................................................................................................35
     3.5.3.4 Failed Builds .........................................................................................................................................35
     3.5.3.5 Failed Environmental Refreshes ..........................................................................................................36

 3.6       End User Training ..................................................................................................................................... 37

 3.7       Stakeholder Involvement and Communications Management ................................................................. 37


 SECTION 4: PERSONNEL ................................................................................................................. 38

 4.1    AGILE PROJECT ORGANIZATION ............................................................................................................... 38
   4.1.1    SUBJECT MATTER EXPERTS INVOLVEMENT .............................................................................................39
   4.1.2    Project OWNER ........................................................................................................................................40
   4.1.3    AREA PRODUCT OWNERS ........................................................................................................................40
   4.1.4    MASTER SCRUM MASTER ........................................................................................................................41
   4.1.5    SCRUM MASTERS .....................................................................................................................................41

 4.2    FEATURE TEAMS ....................................................................................................................................... 42
   4.2.1    FEATURE TEAM RAMP-UP .......................................................................................................................42


 SECTION 5: PROJECT FACILITIES AND RESOURCES .............................................................. 43




AR DHS Agile System Implementation Guidelines v0_13.docx                                                                                                  Page 4
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines




 Section 1: Planning Agile Projects
 Planning an Agile software development project is similar in many ways to planning projects that use
 alternative system implementation lifecycles. The main difference between the two approaches is that
 agile projects consider initial plans to be estimates of the work that will be accomplished and not final
 commitments that fully describe every activity that will be completed, the exact effort the activity will take,
 and the exact dates in which it will happen. In other words Agile approaches are more agile and flexible,
 and that flexibility begins during the planning process. Agile software development approaches also focus
 on addressing the most important and most challenging components of the project as early as possible to
 reduce risk; this is accomplished through the use of unique priorities for each story/requirement.

 1.1 Establish and Utilize a Single Product Backlog
 The most important asset that is developed and regularly evolved on an Agile project is the Product
 Backlog. The backlog, as the name suggests, is a backlog of work that must be completed in delivering
 the intended system and functionality to end users. Each project should have one, and only one, product
 backlog file. Project teams should use the established backlog template as a starting point for their
 specific project/product backlog, DHS_TP001_Requirements_Product_Backlog (see
 https://ardhs.sharepointsite.net/guidelines/Shared%20Documents/TP001_Requirements_Product_Backlo
 g.xlsx).
 The assigned Product Owner should meet with business users to establish the initial version of the
 project/product backlog and establish an overall vision for the solution. Note that the backlog will change
 and grow with the project as additional tasks are identified and added, so the intent of the initial backlog is
 to establish a good enough understand to be able to estimate the amount of time and resources needed
 to complete the project successfully.

 1.2 Recommended Unique Requirement Priorities
 Teams must indicate the unique priority/importance of each requirement in the backlog. Note that the
 assigned unique importance must be a unique number. Teams should determine what scale will be used
 for priorities (e.g. from 1 to N, where N is ten times the total number of requirements in the Product
 Backlog) for each and every requirement included in the Product Backlog. In the example provided,
 priorities would be assigned in increments of 10; this readily enables the team to insert new requirement
 priorities between existing requirements as the project progresses. If the team believes that alternative
 approaches would be more effective (based on past experiences) they are free to use these practices
 once the project is underway and opportunities for improvement are identified during the Sprint
 retrospective meetings.
 The teams initial priorities will serve as the initial starting point for the Product Owner to establish overall
 priorities for the solution during project startup (see section 4.1.2 Project OWNER for information about
 the Product Owners role). The assigned priorities are also expected to heavily influence the proposed
 development roadmap defined in section 1.5 System Roadmap: Initial Plan of Work and the ‘themes’ of
 each resulting Sprint.

 1.3 Size Estimation
 During the initial planning stages the team is also required to assign each requirement a size estimate for
 all requirements. DHS uses a generic size unit, such as “Story Points”, to estimate the size using the
 Fibonacci sequence included in the table below and in the product backlog template. Teams should
 establish definitions or guidance for the use of each numeric size that will be used by the Feature Teams
 on the project to establish a common understanding of size expectations. This should broker more
 uniform estimation across teams (on multi-team projects). The definitions must include information from
 all of the software engineering disciplines relevant to the project, including data conversion, interface,
 database, user interface and architectural elements and any other relevant components/competencies.
AR DHS Agile System Implementation Guidelines v0_13.docx                                                  Page 5
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines



 Note that DHS does not intend to use or control projects with the above mentioned generic size
 estimates; the size estimates are used for planning Sprints, specifically how much scope it is believed can
 be addressed within each Sprint and are NOT considered commitments.
 Teams should also define which sizing metric would be most beneficial to use on a given project;
 specifically the use of Ideal Days versus an arbitrary generic size unit or perhaps even using Function
 Points directly. The approach should be a simple mechanism to use on the project for all staff that quickly
 and effectively enable the tracking of software engineering tasks.

 Size Component           Project Definition and Guidance About Size

 0- Zero

 1 -Very Tiny

 2 –Tiny

 3 - Almost Tiny

 5 - Very Small

 8 – Small

 13 – Medium

 21 –Large

 34 - Very Large

 55 – Huge

 89 – Massive

 144 – Need more          <Note that DHS expects Requirements/Stories of this size to be broken down into
 information              smaller units of work before being allocated to a particular Sprint.>




AR DHS Agile System Implementation Guidelines v0_13.docx                                               Page 6
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines



 1.4 Function Point Estimate
 Before being approved by DHS, all candidate Agile system implementation projects are required to
 estimate the size of the project in Function Points. The Function Point estimate does not have to be tied
 to individual requirements in the initial Product Backlog, though it is expected that the planning team will
 use the requirements during the estimation. Because many of DHS’ project are required to track and cost
 allocate funds across various Federally funded programs, teams may be required to provide a Function
 Point Estimates for sets of requirements for the included program areas (e.g. Medicaid and SNAP). DHS
 will use the total supplied Function Point estimate as an obligation/commitment governing the amount of
 functionality delivered on the project.



        Overall Function Point Estimate for Solution:

        Area 1 Function Point Estimate:

        Area 2 Function Point Estimate:



 1.5 System Roadmap: Initial Plan of Work
 During project planning, project teams should create a detailed roadmap that includes a high-level list of
 activities that are expected to be completed during the project. Project teams should use DHS’ standard
 roadmap template, TP001_Agile_Project_Roadmap_Template.pptx (see
 https://ardhs.sharepointsite.net/guidelines/Shared%20Documents/TP002_Agile_Project_Roadmap_Temp
 late.pptx) when constructing their roadmap. The roadmap should include the initial identified/desired
 software releases, as well as the intended theme of each Sprint in the project.
 Sprint themes on the roadmap will likely correlate with the eventual Sprint goal and identify the prioritized
 features that are aligned with each Sprint in a way that addresses both business priorities and the
 greatest risk early in the project.




AR DHS Agile System Implementation Guidelines v0_13.docx                                              Page 7
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines




 Section 2: System Management and Engineering
 Approach
 DHS’ preferred approach to new system development project is an Agile approach based on Scrum and
 Extreme Programming practices. With this approach and industry-standard practices, teams are able to
 iteratively evolve software solutions while addressing the highest risk components early. (See the
 following diagram for an illustration of DHS’ approach). Project teams are free to explore alternative Agile
 approaches (e.g. Crystal Clear and Unified Process) and to enhance the Agile approach shown below,
 however project teams must clearly demonstrate the justification for and benefit to before any alternative
 approach is categorically approved for use.




AR DHS Agile System Implementation Guidelines v0_13.docx                                             Page 8
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines



 2.1 Defined Project Engineering Strategy
 Before starting a system development project, project teams should define a project engineering strategy
 and describe how the strategy can be used to deliver the intended solution and meet stated goals and
 needs associated with the project. A key component of the defined engineering strategy should include
 the definitions of ‘done’ at three (3) different levels:
         The definition of done at the solution level is used to determine when the system is finished and
          meets DHS’ intended need(s)
         The definition of done at a Sprint level indicates when a Sprint goal is achieved (see section
          2.2.1 Planning and Executing the Four Week Engineering Sprints);
         The definition of done for the various coding related activities such as development, data
          conversion, training material creation, and interface development.

 2.2 Scrum Management Approach
 As a general practice, DHS understands and respects the Agile and Scrum principles of self-organizing
 teams and shared ownership of the project processes and results for their project. During project
 planning, the project team project team provides details of the project management and project control
 methods that they intend to use. The intent of these details is not to identify fixed processes and
 approaches for the project, but rather to identify practices that the project team believes are well suited for
 the project. In other words, the details are considered to be initial conventions that teams may use over
 the first few iterations, and once the project is underway, identify and implement process improvements
 throughout the project.

 2.2.1       Planning and Executing the Four Week Engineering Sprints
 In using the Scrum system, a good practice and approach is using four week engineering
 Sprint/iterations. During project planning, the project team should define the initial Sprint duration and
 configuration that will be used to manage the development activities. This guideline document provides
 an initial standard and structure for how a Sprint cycle should be executed.
 As a general guideline, a good approach is one that efficiently and effectively integrates the system in a
 timely fashion. Project teams are encouraged to work collaboratively with DHS management and
 involved stakeholders, throughout the project to a establish project approach that is best suited to
 meeting the specified needs of the project champion/sponsor.

 2.2.1.1     Sprint Planning Part 1
 In Part 1 of Sprint planning, the Project Owner (and Area Product Owners on large projects), executive
 management and users meet with representatives from each of the Feature Teams and the Scrum
 Masters to establish a goal for the Sprint. These meetings are also used to identify what functionality will
 be built in the coming Sprint.
 The part 1 Sprint planning meeting lasts no more than 4 hours and during the meeting, business priorities
 are discussed and then contrasted against the priorities in the Product Backlog. If necessary the backlog
 priorities may be changed to reflect new information or changes in business priorities. As the top
 priorities are selected for inclusion in the scope of the Feature Teams Sprints, careful attention is paid to
 the size of work being taken by each team in light of the teams’ corresponding velocity (for more
 information about Feature Team velocity see section 2.2.1.3 Sprint Charters).
 It is important to note that, for large projects with multiple Feature Teams, the representatives from each
 Feature Team are fully authorized and responsible for the assignment of work to their associated team.
 At the end of the part 1 planning, the Feature Team is fully committed to delivering the work they have
 agreed to take on. Additionally, as a good practice, DHS uses an approach that openly allows all Feature

AR DHS Agile System Implementation Guidelines v0_13.docx                                               Page 9
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines



 Team members the opportunity to participate as the team representative in the part 1 planning meeting
 over a period of time. The establishment of a rotating schedule of the team representative(s) is a good
 practice for accomplishing this.
 During project planning, the project team defines the accepted convention for how Feature Team
 members are identified to attend the part 1 planning meeting and how the teams will maintain shared
 commitment for the selected scope. Other conventions about how the part 1 planning meeting (such as
 when in the Sprint cycle: beginning or end) is conducted should also be noted. Teams should also
 consider how the Sprint 1 planning meeting can be structured and managed to prevent overrunning the
 allocated 4 hour duration.

 2.2.1.2     Sprint Planning Part 2
 The second part of planning for a Sprint involves the individual Feature Teams meeting and discussing
 the work it committed to complete and how it intends to build the functionality into a product increment
 during the Sprint. For large multi-team projects, Area Product Owners should be available to assist the
 Feature Team in the detailed planning of their Sprint activities, however the Area Product Managers may
 be required to split her/his time across multiple Feature Teams and hence may not be available to a
 specific Feature Team for the full duration of the part 2 planning meeting. As with the first Sprint planning
 meeting, the second planning meeting lasts no more than four hours.
 During project planning, the team decides how they will conduct the second part of the Sprint planning
 meeting while recognizing that self-organizing teams may alter those practices at a future time.
 2.2.1.2.1 FEATURE TEAM SPRINT BACKLOGS & BURN-DOWN CHARTS
 The main output of the part 2 planning meeting is the Sprint Backlog, which includes the tasks the
 Feature Team will complete in building the required features, estimates of the amount of time it will take
 to realize those features (preferably in Ideal Days) and a Sprint burn-down chart template. The use of
 specific media for the Sprint backlog items is specified by each of self-organized Feature Teams. Teams
 are encouraged to use physical Sprint backlog media to track progress unless other specific practices are
 justified.
 During project planning, the project team should define how the burn-down chart will be updated each
 day and how records will be kept to record the daily progress (e.g. picture of the team backlog are taken
 and electronically filed). Additionally, the project team should define how tasks are identified and
 estimated and how unexpected complexity is to be handled.
 As a general guideline, each requirement should be broken down into smaller tasks during the part 2
 planning meeting and during this decomposition each tasks should be assigned an Ideal Day estimate,
 which is maintained throughout the Sprint, by the Feature Team.

 2.2.1.3     Sprint Charters
 One of the main outputs of both of the Sprint planning meetings is a one page charter that indicates the
 goal of the Sprint, the dates for the beginning and ending of the Sprint, the time and location of the project
 daily Scrum meeting as well as dates, times and locations for the Sprint Review meeting and the solution
 retrospective meeting. For large, multi-team project, the solution Sprint Charter is made available to all
 stakeholders interested in the project. Additionally, the Feature Teams create auxiliary charters that
 include relevant details about the scope of work they have selected along with relevant details about their
 teams plans for the Sprint.
 As a general guideline, the Feature Team representatives bring a partially completed team charter to the
 part 1 planning meeting. This partially completed charter includes the team’s targeted velocity (in generic
 size count) and the Feature Team’s expected availability over the Sprint (e.g. accounting for known
 vacations and holidays). While the content of the charters may evolve over time, project teams are
 required to use the established template, TP003_Sprint_Charters_Template (see

AR DHS Agile System Implementation Guidelines v0_13.docx                                             Page 10
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines



 https://ardhs.sharepointsite.net/guidelines/Shared%20Documents/TP003_Sprint_Charters_Template.doc
 x), at the beginning of the project.

 2.2.1.4       Daily Stand-up Scrum Meetings
 For large multi-team projects, Daily Scrum meetings are conducted at two levels each day; at the Feature
 Team level for each team and at the Enterprise level where representatives from each Feature Team
 attend with the Product Owner and Area Product Owners; small single team projects only require one
 Daily Scrum meeting. Regardless of which level the daily Scrum meeting is for, it never lasts more than
 15 minutes, requires that all team members attend, stand, and provide answers to the following 3
 questions:
           •      What was completed since the last daily Scrum Meeting?
           •      What will be finished by the next daily Stand-up Meeting?
           •      What is preventing work from being completed?
 During project planning, the project team defines how will keep the daily meetings under the 15 minute
 limit, how they will handle non-Feature Team (uninvited) participation (not attendance), and how they will
 schedule the Feature Team level meetings to facilitate a consolidated and accurate solution level
 meeting.

 2.2.1.5       Sprint Review Meetings
 Simply put, the Sprint review meeting is a 4 hour demonstration of the new products and features that
 were built during the latest 4 week Sprint. During the meeting representatives from the Feature Team(s)
 present the results of their work to the Product Owner, Area Product Owners, management, users and
 other stakeholders. The big-picture intent of the review meetings is to validate the solutions progress
 against DHS needs and priorities and to mitigate project risk.
 Presenters at the review meeting should do as little as possible in preparation of the meeting; the meeting
 is not intended to be a polished presentation requiring additional project resources to prepare, but rather
 a functional view of the solution and its progress. During project planning, the project team should define
 how the meeting will be organized, how Feature Teams might identify presenters, and how feedback will
 be collected.
 As a general guideline, DHS uses the review meetings as a stakeholder involvement and communication
 forum for many of the project stakeholders. During project planning, project teams should define how
 DHS can involve the broadest audience possible (video conference, webinar, etc.) while completing the
 meeting within the four hour limit.

 2.2.1.6       Sprint User Acceptance
 Traditional User Acceptance Testing (UAT) is not readily possible when using Agile project approaches.
 As a result DHS has adopted a three pronged approach to addressing the need for formal acceptance of
 the developed solution:
       •       Sprint Execution User Validation – see section 2.4.2 Sprint Execution User Validation.
       •       Sprint Review User Acceptance – (this section) at the end of each Sprint, during the Sprint
               Review Meeting, DHS reviews and accepts the developed components presented in the
               review.
       •       Release Sprint User Revalidation – while executing a Release Sprint (see section 2.5.6
               Formal User Revalidation & Acceptance requirements) DHS requires that Users Revalidate
               the functionality that is being released.
 As a general guideline, DHS provides acceptance of the developed features and components
 demonstrated in the Sprint Review Meetings. All feedback that requires a system change is incorporated
AR DHS Agile System Implementation Guidelines v0_13.docx                                           Page 11
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines



 into the Product Backlog for future Sprints. During project planning, the project team should define how
 features will be demonstrated, how feedback will be collected and how acceptance will be documented.

 2.2.1.7     Retrospectives
 Retrospection is a key activity that drives continuous improvement during the project. During project
 planning, the project team should provide teams with a brief summary of techniques that they will use
 during retrospective meetings. Additionally project teams should define when retrospective meetings are
 conducted within the Sprint cycle. A recommended guideline is to conduct retrospective meetings at the
 end of a Sprint because any opportunities for improvement will be fresh in the teams’ minds. Note that for
 large, multi-Feature Team, projects retrospective meetings are conducted at two levels.
 2.2.1.7.1 FEATURE TEAM SPRINT RETROSPECTIVES
 At the end of each iteration, each Feature Team must conduct a retrospective meeting to inspect their
 processes, identify potential improvements, assess any limitations and challenges faced by the team, and
 determine solutions that potentially eliminate the challenges. During project planning, the project team
 should define how the Feature Teams will conduct retrospectives, when they should be conducted and
 any particular approaches that might be helpful (e.g. Kaizen, 5-Whys).
 2.2.1.7.2 SOLUTION RETROSPECTIVES
 In addition to the Feature Team retrospectives, the DHS Project Owner or Master Scrum Master (for very
 large projects) conduct a solution level retrospective with the focus of improving operations and
 efficiencies at the overall project level. A key input to the solution level retrospective is the challenges
 and ideas from the Feature Team retrospectives. The enterprise retrospective meetings are also
 conducted after the end of each Sprint and will be led by the Product Owner or designee and attended by
 the Area Product Owners, DHS management and other interested stakeholders.
 During project planning, the project team must provide input and recommendations about how the
 solution retrospective meetings should be scheduled to enable input from the Feature Team
 retrospectives as well as more general recommendations about how DHS can implement enterprise
 improvement activities.

 2.2.1.8     Pre-Sprint Preparation
 As a regular and on-going activity before each Sprint, in anticipation of the Sprint planning meetings, the
 DHS Project Owner assesses and updates the Product Backlog to reflect current priorities, recent
 legislative changes or other external influences. These updates also include the closing out of
 requirements that were addressed in the previous Sprint as well as updates to the Product Burn-down
 Chart.

 2.2.2       Graphical Depiction of Planned Sprint Cycle
 Within the Defined Project Strategy, project teams should create a graphical depiction of the Sprint
 cycle/timeline that will be used to meet DHS’ requirements. This graphic depiction should show how
 activities conducted within each Sprint are laid out; it is not considered to be a final, rigid plan for each
 Sprint, but rather a visual aid to help project team members to understand the timing of events within the
 Sprint.

 2.3 Sprint 1: Project Startup Activities
 As part of their project planning efforts, project teams should define the approach that will be used to
 complete the installation, configuration, and start-up activities that are required by the project, but not
 currently in place. These activities usually include:
     •       Establishment of the proposed technical environments
     •       Installation and base configuration of any necessary software packages
AR DHS Agile System Implementation Guidelines v0_13.docx                                               Page 12
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines



     •       Configuration of the development toolsets (i.e. IDE, testing tools, see section 2.4.4
             Continuous Integration)
     •       Establishment of the project performance metric utilities (section 3.5 Performance Metrics)
     •       Executing Sprint Planning (parts 1 and 2) for Sprint 2 (2.2 Scrum Management Approach)
     •       Establishing project facilities and workspaces (if required)
 The plan for the project startup activities within Iteration 1 should include a Sprint Charter (and Solution
 Charter for large projects) and a Sprint Backlog. These materials accelerate the execution of the
 execution activities for the first Sprint. The project team should also provide details about the tasks that
 must be executed, any constraints that are known, dependencies between tasks, and the level of
 resourcing required to complete the work.
 For all identified iteration tasks, the project team should insert requirements into the Product backlog at
 the end of the list and assign unique priorities that effectively convey that these tasks must be completed
 in the first Sprint. For example, the first Sprint may be used to elicit User Stories/Requirements from
 business users for a given set of functions; the elicitation of requirements in each area may be suitable
 tasks for the Sprint product backlog.

 2.4 Solution Engineering Approach
 In order to maximize efficiency, minimize risk, and implement the desired solution in the shortest possible
 time, DHS makes heavy use of Extreme Programming (XP) concepts. During project planning, project
 teams are encouraged to use additional/alternative techniques and methods that have been proven to
 yield effective results for software engineering in the past.
 In Agile terms, the four week development iterations must produce a “potentially shippable product”. In
 general, “potentially shippable” is defined to mean a version of software products and any and all
 associated materials required to implement that software. It is important to note that the software may
 require additional components to function in the production environment (e.g. a user interface) but the unit
 component (e.g. database stored procedure) is of production quality.

 2.4.1       Collective Ownership of the Solution
 It is imperative that everyone on the project, from the most junior team member to the Project Owner and
 management, to have a strong sense of collective ownership for the successful implementation of the
 desired solution within the defined schedule. This is reflected in the ‘horizontal’ nature of the organization
 chart shown in section 4.1 AGILE PROJECT ORGANIZATION. During project planning, the project team
 should decide how they will actively foster and maintain a culture of shared ownership across the team
 and with other project stakeholders.

 2.4.2       Sprint Execution User Validation
 DHS uses the concept of ‘validation’ to ensure that the systems meet the intended needs, or ‘goodness of
 fit’. Hence validation cannot be completed with automated testing, at least initially. Validation must
 involve subject matter experts (SMEs) who understand the nature of the intended solution. Through
 close collaboration during each Sprint, the Project Owner, Area Product Owners, SMEs, and other
 participants validate the evolving solution during each Sprint.
 Traditionally, validation is accomplished during User Acceptance Testing (UAT), which is not readily
 possible when using Agile concepts. As a result DHS has adopted a three pronged approach to
 addressing the need for formal acceptance of the developed solution:
     •       Sprint Execution User Validation – (this section). During the execution of the Sprint, the
             Project Owner and Area Product Owners regularly provide input, use and review the work of
             the Feature Teams.
     •       Sprint Review User Acceptance – see section 2.2.1.6 Sprint User Acceptance
     •       Release Sprint User Revalidation – see 2.5.6 Formal User Revalidation & Acceptance

AR DHS Agile System Implementation Guidelines v0_13.docx                                              Page 13
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines



 For an Agile approach to be successful, frequent and regular interaction must take place between the
 Feature Teams and Subject Matter Experts (SMEs). As a result of this interaction, the Feature Teams
 gain heightened insight about the required solution and lowers risk associated with the delivery of the
 right solution. Due to the nature of the feedback and SME interactions, the Feature Teams use their
 judgment based on the scope of the feedback; for feedback with small resulting changes the team
 assesses how to incorporate the change within the existing Sprint; for larger changes that cannot be
 accommodated in the Sprint the Feature Team works with the Project Owner and Area Product Owner to
 communicate that an additional task (or tasks) needs to be added to the Product Backlog.
 During project planning, the project team should discuss how features will be demonstrated, how
 feedback will be collected, and how validation will be tracked to make sure that all new components
 receive user scrutiny.
 As a general guideline, the Project Owners and Area Produce Owners regularly engage stakeholders
 external to the project to use features and components developed in the Sprint, and past Sprints, to
 validate that the solution fits its intended need.
 A best practice for user validation involves recording user interactions with the system in an automated
 manner that enables ‘playback’ of the user actions. During project planning, the project team should
 discuss how this information will be captured, managed throughout the project, and used (played back) to
 facilitate User Revalidation during Release Sprints (see 2.5.6 Formal User Revalidation & Acceptance for
 more details).

 2.4.3       Training Material Production
 Project teams should use engineering approaches that incorporate the development of training materials
 seamlessly into each Sprint. In other words, Feature Teams should be staffed with people who have
 skills necessary for the development of training materials. Project teams should also give thought to the
 type of type of training material(s) that will be developed and how those materials are integrated into the
 software solution (e.g. on-screen help, training manuals and video tutorials such as YouTube).

 2.4.4       Continuous Integration
 DHS mandates and verifies that all project teams make full use of Continuous Integration (CI) from the
 beginning of the project through the end of the project. During project planning, project teams should
 define an approach to implementing continuous integration practices on the project including the
 frequency and proposed times when integration cycles will be executed (e.g. 10 AM and 2 PM each
 business day; full regression each Saturday at 1 PM). The project team should also include a discussion
 of the tools (e.g. CI tool, build tool, version control and others) that are required to implement the
 proposed approach.
 Where possible, DHS uses freely available, open-source tools for executing a continuous integration
 strategy. Whenever a project team identifies software tools that do not include a license free of charge,
 the project team must clearly justify why the tool is needed and why a similarly-featured free of cost tool
 cannot be used and request the purchase of the tool through the DHS Office of Systems and Technology
 (OST).

 2.4.4.1     Software Builds: Code Compilation
 Ideally, system code should be compiled at least three times daily (twice during normal business hours
 and once overnight). As a general good practice, DHS also encourages more frequent compilations.
 Doing so enables the rapid detection of defects and issues with the code base that can rapidly be
 resolved because the events that triggered the corresponding error would still be fresh in the project
 team’s memory. During project planning, the project team should define their approach to code
 compilation, including any tools that will be used.



AR DHS Agile System Implementation Guidelines v0_13.docx                                            Page 14
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines



 2.4.4.1.1 SOFTWARE VERSION CONTROL
 During project planning, project teams should define their approach and toolset that will be used to
 maintain control of versions of software. Because a single program/file may be modified by multiple
 people simultaneously (shared solution responsibility) it is important that the proposed tool be capable of
 reconciling multiple changes as the solution progresses. It is required that ALL code be checked in every
 day before the end of the day.
 2.4.4.1.2 DEDICATED BUILD MACHINE/SERVER & CONTINUOUS INTEGRATION
 SERVER
 During project planning, project teams should define how builds will be created, specifically focusing on
 hardware and network resources. As a general good practice, build cycles must be completed as rapidly
 as possible, and in most cases executed in a few minutes. Project teams should also determine the
 configuration of the build server and any specific requirements of such a machine that enable rapid build
 cycles.
 As a general guideline, a build server will, under normal circumstances, not require any unusual hardware
 or configurations; in most cases a developer workstation can satisfy the requirements of a build server.
 2.4.4.1.3 AUTOMATED BUILDS
 During project planning, project teams should establish their build cycle, the events that are triggered, the
 order of the events, dependencies and exceptions handling.
 2.4.4.1.4 PRIVATE DEVELOPER BUILDS
 As a general guideline, continuous integration process and, more specifically, the regular solution builds,
 should be uneventful processes on the project. One effective means to accomplish this goal is to
 mandate that all developers conduct a local build with modified code prior to checking their modified
 solutions into the common repository.
 During project planning, project teams define how their project approach addresses this requirement as
 well as what expectations are included for private developer builds.
 2.4.4.1.5 FAST/QUICK BUILD CYCLES
 Build cycles should execute rapidly in order to provide timely feedback. Project teams should describe
 the conventions that will be established on the project to balance the speed of the build cycle and
 associated feedback against the need to provide adequate coverage of tests during the cycle. Project
 teams should continually seek ways to refine the balance between useful inspection and rapid builds
 should the project team experience longer than acceptable build times.

 2.4.4.2     Continuous Database Integration
 As with the continuous integration of software code based products, DHS mandates that databases
 technologies be continually integrated with software solutions. Continuous database integration enables
 project teams to establish a solid foundational domain model with which to build application layers. Doing
 so reduces the risk of defects associated with multiple layers of the system from taking heightened effort
 to trouble shoot and resolve. Additionally, continuous database integration accelerates testing as the
 application and associated data model are established in a well-defined and known state.
 2.4.4.2.1 VERSION CONTROLLED DATA DEFINITION LANGUAGE (DDL) SCRIPTS
 Any development resource should be able to create or modify database components within the system.
 The modifications and definition of database components must be accomplished with DDL scripts that are
 to be included in the version control repository. Establishing this practice on the project enables the
 Continuous Integration process to wipe and establish required database structures from scratch in a
 known stable state.
AR DHS Agile System Implementation Guidelines v0_13.docx                                             Page 15
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines



 2.4.4.2.2 AUTOMATED DATABASE INTEGRATION
 The Continuous Integration process used by the project team must have the capability to automatically
 create and refresh databases. Because database solutions are integral components of the overall
 solution, it is critical to establish practices that mitigate the risk of establishing a fragile database platform.
 2.4.4.2.3 DATABASE ADMINISTRATION AND TUNING
 DHS requires DDL and DML scripts that are created and managed along with the solution code. As a
 result, it is possible for DBA-type resources to periodically review and tune the database construction
 scripts. This also enables this activity to be conducted with little or no impacts to Feature Teams who are
 executing work on the solution.
 During project planning, project teams should define how they will plan for and enable knowledgeable
 individuals with experience tuning databases to tune the system to improve performance and the
 solutions capabilities.

 2.4.4.3      Test Driven Development & Continuous Testing
 The software engineering approach used by DHS is heavily dependent on continuous integration and test
 driven development as risk mitigation mechanisms. Project teams must define how they integrate test
 driven development concepts and how tests are conceptualized and associated with Sprint backlog items.
 The project teams should also determine what tools would help to automate the unit testing process and
 should also define the strategy/architecture of the unit testing approach.
 2.4.4.3.1 AUTOMATED UNIT TESTING
 Because the regular, multi-daily build-and-test cycle must execute quickly, the project team must
 establish sets of tests that vary in scope and complexity that exercise the system differently depending on
 the available time. In other words, that automated tests run during business hours are relatively small,
 but those that run each night are more comprehensive in nature. Project teams should use an approach
 that enables unit tests of the system to be performed at various levels. During project planning, project
 teams should describe such tiers of tests will be established, configured and maintained over time as the
 System evolves.
 2.4.4.3.2 MAINTAINING ROBUST UNIT TESTS
 Systems often require thousands of unit tests to be run by the completion of the project. With this in
 mind, it is important to establish an architecture for the unit tests in aggregate to prevent them from
 becoming ‘fragile’ over time. For example, hard coding a future date into a unit test makes the test
 sensitive to the current date and at some point in the future the test results will change when the current
 date passes the hard coded date. Project teams should give thought to defining how to make the unit
 testing approach maintain a robust testing system that avoids fragile tests. Project teams should also
 define how the unit tests will be combined in sequence to form testing scenarios.
 2.4.4.3.3 CREATION AND MAINTENANCE OF TEST DATA
 As the system evolves, it becomes increasingly important to establish test data that maintains integrity
 across the application (e.g. establishment of client demographics associated with a client ID that is
 referenced throughout the system). One way to establish this data is the use of Structured Query
 Language (SQL) scripts to populate data directly into the database on the back end. During project
 planning, the project team defines how referent, high integrity test data will be created and maintained in
 the application in a manner that enables quick, frequent, and preferably automated, environmental
 refreshes.




AR DHS Agile System Implementation Guidelines v0_13.docx                                                  Page 16
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines



 2.4.4.3.4 UNIT TESTING FOR IDENTIFIED DEFECTS
 It is inevitable that defects will be injected into the system as part of the engineering cycle. While
 unavoidable during initial injection, it is possible to prevent future occurrences of the defect by
 establishing tests that verify the defect condition does not reoccur. As a general guideline, DHS requires
 that all detected defects be covered by unit tests that detect any future instance of the defect.
 2.4.4.3.5 AUTOMATED DATA CONVERSION TESTING
 The conversion of data from existing applications into the system is included in the scope of Sprints as
 necessary based on selected Sprint scope. Within this in mind, during project planning, project teams
 should define an approach to converting data from existing systems into the target/new system, what
 tools will be used to facilitate the conversion (e.g. Extract-Transform-Load (ETL) tool), how errors and
 discrepancies will be resolved, and how the process will be integrated into the continuous integration
 cycle and environmental refreshes.
 DHS makes production data available to the team to test data conversion routines and processes on an
 as-needed basis. During project planning, project teams should define, in detail, how they will identify the
 need for production data, how they will work with other DHS staff to obtain the data, how the data will be
 stored to protect confidentiality, how the data will be used, and how often the data is expected to be
 refreshed (e.g. new data pulled at the end of each Sprint and made available before the beginning of the
 next). As a general practice, the project team should identify specific individuals who are authorized to
 access production systems, thereby enabling her/him to extract sensitive production data that are made
 available to teams to construct and test data conversion programs.
 DHS is interested in innovative approaches to the integration of data conversion activities into the
 software engineering approach. Project teams are encouraged to identify practices and
 recommendations that are believed to be relevant and potentially beneficial for DHS’ to use. Additionally,
 due to the sensitive nature of production data, project teams must also identify ways that mandatory
 HIPAA requirements can be met and how the project team will help keep the data secure from
 unauthorized access.
 While structuring the team, project planners must not identify a single ‘Conversion Feature Team’ for the
 project. In the spirit of Agile software development practices each Feature Team should have skills and
 competencies available to address all required conversion activities within a Sprint.

 2.4.4.4     Code Inspection and Review
 Project teams regularly and aggressively review and inspect code work products being created on the
 project to maintain a high degree of technical quality. DHS appreciates highly efficient Agile engineering
 teams and understands that the results of this high performance equates to a large volume of solution
 code. Therefore, it is always necessary to establish automated and continuous practices to ensure that
 established standards are being met to avoid unnecessary technical complexity and reduced efficiency.
 2.4.4.4.1 AUTOMATED CODE VERIFICATION: CODING STANDARDS
 During project planning, the project team must identify the coding conventions that will be used on the
 project as well as the toolset that will be used to verify the code against established standards. If the
 project team needs to modify the system’s existing coding standards, the team must explain the rationale
 for the change and how the new coding standards will be enforced in tandem with pre-existing system
 standards. The team should also explain how deviations from the established standard will be resolved by
 project teams and how the standards are modified and maintained within the selected toolset.
 The process of automated code verification against the established standard must be fully integrated into
 the continuous integration cycle.



AR DHS Agile System Implementation Guidelines v0_13.docx                                            Page 17
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines



 2.4.4.4.2 MANAGING CODE COVERAGE
 During project planning, the project team defines ways use automated unit testing to ensure that
 adequate testing coverage of the software solution. The project team should also describe the tools that
 will be used to monitor testing coverage. It is important for the project team to provide a thorough
 description of the overall philosophy behind managing testing code coverage, what percentage of code
 coverage is targeted, and how deviations from that target will be addressed.
 2.4.4.4.3 ELIMINATION OF DUPLICATE CODE
 During project planning, the project team should define how the code will be assessed and reviewed to
 identify duplicate or redundant code or code components. DHS understands that large-scale code sets
 inevitably contain duplication of code structures and that this duplication must be managed to preserve
 the overall solution quality. The project team should define what tools will be used to review code for
 duplication and how the review will be integrated into the regular engineering activities, including the CI
 cycle.

 2.4.4.5     Documentation Compilation
 DHS believes that a self-documenting solution is the best mechanism through which to create solution
 documentation.
 During project planning, the project team should define how solution documentation will be compiled,
 stored, and published within the continuous integration cycle. Specifically, the team should describe how
 often the documentation will be compiled and what specifically will be compiled. The team should define
 the toolset that will be used to create and publish the documentation.
 At a minimum, DHS requires the automated generation of:
     •       A system Application Programming Interface (API)
     •       A data dictionary that fully describes data models that are implemented with systems and
 2.4.4.5.1 AUTOMATED API COMPILATION: CODE SPECIFICATIONS
 Test Driven Development process is executed shortly before a set of activities dedicated to specifying
 and coding of software. In other words the specifications are developed at the same time the code is
 developed to satisfy established automated unit tests.
 During project planning, the project teams define the proposed approach for automatically generating
 high quality API code specifications. The project teams should also discuss any toolsets that will be used
 to generate the specifications and how these specifications will be integrated into the system’s integrated
 library.
 2.4.4.5.2 AUTOMATIC CREATION OF A SOLUTION DATA DICTIONARY
 During project planning, the project team should define how a data dictionary will be automatically created
 by the system. As a general practices, DHS prefers a dictionary that makes use of HTML technologies.
 2.4.4.5.3 PRODUCT DASHBOARD PUBLISHING
 At the completion of each CI cycle, a dashboard summarizing the results of the various activities is
 created. The dashboard is visually appealing and easily navigable to facilitate executive DHS
 management in understanding the technical status of the solution. During project planning, the project
 team should identify dashboards that will be used. In addition, the project team should indicate desired
 and recommended approaches for communicating the status of the technical solution to interested
 stakeholders.




AR DHS Agile System Implementation Guidelines v0_13.docx                                             Page 18
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines



 2.4.5       Periodic Automated Environmental Refresh
 DHS continually seeks to establish development and test related environments that are regularly
 automatically reconstructed with fresh versions of the solution, configuration data, and test data. The
 intent of the regular refresh is to avoid unknown or undocumented, but required, configurations that may
 lead to issues and slow development progress. As a result of the previous point, DHS does not use a
 backup or disaster recovery mechanism to satisfy this requirement as this approach still leads to unknown
 and misunderstood configurations.
 Ideally, an environment refresh would involve the automated establishment of a fully functional version of
 the system from a clean, base system image. During project planning, project teams should describe
 how to accomplish a fully automatic refresh of all proposed environments from the base configuration to a
 specified version/baseline of the system. The automated refresh approach makes heavy use of the CI
 solution described in section 2.4.4 Continuous Integration. Project Teams should describe in particular
 detail:
     •       How the existing environment will be temporarily backed up and when the temporary backup
             is deleted
     •       How the base system installation is reestablished
     •       How the system version is migrated to the specified environment
     •       How any structural database modifications are applied
     •       How configuration data is applied/loaded into the base installation
     •       How test data is loaded
     •       How ‘shake-down’ testing is conducted on the newly established environment
 Note that DHS uses scripts that can be executed, both manually and programmatically, that accepts
 parameters (target environment, DHS system version, etc.) to trigger the refresh using the proposed
 Continuous Integration tools.

 2.4.5.1     Automated Environmental Refresh Approach and Strategy
 During project planning, the project team should define in detail their approach to automatically refreshing
 each proposed environment to a base-state with a known version of software, data model, configuration
 data, and test data.
 The continuous integration toolset is used to accomplish the automated environment refreshes. For each
 environment, the project team should specify the frequency of the refresh (e.g. development refreshed
 nightly, system test weekly and UAT before each validation cycle).

 2.4.5.2     Temporary Environmental Archive
 During project planning, the project team should explain how a temporary backup of a given environment
 is backed up before beginning the refresh. Because the refresh is fully automated, the project team
 should explain how success and failure of the backup impacts the refresh cycle. The project team should
 explain how the backup will be stored, accessed, and if necessary, restored. The project team should
 also provide a brief discussion of how long the backup will be maintained before being deleted.

 2.4.5.3     Reestablish Base system Configuration
 During project planning, the project team should explain how the environment will be reset to a base
 version of the system once the environment is backed up. If the system involves the restoring of server
 images, the project team should explain how the images will be established, refreshed, stored and
 maintained.

 2.4.5.4     Code Migration
 During project planning, the project team should briefly explain how the code for the system will be
 migrated to the specified environment after the establishment of the base system.
AR DHS Agile System Implementation Guidelines v0_13.docx                                            Page 19
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines



 2.4.5.5      Structural Database Modifications
 During project planning, the project team should explain how the database will be prepared with any
 modifications prior to data loading, once the correct version of the system is successfully migrated to the
 target environment.
 As a general guideline, modifications to the database should be kept to an absolute minimum to avoid
 complexities with future system patches and version updates. However, if additional tables, fields and
 other database elements (e.g. indexes, partitions, or views) are required, those modifications must be
 automatically applied.

 2.4.5.6      Configuration Data Load
 During project planning, the project team should explain how DHS’ specific configuration data will be
 loaded into the environment after the database has been prepared as needed. The project team should
 also describe how an initial baseline of configuration data changes will be established and maintained
 throughout the project.
 As a general guideline, all system configuration data, while initially applied through the system user
 interfaces, can be preserved via back-end database scripting to facilitate the automated loading of
 configuration data. If the system has utilities to facilitate this process that differ from the above described
 approach, the project team should provide a detailed description of how configuration data can be
 captured and applied, in an automatic manner, to a desired environment.

 2.4.5.7      Loading of Test Data
 During project planning, the project team should explain how the environment will be populated with test
 data.

 2.4.5.8      Shake-down Testing
 During project planning, the project team should explain how the environment will be verified once the
 automated refreshing of the environment code and data is completed. As a general guideline, a full
 regression test is not completed on the newly established environment. Rather, it is important to be
 concerned with how the environment will be deemed ready for use (which may include regression
 testing).
 As a general guideline, Shake-down activities should include verifying that an application server is
 connecting with a database, that data has been committed to tables as required, that user sessions can
 be established and that the correct code version is present.

 2.4.6        Integration Test and Regression Test
 While TDD and CI are useful for testing the evolving technical solution they are not sufficient by
 themselves to verify and validate that the Solution Software is evolving toward a solution that meets DHS’
 needs. Integration testing involves the testing of software to verify functional, performance, and reliability
 requirements placed on major design items operating in conjunction in an integrated manner. Regression
 testing is software testing that seeks to uncover defects in existing functionality after changes are applied
 (e.g. functional enhancements, patches or configuration data changes).
 The products created at the end of each Sprint must be ready for user acceptance during the Sprint
 Review (see section 2.2.1.6Sprint User Acceptance), meaning that the relevant components have been
 fully integration and regression tested (and validated by users: section 2.4.2 Sprint Execution User
 Validation) before the end of a Sprint.
 During project planning, project teams should describe their approach to ensuring that the Solution
 Software remains stable as new components are added or existing components are modified. Project


AR DHS Agile System Implementation Guidelines v0_13.docx                                                Page 20
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines



 teams should also describe how the suggested approach will enable DHS to formally accept the solution
 at the end of the Sprint.

 2.4.6.1     Verification
 During project planning, project teams should describe how to verify that each of the
 requirements/Product Backlog Items included in a Sprint has been fully addressed. The project team’s
 description should clearly show a direct correlation between verification of the requirement in the Product
 Backlog and user validation activities (see section 2.4.2Sprint Execution User Validation). Verification
 involves the Project Owner and Area Product Owner. Project teams should also describe how to
 incorporate the Project Owner and Area Product Owner in the verification activities.

 2.4.7       Pair Programming
 It is not mandated that Feature Teams use Pair Programming, however, during project planning, the
 project teams respondents should describe how Pair Programming will be used and what guidelines or
 criteria will be used to indicate when Pair Programming is a viable option. Additionally, the project teams
 should provide a discussion about the efficacy of Pair Programming to the scope of work covered by their
 Software Development Project.

 2.4.8       Design Approach
 Design is a means to produce an end result in the system and therefore is a necessary form of ‘waste’ in
 Lean and Agile terms. Therefore it is important to balance the ceremony and formality of the design
 competency and resulting assets against their ability to effectively and efficiently produce the required
 software products and/or mitigate project risk.
 During project planning, the project team should define the proposed convention for the conversion of
 Sprint Backlog Items into system software code. More specifically, the project team should provide details
 about the proposed design process paying particular attention to the assets produced, their value and
 formality.
 Project teams should also describe possible tailoring of the described convention that may be seen
 across the various Feature Teams as they design required components.
 DHS believes in Agile and Lean software development concepts, and as such wishes to minimize the
 amount of effort expended on the manual creation of solution documentation. For all proposed
 documentation that is to be manually created and maintained, project teams must provide a justification
 that demonstrates the need for the documentation.

 2.4.8.1     Configuration vs. Customization
 Where possible, DHS prefers systems that can be configured to address requirements as opposed to
 solutions that require customization. During project planning, project teams should clearly describe how
 they will approach identifying and balancing the need to configure versus customize code. Project teams
 should describe how configuration related changes are ‘designed’ and applied to the system. Project
 teams should describe try to answer the question of whether configuration changes need to be
 documented and, if so, how the changes are documented.
 An important result of the configuration vs. customization juxtaposition is the need of Feature Teams to
 balance whether the system is modified to meet DHS’ practices or where DHS’ process must be changed
 to match the system. Project teams should discuss how the Feature Teams are expected to maintain this
 balance during a Sprint with the involvement of the Area Product Owner and the Project Owner.

 2.4.9       Database Customization
 DHS appreciates that systems complying with the stated requirements is an evolving system that
 progresses to address industry demands beyond those of DHS. Because DHS vigorously maintains
AR DHS Agile System Implementation Guidelines v0_13.docx                                           Page 21
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines



 compatibility with the source system, database customization is viewed critically. When database
 changes are required to the system, the changes must be applied in an isolated manner that does not
 impact the systems baseline-versioned data model.
 During project planning, the project team should define how impacts to the system data model will be
 mitigated and isolated from the base system.
 An example of an isolation technique involves the creation of materialized views that join base system
 tables with new data structures required solely to meet DHS’ needs. DHS’ customized solution would
 then consume the materialized view as a data source leaving the base table unaffected and hence
 compatible with future versions of the system.
 Project teams’ discussions should include descriptions of naming conventions and schema topologies
 that enable DHS to readily identify and manage all customized database components.

 2.4.10      Solution Refactoring
 Software development inevitably involves learning about complex problems, both business and technical,
 as they are solved. While implementing a software solution for a particular problem the team addressing it
 often realizes a better understanding after the initial solution is created; this heightened understanding
 often yields better, more efficient technical solutions than the one initially created. DHS manages systems
 over a long-term time horizon and, as a result, it is important to maintain the overall architecture of the
 implemented enhancements.
 During project planning, project teams should describe how and when developed solutions will be re-
 factored to create a more optimal system for DHS. Additionally, project teams should describe how
 refactoring activities are integrated into the work of the Feature Teams.

 2.4.10.1 Load and Performance Testing
 Because DHS expects thousands of staff members to use the system on a daily basis, it is critical to
 maintain the performance of the system over and above the expected normal use. Performance and Load
 testing are key triggers for refactoring activities as described above.
 During project planning, project teams should describe the approach and toolset that will be used to verify
 and monitor the performance of the system. Project teams should describe how the system performance
 requirements are established and managed in terms of concurrent CPU users and not concurrent users
 logged into the application.

 2.5 Executing a Production Release Sprint
 As a general guideline, the overall system may be implemented in several ‘stages’, each of which
 involves a production release of newly developed products. During project planning, the project team
 should provide information about the approach to coordinating the various dependent activities for a
 release, such as user revalidation (see section 2.5.6 Formal User Revalidation & Acceptance), end user
 training, and stakeholder communication (section 3.7 Stakeholder Involvement and Communications
 Management). Because of the complexity and level of coordination required during a production release,
 during project planning, the project team should provide recommendations, including sample templates,
 for assets and approaches that will be used to coordinate technical release activities.
 During project planning, project teams should point out the number of production releases that are
 expected and included in the project roadmap. Also, during project planning, project teams should
 provide a graphical timeline of the expectation for a ‘normal’ production release cycle based on the
 proposed approach. Project teams should note that the format of the release sprint timeline should
 closely align with the timeline for the Sprint provided in section 2.2.2 Graphical Depiction of Planned
 Sprint Cycle.


AR DHS Agile System Implementation Guidelines v0_13.docx                                             Page 22
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines



 2.5.1       Release Planning
 In addition to addressing the known implementation dates and constraints set by the various federal
 regulations, DHS conducts production releases whenever a release is warranted. An example of a
 circumstance when DHS may determine that a production release is needed is when a new feature is
 deemed beneficial to rollout to end users as soon as possible instead of waiting for the next planned
 release.
 During project planning, project teams must describe their approach to release planning, including when it
 is best to consider planning a release and how the release is planned and coordinated. Project Teams
 should also describe the approach that will be used to conduct production release iterations, including the
 expected duration of the release (e.g. four weeks), the types of activities that are conducted (code
 migrations, data conversion, configuration data migrations, environmental shakedown, and others) and
 the sequence and duration of each activity. Project teams should fully describe the expected activities,
 including, but not limited to:
     •       Identifying and obtaining the latest production data with which to validate data conversion
             performance
     •       Running necessary data conversions and recording/establishing expected results (rows
             affected)
     •       How errors are resolved during the release
     •       How “Go/No-Go” decisions are made and what criteria are used (User Revalidation fails)
 Release prints make use of detailed release “playbooks” that provide detailed descriptions of tasks that
 are executed, the expected duration of the task, roll-back strategies, and other components of concern.
 During the initial stages of the release the playbook is completed with expected values and dependencies
 based on the release scope. As the release is tested and executed across the various environments the
 playbook is refined and leads to an accurate asset that fully describes all release activities. Project teams
 should develop a release playbook template and describe how the playbook concept can be implemented
 on the project.

 2.5.1.1     Release Sprint Staffing
 A release iteration does not require more than one Feature Team to execute.
 During project planning, the project team describes their approach to staffing a release sprint; will a
 specific Feature Team be identified to execute the Release Sprint or is another approach proposed?
 Additionally, because the remaining Feature Teams will be executing normal Sprint work, project teams
 should explain how the release team will coordinate its activities with the other executing Feature Teams.
 For example, how will Feature Teams avoid code version conflicts, configuration data issues, and training
 material versions issues? Project teams should also discuss how bugs identified within a release sprint
 will be addressed, both when the Feature Team can handle the required changes and when the team
 does not have capacity to address the required changes.

 2.5.2       Planning End-User Training
 During a release Sprint, the project team executes planned training that prepares DHS’ Trainers to
 educate and prepare end-users to accept and use the new functionality. During project planning, the
 project team should define how training activities are planned and coordinated as part of a Release
 Sprint. The project team should specifically discuss the timing of training as it may take more than four
 weeks to plan and execute required training. The project team should understand how the
 interdependencies between training and technical releases are identified and managed to ensure
 successful results.




AR DHS Agile System Implementation Guidelines v0_13.docx                                            Page 23
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines



 2.5.3       Stakeholder Involvement and Communication
 A critical component of releasing functionality involves the coordination and involvement of many
 individuals who have a stake in the project but may not be directly associated with the regular work.
 During project planning, the project team should explain how they will identify, plan and engage DHS
 stakeholders during a Release Sprint to reduce risk and increase the effectiveness of the release.
 Additionally, the project team should understand the types of communication activities that will be used
 and when those activities should be undertaken in anticipation of the release.

 2.5.4       Technical Implementation
 During project planning, the project team should understand in detail how the technical environments,
 from Development through Production, will be prepared during a production release. This includes
 configuration of servers, installation of required dependent software, migration of code, configuration data
 and relevant database work (partitioning and creation of necessary database objects).

 2.5.5       Load and Performance Testing
 Before implementing any new functionality in a production environment, the release team must fully load
 and performance test the target solution. During project planning, the project team must describe in detail
 the approach that will be taken to complete performance and load testing, including any tools used and
 how performance related issues will be resolved before implementing the technical solution in the
 production environment.

 2.5.6       Formal User Revalidation & Acceptance
 At periodic points, before releasing features to DHS’ end users, the system must be revalidated by users,
 as a whole solution.
 During project planning, the project team must describe in detail their approach to User Revalidation and
 Acceptance, including scheduling of activities (iteration planning), creation and execution of scripts (which
 should be done during each Sprint enabling “playback”, see section 2.4.2 Sprint Execution User
 Validation), preparation of the test environment, tracking of progress, resolution of identified defects and
 tracking acceptance or rejection of the solution.
 This section represents the third and final component of user acceptance testing on the project; project
 teams should not restate content in either of the two preceding sections:
             •        Sprint Execution User Validation – see section 2.4.2 Sprint Execution User Validation.
             •        Sprint Review User Acceptance – at the end of each Sprint, during the Sprint Review
                      Meeting, DHS reviews and accepts the developed components presented in the review.

 2.5.7       Execution of Data Conversion in Production
 Data conversion programs are fully designed and implemented during the regular Sprint cycles and
 included in regular Continuous Integration testing cycles. During a production release however, it is
 critical to review and thoroughly test the conversion routines on fresh data pulled from identified data
 sources. This ensures that the routines adequately address production data and that expectations for
 results of the conversion are fully known before the conversion executed in Production. Doing so enables
 the release team to rapidly identify anomalous results of the conversion and resolve any issues that may
 be encountered.
 During project planning, the project team should understand, in detail, how the data conversion routines
 that will be developed during the engineering cycles are fully prepared for execution in production as part
 of the release activities.


AR DHS Agile System Implementation Guidelines v0_13.docx                                                 Page 24
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines




 Section 3: Support Processes
 In addition to engineering and management processes, project teams should employ several other
 practices in support of the project. These activities, which run largely in parallel with the activities
 discussed above, are noted and described below.
 During project planning, project teams may expand upon each of these Support Process guidelines and
 may also suggest additional processes that will help support of the successful completion of the project.
 However, at a minimum, the following Support Process guidelines should be followed.

 3.1 Continuous Process Improvement
 DHS fully embraces self-organizing Feature Teams with a shared commitment to our DHS’ common
 goals and objectives. One important shared goal of all the stakeholders involved in software development
 projects is to continually refine and improve the processes and practices in use by the team.
 The retrospective meetings conducted at the end of a Sprint at the Feature Team and solution levels are
 a key source of improvement ideas. (See section 2.2.1.7 Retrospectives for guidelines about
 retrospective meetings).
 During project planning, the project teams should consider how Features Teams will be facilitated in both
 self-organization and continual improvement of the processes used. Specifically, project teams should be
 informed about how promising practices will be shared across teams, how common opportunities for
 improvement will be detected, and what related approaches could be used such as CMMI, PMBOK,
 Lean, Six Sigma and ITIL concepts.

 3.2 Project Configuration Management
 Successfully completing a software development project that supports DHS and various member divisions
 requires close and deliberate attention to details.
 Project Configuration Management is used to define, document and control the configuration of any
 system project materials. The project team should consider how assets such as documents, libraries,
 and backups that will be used and how access and how changes to these assets will be controlled.
 Project configuration management is broader than software configuration management (SCM) and
 includes management of all worthy project assets such as design documents (white board pictures,
 scanned data models, etc.), the Product Backlog, and test scripts in addition to others.
 Project teams should understand, in detail, in detail the approach that they will use to manage project
 assets, any software toolsets that they will used (e.g. SharePoint and Subversion), and be prepared to
 describe why the specified approach is best for DHS.

 3.3 Risk and Issues Management
 Risks are events that have not yet occurred, have a probability of occurring, and if they do occur, impact
 the project (in a negative way). Issues are events or occurrences that have occurred on a project that
 impact the project in some way and must be addressed. In other words issues are realized risks.
 At DHS, daily Scrum meetings are the primary vehicle through which to address issues and identified
 risks, though other mechanisms can be used as well.
 Project Teams should understand how risks and issues will be identified, managed, and communicated to
 DHS, as well as how responses will be planned, executed, and verified.
 During project planning, the project team should identify risks and issues that DHS may encounter with
 the system, the integration approach, and any other areas of the project. For each risk, the project team
 should include an area in which to group the risk (system, Agile Approach, New Federal Policy, etc.).

AR DHS Agile System Implementation Guidelines v0_13.docx                                               Page 25
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines



 The following table can be used by the project team to catalog identified Risks and how they will be
 managed.

 #     Area                Risk                                  Mitigation Approach


 1

 2

 3

                           <add more rows as needed>



 3.3.1        Contingency Planning
 DHS recognizes that federal timelines present formidable constraints for the project.
 During project planning, the project team should provide a list of critical, mandatory components that must
 be implemented, at a minimum, to comply with any federal regulations or timelines given the proposed
 system.
 Once these components are clearly identified, the project team should prepare a contingency plan that
 would be used to implement these critical components and establish basic compliance with associated
 federal regulations and timelines.
 Project teams should also understand the characteristics of the worst-case scenario that might trigger the
 contingency plan execution (e.g. missing a critical milestone date, development progress, external
 dependencies).

 3.4 Assumption and Constraint Management
 Assumptions are defined as facts that are believed to be true for planning purposes. All assumptions
 must be validated and affirmed and rejected after required information, which is missing during planning,
 is received.
 During project planning, project teams should record all assumptions in the following table which can be
 used to manage assumptions throughout the life of the project.
 DHS defines constraints as facts or circumstances that impact the project in some way and are beyond
 the control of a given group to change or alter. Changing Federal timelines and regulations are good
 examples of constraints that we do not have the ability to change or alter. Documenting constraints
 enables DHS and the project team members to more readily understand the nature and nuances of
 established plans.




AR DHS Agile System Implementation Guidelines v0_13.docx                                          Page 26
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines




 #        Area                       Assumption / Constraint


 1

 2

 3

                                     <add more rows as needed>


 3.5 Performance Metrics
 DHS has identified 10 performance metrics that are required to track project performance. These metrics
 are included below. For each metric, project teams must use a Microsoft Excel spreadsheet or other
 similarly readily accessible tool to calculate the metric. The spreadsheet or similar tool must also include
 sample data of performance that might be expected on the project and is graphed in the readable format.
 Project teams are also encouraged to provide recommendations for performance metrics in addition to or
 substitution of those included in these guidelines. DHS may reject or adopt any additional metrics. DHS
 may also add or modify the established performance metrics used on the project at any time during the
 life of the project.

 3.5.1       Scope & Schedule Management
 DHS uses FP-01 and FP-02 to facilitate scope and schedule management.
 During project planning, project teams should describe how these metrics will be integrated into their
 project and used collaboratively to track scope and schedule progress.
 DHS uses a Product Burn-Down chart to track and communicate the progress of the evolving solution.
 During project planning, project teams should provide information and recommendations about how they
 will use Product Backlog, and information from iterations, create and update the overall solution burn-
 down chart. The information should address whether the generic size estimate should be used, the
 Function Point estimate, or some other means of tracking progress.
 Project teams should also provide determine, using sample burn-down charts, of how their approach (size
 estimates, number of Feature Teams, and expected Feature Team velocity) will be used to track the
 implementation of the required solution in conformance with the DHS specified timelines.




AR DHS Agile System Implementation Guidelines v0_13.docx                                            Page 27
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines



 3.5.1.1      Function Point Earn Rate
           Metric ID:    FP-01

       Metric Name:      Function Point Earn Rate

     Commencement        After Iteration 3. Metric is effective for Iteration 4.
            Date:
              Metric     Tracks the rate at which Function Points are being earned toward completing the
         Description:    desired solution. Because the proposed Function Point count included in the
                         response is a contractual obligation, tracking the remaining number of Function
                         Points enables DHS to track the project progress and more readily assess likelihood
                         of on-time and on-budget completion.

  Category/Compone       Function Point
                nt:
   Reporting/Frequen     Each Sprint Iteration
                 cy:
         Verification    Project Owner
          Reviewer:
              Scope:     This metric applies at a project level, across all teams.

    Threshold Levels:    >5% under the required Function Point earn rate to complete on-time delivery.

       Data Sources:     Estimates of number of Function Points of solution features delivered at the end of
                         each Sprint.

            Formula:     To determine the required Function Point earn rate divide the total proposed
                         Function Point estimate for the project by the duration of the project in weeks.
                         Multiplying by four weeks provides linear average number of Function Points that
                         must be delivered by each Sprint. To determine the deviation threshold, divide the
                         total number of delivered Function Points to date by the number of elapsed weeks on
                         the project. This value is the actual Function Point velocity. The threshold value is
                         calculated as
                         [1 – (Actual Function Point velocity/Required Function Point Velocity)] x 100
                          If the resulting value is larger than +5 the threshold is exceeded and corrective
                         action is required.

       Project Team      Estimate the number of Function Points delivered at the end of each Sprint.
      Responsibility:
               DHS       Estimate the number of Function Points delivered at the completion of each Sprint.
      Responsibility:
         Tool/Utility:   <Project team should use an Excel spreadsheet to record data and generate
                         proposed graph.>

  <During project planning, the project team should provide mockups/screen shots of the graphs that will be
  used to track this metric.>



AR DHS Agile System Implementation Guidelines v0_13.docx                                                       Page 28
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines



 3.5.1.2      Hours per Function Point
           Metric ID:    FP-02

       Metric Name:      Hours per Function Point

     Commencement        After Iteration 3. Metric is effective for Iteration 4.
            Date:
              Metric     Tracks the average amount of effort across the entire group of Feature Teams
         Description:    required to produce a single function point.

  Category/Compone       Function Point
                nt:
  Reporting/Frequen      Each Sprint Iteration
                cy:
         Verification    Project Owner
          Reviewer:
              Scope:     This metric applies at a project level, across all teams.

    Threshold Levels:    +5% The threshold only applies to the increases in hours per Function Point-
                         increasing the number of human hours to provide a single Function Point worth of
                         solution is a reduction in efficiency on the project.

       Data Sources:     Estimates of number of Function Points of solution features delivered at the end of
                         each Sprint. Total hours contributed to the project as of the last iteration.

            Formula:     [(Total Current Sprint Hrs/Total Current Sprint FPs)/Previous Sprint Hours/FP]

       Project Team      Record the total hours, by Feature Team member, contributed to the Sprint.
      Responsibility:

               DHS       Estimate the number of Function Points delivered at the completion of each Sprint.
      Responsibility:    Gather the total effort allocated to the completed Sprint and enter the data into the
                         necessary tool.

         Tool/Utility:   <Project Team should use an Excel spreadsheet to record data and generate
                         proposed graph. >

  <During project planning, the project team should provide mockups/screen shots of the graphs that will be
  used to track this metric.>




AR DHS Agile System Implementation Guidelines v0_13.docx                                                         Page 29
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines



 3.5.2        Team Performance Metrics
 Metrics SP-01, SP-02 and SP-3 are focused on monitoring the performance of the Feature Teams in
 completing the required work. During project planning, the project team should provide a description of
 how these metrics will be used and any recommendations relating to tracking performance with these or
 other metrics.

 3.5.2.1      Planned vs. Actual Feature Team Velocity by Iteration
                            SP-01
              Metric ID:
                            Planned vs. Actual Feature Team Velocity by Iteration
           Metric Name:
                            Project Start
   Commencement Date:
                            Measure the planned versus actual productivity/ velocity of each Feature Team
     Metric Description:
                            for each Sprint
                            Size/Ideal Day Performance
   Category/Component:
                            Each Sprint Iteration
   Reporting/Frequency:
                            DHS Master Scrum Master
  Verification Reviewer:
                            This metric applies to all Feature Teams for each Sprint that they are active on
                 Scope:
                            the project.
                            TBD (not possible to define until baseline data obtained)
      Threshold Levels:
                            Example: Each member of the team with the least variance between planned and
  Performance Incentive
                            actual velocity receives a gift card and are publically recognized during the
                            Sprint Review meeting. Note: quality metrics must be measured within threshold
                            for incentive to apply.
                            Example: Each member of the team with the greatest variance between planned
   Performance Penalty
                            and actual velocity receive a vendor supplied gift card.
                            Planned velocity: Feature Team charter for each Sprint (size and expected
           Data Sources:
                            effort)
                            Actual velocity: Sprint backlog for each Feature Team at the end of the Sprint
                            (size). Effort derived from time entry.
                            Planned velocity = selected Sprint size / total planned effort for the Feature
               Formula:
                            Team in the Sprint
                            Actual velocity = actual Sprint size completed / total hours recorded for Feature
                            Team during Sprint
                            Feature Teams commit to the scope of the Sprint during part 1 and part 2 of the
          Project Team
                            Sprint planning session. During part 2 the Feature Team confirms expected staff
         Responsibility:
                            load over the next Sprint by accounting for vacations, holidays and partial
                            staffing arrangements; this is summed to determine the total planned Sprint
                            effort on the Feature Team Charter.
                            During the Sprint, Feature Team members’ record time worked each day. At the
                            end of the Sprint, a Feature Team member is selected to record the total final
                            scope, in the ideal days, addressed in the Sprint.
                            Collect planned data upon completion of the Sprint planning meetings and
    DHS Responsibility:
                            record in the associated tool.
                            Gather actual scope and effort values for each Feature Team at the end of each
                            Sprint and enter into tool.
            Tool/Utility:   <Project Team should use a spreadsheet (preferably MS Excel) to record data
                            and generate proposed graphs. Insert the name of spreadsheet and tab here.
                            Because this metric applies to all Feature Teams, the tool provided with the
AR DHS Agile System Implementation Guidelines v0_13.docx                                                       Page 30
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines



                           response must include tabs or sections for each of the proposed Feature Teams.>



  <During project planning, the project team should provide mockups/screen shots of the graphs that will
  be used to track this metric.>



 3.5.2.2      Actual Feature Team Member Velocity by Iteration
              Metric ID:   SP-02

           Metric Name:    Actual Feature Team Member Velocity by Iteration

   Commencement Date:      Project Start

     Metric Description:   Measure the actual contribution of members of the Feature Teams for each
                           Sprint based on the hours contributed to the Sprint. For resources shared across
                           Feature Teams the metric still applies, but the total size contribution must be
                           computed across feature teams.

   Category/Component:     Generic Size (Ideal Days) Performance

   Reporting/Frequency:    Each Sprint Iteration

  Verification Reviewer:   Project Owner

                 Scope:    This metric applies to all Feature Team members for each Sprint that they are
                           active on the project.

      Threshold Levels:    TBD (not possible to define until baseline data obtained)

  Performance Incentive    Example: The team member with the highest velocity for a given Sprint receives
                           a vendor supplied gift card and is recognized during the Sprint Review meeting.
                           Note: quality metrics must be measured within threshold for incentive to apply.

   Performance Penalty     Example: Team member velocities are published in the team area following each
                           Sprint and present an order list of all team members in terms of velocity in each
                           Sprint.

           Data Sources:   Sprint Backlog tasks provide the actual size (in Ideal Days) of all tasks assumed
                           by individuals on the team for a given Sprint.
                           Time system provides the total hours billed by each member of the Feature
                           Team.

               Formula:    Velocity = (Sum of the total final ideal days of work for each tasks assumed for
                           each team member)/ hours recorded against the Sprint for each individual
                           Feature Team member.

          Project Team     At the end of each Sprint a member of the Feature Team compiles the total ideal
         Responsibility:   days completed for each team member. This information is provided to the
                           Project Owner.

    DHS Responsibility:    Gather ideal day contributions from each Feature Team and enter into the
                           specified tool. Determine the total hours entered by the resource for the Sprint.
AR DHS Agile System Implementation Guidelines v0_13.docx                                                       Page 31
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines



                            Divide the total contribution (summed across Feature Teams for shared
                            resources) by the hours worked on the Sprint.

            Tool/Utility:   <Project Team should use a tool (Excel spreadsheet, Access database, etc) to
                            record data and generate the proposed graphs. Because this metric applies to all
                            members of the proposed Feature Teams the utility must provide a mechanism
                            with which to calculate productivity and provide a way to easily track this
                            productivity for all Feature Team members over time.>

  <During project planning, the project team should provide mockups/screen shots of the graphs that will
  be used to track this metric.>

 3.5.2.3      Remaining Product Backlog
              Metric ID:    SP-03

           Metric Name:     Remaining Product Backlog

   Commencement Date:       Project Start

     Metric Description:    Determine the remaining size of the Product Backlog and plot on the Product
                            Burndown chart. Enables the project team to determine if the project is on-track,
                            ahead or behind.

   Category/Component:      Generic Size Performance

   Reporting/Frequency:     Each Sprint Iteration

  Verification Reviewer:    Project Owner

                 Scope:     This metric applies at a project level, across all teams.

      Threshold Levels:     N/A

           Data Sources:    Product Backlog for items completed to date on the project. For items assigned
                            to a Sprint that are not completed, additional tasks are expected to be added to
                            the Product Backlog with adjusted size entries to reflect the modification.

               Formula:     Sum of unfinished work, in proposed generic size units, in the Product Backlog.

          Project Team      At the end of each Sprint report the completed Product Backlog Items and
         Responsibility:    unfinished/split tasks. For unfinished/split tasks provide size estimates to be
                            included with new tasks as well as revisions to the original entry if necessary.
                            The Project Owner makes all changes to the Product Backlog.

    DHS Responsibility:     Gather information from Feature Teams during Sprint Review regarding
                            completed Product Backlog Items. Sum the remaining work after each Sprint and
                            Plot on the Product Burndown chart.

            Tool/Utility:   <Project Team should note the Excel spreadsheet that will be used to record
                            data and generate the proposed graph.>

  <During project planning, the project team should provide mockups/screen shots of the graphs that will
  be used to track this metric.>


AR DHS Agile System Implementation Guidelines v0_13.docx                                                       Page 32
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines



 3.5.3        Quality Metrics
 DHS uses FP-03, FP-04, FP-05, BL-01 and BL-02 to track quality on the project. These metrics are high-
 level aggregate indicators of overall solution quality. DHS believes that even good engineering practices
 can provide adequate low-level coverage to foster quality; therefore these metrics are included only as
 indicators of symptomatic quality issues that may require additional investigation and corrective actions to
 be taken. During project planning, project teams should describe how these metrics will be integrated into
 the project. Additionally, project teams may provide recommendations for additional quality metrics that
 DHS should consider adopting on the project.

 3.5.3.1      Iteration Unit Test Density
              Metric ID:    FP-03

           Metric Name:     Iteration Unit Test Density

   Commencement Date:       After Iteration 3. Metric is effective for Iteration 4.

     Metric Description:    Tracks the number of new Unit Tests written for the produced amount of
                            Function Points of solution. This provides DHS with an estimate of the coverage
                            of unit tests for the products delivered in the last Sprint. This is an indicator of
                            the number of defects that might be associated with the newly developed
                            products.

   Category/Component:      Function Point

   Reporting/Frequency:     Each Sprint Iteration

  Verification Reviewer:    Project Owner

                 Scope:     This metric applies at a project level, across all teams.

      Threshold Levels:     +5% The threshold only applies to the decreases in the number of Unit Tests
                            per Function Point- decreasing the number of unit tests for a single Function
                            Point worth of solution is a reduction in test coverage of the solution.

           Data Sources:    Estimates of number of Function Points of solution features delivered at the end
                            of each Sprint. Total count of new unit tests added to the solution during the last
                            iteration.

               Formula:     (Total Count of Unit Tests Added in the Current Sprint /Total Current Sprint
                            FPs)

          Project Team      Supply the Project Owner with the Total Count of Unit Tests in the solution as of
         Responsibility:    the currently completed Sprint. Note that this responsibility presumes that the
                            baseline count of Unit Tests was known at the start of the project and tracked as
                            of each successive Sprint. Estimate the number of Function Points delivered at
                            the end of each Sprint.

    DHS Responsibility:     Estimate the number of Function Points delivered at the completion of each
                            Sprint. Accept the total count of Unit Tests as of the end of the Sprint.

            Tool/Utility:   <Project Team should use an Excel spreadsheet to record data and generate
                            proposed graphs. Note that this metric and metric FP-04 are expected to be
                            tracked using the same chart.>

AR DHS Agile System Implementation Guidelines v0_13.docx                                                       Page 33
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines



  <During project planning, the project team should provide mockups/screen shots of the graphs that will
  be used to track this metric.>



 3.5.3.2      Solution Unit Test Density
              Metric ID:    FP-04

           Metric Name:     Solution Unit Test Density

   Commencement Date:       After Iteration 3. Metric is effective for Iteration 4.

     Metric Description:    Tracks the average number of solution Unit Tests written by total Function
                            Points of evolving solution. This provides DHS with a historical estimate of the
                            overall average coverage of unit tests for the evolving solution, which is a
                            predictor of solution defects.

   Category/Component:      Function Point

   Reporting/Frequency:     Each Sprint Iteration

  Verification Reviewer:    Project Owner

                 Scope:     This metric applies at a project level, across all teams.

      Threshold Levels:     +5% The threshold only applies to the decreases in the number of Unit Tests
                            per Function Point- decreasing the number of unit tests for a single Function
                            Point worth of solution is a reduction in test coverage of the solution.

           Data Sources:    Estimates of number of Function Points of solution features delivered at the end
                            of each Sprint. Total count of new unit tests added to the solution during the last
                            iteration.

               Formula:     (Total Count of Unit Tests Added in the Current Sprint /Total Current Sprint
                            FPs)

          Project Team      Supply the Project Owner with the Total Count of Unit Tests in the solution as of
         Responsibility:    the currently completed Sprint. Note that this responsibility presumes that the
                            baseline count of Unit Tests was known at the start of the project and tracked as
                            of each successive Sprint. Support DHS in estimating the number of Function
                            Points delivered at the end of each Sprint.

    DHS Responsibility:     Estimate the number of Function Points delivered at the completion of each
                            Sprint. Accept the total count of Unit Tests as of the end of the Sprint.

            Tool/Utility:   <Project Team identifies the Excel spreadsheet to record data and generate
                            proposed graph. Insert name of spreadsheet and tab here. Note that this metric
                            and metric FP-03 are expected to be tracked using the same chart.>

  <During project planning, the project team should provide mockups/screen shots of the graphs that will
  be used to track this metric.>

AR DHS Agile System Implementation Guidelines v0_13.docx                                                       Page 34
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines



 3.5.3.3      UAT Defect Density
              Metric ID:    FP-05

           Metric Name:     UAT Defect Density

   Commencement Date:       After Sprint #3. Metric is effective for Sprint #4.

     Metric Description:    Determines the average number of defects contained per Function Point in the
                            evolving solution. This provides an overall metric of ‘user’ quality of the
                            developed solution and solution components. A UAT defect is defined to be a
                            defect discovered during a Sprint Review meeting or during revalidation
                            activities during a release sprint.

   Category/Component:      Function Point

   Reporting/Frequency:     After the completion of a Sprint Review or following a Release Sprint.

  Verification Reviewer:    Project Owner


                 Scope:     This metric applies at a project level, across all teams.

      Threshold Levels:     TBD (not possible to define until baseline data obtained)

           Data Sources:    Estimates of number of Function Points of solution features delivered at the end
                            of each Sprint. Total count of defects identified in an associated Sprint.

               Formula:     (Total Count of Defects Identified in the Completed Sprint /Total FPs in the
                            release)

          Project Team      Supply the Project Owner with the Total Count of UAT defects detected in the
         Responsibility:    completed Sprint.

    DHS Responsibility:     Determine the total number of FPs included in the scope of the testing by
                            subtracting the total number of FPs from the previous release. Gather the defect
                            data and calculate the metric.

            Tool/Utility:   <Project Team identifies Excel spreadsheet to record data and generate
                            proposed graph. >

  <During project planning, the project team should provide mockups/screen shots of the graphs that will
  be used to track this metric.>




 3.5.3.4      Failed Builds
              Metric ID:    BL-01

           Metric Name:     Failed Builds



AR DHS Agile System Implementation Guidelines v0_13.docx                                                       Page 35
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines



   Commencement Date:       Project Start


     Metric Description:    A count of the number of failed builds during the continuous integration cycles
                            during a Sprint. This is an indicator of compliance with Agile engineering
                            practices.

   Category/Component:      Build

   Reporting/Frequency:     Each Sprint Iteration

  Verification Reviewer:    Project Owner


                 Scope:     This metric applies at a project level, across all teams for a given Sprint.

      Threshold Levels:     >0 DHS expects that once automation is in place that the build cycle will
                            regularly complete with no failures.

           Data Sources:    Continuous Integration logs

               Formula:     Count of failed builds within a Sprint

          Project Team      Facilitate data collection from the continuous integration toolset.
         Responsibility:
    DHS Responsibility:     Gather information from Continuous Integration toolset.

            Tool/Utility:   <Project Team uses an Excel spreadsheet to record data and generate proposed
                            graph. >

  <During project planning, the project team should provide mockups/screen shots of the graphs that will
  be used to track this metric.>



 3.5.3.5      Failed Environmental Refreshes
              Metric ID:    BL-02

           Metric Name:     Failed Environmental Refreshes

   Commencement Date:       Project Start


     Metric Description:    A count of the number of failed environmental refreshes during the Sprint. This is
                            an indicator of compliance with Agile engineering practices.

   Category/Component:      Build

   Reporting/Frequency:     Each Sprint Iteration



AR DHS Agile System Implementation Guidelines v0_13.docx                                                      Page 36
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines



  Verification Reviewer:    Project Owner


                 Scope:     This metric applies at a project level, across all teams for a given Sprint.

      Threshold Levels:     >0 DHS expects that once automation is in place that the environmental refresh
                            cycle will regularly complete with no failures.

          Data Sources:     Continuous Integration logs

               Formula:     Count of failed refreshes within a Sprint

          Project Team      Facilitate data collection from the continuous integration toolset.
         Responsibility:
    DHS Responsibility:     Gather information from Continuous Integration toolset.


            Tool/Utility:   <Project Team uses an Excel spreadsheet to record data and generate proposed
                            graph>

  <During project planning, the project team should provide mockups/screen shots of the graphs that will
  be used to track this metric.>

 3.6 End User Training
 Before a release is considered implemented, end users must be trained and prepared to use the new
 functionality. DHS, and more specifically the Division of County Offices (DCO), has internal training
 capabilities for efficiently and effectively training staff who will use the system. In order to leverage these
 internal DHS capabilities, project teams must use a Train-the-Trainer approach in order to fully train DHS
 trainers in the new system.
 Note that the Feature Teams are responsible for developing training content within each Sprint. Therefore
 training materials should be ready for implementation during the production release cycle. During project
 planning, project teams are only required to include system training in the scope of their work.

 3.7 Stakeholder Involvement and Communications Management
 Any part of the Software Development Project will impact thousands of State staff and other end users
 across the State. In order to effectively involve and communicate with these individuals, as well as other
 stakeholders of the project, it is necessary to have a strategy in place to proactively manage stakeholder
 involvement and communications. DHS has established establishing a Stakeholder Involvement and
 Communications plan that is ready for review during project startup. This plan fully describes the
 mechanisms and stakeholders with whom information about the project will be shared.
 The project teams are expected to participate collaboratively with DHS in the establishment,
 management, and execution of the stakeholder involvement and communications plan. During project
 planning, the project team should define the stakeholder involvement that is required by the proposed
 project approach such as SME input to Feature Teams. The project team should describe the level of
 involvement that is expected as well as the types of skills required.




AR DHS Agile System Implementation Guidelines v0_13.docx                                                   Page 37
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines




 Section 4: PERSONNEL
 In order to create a high-performing project, people planning the project must establish a collaborative,
 collegial, and integrated project environment. DHS appreciates that every high-performing team is unique
 and incredibly dependent on the individual members who comprise the team.
 During project planning, the project team should prepare an approach to creating, maintaining and
 ensuring that the project team will functions in an integrated manner at a high-performing level and will
 have a collaborative and collegial culture.

 4.1 AGILE PROJECT ORGANIZATION
 During project planning, project teams must define a project organization structure and staffing. All staff
 noted in the structure should be named along with a resume or description of their experience and
 background..
 In the event that a particular team member is not known by name at the time that project planning occurs,
 project teams should provide a representative resume. (i.e. A description of what skills / experience are
 needed for the team.) As a general guideline, representative resumes should target specific skillsets that
 complement a given Feature Team.
 Project Teams are cautioned against proposing representative staff for an entire Feature Team or the
 majority of a Feature Team. This is because, in general, project staff are considered to be key personnel
 who may be used for other initiatives. If only representative resumes are noted, it is difficult to understand
 if people have conflicting assignments with other project assignments.
 It is important to note that the interactions between the various roles and teams included in the
 organizational figure below are mainly illustrative in nature and provide a general guideline to project
 teams. The communication and information exchange within and between the Feature Teams, Scrum
 Masters and the DHS Master Scrum Master and DHS Area Project Owners is expected to ebb and flow
 based on the particular scope of a given Sprint. In other words, the communication lines representing
 these communications and interactions are not fixed and change as needed on the project. The figure
 does however indicate DHS’ anticipated ratios between Scrum Masters and Feature Teams (1 SM to 2
 Feature Teams) and Area Project Owners and Feature Teams (1 APO to 4 or less Feature Teams).
 Additionally, it is expected that team members across all groups will regularly interact with other State
 staff.
 DHS also believes that the project team should not have a rigid hierarchy and reporting relationships.
 This is the main reason that DHS structured the organization in the figure to be shown horizontally; this
 visually indicates that the intended project structure should not be highly hierarchal, but rather structured
 in a way to facilitate rapid communication and information flow.
 During project planning, the project team should understand the ratio of Scrum Masters to Feature
 Teams. Additionally, the staffing ratio of the DHS Master Scrum Master, DHS Area Product Owners and
 DHS Product owner in relation to the project team should also be understood.
 As a general practice, DHS tries to balance the overall project team to avoid bottle-necks across role
 boundaries. As a result, project teams should understand how their organization structures will interact
 with the standard DHS roles.




AR DHS Agile System Implementation Guidelines v0_13.docx                                              Page 38
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines




 4.1.1       SUBJECT MATTER EXPERTS INVOLVEMENT
 Agile development practices are well known for requiring heightened involvement from subject matter
 experts. DHS understands that the speed with which progress is made is largely dependent on the
 involvement of subject matter experts familiar with DHS’ programs. One way that DHS satisfies this
 critical information need for business expertise is through the DHS Project Owner and DHS Area Product
 Owners. Specifically, the DHS Project Owner team serves as the initial subject matter expertise team for
 each of the Feature Teams.
  In the event that the DHS Project Owner team cannot address an information need, they are responsible
 for contacting other DHS SMEs to obtain the required information for the Feature Teams to continue
 making progress.
 During project planning, project teams should understand their approach to obtaining input from subject
 matter experts and the expectation of the amount of time required, beyond the DHS Project Owner and

AR DHS Agile System Implementation Guidelines v0_13.docx                                         Page 39
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines



 Area Product Owners, from DHS SME’s as a percentage of their full-time jobs. As a general guideline, the
 Product Owner and DHS Area Product Owners are fully allocated to the Software Development project.

 4.1.2       Project OWNER
 DHS recognizes the importance of having a single person serve as the Project Owner for the Software
 Development Project. DHS appoints the DHS Project Owner for the project to work with the project team
 to:
        Maintain a business focus for the project - make sure the solution is the right one
        Solely responsible for requirement/project backlog item prioritization based on business needs
        Regularly adjust the solution in response to changing business conditions
        Determine release dates by gathering input from stakeholders
        Defines and clarifies the features that the solution must provide
        Change feature priority in the project backlog as needed before each iteration
        Facilitates external stakeholder validation of components developed during a Sprint
        Accepts or rejects results following an iteration
        Aligns expectations with Area Product Owners
        Participates in the Sprint Planning Part 1 Meeting
        Clarifies understanding of intended solution and provides Feature Teams with resolutions to
         escalated issues and questions
 During project planning, project teams should understand the above responsibilities and provide
 recommendations for modification of the above or addition of responsibilities.

 4.1.3       AREA PRODUCT OWNERS
 As a general practice for projects that have aggressive timelines or large scopes, DHS appoints several
 Area Product Owners to assist the single Project Owner in providing timely responses to Feature Teams
 and guide the solution.
 Each DHS Area Product Owner specializes in a given requirement area or set of requirement areas and
 have the same general responsibilities as the DHS Project Owner in relation to assisting the Feature
 Teams. By appointing DHS Area Product Owners to the project, DHS frees the single DHS Project Owner
 to focus on the big picture of the overall solution progress in meeting its intended needs. Specifically the
 DHS Area Product Owners are responsible to:
        Work closely with the DHS Project Owner to understand the overall priorities of the project
        Maintain a business focus for the product- make sure the solution is the right one
        Facilitate requirement/product backlog item prioritization based on business needs
        Assist in adjusting the solution in response to changing business conditions
        Provide input regarding release dates
        Defines and clarifies the features that the solution must provide
        Facilitate changes to feature priority in the product backlog as needed before each iteration
        Coordinates external stakeholder validation of components developed during a Sprint
        Provide input for the acceptance or rejection of results following an iteration
        Participates in the Sprint Planning Part 1 Meeting
        Clarifies understanding of intended solution and provides Feature Teams in resolutions to
         escalated issues and questions
 During project planning, project teams should understand the above responsibilities and provide
 recommendations for modification of the above or addition of responsibilities.




AR DHS Agile System Implementation Guidelines v0_13.docx                                            Page 40
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines



 4.1.4       MASTER SCRUM MASTER
 Scrum Masters are responsible for the success of Feature Teams, Project Owner, Area Product Owners
 and all other stakeholders in the use Scrum to deliver successful results. Specifically the DHS Master
 Scrum Master is responsible for:
        Guiding Feature Teams, DHS Project Owner, DHS Area Product Owners, and other stakeholders
         in the proper use of Scrum
        Interfacing between Feature Teams and project management (DHS Project Owner, DHS Area
         Product Owners, DHS management)
        Maintain a technical focus for the product- make sure the solution is the right one
        Facilitate requirement/product backlog item prioritization based on technical and business needs
        Regularly adjust the solution in response to changing technical/business conditions
        Assist in planning releases by gathering input from stakeholders
        Defines and clarifies the features that the solution must provide
        Identify changes to feature priority in the product backlog as needed before each iteration
        Facilitates external stakeholder validation of components developed during a Sprint
        Aligns expectations with DHS Area Product Owners
        Participates in the Sprint Planning Part 1 & Part 2 Planning Meetings
        Clarifies understanding of intended solution and provides Feature Teams in resolutions to
         escalated issues and questions
        Facilitates Daily Scrum Meetings
        Empirically assess individual and overall Feature Team velocity within the Sprint and provide
         assistance to correct performance related issues
        Assist the Scrum Masters in resolving issues and impediments
        Work with the team during Sprint planning
        Assist in obtaining decisions required by Feature Teams
 During project planning, project teams should understand the above responsibilities and provide
 recommendations for modification of the above or addition of responsibilities.

 4.1.5       SCRUM MASTERS
 The Scrum Masters are responsible for the success of Feature Teams, Project Owner, Area Product
 Owners and all other stakeholders in the use Scrum to deliver successful results. Specifically Scrum
 Masters are responsible for:
        Working in lock step with the DHS Master Scrum Master
        Interfacing between Feature Teams and project management (DHS Project Owner, DHS Area
         Product Owners, DHS management)
        Guiding Feature Teams, DHS Project Owner, DHS Area Product Owners, and other stakeholders
         in the proper use of Scrum
        Facilitating Daily Scrum Meetings
        Empirically assess individual and overall Feature Team velocity within the Sprint and provide
         assistance to correct performance related issues
        Assist the team in resolving issues and impediments
        Work with the team(s) during Sprint planning
        Assist in obtaining decisions required by Feature Teams
 Scrum Masters are required to support multiple Feature Teams (In the above illustration, DHS graphically
 depicted a ratio of two Feature Teams to one Scrum Master);

AR DHS Agile System Implementation Guidelines v0_13.docx                                           Page 41
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines



 During project planning, project teams should understand how to align Scrum Masters with Feature
 Teams and understand the rationale for the alignment..
 Each Feature Team should be supported by Certified Scrum Masters (CSM). If a project staff does not
 include the necessary complement of CSM members, the project team must describe a plan for obtaining
 the necessary credentials within three months of the project initiation. For each certification, the project
 team must understand details of the certifications, the body that administers the certifications, and the
 requirements that must be met in order to obtain the certifications.

 4.2 FEATURE TEAMS
 Feature Teams are cross-functional, self-organizing teams of seven, plus or minus two, people who
 commit to turning selected Product Backlog items into working products. Each Feature Team is ‘long
 lived’, meaning the team members are associated with a single team for long durations of time (as
 opposed to forming new teams each iteration). The team is given full authority to do whatever is
 necessary, within DHS organizational policy, to achieve the goal at hand. Cross-functional, to DHS,
 means that the team includes ALL the skills necessary to evolve a selected Product Backlog item,
 through all software development competencies into a production-ready product, including all associated
 training materials, documentation and data conversions.
 The optimal Feature Team members are not highly specialized in only one particular skill set or
 competency, but rather have strong capabilities in most of the skills necessary with a particular specialty
 or niche.
 During project planning, project teams should describe why Feature Teams members who are singly
 skilled, how the team members’ skill sets will be broadened and expanded to more closely reflect DHS
 guidelines for an optimal Feature team.
 All Feature Team members who will customize the system should either have a certification associated
 with the system or demonstrate five or more years of experience actively customizing the system. In the
 event that members of the teams who are customizing the system do not currently possess the necessary
 credentials, during project planning, the project team should provide a detailed plan that will be used to
 obtain the necessary certifications within three months of the individual joining the Feature Team.
 During project planning, project teams should also understand why each team is uniquely assembled to
 include the specified staff. Additionally, project teams must indicate on each team members resumes or
 descriptions which experiences contributed to a check in the corresponding row in the associated Feature
 Team table.

 4.2.1       FEATURE TEAM RAMP-UP
 During project planning, project teams may suggest a project staffing approach that involves a gradual
 staffing ramp-up over the first few iterations. Doing so reduces risk and the transition time for new
 individuals joining the project.
 During project planning, project teams should explain a ramp-up strategy that specifically highlights how
 members of the initial Feature Teams will be seeded on subsequent newly formed Feature Teams.
 During project planning, project teams should also indicate when the ramp-up is complete; the point at
 which the project team is at full strength.




AR DHS Agile System Implementation Guidelines v0_13.docx                                            Page 42
 Department of Human Services
 Office of Systems and Technology
 Agile System Implementation Guidelines




 Section 5: PROJECT FACILITIES AND
 RESOURCES
 Sometimes, for a Software Project, DHS does not have adequate space in which to co-locate the project
 team directly with subject matter experts and end users. During project planning, project teams should
 establish project facilities that adequately address the needs for the proposed team. During project
 planning, project teams should describe the proposed facilities plan for the project and include detailed
 explanations of how the facilities align or satisfy the following:
        As open as possible: Avoid the use of cubicles and other dividing layouts such as wall-to-wall
         offices. Furniture on wheels enables Feature Teams to structure their areas more adequately to
         suit their needs in an open floor plan. Availability of low movable partitions to create pseudo-
         rooms and work areas.
        Ample Whiteboard Space: Optimally whiteboards from floor to ceiling on all non-windowed walls.
         Areas where pictures and other ‘wall art’ are hung should be eliminated or kept to a minimum.
        Small Private Meeting Room(s): Availability of several smaller meeting rooms that accommodate
         11 people. Preferably these rooms would be on interior walls to allocate as much window space
         to the open team room as possible.
        Room to Hang “Big Visible Charts”: The space should enable team members to tape various
         charts and paper-assets to vertical surfaces.
        Additional Feedback Devices: Agile teams often use flashing red lights or lava lamps to signify
         that a build has failed or production issues have been detected. The facilities should allow for
         such devices and team-determined configuration.
        Visible Sprint Backlogs: Allow Feature Teams to establish and manage physical backlogs for
         each Sprint. This also enables other stakeholders to browse project status by touring the various
         backlogs.
        Visible Product Backlogs: Provide a space, or spaces, where the product backlog and burn down
         chart can be hung and displayed.
        Food and Drinks- the kitchen space: Include an area where team members can eat and drink.
        Windows in the Open Work Area: Maximize the amount of ambient light that the teams working
         area receives each day. In an Agile context, windows are not reserved for management positions.




AR DHS Agile System Implementation Guidelines v0_13.docx                                           Page 43

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:4
posted:8/23/2012
language:English
pages:43