Introduction to project management topics by kch10832

VIEWS: 0 PAGES: 110

									Introduction to
    project
 management
     topics
                  Persuasive
                  Communicator
  Good
  Controller                          Effective
                                      Planner




Effective         Project                 Good
Manager                                   Trainer
                  Manager


                                     Technically
   Sensitive to                      Competent
   problems of
   change         Organisationally
                  Powerful
  DEVELOPMENT MODELS


• Waterfall
   – Clear deliverables
• Spiral Model
   – Risk assessment
• Evolutionary
   – Staged Delivery
• RAD
   – Fast development enivronment
• Package based
   – Using a foundation
        Waterfall Model




Feasibility
  Study

      Define
   Requirements

              System
              Design

                  Programming
                  and Testing



                          Implementatio
                          n
          V Model




Concept             System
               Incremental




   Initial
Requirements         Increment 1
 and Design        Detailed design,
               programming and testing



                       Increment 2
                     Detailed design,
                 programming and testing



                          Increment 3
                        Detailed design,
                    programming and testing
                Spiral Model




Determine                          Evaluate
objectives,                        alternatives,
alternatives,                      identify and
                     Risk          resolve risks
constraints
                     Analysis

                       Prototype
   Review
           Project plan Reqs/
           Q.Plan       Design/
           Test plan    Code


Plan for next                      Develop &
phase                              Verify
      PRINCE -Overview


• A project management
  methodology
• Other similar ones: METHOD/1;
  SDM;
• Public domain / standard
• Separates management from
  technical
PRINCE - Organisation




Project
Board             Project
                 Assurance
                   Team

Project
Manager
             Stage
            Manager(s)




          Team    Supplier(s)
      PRINCE - Planning/
         Deliverables




         Project Deliverables




Managerial   Technical          Quality
    PRINCE - Planning /
        Tolerances




                       Target


Cost +
                           Tolerance
     -
                           Box




                - +

                Time
         PRINCE - Control




Stages




Phases   Product Product     Product



   Management Control - health of project


   Technical Control - health of product
              Rapid Application
                Development

• Martin - 1992
• Initially synonymous with
  „hacking‟
• DSDM - mid-1990s
• Iterative approach
• Minimum documentation
• Scientific (objective)
  philosophy
Further reading / sources:
Stapleton, J. (1997) Dynamic Systems Development Method, Addison
     -Wesley
Beynon-Davies et al (1999) Rapid Application Development (RAD): an
     empirical review, European J. of IS, vol.8, pp211-223.
      DSDM Principles (1)


• Active user involvement
• Team empowered to make
  decisions
• Frequent delivery of products
• Fitness for purpose is
  acceptance criterion
• Collaborative & co-operative
  approach
            DSDM (2)


• Iterative & incremental
  development
• All changes are reversible
• Requirements frozen at high
  level
• Testing integrated throughout
  life cycle
     Components of RAD


• Joint Application Design (JAD)
• Clean rooms
• Time boxing
• Rapid development tools
• Highly interactive, low
  complexity projects
• Intensive & phased projects
             Time Boxing




             U                U                U
             s                s                s
             e                e                e
             r                r                r
Time box 1       Time box 2       Time box 3
             R                R                R
             e                e                e
             v                v                v
             i                i                i
             e                e                e
             w                w                w
DSDM Life Cycle


                                  Feasibility


           Agree
                                Business Study           Implement
           Schedule



  Create            Identify                       Business         Train
         FunctionalFunctional
  Functional                                       ReviewImplement- Users
         Prototype Prototype
  Prototype                                                ation
          Iteration


          Review                                         User approval &
          Prototype                                      user guidelines

                                   Review Design
                                   Prototype



                          Identify           Create
                          Design Design Design
                          PrototypePrototype Prototype
                                   Iteration


                                       Agree
                                       Schedule
 Project
planning
 topics
  REASONS FOR PLANNING


• Target Verification

• Resource Planning

• Commitment

• Basis for “What-if”

• Enforces Pre-thinking

• Step in Delegation

• Basis for Control
    ELEMENTS OF A PLAN
• Project Activities
   – Tasks / Deliverables
• Dependancies


• Resource Estimates
      £ - mandays - Mb
• Activity Schedules


• Resource Budgets
   – By Task / Person
• Risk Analysis
   – Risk - Reductions - Contingency
• Quality Plan
     ACHIEVES THE GOAL

• Meets the project‟s objectives
              Programmers
                              CONTRACT




    Project
    Manager
                            Customer
• Deliverables based
   – NOT 80% of program XYZ done
   – SHOULD BE program XYZ
     signed off by team leader as
     complete

• All activities catered for
   – Major cost overrun
   – Checklists /Methodologies
     REALISTIC TARGETS


• Short Activities
   – eg 1 week
• Real Estimates
   – NOT “time sliced”
• Dependencies
   – all clearly defined
• Calculated Contingency
   – from risk analysis
   – per task not the whole
   – risk of wrong estimates
   – OSINTOTS
   – rework
PURPOSE OF CONTROLLING
       THE PLAN

              PLAN




REASSESS               OBTAIN
                       AGREEMENT



           PROCEED



• To monitor progress
• To provide motivation
• Input to formal estimating
  technique
• Input to staff performance reviews
• Basis of lessons learnt
• Mainly to trigger replanning
              PROCESS OF
             DECOMPOSITION
                  Project WBS




   Program           Program           Program
   Design            Coding            Testing


Prog Prog Prog   Prog Prog Prog     Prog Prog Prog
A     B   C      A     B   C        A     B   C




 Guidelines:
    – Method of evaluating task completeness
    – Tasks clearly defined
    – Tasks should be assigned to very few people
    – Time duration short enough to monitor
    – Task cohesion (same type of work)
    – Minimise task coupling (interdependency)
                 DEPENDENCIES
Types:
Finish - Start




Start-Start




Finish-Finish




Partial Finish -Start




Predecessor task has another dependent task on it.
Successor task is dependent on another task
            SCHEDULING


Gantt (bar) Chart
    – What to be done
    – Who will do it
    – When it will be done
Activity Network
    – Inter-dependencies
    – Estimated Effort
Resource Matrix
    – Skills/resources required
    – Under/over utilisation
    – Used to smooth team size
Loss Factors
    – Non-productive time
    – Absences
    – Increases with team size
    – 4 days a week productive
      NETWORK ANALYSIS


Steps:
  –   Establish sequence
  –   Timetable for each job
  –   Analysing spare time
  –   Identifying most critical jobs


Advantages:
  – Attention can be given to most critical
    jobs
  – Spare time can be utilised
  – Resources can be balanced
  – Project Completion dates advanced
  – Updating/Revision of plans easier


Diagrams:
  – Activity on the arrow
  – Activity on the node
  NETWORK CONVENTIONS
Start and Finish Events
  Single start activity and completion
 activity
Left to right Dependencies
              A       B

No Dangling Events
   A          B       C        E (finish)
                               D.........?

No Looping
        A                  B
                  C

Avoid Redundancy
        A         B        C



Example Node
                      EST = Earliest Start Time
   Activity
                      LST = Latest Start Time
  EST DUR
                      DUR = estimated DURation
  LST   SLK           SLK = SLacK time (=LST-EST)
     WHY ESTIMATING IS
        IMPORTANT

• To determine feasibility

• Calculate when and at what cost
  a system can be delivered

• Deciding on staffing policies and
  how to carry out a project

• If our estimates are inaccurate
  then our ability to control and
  deliver the system on time and
  to budget is affected

• Affects all subsequent stages if
  we get it wrong
    HOW ESTIMATES FIT INTO
      OVERALL PROCESS

  It is one element of an iterative
     process which includes:
       – information gathering
       – breaking down the job
       – estimating of effort and skills
       – scheduling
       – monitoring and control
       – reporting



Accuracy
100%




                 Information Available
WHY THERE ARE PROBLEMS


• Inappropriate expectations of accuracy
• Lack of experience and methods
• Different methods used by different
  people
• Poor record in the past
• Failure to take into account changes in
  development environment
• Lack of separation of estimating from
  scheduling
• Skill level assumed is not clear
• Elapse time between original estimate
  and the event
• No estimate for dealing with problems
                 METHODS


  Known
  factors              MODEL              Estimate



• Function Point Analysis
• Cocomo - lines of code based
• Pert - weighted average
        » (A+4B+C)/6
• Ratios
        » High Level (phases)
        » Low Level (tasks)
• Standard Estimates
        » Fixed
        » Multiplied by number of items
        » Based on complexity / size
• Multiple Estimate
        » Compare and contrast
• Evolution of a local model
 FUNCTION POINT ANALYSIS

Steps in performing FPA:
• Defining the system boundary
• Defining the parameters
    – external inputs
    – external outputs
    – internal logical data groups
    – external logical data groups
    – inquiries
• Identifying unique occurrences of
  parameters
• Complexity assessment
    – low, average, high
• General System Characteristics
    VAF = (TDI x 0.01) + 0.65

Final FP score = unadjusted score x VAF
                COCOMO


• Basic level
   Effort = a x sizeb  [size = KLOC]
   – Organic (small scale)
       Effort = 2.4 x size1.05
   – Embedded (large, formal)
       Effort = 3.6 x size1.2
   – Semi-detached (medium)
       Effort = 3.0 x size1.12


• Intermediate
   – takes other factors into account

• Extended
   – breaks the project down into
     phases / tasks
      PROJECT CONTROL


 Planning alone does not ensure
   the success of a project
 The project must be controlled

   – monitoring progress
   – evaluating performance
   – making adjustments
   – reacting to changes




                PLAN



FEEDBACK       CONTROL
               LOOP           INFORM


           ACTIVITY
   CONTROL PROCESS

Timesheets                      Change
                                Requests




                INPUT
             INFORMATION



             QUESTIONS


               Are we on
               schedule?



              OUTPUTS



Revised                    New Resource
Plan           Progress    Requirements
               Report
    SLIPPAGE / VARIANCE


• Common Causes
• Calculating Slippage / Variance
   – Earned Value
   – Use of network chart
• Actions
   – Extra Manpower ?
   – Active management
      » Recovery plans
   – Work longer / harder
   – Training
   – Examine all possible solutions
   – Hustle
   – Review critical path
   – Smooth the way ahead
   – Remove / Reduce tasks
      » phase system in?
  Quality
management
  topics
   DEFINITION OF QUALITY


Crosby defines quality as:
Conformance to requirements
   – not “good”ness or elegance

• Quality is implemented through
  prevention of defects
   – not through post-manufacturing
     inspection

• Performance standard must be
  zero-defects
   – not “that‟s near enough”

• The measurement of quality is
  the price of non-conformance
       COST OF QUALITY
• Prevention
   – Planning
   – Standards
   – Training
   – Improving the process
• Doing
   – Cost of Production
       » eg production of program design
• Appraisal
   – Reviews & Testing
• Failure
   – Reworked & Scrapped Work
   – Data Corrections
   – Retests
   – User Discussions

Cost of Q. = price of non-conformance
              CUSTOMERS

• Identify your customers
       Internal / External
       Senior / Junior
• Involve throughout the project




Mere Perception
“The individual perceives service in his or her
own way” Arch McGill (IBM VP)
Tom Peters adds “....in his or her own unique,
idiosyncratic, human, emotional, end-of-the-
day, irrational, erratic way.”
           QUALITY ETHOS


• Quality comes from people

• It happens because YOU want it
  to happen

• Quality staff = Quality work?

• Avoid the “we‟re no worse than
  anyone else” syndrome

• Ownership = motivation for
  quality

• Difficult to improve 1 thing
  100%; It is easier to improve 100
  things by 1 %
Book : A Passion for Excellence, Peters & Austin
               Accumulating Errors
             Ideas, wishes and requirements


Req’mt        Correct    Faulty
Definition    Req’mt     Req’mt



System        Correct    Specif’n Errors induced
Specif’’n     Specif’n   Errors   from req’mts




                                        Induced errors
Design        Correct    Design              from
              Design     Errors
                                      Req’mts      Specfic’n



              Correct    Program        Induced errors
Coding        Program    Errors              from
                                    Req’mts Specfic’n Design




Testing & Correct     Corrected      Known
                                     Uncorrected          Unknown
Integration Operation Errors                              Errors
                                     errors



             Software with known and unknown faults


                                  Source : Wallmuller Fig 1.3
    Percentages of fault costs
 compared with development costs




                  Management                           9%

      Quality Assurance, Configuration Mgt             6%



 1%                           CORRECTIONS             39%

 5%
           4%
          10%       6%
                   12%             10%
Rqmts                              10%               18%
         Sys
         Spec’n                                       9%
                  Detailed
                  Design          Coding       Integration &
                                               System test




                   Adapted From: Moller
                   in Software Quality & Reliability Ed. Ince
       QUALITY PLANNING

        Quality Management System


  Methodology              Development Environment
     Documentation Practices         Standards
 CASE tools     Configuration
                Management      Review Procedures




                 Quality Plan




Project Plan                         Test Plan
    CONTENTS OF THE PLAN
•   Purpose
•   Reference Documents
•   Management
•   Documentation
•   Standards, Practices, Conventions, and
    metrics
•   Reviews and Audit
•   Test
•   Problem Reporting & Correction
•   Tools, techniques, and methodologies
•   Code Control
•   Media Control
•   Supplier Control
•   Record Keeping
•   Training
•   Risk Management
USE OF SOFTWARE METRICS


Types:
   – Effort Used
   – Productivity Rate
   – Lines of Code
   – Defect Data
   – Complexity
   – Mean Time To Failure
   – Change Requests

Uses:
  – Compare with Plan/Expectations
  – Identify Problem Areas
  – Identify Areas for Using Tools
  – Improved Information for Future
    Projects
  – Implementation Decision
               RISKS
• System will never be delivered
• System will be delivered late
• System will exceed budget
• Project will divert user resources
  to an unacceptable extent
• System will lack functionality
• System will contain errors
• System will present difficulties
  to users in using it
• System will be difficult / costly to
  support / enhance




       Anyone of these risks
     could lead to project failure
     COUNTER MEASURES
Objective :
   – to reduce / eliminate risk that
     something will cause the project to „fail‟
Examples:
• Extra tasks
   – eg develop prototype to check user
     interface is okay
• Contingency
   – eg extra time available
• Ensure controls in place
   – eg contracts signed off
• Avoid assumptions
   – eg excellent programmer from last
     project will join you on new project
• Have options available
   – eg if performance problems then have
     extra machine time available
• Avoid promises
   – eg use „windows‟ for implementation
     dates
     QUESTION / METRIC
Where are the errors found?
      » Error location
What type of errors are they?
      » Error classification
What is our productivity Rate?
      » Effort
      » Deliverables Signed Off
      » Number of lines of code
How many more test runs do we
 need?
      » Number of outstanding errors
      » Error location
How successful are our
 inspections?
      » Effort
      » Number of defects found
How can we improve the
 development process or what
 training is required?
      » Origin of Error
      Analysis of Defect Metrics

            Defect Density



No.
                                *
of
                                        *
Defects              *    *      *      *
                                **
                     *         *
                          **
                     **
                     *
           *       *
                     *
           *   *


                                     Size of
                                     program
                 ORIGIN OF DEFECTS

                                  Functional
                                    Rqts




                 Coding


                                                        Implementaion




                   Detailed
                   Design                      System Design
No. of Defects




                    A         B            C       D           E
                                   Sub-Systems
              RELIABILITY




No. of
Defects
Outstanding




                                  Time


                             Cumulative
                             Defects




                            Failure Intensity


                                  Time
        TARGETING MAIN PROBLEMS

          Pareto Chart

           1                                                     1
          0.9                                                    0.9
          0.8                                                    0.8
          0.7                                                    0.7
          0.6                                                    0.6
 Relative
          0.5                                                    0.5
Frequency
          0.4                                                    0.4
          0.3                                                    0.3
          0.2                                                    0.2
          0.1                                                    0.1
           0                                                     0
                Redundant   Missing    Discovered    Customer
                 records     Data        Errors     Complaints
                              Error Categories
   PRINCIPLES FOR USING
         METRICS

• Pragmatism and Compromise

• Measuring People - Don‟t!

• Modelling = Simplification

• Ask not for whom the bell tolls -
  ask why?

• The sum of the whole is greater
  than the constituent parts

• Culture Shock!
 REVIEW OF DELIVERABLES


• Team Leader Review
    – Eg. Programming Team Leader
    – Check for conformity, consistency and
      completeness
• Peer Review
    – Eg. Another designer within team
    – As with team leader review
• Walkthrough
    – Eg Designer => Programmer
    – To confirm understanding and allow
      „customer‟ to ask questions
• Inspections
    – Several people
    (Eg. Moderator, Author, Customer,
      Specialist, Peer, User)
    – To trap and correct defects
    – To review against requirements
     INSPECTION PROCESS


Input                             Exit
Documents Deliverables Checklists Criteria




             INSPECTION




   Defect and             Correct
   Process Data           Material
INSPECTION CYCLE

 Produce Deliverable


  Moderator checks
     suitability


Choose inspection team


  Overview meeting

                         Reinspection
     Preparation         if >25% changed


  Inspection Meeting


       Rework


  Moderator Sign-off
 Walkthrough Vs Inspection

                        Walkthrough   Inspection

Precise Entry /            None       Compulsory
Exit Criteria

Checklists Used         Sometimes      Essential

Rework Stage              Implied       Explicit

Defects discussed/        Usually       Never
corrected in meeting

Leader of meeting         Author      Independent

Defects formally            No            Yes
recorded and analysed

Defects found               10            25
per review

Defects found               7              1
per man hour
  CLEANROOM APPROACH


• More effective than debugging

• 90% defects found before test

• Total defect count drops

• Forces problems out early

• Based on statistical quality
  control & mathematical
  verification

• Error reduction based on areas
  of most frequent usage
             TEST STAGES

                    Black Box Test
Requirements                         Acceptance
Specification                          Testing



   Systems          Black Box Test    System
 Specification                        Testing



(sub)System         White Box Test   Integration
   Design                              Testing
                    Black Box Test


                    White Box Test
   Module                             Module
   Design           Black Box Test    Testing




                           Coding




     Source : Wallmuller
TEST DOCUMENTATION

Quality         Project           System
 Plan            Plan          Documentation



                Test Plan




   Test
  Design
Specification
                             Test
                           Cases &
                          Procedures



          Test execution

     Test Log

                      Test Incident
                      Report
                TEST PLAN
• Overall strategy for testing system
   – shows how tests will “prove” the system,
     including such things as :
       » stress/performance testing
       » recovery
       » regression testing
• Objective for each kind of test
   – avoid overlaps
• Criteria for completion
   – how to decide when to stop
• Test schedule
   – tasks & dependencies
• Responsibilities
   – team organisation
• Resources
   – eg. automation tools
• Test procedures & documentation to be
  produced
   – test details
       » eg. conditions, data, expected results,
         instructions, tools
   – evidence
       » eg test logs, error log
    AUTOMATING TESTING

• Capture & Playback
        » data input
• Comparator
        » file/DB
        » program code
        » output vs expectation
• Input Generation
        » for stress testing
        » use of random numbers
• Verification (static analysis)
        » eg compilers, spell checkers, style
          checker
• Simulator
        » simulates real world
• Test Harness / Test Driver
        » enables tests to be run unattended
• Test Coverage Measurement
        » checks if all conditions/paths tested
• Debugging
        » observation of variable states
 Software
management
  topics
People Management (1)




   Skilled &                 Successful
   knowledgable              application of
   workforce                 technology




“In software development, talented
programmers are known to be ten
times more productive as the less
talented members of a team”
Cusumano (1997) How Microsoft makes large
teams work like small teams, Sloan
Management Review
    LEADERSHIP STYLES


Autocratic (Boss-Centred)

   – Manager makes decision and
     announces it
   – Manager sells decision
   – Manager presents ideas and invites
     questions
   – Manager presents tentative decision
     subject to change
   – Manager presents problem, gets
     suggestions, makes decision
   – Manager defines limits, asks group to
     make decision
   – Manager permits sub-ordinates to
     function within limits defined by
     superior



Democratic (sub-ordinate centred)
        TEAM BUILDING


• Ensure commitment to goal
• Team loyalty to one another
• Obtain team agreement
• Encourage contributions and
  views
• Listen to views - take action!
• Organise team to give clear
  responsibilities and structure
• Optimise team size
      » 5 - 10 people per team
• Breakdown of team should
  minimise need for co-ordination
• Realistic but positive leadership
• Work closely with customers
• Manage conflict
          COMMUNICATION


Within Team
   – Team Meetings
   – Newsletters
   – Noticeboard / Wallchart
   – Top / Down & Bottom / Up
   – Peer to Peer
   – Formal System Information
      » eg change requests, new CASE tool


External to Team

            Management




  Users        Team          Other Teams




             Suppliers
         PRODUCTIVITY


                                 Perfectly
months                           partionable task
                                 (with no
                                 communication)



                       People

                                 Unpartionable
                                 Task
months




                       People

                                 Task with complex
months                           interrelationships




                        People


         No of links = n (n-1) / 2
               GOAL SETTING
SHORT TERM
Use of work plans
Work Plan for Ian


        Task             Deliverable    Estimate   Target

Prepare for interview   Questionnaire   8 hours    23/11

    Interview            Completed
Marketing Manager       Questionnaire   4 hours    24/11

 Write up interview       Minutes       6 hours    25/11

LONG TERM
Update data model        Data Model     2 hours    25/11
Use of job reviews / Appraisals
  Update DFDs          DFD              8 hours    27/11
   – frequency?
   – major objectives
   – development of staff
   – review objectives
   – No shocks
 HOW DO WE MANAGE THE
      CUSTOMER?


• Know their business

• Set clear responsibilities

• Communicate frequently

• Involve them throughout

• Give low expectations

• Use contracts

• Plan for changeover
       COMMUNICATION


• Plan Involvement
• Users involved throughout
  project
• Inform of changes to plan or
  design
• Demonstrate system early
• Report progress
• Ensure users report problems
• Communicate information on
  new releases
• Use of contracts / service level
  agreements
• Measure satisfaction
• Get to know the customers
MANAGING EXPECTATIONS




The user’s views of what to expect develop
  throughout the project


20% of programming will give 80%
  of functionality - Code this first!

Some Guidelines:
   –   Tell them the worst
   –   Keep the good news until it is certain
   –   Prepare users for problems
   –   Be clear on essential vs. desirable
   –   Do not assume functions should be
       computerised
        MANAGING THE
       IMPLEMENTATION

• Preparation for „live‟ running
   – user procedures
   – operation procedures
   – training
   – control procedures
   – conversion planning
   – regression plan
• Going „Live‟
   – Conversion of data, etc..
   – Technical environment
   – Software release
      » version control
   – Hardware installed
   – Customer Acceptance
      » sign off contract
 WHY DO IMPLEMENTATIONS
          FAIL?
• System does not meet requirements
• Lack of planning
• Lack of management action after
  implementation
• No success factors to judge by
• Lack of training
• Poor user morale / no desire
• Over expectations
• Not making full use of the system
• Benefits not achieved
• Resistance to change
• Fear
      » absenteeism
      » avoidance of system
      » redundancies


Result :
  – Effect on Business
  – Internal
  COST / BENEFIT ANALYSIS

• COSTS:
   – Installation
   – Running
• BENEFITS
   – Tangible
          » measurable now
      – Indeterminate
          » measurable afterwards
      – Intangible
          » not measurable


Simple Payback Equation

Time =             Installation Costs
Period      Benefits (p.a.) - Running Costs (p.a)

eg.       100,000      = 4 years
      (30,000-5,000)
    PRESENTATION OF
  COST/BENEFIT ANALYSIS

• Simple Presentation
   – eg graphical



      +          Implementation


      £
      -                      Break
                             even point
   – eg charts
            Year 1 Year 2 Year 3
   Costs      10,000  5,000    3,000
   Benefits        0 12,000 15,000
   Cash flow -10,000 +7,000 +12,000


• Put detail in back up documents
• Agree management requirements first
 ENSURING THE REWARDS
      ARE REAPED
• Maximising Benefits
  – Winning over the users
  – Training / Procedures
  – Plan to gain benefits         (&rigorously
    manage accordingly)
       » eliminate functions
       » regroup staff
  – Contingency plans
  – Selling to multiple customers or other
    companies

• Minimising Costs
  –   Good planning and control
  –   Being effective in development
  –   Controlling suppliers
  –   Managing risks
  –   Reducing maintenance
       » minimising cost of making change
       » minimise number of changes
       Why the productivity
            paradox?

• Amount of money spent
• Benefits not matching investment
• IS over hyped?
• Competitive advantage - disaster
  dichotomy
• Difficulty in estimating costs and
  measuring benefits
• Timing of returns
     Types of justification


• Cost / benefit
   – Payback period
   – RoI
   – Cashflow
• Business Value
   – RoM
   – Information Economics
• Strategic Value
   – CSF
        Changing nature of IS
            justification




                                 Repositioning
Making
money
                               Differentiation


Saving
money
                         Effectiveness


                  Efficiency


                                     time
Source: Lincoln (1990)
Moving from a rational to
   interpretive view



   Viewpoint: Objective / Rational



       Efficiency Zone




     Effectiveness Zone




     Understanding Zone



   Viewpoint: Subjective / Political
    Transforming the IT function



     From Systems Analysts            From Monopoly Supplier
     to Business Consultants          to Mixed Sourcing




From Craftsmen
                                                From Business
to Project Managers
                                                to Industry Standards




                         PURPOSE           From Decentralized Bias
   From Large Function                     to Centralized Topsight
   to Lean Teams

                      From System Provider to
                      Infrastructure planner



       Transformation of the IT function at
       British Petroleum,
       Cross, Earl and Sampler, 1997
    The Software Process




Requirement Activities Product
        The Problem ...


• “For every six new large-scale
  software systems that are put
  into operation, two others are
  cancelled. The average software
  development project overshoots
  its schedule by half; larger
  projects generally do worse”
    – W. Wayt Gibbs
         Software Process
           Improvement

• SEI believes “the quality of software is
  largely determined by the quality of the
  software development and
  maintenance processes used to build
  it”
• ESI “focuses on the organisational and
  management challenges of producing
  software, as it is increasingly
  recognised that purely technological
  solutions yield benefits that are difficult
  to sustain”
The Capability Maturity Model




                Maturity levels
    Process
   capability               Contain

                      Key process
                         areas
       Goals
                                  Organised by


                           Common
   Implementation or        features
   institutionalisation              Contain


                             Key practices
          Activities or
          infrastructure
The Capability Maturity Model




CMM Level                  Focus


  1 - Initial    Competent people & heroics

2 - Repeatable      Project management
                         processes

 3 - Defined     Engineering processes and
                   organisational support

4 - Managed      Product and process quality


5 - Optimised       Continuous process
                      improvement
Process Improvement Cycle




 Build                Implement
 executive            action plan
 support




    Build                 Develop
    improvement           improvement
    infrastructure        action plan




             Assess the
             organisation’s
             software process
Process Improvement
    Infrastructure




Division    Steering Committee
Executive



Program
Manager
                  S/w Engineering
                   Process Group

Software
Manager


               Technical
Software     Working Groups
  Staff
         Software Process
    Improvement and Capability
          dEtermination
SPICE is an attempt to draw
  together the CMM, Bootstrap
  and other approaches to date.
The SPICE project has three goals:
   – to develop a working draft for a
     standard for software process
     assessment
   – to conduct industry trials of the
     emerging standard
   – to promote the technology
     transfer of software process
     assessment into the software
     industry world-wide
     Process Improvement
         - the Benefits

• According to a survey by SEI…
   – Productivity gains/year 35%
   – Time to market (reduction/year)
     15-23%
   – Post release defects 39%
   – Business value ratio 5:1
     (benefits:costs)
• Savings at Raytheon
   – Rework cost reduction $16m
   – Productivity gains 130% over 5
     years
                                      The impact of maturing


                                                                 Customer
                            100                                  Satisfaction

                                                                 Productivity
                                  80
% of companies good / excellent




                                  60                             Staff morale




                                  4
                                  0

                                  20


                                      Initial    Repeatable           Define
                                                Maturity Level
     Process Improvement
         - the Pitfalls

• Cost & time exceed expectations
• Most difficult aspect appears to
  be project planning
• Keeping the people on board
   – “I‟m a programmer, I don‟t want
     all this!”
• Using the metrics to improve
• Organisational „politics‟
• Turf guarding
           Current Status


• SEI
   – 1991 : 81% at level 1; none at 4or 5
   – 1999: approximately 100 at level 4 or
     5

• ESI : benchmark of European
  companies
   – 66% have common coding standards
   – Only 56% track actual project costs
   – 75% had post implementation review

   – UK one of leading European
     countries in software management
     practice
    Some Key Aspects of
       Improving (1)

• Senior management need to
  actively monitor SPI progress
• Set clear SPI goals
• Staff time dedicated to
  improvement
• Clear, compensated assignment
  of responsibility
• SEPG staffed by respected
  people
• Technical staff involved in
  improvement
     Some Key Aspects of
        Improving (2)

• Team work
• Learn from own experience
• Starting from where you are not
  where you would like to be
• Introduce new ideas in order of
  „hit rate‟
• Start to measure the processes
  as soon as possible
• Focus on early removal of
  defects
    A Typical Approach to SPI


•   Single champion
•   Motivated by current problems
•   Diffusion is ad-hoc
•   No corporate view
•   Will often fail
     – cost > expected
     – champion leaves
     – benefits take time
     IDEAL model



        Leveraging    Acting




Initiating




       Diagnosing    Establishing
 Product Management Issues


• Rapid changes in the market
  place
• First product often wins
  significant market share
• IS function needs to be seen to
  be profitable
• Applications are developed as
  one-off activities
• Reuse limited to „cut and paste‟
  approaches or low level
  components
    Software Maintenance


• Types of maintenance
    – corrective
    – adaptive
    – perfective
• Significant cost of maintenance
• Poor perception
• Lacks good staff & continuity
  from development team
• Patchwork changes not strategic
  re-engineering
  Moving from Backroom to
        Boardroom

• Build a responsive IT
  infrastructure to enable change
• Use „business level‟ components
  as well as low level components
   – helps executives relate to reuse
• Components
   – clear interfaces and behaviour;
     reusable
• Manage the risk of poor
  components
     A Packaged Software
         Organisation

• Market leader in demographic
  data analysis & software
• Multiple products targeted at
  different markets
• Previous experience with OO
  and technical level components
• Key changes in the business
  leading to multiple products and
  high levels of reuse
                          Product Portfolio




                          Customer Data          Sales
                          Relations Mining       Manage
                          hip                    ment
                          Manage
                          ment
Level of Sophistication




                                 Bespoke Solutions
                                     Market
                                     Profilin
                          New        g           Vertical
                          Prospect   Package     Market
                          Finder     s           Products




                          Relationship Market Local
                          Marketing Analysis Area
                                              Marketi
                                              ng
  Component Architecture




       Packaged          Bespoke
       Applicatio       Applicatio
          ns               ns




MultimediaMapping Profiling Reporting


   Internet         Statistical KBS
            Databases
   Browser            Models Models




         Technical Components
Organisational Structure




                                Source of
                                Software
Ownership


                               Application
                                 Teams
 Executive
 Strategy    Application Portfolio
  Group


                               Component
                                 Teams
         Business Components
 Technical
 Steering
  Group                         External
                                Sources
         Technical Components
     A Flexible Approach


• Build for change
• Components reduce repetitious
  work
• Reduces lead time to market
• Avoids mass legacy systems
• Involve executives in
  maintenance decisions for
  added value of current products
              Conclusion


• We need to manage people,
  process and product
• We need to reconsider some
  traditional views of project
  management
• The future: “chaos not
  panaceas” (Baskerville and Smithson, 1995)

								
To top