Proposals.xlsx - MSE Studio

Document Sample
Proposals.xlsx - MSE Studio Powered By Docstoc
					Related Artifacts   Evaluation Status

AUP                 Satisfactory



Meeting Management
Agenda template
Minutes template     Satisfactory


Decision making


Processes            Abandoned







Feature Document Acceptance test suite.   Finished






Coding Standard           Satisfactory

Issue Tracking Proposal   Satisfactory


Testing environment       Satisfactory
Demo scripts (demo)                 Satisfactory


Resource schedule spreadsheet

WBS list:
http://msesrv4e-   Finished
http://msesrv4e-         Finished


Tasks list:
Iteration list:



Task list:
Risk list
Imhotep SRE memo


Process wiki:
sitory/Business%20Definition/SOW.doc        Finished

Tasks list:
Iteration list:
Action Items                                  Satisfactory

Conflict resolution                           Satisfactory


Coding Standard -> section 'Committing Code'. Satisfactory


Product backlog
Resource schedule

SPID process        Finished

Tasks list
Iteration list




Owners             Active Date

Process engineer       9/23/2008

Project manager
Process engineer       9/23/2008

Process engineer
Tools specialist       9/23/2008

Process engineer      10/23/2008
Process engineer   9/23/2008

Process engineer   9/23/2008

Technical lead   6/26/2009


Requirements Engineer   1/12/2009


Developers;Clients.                                    9/1/2008

Requirements Manager, Project Manager, Test Manager   6/26/2009

Tech lead
Developers                                            6/26/2009
Tech lead
Developers        5/27/2009

Testing manager   6/26/2009

Tech lead         6/26/2009

Testing manager   6/26/2009
Tech lead          6/1/2009

Tech lead
Integrator        6/26/2009

Project manager   10/1/2008




Project manager   9/12/2008
Tech lead
Project manager   10/1/2008

Project Manager   10/1/2008

Project manager   10/1/2008



Process engineer   3/26/2009

Process engineer   9/23/2008

Process engineer   5/27/2009

Tech lead
Integrator         6/13/2009

Process engineer
Project manager    5/27/2009
Project manager
Process engineer   5/27/2009

Tech lead
Project Manager    6/20/2009

Project manager    6/19/2009

Tech lead          6/26/2009
Project manager
Process engineer                   7/6/2009

Project manager                   1/12/2009

Requirements manager, Tech lead   7/15/2009

Does our SOW deliverables and overall plan in general match AUP phase milestones?

How often people miss common time?

Was there any issues with storing and sharing documents using SharePoint? (loss of information,
difficulties to find artifacts and etc.)

How many proposals are needs improvement and are not being improved?
Is there any proposal, which is not up-to-date and people are not aware about?
Is everything that needs to reflected on tracked and reflected each iteration?
Are reflections being done in timely manner?
How often such meetings take longer time than it's planned?How often agendas are not
followed?How often people submit minutes or agenda late?How often Scribe does not enter action
items to the Action item list?
Considering that we have only two client meetings, two status meetings more than one occurance of
something listed above will be considered not a good sign, meaning that people do not perform it.

Backlog and burndown emphasize the tension between items to be done and the amount of time to
produce things. By focusing discussions on the burndown we hope to be able to keep our time
restraints highly visible to the client.

How many conflicts arise during Iteration? (more than 2 should be considered as a bad trend)
How long does it take in average to decide on something?

We use the process framework processes as much as possible because they were designed to work
together. However, frameworks are generally not complete and we had to define our own.

ACDM provides more concrete steps and guidelines for architectural design, compared to other
process models.
It fits well in the AUP process because they are both iterative in nature.
Significant support is available through access to the author.
Summer semester:

Are you seeing architectural diagrams and documents with the changed architecture
Are you seeing architectural metrics? (%changed and % added/deleted)

This step is taken from ACDM. Pangea reflects that it worked well for them. Other teams used as


Using GMF/GEF/EMF techniques is a constraint on the team. These techniqes mainly driven by the
MDD approach.

This is an industry standard. pervious teams reported this worked well for them.

Spring-Summer semesters:
Are we doing prototypes for new features, where we don't have enough feedback from the
clients?Does it helps us to save time and get good feedback from the clients?Does it meet clients'

The client is familiar with use case modeling and prototype as a basis for requirements elicitation.
We know that it worked for the case of Pangea, and that client was comfortable with this approach.
We wanted to continue feature identification for exisiting requests, bugs along with catering to new
requirements. The team felt that this process was reasonable to start with elicitation. The team
wanted a feature centered approach for iterative elicitation, experimentation and found as a
fundamental unit of client and architectural priority.

The team realizes that not all requirements are of equal client and architectural importance.
Management should involve finding critical features and iterate over the elaboration,
experimentation to develop with test suite, architecture incorporating feature functionality. The
team realizes that experiementation with POCs is critical for identifying correctness of requirements
The feature test plans developed before hand will help us identify interfaces for architecture and can
act as architecture drivers Finally, the development strategy can run agile way, and the team can
strategize implementation
Are recordings being USED now?
Are they being recorded and uploaded to SharePoint in consistent manner?

Do people know how much requirements have been completed?Do you see the new requirements
talked about in the client meetings reflected in the requirements list and eventually in the product

The basic rationale is the need of knowing the quality, stability, completeness of requirements. To
achieve this, the team needs to think of some metrics for requirements. These are going be the
factors while considering elaboration and implementation phase estimation. At any given point of
time the team will be able to know if a given set of requirements has the capability to complete.
Surely requirement metric factors are one calibration aspect for the overall estimation process.

Many teams used it and found it helpful.
Provides concepts not available like standup meetings everyday

Some key things we can make use of, considering our state so far, With the risk evaluation we can
certain look for a confident risk factor for our estimation Probability/Stability factors for the feature
need to be thought about from Requirement Inception so far. Usage of Imhotep KT feature team’s
data for elaboration estimation factors Can we look at feature point/use case point/ Adapted
wideband delphi/ Pangea’s historical data for coming up with factors ! ?

Summer semester:

Are people feeling that they are learning paired-programming and new technologies?
Shared understanding should be improving.

Use static analysis tools to look for unconformities or just warnings? (discouraged use of APIs and so

How many unconformities with coding standard per 100 LOC? (code reviews) (if less than x than OK)

How many unconformities reported to the Sharepoint "Issue list' per week? (if less than x than OK)

Summer semester:

Are people following the guidelines? (data consistency)
Is Mantis convenient for tracking status of the bugs?

Are all TODOs addressed before shipping?
Is code readable (maintainable) (definition of maintainable and readable code can be found in the

Are there tests for each functionality?
Are demos being held?Are clients feel satisfied with the functionality being delivered (developed)?

Summer semester:
Do we have release ready for the demo?Did the regression testing find any bugs before they went
out of the field?Do we deal with the bugs as prescribed by the process?

How accurate is the data?Does it help planning or just consumes more time?Is being updated or


Does it help to measure our progress?

Does it support our planning activities?
The group estimation approach allows us to get commitment from the members involve and to
leverage our lack of experience as a group from our individual experiences.


Do the hours tracked match to available number from the resource schedule.

How often people are updating the tasks?


Mentors' feedback.
How often planning for new iteration is not accomplished in timely manner (before the actual start
of iteration)?How much time people spent on trying to understand their tasks? (we need to track it
somehow?)How often do we need to replan in between iterations?

Does it help the team to report status efficiently?

How many critical risks, which had a serious impact on the project, were missed?On a postmortem
analysis, does the group feel that risks that where identified where properly mitigated? (considering
the cost of mitigation vs cost of living with the problem)

This is a maintenance project, but we don't want to spend all our time doing maintenance and not
be credited for it as it will likely happen if we run a project driven by the scope of the new evolution
style functionality.
We have both fixed time and resources, why not let the client handle the scope problem and the
team focusing on giving them the best efficiency possible.

We needed to make it a scope driven project to satisfy the Studio stakeholders but we also needed
to limit the risk of falling into maintenance.
The burndown is still a good idea, specially in an AUP project like this.

having an extra member sharing a task increases the likelihood that the task will get done.
How many action items team generates in average every week?How many actions items slipped?Are
they being reviewed during formal status meetings?

How many conflicts weren't resolved or became even worse? (more than 2 means that it's
something repeatable, bad trend)

Do team members have any misunderstanding regarding each others role's responsibilities? Do team
members aware about the processes their role is responsible for?

Was there any development downtime (people were not able to work, because someone broke the
code?) - more than twice per iteration is not a good sign.

Does it help team to improve communication, avoid conflicts between team members and make
development tasks of developers more transparent for each other?

How we are accomplishing the milestones in SOW?

How many unplanned tasks are usually generated per Iteration?How many tasks are usually slipped
or not being accomplished per Iteration?

Do the hours tracked match to available numbers in the resource schedule?Are people updating
tasks in time or if not how often they are late?

Summer semester:
Are design artifacts being produced?Are architecturally critical (important) components
being brainstormed and designed with the Tech lead before starting implementation?
Does everyone report status regularly?How long is the status? (longer than 15 minutes?)
- Are these meeting useful for the team? Do they help to avoid development delays? - Is reserved
time being used? Are meeting being scheduled?

How often team schedules such meetings?
Are they useful? Are they helping to identify bugs or expose requirements incompleteness to the
whole team?

Use the Agile Unified Process.

Use an existing process, so we don't need to invent our own. We chose AUP, because we have a team
member who is experienced with UP. We chose AUP over other processes, because it has long term
planning goals and short term reflective cycles. Also although AUP describes general (high-level)
activities and goals, it does not limit us to some particular implementation approach, so that as long
as goals and milestones are achieved we are free to adjust and experiment with it.

Summer semester:
Since schedule has changed, common time also became longer. (12pm-7pm)
Fall-Spring semesters:

Team lead with collaboration from the rest of the team establishes specifc time periods during the
day, when the whole team must be working together in the same place.
Common Time is fixed in the team' calendar, so that everyone in the team is aware about it.
Priority is working on the project, but if needed, we can work on something else.

Fall-Summer semesters:

We will use Sharepoint as a document, task, issue repository. It is a flexible and customizable
solution that can support all the types of needs we have from tasks to documents and the like.
Additionally, one of our resources is familiar with the tool and is skilled in configuration and

Summer semester:
Continuing doing surveys.
Process manager based on the last reflections and current trends in the team decides what
proposals should make it in the survey.
Process manager setups with the team a follow-up reflections review meeting for about 30-60
minutes in the beginning of every new iteration.
Spring semester:
Same idea as in the Fall, but we use surveys instead. We have been doing it for two iterations and it
seems to work.
Fall semester:
The team will hold reflection meetings between every iteration before planning the next iteration.
The reflection meetings will allow the team to share the good and bad points of all processes and
activities. These reflections can then be used in subsequent planning. We decided to replace the
above with surveys to reduce the time spent on relflection meetings.
Spring-Summer semesters:

Meetings will have a defined agenda
Before meeting initiator of the meeting will assign three responsible person for three different
meeting roles:
Facilitator - person who is responsible for facilitating discussion during the meeting.
Scribe - person who is responsible for writing down the minutes (key notes, actions items and such
stuff) and uploading minutes on team's shared space (collaboration environment)
Time keeper - person who is responsible for tracking the time spending on particular item in agenda,
reminding everyone about the time passed, collecting time information and passing it to the Scribe
so that it can be captured in the minutes document.
Agenda for a meeting should be sent to all participants 24 hours ahead.
Scribe should upload minutes to Sharepoint and add new action items to the Sharepoint action item
list not later than 24 hours after the meeting.Minutes can be changed or commented by people
within 24 hours after they have been uploaded, after that they are considered to be accepted by all
team members.

To be able to fit things into the alloted time, the team will use the 'backlog' and 'burndown' concepts
to allocate tasks to the times.

Summer semester:
Team follows 'Consultive' decision making process with final word with the person responsible for
the specific area of project, which literally means 'I Decide with Input from You".
Spring semester:
Team started to shift from 'Consensus' to more 'Democractic' way of making decisions with some
element of 'Consultive' decision making, but didn't realize that at that time.
Fall semester:
Since decisions were critical and everyone had to agree on the things we were going to do in our
project the team was following 'Consensus' decision making process ('We Decide').
Reference for definitions:

We use process and roles defined in our adopted process framework and extend them with
customized ones.

Research how ACDM can be included in AUP.
Summer semester:

The new propsed scenario

The Architect schedules a architecture conformance meeting if not already done for the developers
The developers creates a notional design of the component (prep 30 min)
The architect and the devloper meets for the architectural conformance meeting (mostly 1 hour) and
discuss every component and connector
Verify that the design conforms to architecture in the meeting
The developer writes down the key decisions made
The developer verifies the Element Responsiblity catalog
The architect updates the documentation
The architect updates the architectural conformance metrics
The developer reviewes the changes


This is taken from ACDM. It was reported as successful by previous teams.

Invistigate if we need to use another method for the parts of our system which is not dependent on

Consider: fully capturing the detailed design for static aspects and selected important parts of the
behavioural ones.

The language used for development is Java and eclipse tehnologies. This makes OO techniques the
best to use. The industry standard to use with OO is OOAD and UML for as a representation
What are the designs to be peer reviewed?
1. Based on the amount estimated for the artifact (e.g. if a design takes X hours then it must be peer
2. Based on the importance of quality attributes it supports
3. Peer Review all module interfaces
4. ...

Spring-Summer semesters:
Team draws UI prototypes on the paper or using tools like .NET PaintTeam presents prototypes to
the clients during the client meetings or office hours,Team updates prototypes based on the
feedback from clients only if it's necessary (for example for follow-up reviews with clients or team),
not just for the sake of keeping prototypes up-to-date.

The approach we propose is a feature centered requirements elicitation process which consists the
All new functional requirements are gathered with use case modeling
Develop detail level basic and alternate flows
Develop the domain glossary, which stands basis for semantics of any new or old requirement
Develop feature document
Develop paper prototype
Define a prioritized paper stack

Summer semester:
Legacy part? (features and bugs)
New bug fixes?
New features?
What about use case flows?

Spring semester:
The team identifies a list of features and prioritize them as must-have, nice-have, or enhancement.
The list is prioritized based on the featue importance to the clients and technology risk. For each
feature, there will be elaboration activities as identified in AUP.
Experiements are kept with its associated data in sharepoint.
With the concent of the client, voice recordings either on the iPod recorder or laptop is enabled for
every client meeting.
Audio recordings are maintained on the SharePoint portal.
All audio recordings should be uploaded to the SharePoint within 24 hours after the requirements
elicitation meeting.
Summer semester:
The Requirements manager makes the traceability between the features and the associated
implementation tasks
The Project manager ensures at all times that the list of tasks to be done are the ones which are
required for the completion
The project manager updates the backlog fromt he dates and the %complete status from the
sharepoint tasks
The test manager publishes the data for the defects found at the product review meetings and the
rate at which the defects are getting done
The requirements manager and the test manager are responsible for coming out with the product
completeness metrics and charts

Fall-Spring semesters:
Usage of SharePoint based, document library views, to make traceabliity tables. The traceability
document remains a live document in SharePoint

Identify metrics like, stability, correctness, completeness, priority, change impact, risk associated per
unit of requirement (feature level or lower) and manage data lists/tables in Sharepoint.

Scrum sprints + standup meetings + etc

Here cost = time Time available for elaboration = Total available time for all resources – (Operational
costs + Architecture buiding cost) Number of features (we can elaborate) = Time available for
elaboration/ (Risk factor for the feature + Probability/Stability factor for that feature * Architectural
impact + Experimentation cost)

Summer semester:
Tech lead suggests possible areas for paired programming.People voluonteer to try parts what they
want to learn (technology learning)Two people work on one or two machines actively collaborating
with each other and supporting each other to achieve one goal (complete task, develop required
functionality)One task (shared task) per persons is assigned by technical and team leads.

This method is well known in industry to have worked for many teams and MSE studio previous
teams reported it was useful for them.
Team defined a coding standard, which is based on Java industry standard.
Team decided to use a standard Eclipse 'Clean-up' feature before submitting code the SVN
Team also doing reviews of each other's code to see if their peers follow our coding standard.

Summer semester:

Use Mantis as a bug tracking tool.
To report bugs developers must follow the guidelines doc defined and prepared by Testing manager.
(see Artifacts section)
Mantis is provided by client and is used by the previous team with some bugs alreay there that at
some point we may want to fix. It will make it easy to have all bugs in one place
rather finding another tool.

When development tasks are 'completed' for the iteration, developers should inform the technical
that their code (and what code) is ready for review.Depending on the amount of code produced tech
lead assigns code review tasks for every develop in the team. Usually it will be every iteration.If
someone finds a bug or unconformity to the coding standard defined and documented by tech lead,
he/she should report them again following the standard defined in "Coding standard" document in
SharePoint. (TODO [commenter's name] comment's text)TODOs should be cleaned up in that
iteration if it's the last time when this part of code is going to be refactored before closing and
delivering that functionality. Otherwise if necessary (not encouraged) we can leave TODOs for next
iteration.Responsible for fixing TODO is the person who is currently responsible for that code (at
some point in the time, it doesn't have to be some specific owner) (We don't have specific owner,
because we develop our features incrementally and we have final complete state may be only by
the time of Final Release ).
This method is well known in industry to have worked for many teams and MSE studio previous
teams reported it was useful for them.

Team decided to use Eclipse JUnit Test Framework.We are testing the functionality required by each
development task.Each unit test must have test cases exercising different equivalent classes of
input. Testing must be done for each development task before the task is closed, that is there is no
separate task for testing. Testing time is included in development task.Testing manager will ensure
that test cases are being developed.Tests should pass before development task is closed.
The team will not have the time to repeat tests manually. A well known method in industry is to use
Unit Testing and the most famous implemenation for Java is JUnit.
-Demo presentations each week to ensure that the requirements being developed are of the high
priority for the clients and are really what they want to be developed in the first order.

TBD. End of Iterations.

This will give the team feedback each time this is done. This will continuesly minimize the gap
between what we are buliding and what the users really need.

Summer semester:
Tech lead stops checking in and notifies people about starting build and testing by email.A day
before IR tech lead checks out all code and builds it.Runs all unit tests (JUnit) written for each
functionality by each developer.Runs scripts again to avoid unexpected problems during the
Demo.If something fails while performing integration tests or scripts, problems should be identified,
fixed and committed back to SVN by the person assigned to fixed it (which are assigned by tech
lead)If bugs are related to the tasks(functionality) which are related to the current iteration, the time
for fixing them should be counted towards these tasks. Otherwise individual unplanned tasks should
be created.When all tests are passed the build is posted on the team's update site (ftp://msesrv4e- lead notifies everyone in the team.

Fall-Summer semesters:
Spreadsheet with available effort hours per week, per iteration.Fixed items are debited.The
spreadsheet is available from the task pane through a link but actually resides in the artifacts section.

A spreadsheet gives as the flexibility of modifying it as we progress through the project.
Also provides a single point where this information is available.
This is used as an input in the planning process.

WBS list in Sharepoint (heps to break the whole project on a small tasks or deliverables ).Sharepoint
tasks list's items directly connected to the WBS list items(in such a way "strategic" aspects (WBS
items) are separated from "tactical" aspects (tasks items) )Estimated and actual effort for each task
item is captured in the tasks list, which is reflected in WBS list.
Do Delphi estimation in a meeting using a spreadsheet for each round. Then input those final results
in the WBS

A spreadsheet with the method and calculations is developed comparing a feature to others.
After filling in the comparisson, the spreadsheet calculates the estimated effort required.

Works by comparing work items already completed to others of similar charasteristics not yet done.
As such we will use it for feature development since those work items are considered to be
One of the mentors (Eduardo Miranda) recommended this approach and provided the basic tools to
support it.

Same as before,but tasks became bigger (~6-12 hours).
Fall - Spring:
Enhance the default Task list in SharePoint to track actual efforts and dates.Items in the plan are
traced down to items in the WBS and to one of the iterations in the Iteration list (or the
backlog).Iterations are planed according to resources available and milestones.Tasks are about 1-3
hours and tracked per Iteration.

Summer semester:
As usual once per week, however format has a little bit changed (no risks report, only tasks and
implementation progress); also individual progress is highlighted by numbers and names of
responsible for that tasks people.
Spring semester:
Same as in Fall semester, but twice longer and the format of status meeting became more formal
(risks status, tasks for the next week, action items and EVM).
Fall semester:
Set up a 30 minute meeting each week to share progress and status and problems
Summer semester:
Meetings are not anymore in practice. Planning changed to more centralized way. Now Team lead
and Technical lead plan tasks for everyone based on the input from the team and clients
requirements and priorities.
Team lead also inputs all tasks in Sharepoint in the beginning of each iteration
Team and Technical leads make sure that people understand their tasks by having a short discussions
individually with everyone who has a tasks.
Common tasks are managed by Team lead and time per tasks is tracked as one task per 5 team
Such type of planning is developing trust inside of the team, builds shared vision, improves
collaboration, improves accuracy of data, reduces time for managing tasks and makes people more
responsible for their tasks.
Spring semester:
Meetings were replaced with virtual (online) meetings for two iterations but it didn't work.
Fall semester:
At the beginning of each iteration, we plan as a group the tasks for each member in a 30 minute
meeting (right after reflection).

Fall-Summer semesters:

Use Earned Value Management.

Since we have a WBS, are estimating and we only have effort as a resource, we can track value
against it.

Add additional checkbox in SharePoint's 'Tasks' list.If task is unplanned tick off 'Unplanned' checkbox,
otherwise leave empty.

If we mark them, we can later analyze the data.
The rest of the attributes are the same since they are also tasks, just usually more "reported" than
planed and executed.
Summer semester:
Team lead refines, analyzes, re-prioritizes list of risks and assigns people responsible for mitigation of
the risks.
Every iteration team has a risks review meeting to discuss analysis done on existing risks, identify
other risks which haven't been considered by team lead and share risks mitigation progress.
Spring semester:
Initial list of risks should be identified during the Small Risk Evaluation Workshop (SRE) conducted
with an expert in a risk management. All risks are tracked in Sharepoint (see related artifacts below)
and the person responsible for it is a Project Manager (PM) or PM can initiate a Risk Manager role
and delegate this to him.Risks are identified any time and brought in during the regular meetings like
status or reflection meeting, however special Mini SRE session might be scheduled during the
semester, too.Risk assessment and prioritization should be done in team. Project manager or Risk
manager should come up with a way how to effectively conduct this activities (it can be done in a
survey or a meeting). Project manager is free to consolidate team's assessment and prioritization
results and update information about risks in Sharepoint, however in order to do that team results
should not split significantly. In order to avoid it, PM or RM can call additional round of

Define our processes around the "burndown" approach with client's prioritizing the tasks they want
to see done first.
Items are estimated and attacked by priority basis.
An "architectural priority" meassures how important the specific task is to the underlaying
architecture of the project and shifting them will alter the estimations (since they depend on some
tasks developing infraestracture for them)

We will drive a scope project around the evolution style functionality but we will also cap the
ammount of hours we can spend doing legacy fixes, features or additional tasks.
The scope will be determined and renegotiated along with the effort limits for those other other
work items.
We will still try to use the "burndown" approach, during summer, prioritazing features according to
the client and architectural preferences.

each task should have a reviewer who shares responsibility with the original owner. His job will be to
remind the task's owner and help him to finish the task.
Unplanned tasks with Due Date not within current iteration should go as planned tasks for next
iteration.Action items bigger than 10-15 minutes should be linked with unplanned tasks.Action items
should not be closed until task is finished (to track completion of task)
It's better to keep a list of issues that need to be acted upon in the middle of an iteration in a sperate
list rather than mixing them with iteration tasks. this will better help track action items

Fall-Summer semesters:

Team tries to resolve problem together, however if it becomes too complicated problem people go
to the mentors and try to resolve their problems with mentors.

Summer semester:
Roles definitions are defined in the beginning of each semester and captured in the form of
document or wiki page in Sharepoint.Every proposal will have a column or text field showing the list
of roles responsible for it (ensuring that people are following the process and keeping Process
Engineer up-to-date about changes).

Run the Eclipse Format: Source -> Format (Not for generated code.)Run the Eclipse CleanUp: Source -
> Cleanup (Not for generated code.)Review all changes for correctnessSynchronize to the latest
sourceResolve any/all conflicts from syncClean the project (removing compiled sources)BuildRun
Unit Tests and they must passCommit with clear check in comment.

Every day, at the beginning of common time, team has a lunch meeting about 30-45 minutes, where
people eat tasty food, talk about everything they want to talk about and of course about their
development tasks.
Summer semester:

Priority-driven is more close to Time and materials driven approach, because we are limited by
program duration and because scope-driven approach does not match research nature of our
project, however our main focus in planning our work based on clients' priorities.

Summer semester:Product backlogA list of planned tasks to be done for the whole project, which is
not fixed, but can be expanded based on latest clients' feedback and considering team's resource
availability.Every task in product backlog has several major attributes: priority (must-have, nice-to-
have, enhancement), owner, size(large, medium, small, tiny) and estimated size
uncertainty(extreme, high, medium,low).Priorities for the tasks are traced back to the features
requirements list, which can be changed in between iterations if necessary.Owner is assigned by
Technical lead based on team's feedback (people might want to learn different technologies )Size and
complexity of the tasks are initially defined by Technical lead based on the team's feedback but the
assigned team member can request that those values be revised based on his in depth
understanding of the issues involved. Product backlog also contains estimated and scheduled values,
which are initially generated using task size, owner's velocity and SPID scheduling mechanism but
can be adjusted based on the implementer's feedback.Scheduling using SPIDEvery task should has
estimated worst, likely, most-likely (50%) and best cases of effort required to be
accomplished.Amount of work is being planned using worst case values.Tasks in the packages are
being scheduled using most-likely case values, so that delta time generated by difference between

Same as before,but tasks became bigger (~6-12 hours).
Also tasks now has attribute 'discipline' (Implementation, Studio, Modeling and etc.), which helps
track effort distribution across all major disciplines.
As before Task list in SharePoint to track actual efforts and dates.
WBS is still used, but more for tracking purposes.
Items in the plan are traced down to items in the WBS and to one of the iterations in the Iteration
list (or the backlog).
Iterations are planed according to resources available and milestones. Urgent tasks, which crop up
during the iterations are tracked as unplanned tasks and are distinguished from other tasks by ticked
off 'unplanned' checkbox.
Sharepoint lists help to track effort more efficiently and transparent for everyone in the team. WBS
encourages people to think about the work to be done and track effort by having logically
separated working packages, such that 'strategic' aspects (WBS items) are separated from
'tactical' (tasks). Tracking unplanned tasks gives an opportunity for latter analysis.

Summer semester:
No specific method is required.Tech lead and developers meet and brainstorm design.Design should
be documented and should be notified to the architect in any form appropriate for architecture and
high level design conformance checking.
Every day after or before 'team lunch' (depends on team availability) the team has a formal stand-up
meeting in the cubical work area. Duration of the meeting is limited to 15 minutes, where every one
should answer on the following three questions:what have they accomplished since last
meetingwhat problems did they face or are facing nowand what are they working now or planning to
do till next status meeting
These meetings require standing up and discourages long and over-detailed reports.
The team and clients agrees on the time, when the team any time during the working hours can
schedule additional meetings with the clients.

The team schedules product review meetings, which everyone in the team must attend.
Requirements manager assigns the roles for the meeting: 1). Requirements reader 2). Reporter 3).
Review leader and 4). Observer.
Requirements reader is supposed to loudly read one by one all features in the features list, such that
everyone can hear them and Review leader can reproduce them on his laptop and project it for
everyone. Bugs and tasks reporters are responsible for writing down all newly identified tasks or
bugs. And observers are supposed to carefully analyze them and report any bugs or requirements
unconformities they might notice.
Category     Content Type

Operations   Item

Operations   Item

Operations   Item

Operations   Item
Operations   Item

Operations   Item

Operations   Item

Operations   Item

Design       Item
Design   Item

Design   Item

Design   Item

Design   Item

Design   Item
Design         Item

Design         Item

Requirements   Item

Requirements   Item
Requirements     Item

Requirements     Item

Requirements     Item

Implementation   Item

Requirements     Item

Implementation   Item
Implementation   Item

Implementation   Item

Implementation   Item

Implementation   Item
Implementation          Item

Implementation          Item

Planning and Tracking   Item

Planning and Tracking   Item
Planning and Tracking   Item

Planning and Tracking   Item

Planning and Tracking   Item

Planning and Tracking   Item
Planning and Tracking   Item

Planning and Tracking   Item

Planning and Tracking   Item
Planning and Tracking   Item

Planning and Tracking   Item

Planning and Tracking   Item

Planning and Tracking   Item
Planning and Tracking   Item

Operations              Item

Operations              Item

Implementation          Item

Operations              Item
Planning and Tracking   Item

Planning and Tracking   Item

Planning and Tracking   Item

Design                  Item
Operations       Item

Implementation   Item

Implementation   Item

The team needs a main process to drive our project by providing a clear short and long-terms goals
and prescribing the flow of general activities to achieve that goals.

The team needs to be able to collaborate and share ideas, experience and develop team skills.
Provide visibility onto each other's tasks.
Provide support for other members and make sure tasks are getting done.
Need to improve team bonding.

The team needs to be able to share artifacts and information in a centralized location.
We also need a place to capture and track overall project progress and team tasks in general.

The team is rapidly trying out new processes, roles and techniques. The team wants them to be
effective and does not want to keep ones that aren't used. We need some way to evaluate what we
are doing.
The team needs to have some mechanism to control the flow and content of the client, status
(status with mentors) and other types of meetings, which involve participation of off-site parties and
which are usually time critical for the team in order to be able to continue work efficiently. Such
meetings should be intended to produce or demonstrate some artifacts or even resolve some issues.

Time is one of the primary constraints to the project. There is not enough time to do everything the
team wants to do.

The team needs to be able to efficiently make decisions in group.

The team needs to be able understand what is expected of themselves and their peers and how to
do their work.

The team needs a process to develop the architecure of the our AE tool.
Summer semester:

Although this is conformance activity, it is not continous conformance activity.We intend to review
all the design and code for architectural conformance once.We are already into the stage where
most of the key decisions are madeHaving architectural conformance session once will help the
team maintain the architectural goals and goals for studio and the client and in addition keeping this
process lightIf however the design continues to change for a paritcular component which is
architecturally significant it is the responsibility of the developer to convey the decisions to the

The team needs to identify the main quality attributes that will drive thearchitecture. Since one of
the main concerns of users is to build a tool that has several extension points for future
development, it's very important to get the quality attributes that clients participates in identifying in
order to get the architecture right.

Experiment new architecture solutions and techniques.

Identify and assess risks with different architecture solutions.
Give the team a chance to have hands on coding experience to make the summer session as smooth
as possilbe

The team needs a method to guide in the detailed design.

The team needs a method to represent structural as well as behavioral aspects of the detailed
design. The team needs a graphical representation language to capture those aspects.
A team member needs assistance in makig sure his design is as correct as possible and conforms to
the team standards.

It's important to discover flaws in the system as early as possilbe. Peer Design Review help to
acheive this and helps leaverging the team understanding of the design as a whole.

Spring-Summer semesters:

The team needs to review UI artifact with users before implementing them.

It's cheaper to make changes on paper prototypes before actually implementing them. This will also
help the team undersand several parts of the system early enough.

The team needs a process to guide the requirements elicitation process prescribed by the AUP for
inception and elaboration phase. Along with new functionality identified for Imhotep, the team has
to support Pangea's AETool version for defect fixing cater to feature request.

The team needs a process to manage the legacy bugs and features left from previous team, new
requirements, feature requests and new bug fixes.
Clarity of the requirements is critical for the success of the project.

The team feels that it is critical to revisit to confirm the identified requirements match the client
expectation. Sometimes, the client discusses new concepts in the meetings, which often the team
has to revisit to gain context.

Summer semester:

This is a cross discipline activity. Requirements Manager + Project Manger + Test Manager.
The team needs to know
how much requirements are getting done? and at what quality?
get a sense of product completeness
get the requiremetns status over time (requirements volatility)
ensure that every implementation task is traced back to the specific clients' requirements

Fall-Spring semesters:
The team feels the need, to maintain the traceability of new requirements to old, new requirements
to impact on existing architecture, etc

The team needs to know stability of the requirememnts at any given time. The associated risks at
any given point of time. These are critical inputs for estimation.

The team needs a process to drive the implementation effort.

The team feels the need for addressing the metrics of the requirements and reason about, 1. Cost of
elaboration a feature 2. Cost of implementing a feature 3. Can we know if with the given state of
requirements we have budgeted time to fit all requirements?

Summer semester:
The team wants to try paired programming (learning opportunities).Also by doing paired
programming the team hopes to improve shared understanding and reduce time of development.
The team needs to make sure that code is readable and easy to follow by all team members.

This is an industry standard and the alternative is that everyone can make his own standard which
will make the code less readable for other members.

Summer semester:

Thee team needs a tool for tracking bugs and their status.

The team needs a way to find an easy bugs and to make sure every code artifact is of acceptable
quality and maintainable (readable).

The team needs an automatic testing of important units of code (usually functions) to support
integration testing.
The team needs to make sure the tool being bulid is what the users need.

Summer semester:

The team needs a mechanism to ensure that the last feature enhancements haven't broken the tool
and be prepared to the internal releases (IRs) or so-called demo presentations to our clients.

This will work due to the iterative nature of our process. Integrations are expected to occur at the
end of each week and an automatic method to test that the system works will save a lot of hours
done in manual integration testing

Have an account of our available resources through time.

Team needs to measure its progress against some work items.We also want to ensure that we have
an overall plan to acomplish the goals of the project.
Need to estimate work items for which we have little previous experience

Need to estimate the effort required to complete our work items.

Plan our activities and track the effort invested in them.

Team needs to have a formal status meetings with mentors to share the progress.
To get a plan for iteration and make sure that people understand all their tasks.

Team needs a way to report our progress in Studio to the mentors and MSE staff.

Team needs to keep an eye on how much effort is being spent on tasks, which crop up during
iteration and are not originally planned.
The team needs to understand what threats exist that could have an adverse impact on the team's
project. Our team would like to be able to deal with these issues in a proactive manner to reduce
their impact. Other teams have indicated that they would have benefited from such a process.

We have several 'unscopable' items such as:
+ Legacy defects
+ Legacy new features
+ Additional tasks such as preparing presentations, etc.

Need an approach for the project since the "time and materials" didn't work

a member of the team may need someone else to help him/remind him of finishing a task
The team needs to track action items that appear during the iteration and meetings.

The team needs to efficiently resolve conflicts inside the team.

The team needs to capture and track understanding of the team about key roles
responsibilities defined out of the AUP framework, which are not prescribed or defined by AUP. We
also need to be sure that the team understands which are they key processes their role is
responsible for.

To prevent code crushes and conflicts of the code written by different team members, team has to
have a process defined for committing the source code to the repository (SVN, CVS or something

Team also needs to maintain bonding between each other and ensure that everyone is on the same
page about the project.
The team needs an approach for the project since the "time and materials" and "scope-driven
approach" didn't work.

Team needs a way to plan and schedule our activities efficiently.

Plan our activities and track the effort invested in them.
Measure work progress against some working items.

Summer semester:

Tech lead and developers need some common time to discuss and desing architecturally
critical(important) components.
Because of radical changes in the schedule (12->48 hours)the team needs to track and share
progress on daily basis.
The team needs more time to avoid implementation delays in case of unexpected questions
regarding either the requirements or problems during the phase.

The needs a way to review the product, verify completeness of requirements and also identify bugs
if any.
The team also needs to be always on the same page about the completeness of requirements.
Created         Created By

 11/2/2008 13:48 Andrew O Mellinger

 11/2/2008 14:04 Andrew O Mellinger

 11/2/2008 14:09 Andrew O Mellinger

 11/2/2008 14:13 Andrew O Mellinger
11/2/2008 14:35 Andrew O Mellinger

11/2/2008 14:43 Andrew O Mellinger

11/2/2008 14:47 Andrew O Mellinger

11/2/2008 14:51 Andrew O Mellinger

12/1/2008 21:04 Majid Alfifi
12/1/2008 21:08 Majid Alfifi

12/1/2008 21:12 Majid Alfifi

12/1/2008 21:16 Majid Alfifi

12/1/2008 21:31 Majid Alfifi

12/1/2008 21:40 Majid Alfifi
12/1/2008 21:47 Majid Alfifi

12/1/2008 21:49 Majid Alfifi

 12/2/2008 0:53 Mohit Bhonde

 12/2/2008 1:10 Mohit Bhonde
 12/2/2008 1:22 Mohit Bhonde

 12/2/2008 1:37 Mohit Bhonde

 12/2/2008 1:42 Mohit Bhonde

12/2/2008 13:06 Majid Alfifi

12/2/2008 17:01 Mohit Bhonde

12/3/2008 13:44 Majid Alfifi
12/3/2008 13:52 Majid Alfifi

12/3/2008 13:55 Majid Alfifi

12/3/2008 13:58 Majid Alfifi

12/3/2008 14:02 Majid Alfifi
12/3/2008 14:06 Majid Alfifi

12/3/2008 14:08 Majid Alfifi

12/9/2008 17:47 Raúl Véjar

12/9/2008 17:53 Raúl Véjar
12/9/2008 17:58 Raúl Véjar

12/9/2008 18:03 Raúl Véjar

12/9/2008 18:12 Raúl Véjar

12/9/2008 18:22 Raúl Véjar
12/9/2008 18:34 Raúl Véjar

12/9/2008 18:39 Raúl Véjar

12/9/2008 18:41 Raúl Véjar
12/9/2008 18:45 Raúl Véjar

12/9/2008 18:55 Raúl Véjar

12/9/2008 18:59 Raúl Véjar

 3/2/2009 23:32 Majid Alfifi
 3/26/2009 1:06 Majid Alfifi

 6/13/2009 1:16 Adlan Israilov

 6/13/2009 3:52 Adlan Israilov

6/13/2009 19:08 Adlan Israilov

6/13/2009 23:38 Adlan Israilov
 6/14/2009 0:09 Adlan Israilov

 6/15/2009 3:55 Adlan Israilov

6/19/2009 23:03 Adlan Israilov

 6/26/2009 2:29 Adlan Israilov
 7/11/2009 2:31 Adlan Israilov

7/25/2009 17:23 Adlan Israilov

7/25/2009 17:37 Adlan Israilov
Final Reflections                                                                                      ID

Iteration 16: People don’t feel guided by AUP as a process
Team is not sure if AUP construction milestones match with our plan.                                        1


Iteration 16: No specific comments this time. Team agress that it's the best choice for now.                3

People want to see improvement after reflections are done. (Process manager's response: "this is
not a proposal for doing some improvements, this is to track effectiveness of one or other proposal.
We should evaluate effectiveness of 'Reflections' gathering")                                               4
Iteration 16: Two general comments since last reflections survey: Which sort of meetings it applies
for? (Process Manager's response: resolved, see Approach section)Are minutes useful? (Process
Manager's response: they definitely are, we just need to upload them in time and use templates
more consistently)


We don’t have clear strategy here. Although it’s satisfactory some people don’t understand how we
are making decisions. It might become a serious problem later. (Process manager's


We used some parts of ACDM, but not everything and we can't say that we were following ACDM in
AUP to guide our design activities.                                                                   11

Worked fine for us, however we didn't find any new unexpected requirements. Also number of
stakeholders is small in our project, which reduces effectiveness of QAW as a technique in general.
We finished doing in Spring semester.                                                                 13

Worked fine for us, could be better.                                                                  14


Still needs to be discussed with everyone before we start to use this.                                16

Has been successfully tried and finished. It should and could be more intensively and effectively used
in the Spring and during Implementation phase, but due to time limit constraints the team opted for
Demos to ensure that UI is good for the clients.                                                         18

Finished, because we are not anymore doing heavy requirements elicitation. We have list of must-
haves, nice-to-haves and enhacements. Some of the new requirements pop up during the IR demos,
which Requirements manager is tracking and Team lead with Technical lead add to the product
backlog based on the clients priorities.

Some people feel that it’s not helping in development.
We are not sure if it helps in communication between clients and the team and also among the
We don’t have alternate flows as mentioned in the process.
Is it organized, repeatable, continuously improving and essential?

We are not doing anymore requirements management just to make sure that we are not losing
anything. Requirements management during the implementation phase is more important in terms
of completeness and this is expressed and described better in Management and Traceability

How we are keeping and managing our requirements on site and how we are eliciting them now?
Should be clarified with Requirements Manager.                                                           20
Team does not see the value in continuing voice recordings. Our primary tool for capturing new
requirements requests should be minutes.                                                         21


We failed to organize and perform this proposal because of problems with shifting domain in
requirements.                                                                                    23


We have never tried to implement it. When we started implementation we just decided to abandon
it.                                                                                              26

Hasn't been scheduled properly. Only some occasional paired bug fixing sessions were held.       27


Has been abandoned, because of not planning for it and not budgeting for it (the team failed to
organize it).                                                                                     30



people didn't have a clear understanding whether is has been updated or not. (Process manager:
"because they didn't check it in SharePoint")                                                    36

Not everyone has a clear picture how it’s being done this semester.

We tried that, but it seems to be not reasonably time consuming for us.                       39

Merged with WBS and Tracking unplanned tasks, because they strongly dependent on each other
and intent of all of them is one - to track effort.                                           40


According to our last discussion and our last definition of this proposal we need EVM graphs only for
EOSP or MOSP presentations, because it apparently became some kind of MSE standard. However, it
doesn't help us to improve our planning and scheduling, therefore the team decided not to use it,
but only prepare graphs for EOSPs. Therefore we will not anymore track it. Although, we were able
to organize it in a good way and generate graphs every iteration, it is abandoned, because it was not
really useful for us.                                                                                   45

Merged with WBS and Tracking unplanned tasks, because they strongly dependent on each other
and intent of all of them is one - to track effort.


Some people thought that we are not doing that. (Process Manager: "consequences of my late start,
we will work to improve team's understanding regarding the process stuff".                              46
Some people are skeptical, because we haven’t seen results yet. In general people seem to like how
Raul started it.



It was actually finished, not abandoned.                                                             50
Some people doubt following the process, because we don't have that much action items. (Process
manager's response: "we are starting to keep track of it, metrics and analysis")                          51

At some point team had some problems, but we managed to find solution by ourselves. Process
seems to be a good fit for our team.                                                                      52

Iteration 18: Overall people seem to be happy about the work everyone is doing their areas of
resonsibility. People show interest and take initiative.

Iteration 16: Shall we evaluate roles on weekly basis? Is role useful and is it being performed by role
People are not clear about the processes their role is responsible for.


Iteration 18: Everyone believe that it helps the team to improve visibility and communication in the
team, but since we have status mixed with the lunch some people felt that sometimes it's too long
or must be more structured. We decided to split our status and lunch meetings, we will try stand-up
status meetings (around 5-15 minutes) in the beginning of our common time and then we will have
team lunch, because we found it useful for such things like improving communication and team

Iteration 16: People want to see status before starting to eat (sounds like a good idea)
People don’t like the idea of doing it standing up.                                                       55
Goes well, everyone is happy, including clients.                                                      56

We tried it only iterations. Although in general it was okay and promising eventually we ran out of
product backlog and had to start polishing (finding and fixing bugs)                                  58

Iteration 16: Since planning became more centralized, not everyone has a clear picture how tracking
in general and WBS is being done in Summer.                                                           59

Do people show up on the meetings?
Does it help us to share progress and problems in effective and timely manner?
Does the team follow the process approach?                                       61


Modified          Modified By     Obsolete Date   Replaced by

  7/11/2009 3:19 Adlan Israilov

  7/11/2009 3:17 Adlan Israilov

  7/11/2009 3:18 Adlan Israilov

  7/22/2009 0:36 Adlan Israilov
7/11/2009 3:20 Adlan Israilov

6/14/2009 1:46 Adlan Israilov   1/15/2009

7/11/2009 3:20 Adlan Israilov

6/13/2009 3:54 Adlan Israilov   3/20/2009

6/14/2009 1:50 Adlan Israilov   2/11/2009
7/25/2009 16:50 Adlan Israilov

 6/14/2009 1:52 Adlan Israilov    5/8/2009

 6/14/2009 1:55 Adlan Israilov    5/8/2009

 6/14/2009 1:42 Adlan Israilov   6/14/2009

 6/26/2009 0:15 Adlan Israilov   6/26/2009
 6/26/2009 2:05 Adlan Israilov

7/30/2009 19:36 Adlan Israilov   7/25/2009

 6/26/2009 0:09 Adlan Israilov   6/26/2009

 6/26/2009 0:18 Adlan Israilov   6/26/2009
 7/11/2009 2:47 Adlan Israilov

 8/4/2009 13:05 Adlan Israilov

 6/26/2009 0:03 Adlan Israilov   6/25/2009

 6/14/2009 2:36 Adlan Israilov   6/14/2009

 6/26/2009 0:05 Adlan Israilov   6/26/2009

7/25/2009 16:39 Adlan Israilov   7/25/2009
 7/11/2009 3:16 Adlan Israilov

7/25/2009 16:41 Adlan Israilov

7/25/2009 16:47 Adlan Israilov   7/25/2009

 8/4/2009 13:05 Adlan Israilov
7/25/2009 16:43 Adlan Israilov

 8/4/2009 13:05 Adlan Israilov

 7/11/2009 3:22 Adlan Israilov

6/19/2009 23:17 Adlan Israilov   6/19/2009 Tasks and WBS lists in Sharepoint;#59
 6/14/2009 3:12 Adlan Israilov    5/8/2009

6/19/2009 17:23 Adlan Israilov

6/19/2009 23:08 Adlan Israilov   6/19/2009 Tasks and WBS lists in Sharepoint;#59

 7/11/2009 3:23 Adlan Israilov
 7/11/2009 3:09 Adlan Israilov

7/31/2009 17:38 Adlan Israilov   7/25/2009

6/19/2009 23:10 Adlan Israilov   6/19/2009 Tasks and WBS lists in Sharepoint;#59
7/31/2009 15:43 Adlan Israilov

 6/12/2009 2:29 Adlan Israilov   12/1/2008

 6/12/2009 2:31 Adlan Israilov

 6/14/2009 3:19 Adlan Israilov   5/27/2009
 7/11/2009 3:24 Adlan Israilov

 7/11/2009 3:21 Adlan Israilov

7/25/2009 15:03 Adlan Israilov

 7/22/2009 0:43 Adlan Israilov

 7/11/2009 3:21 Adlan Israilov
 7/11/2009 3:22 Adlan Israilov

7/25/2009 15:49 Adlan Israilov

 7/11/2009 3:22 Adlan Israilov

 8/4/2009 13:04 Adlan Israilov
7/25/2009 16:58 Adlan Israilov

7/30/2009 19:17 Adlan Israilov

7/30/2009 17:29 Adlan Israilov
Title                       Item Type

Process Framework           Item

Common Time                 Item

Collaboration Environment   Item

Reflections                 Item
Regular Structured Meetings     Item

Utilize Backlog and Burndown    Item

Decision Making                 Item

Roles and Processes             Item

ACDM in AUP Elaboration Phase   Item
Architecture Conformance Meeting       Item

Quality Attribute Workshop             Item

Experiments with Architecture          Item

MDD for UI Detailed Design             Item

OOAD with UML2.0 for Detailed Design   Item
Design review                                         Item

Paper Prototypes for UI                               Item

Feature Centered - Requirements Elicitation Process   Item

Feature Centered - Requirements Management Process    Item
Voice Recordings                                               Item

Requirements Management & Completeness                         Item

Requirements Management - Metrics for requirements             Item

Scrum with AUP                                                 Item

Requirements Management - Requirements analysis + Estimation   Item

Paired Programming                                             Item
Coding Standard    Item

Defects tracking   Item

Code Review        Item

Unit Testing       Item
Demos                          Item

Build and Regression testing   Item

Resource Schedule              Item

WBS                            Item
Delphi estimation          Item

Estimation by comparison   Item

Tasks                      Item

Weekly status meeting      Item
Iteration planning              Item

Earned Value Management (EVM)   Item

Tracking unplanned tasks        Item
Risk management                                Item

Time and material's approach for the project   Item

Scope drive approach for the project           Item

Task Reviewer                                  Item
Action Items                 Item

Conflict Resolution          Item

Roles and responsibilities   Item

Code Committing Rules        Item

Team lunches                 Item
Priority-driven approach for the project   Item

Product backlog and SPID                   Item

Tasks and WBS lists in Sharepoint          Item

Design brainstorm sessions                 Item
Stand-up status meetings   Item

Client office hours        Item

Product review meetings    Item












































Shared By: