Docstoc

CS 3724 Introduction to HCI (PowerPoint)

Document Sample
CS 3724 Introduction to HCI (PowerPoint) Powered By Docstoc
					  Usability Evaluation

or, “I can’t figure this out...do I
      still get the donuts?”
     Purposes of Evaluation
• Which design is better?
• Are there any problems with the design
  as it now stands?
• Does the design meet usability targets?
     When do you evaluate?
• Formative evaluation (e.g. scenarios)
  – Helps with form of the solution
  – Deciding between competing designs
  – At early and intermediate design stages
• Summative evaluation
  – Provides summary of usability
  – Often compares new design with existing or
    alternative solutions
  – When design is complete
     What do you evaluate?
• Prototypes
  – During design iterations
  – Can be very low-fidelity
  – “Wizard of Oz” technique
• Working systems
  – System can be prototype
  – Can also evaluate at end of design cycle
       Types of Evaluation
• Depends on range of formality and
  completeness of system:
  – Informal user studies
  – Usability studies
  – Formal experiments
              User Studies
•   Early stages of design
•   Usually only a few users
•   Non-structured tasks
•   Collect comments, observations,
    suggestions, preferences, ...
          Usability Studies
• Usually more complete prototype
• Structured tasks
• Collect timings, errors, verbal protocol,
  ...
• Issues:
  – Finding users
  – Testing environment
  – Compensation
       Formal Experiments
• Working system or piece of system
• Number of users determined by desired
  difference
• Tightly controlled tasks
• Precise measurements
• Statistical analysis of hypotheses
                  Ethics
• Testing can be a distressing experience
  – pressure to perform, errors inevitable
  – feelings of inadequacy
  – competition with other subjects
• Golden rule
  – subjects should always be treated with
    respect
      Managing Subjects Ethically
• Before the test                      – Maintain privacy
   – Don’t waste the user’s time          • Tell user that individual
                                            test results will be kept
      • Use pilot tests to debug            completely confidential
        experiments, questionnaires
        etc
      • Have everything ready before
                                       – Inform the user
        the user shows up                 • Explain any monitoring
                                            that is being used
   – Make users feel comfortable
                                          • Answer all user’s
      • Emphasize that it is the            questions (but avoid
        system that is being tested,        bias)
        not the user
      • Acknowledge that the
        software may have problems
                                       – Only use volunteers
      • Let users know they can stop      • user must sign an
        at any time                         informed consent form
   Managing Subjects Ethically
During the test
  – Don’t waste the user’s time
     • Never have the user perform unnecessary tasks
  – Make users comfortable
     •   Try to give user an early success experience
     •   Keep a relaxed atmosphere in the room
     •   Coffee, breaks, etc
     •   Hand out test tasks one at a time
     •   Never indicate displeasure with the user’s performance
     •   Avoid disruptions
     •   Stop the test if it becomes too unpleasant
  – Maintain privacy
     • Do not allow the user’s management to observe the test
  Managing Subjects Ethically
• After the test
   – Make the users feel comfortable
      • State that the user has helped you find areas of
        improvement
   – Inform the user
      • Answer particular questions about the experiment that
        could have biased the results before
   – Maintain privacy
      • Never report results in a way that individual users can be
        identified
      • Only show videotapes outside the research group with
        the user’s permission
       Components of a Study
•   Informed consent
•   User familiarization
•   User questionnaire
•   Background testing
•   System exploration
•   Specific tasks
•   Post-testing
•   Debriefing
        Informed Consent
• Form advising users of their rights
• Tell them you are studying the system,
  not them!
• Tell them if there are any known risks
• Tell them they can stop at any time
• Get signature, give them a copy
       User Familiarization
• Make the user comfortable
• Tell them what you’re doing without
  giving everything away
• Show them the facilities and equipment
• Give them written instructions
• Tell them approximately how much time
  the evaluation will take
• Ask if they have any questions
       User Questionnaire
• Obtain demographic information
  – Age
  – Gender
  – Occupation or major
  – Computer experience
  – Domain experience
  – Eyesight
  – Handedness
         Background Testing
• You may want to correlate your findings with
  some standardized test score
• Examples
   – Spatial ability tests
   – Memorization tests
   – Domain knowledge tests
• Example result: “Users with high spatial
  ability preferred interface X, while others
  preferred interface Y”
      Pre- and Post-Testing
• Some systems have a specific learning
  goal (i.e. education or training)
• Giving users the same test or type of
  test both before and after usage of the
  system is a way to measure learning
• Only used with very well-developed
  prototypes
          System Exploration
• User gets “free play” time with the system
• Observe what they seem to understand
  easily and what is troublesome
• See if they “find” all the features or parts
  of the system
• Good with verbal protocol
           Specific Tasks
• Give the user a specific goal
• Ex: Buy a burger with extra pickles and
  no onions using this interface
• Observe problems
• Record performance, errors
        Post-Questionnaire
• Get user’s reaction to the system
• Subjective levels of satisfaction,
  perceived ease of use, usefulness, etc.
• Objective comments, thoughts, and
  questions
             Debriefing
• Talk to the users about the session
• Assure them they did well
• Give more details about what you’re
  doing if they are interested
• Thank them
• Give them donuts 
      Other Evaluation Issues
•   Recruiting users
•   Testing facilities
•   Measurements and observations
•   Compensation
             Recruiting Users
• Attempt to match the proposed user
  population
• Perhaps divide into several user groups
• Techniques
  –   Posted advertisements
  –   Internet/email/newsgroup
  –   Colleagues/friends/classmates
  –   People already using existing system
• Make sure your users are not overly
  knowledgeable!
           Testing Facilities
•   For informal studies, a simple setup
•   In all cases, privacy is important
•   Replicate the usage environment?
•   Evaluator present or not?
•   More formal studies may use a special
    usability lab (e.g. McBryde 102 A or C)
Measurements & Observations
•   User comments
•   General observations
•   Specific “critical incidents”
•   Task timing
•   User errors
      Session management
• May need multiple evaluators
• Use checksheets or pre-printed tables
  for filling in results
• Video/audio as backup if something is
  missed
• Don’t ask the user to stop while you
  catch up!
               Compensation
• Most studies give the user something for their
  time and effort
• Doesn’t have to be monetary – can also be:
  –   Food
  –   Extra credit in a class
  –   Special discounts on the company’s products
  –   Tour of your facility
  –   …

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:4
posted:3/11/2012
language:
pages:28