Designing Web Sites for Usability by wulinqing

VIEWS: 18 PAGES: 119

									Designing Web Sites for
         Bebo White
 Stanford Linear Accelerator
• Interface Design, HCI, Usability, etc. is a
  well-established research discipline
  – remember “user-friendly” systems?
• Web systems (pages, sites, etc.) represent
  the largest application of user interface
  – content providers and authors also become
    interface designers
  – Web usability has become almost hopelessly
    intertwined with other technologies and issues
    (e.g., accessibility, internationalization, etc.)
        What is Usability?

“The usability of an interface is a measure of the
effectiveness, efficiency and satisfaction with which
specified users can achieve specified goals in a
particular environment with that interface.”

(International Organization for Standardization, ISO)
    Encountering Usability
• When selecting a computer system
  (hardware + software), the accepted
  process is to assess:
  – Functionality – will the system do what is
  – Usability – will the users be successful in their
    use of the system?
  – Likeability – will the users feel the system is
• These three elements constitute
Components of Usability
  Components of Usability
• Learnability - the ease with which
  new users can begin effective
  interaction and achieve satisfactory
• Flexibility - the number of ways the
  user and the page/site can exchange
• Robustness - the level of support
  provided the user in determining
  successful achievement and
     What is Usability? (2)
• We cannot say that if a system obeys
  a set of formal principles then it will
  be usable
• Some formal principles are
  necessary for usability; any system
  which breaks them is bound to have
• Formal principles form a “safety net”
  to prevent some of the worst
  mistakes but do not ensure a good
    Usability as One
Component of Acceptability
•   Appropriate affect
•   Match with the medium
•   Consistent graphic vocabulary
•   Visual order and user focus
•   Illusion of manipulable objects
•   Revealed structure
    Usability as a Tradeoff
• User differences on priorities and
         What is Design?
• Design is conscious
• Design keeps human concerns in the
  center (e.g., usability, affordance,
• Design is a dialog with materials
  (technologies, prototyping)
• Design is communication (e.g., conceptual
  models, metaphors, genre)
• Design has social consequences (e.g.,
  work structure, social responses)
      What is Design? (2)
• Design is a social activity (e.g.,
  participatory design, creative
• Design is a tradeoff among many
• The relationship of design to other
  engineering activities and
• Designing spaces for living –
  similarities between software design
     Usability and Design
Designing for maximum usability is the
 goal of interactive systems design;
The challenges to the designer of an
 interactive system are:
  – how can the system be developed to
    ensure its usability?
  – how can the usability of an interactive
    system be demonstrated or measured?
Web Design Process
                                 Design Specialties
       User Interface
                                            • Information
                             Evaluation       Architecture
Architecture                                  – encompasses
                                                information &
               Information     Navigation       navigation design
                  Design        Design
                                            • User Interface
                       Design                 – also includes
                                                testing and
Web Site Design Process

Design Exploration

Design Refinement


… followed by implementation & maintenance
                     Design Process

                        Assess needs
                          – understand
Design Exploration
                          – determine scope
Design Refinement
                            of project
                          – characteristics of
                          – evaluate existing
                            site and/or
                     Design Process
                     Design Exploration
                        Make multiple
                          – visualize solutions to
Design Exploration          discovered issues
                          – information and
Design Refinement           navigation design
                          – early graphic design
                          – select one design
                            for development
                     Design Process
                     Design Refinement
                        Develop the design
                          – increasing level of
Design Exploration        – heavy emphasis on
                            graphic design
Design Refinement         – iterate on design

                     Design Process
                        • Prepare design for
                          – create final
Design Exploration          deliverable
                          – specifications,
Design Refinement           guidelines, and
                          – as much detail as
   Production               possible
     Listen to Users –
What Do You Hate About the
• Slow downloads
• Can’t find what I want or what I find is
• Poor graphic design and layout
• Hard to navigate
• Gratuitous use of “bells and whistles”
• Inappropriate tone
• Designer-centeredness
• Lack of attention to detail
     Listen to Users –
What Do You Like About the
•   Aesthetics
•   Big ideas
•   Utility
•   “Findability”
•   Personalization
Top Ten Issues in Web User
  10. Overly Long Download
• 10 second rule
  – amount of wait time before users lose
    • traditional human factors studies back this
• 15 seconds may be acceptable on web
  – people are getting trained to endure
  – but only for a few key pages
• True even for business sites
  – busy during day & surf at home for work
  Download and Response
• Response times:
  – 0.1 second: interaction appears
    instantaneous; no special feedback
    needed except to display result
  – 1.0 second: the limit for the user’s flow
    of thought to stay uninterrupted, even
    though the user will notice the delay. No
    special feedback needed.
  – 10 seconds – the limit for keeping the
    user’s attention focused on the dialog;
    feedback needed
  Impact of Page Size and
Bandwidth on Download and
     Response Times
         1 second response   10 seconds
         time                response time

 Modem   2 KB                34 KB
 ISDN    8 KB                150 KB
 T-1     100 KB              2 MB
   9. Outdated Information
• Most people would rather create
  content than do maintenance
• Cheap way of enhancing content
  – link to new pages if still relevant
  – otherwise remove them
• Outdated content also leads to a lack
  of trust (important for e-commerce)
8. Non-standard Link Colors
• Links to
  – pages that haven’t been seen are blue
  – previously seen pages are purple / red
• Don't mess with these colors
  – one of the few standard navigational
  – consistency is important for learning
    • don’t underline other objects with blue/red!
  – this is a “Web Design Pattern”
• What is unfortunate about this
  7. Lack of Navigation Support
• Users don’t know much about a site
  – they always have difficulty finding information
  – give a strong sense of structure and place
• Communicate site structure
  – provide a site map
     • so users know where they are & where they can go
  – provide a good search feature
     • the best navigation support will never be enough
• People now expect these
  – site logo in upper left linked to home page
  – outline structure showing where you currently are
What Might be Wrong Here?
    6. Long Scrolling Pages
• Many users do not scroll beyond visible
  section when page completes
• All critical content & navigation should be on
  the top part of the page
• Leaf nodes can be longer
  – people who have that interest will be reading it
  – still good to be brief
• Becoming less of an issue
  – top items will STILL dominate
  – should be careful not to go past 3 screens max.
         5. Orphan Pages
• All pages should have a clear
  indication of what Web site they
  belong to
  – users may not come in through a home
• Every page should have
  – a link up to a home page
  – some indication of where they fit within
    the structure of your information space
What Might be Wrong Here?
          4. Complex URLs
• Shouldn’t have exposed machine address
• Users try to decode URLs of pages
  – to infer the structure of web sites
     • lack of support for navigation & sense of location
• URL should be human-readable
  – names should reflect nature of the info. space
  – sometimes need to type in URL->minimize
     • use lower-case, short names with no special chars
        – many people don't know how to type a ~
• Long URLs are hard to email properly
  – wrapping, etc. *** biggest issue today ***
What Might be Wrong Here?
     3. Constantly Running
• Don’t have elements that move
  – moving images have an overpowering
    effect on the human peripheral vision
    • no animations, scrolling text, marquees
• Users tune them out
  – so do not put anything important there!
• Give your user some peace and quiet
  to actually read the text!
What Might be Wrong Here?
What Might be Wrong Here?
2. Gratuitous use of “Bleeding
      Edge” Technology
 • Don’t try to attract people using it
   – you’ll get the nerd crowd, but
     mainstream users care about content
     and service
 • If their system crashes
   – they will never come back
 • Caveat: appropriate if selling those
What Might be Wrong Here?
What Might be Wrong Here?
      1. Using Frames
• Confusing for users
  – breaks the user model of the web page
    • sequence of actions rather than single act
    • unit of navigation no longer equal to unit of
• Lose predictability of user actions
  – what information appears where when
    you click?
    • Sometimes can’t bookmark the current page
      & return to it
    • URLs stop working
    • can’t share with others (lose social filtering)
             Frames (2)
• Search engines have problems with
  – what part of frames do you include in
• Early surveys found most users
  preferred frame-less sites
  – recent surveys back this up ~70-90%
• Caveat: experienced designers can
  sometimes use frames to good
  effect; Frames are not evil, bad use
        Usability Guidelines
•   Simple and natural dialogue
•   Speak the users’ language
•   Minimize the users’ memory load
•   Consistency
    – Visual look
    – Command semantics
    – Conceptual model
    Usability Guidelines (cont)
•   Feedback
•   Clearly marked exits
•   Shortcuts
•   Good error messages
•   Prevent errors
•   Help and documentation
      In Summary-
The Ten Usability Principles
1. Motivate
  – Design site to meet specific user needs
    and goals
  – Use motivators to attract different user
    “personae” in specific parts of the site
2. User Taskflow
  – Who are the users?
  – What are their tasks and online
  – For a site to be usable, page flow must
    match workflow
          Perceiving the
• Is it a captive audience? (e.g., an
  intranet) - then it’s easy….otherwise
• User surveys/user feedback
• Analysis of legacy audience
  – public affairs office
  – customer support, technical support,
        Perceiving the
     Audience/User (cont)
• Lessons learned from non-Web
  methods - e.g., brochures, catalogs,
• Demographic information - e.g.,
  geography, online, education, etc.
• Sampling - population samples,
  “beta” testers
• Experience
3. Architecture
  – 80% of usability
  – Build an efficient navigational structure
  – “If they can’t find it in three clicks,
    they’re gone”
4. Affordance Means Obvious
  – Make controls understandable
  – Avoid confusion between logos,
    banners, and buttons
5. Replicate
  – What works - “Don’t re-invent the wheel”
  – Use well-designed templates for the most
    common page types (e.g., personal home
  – Look at other sites
     • Competitors
     • Classics
6. Usability Test Along the Way
  – Test users with low-fi prototypes early
    in the design process
  – Don’t wait until too close to site launch
7. Know the Technology Limitations
  – Identify and optimize for target
    browsers and user hardware
  – Test HTML, JavaScript, etc. for
8. Know User Tolerances
  – Users are impatient
  – Design for a 2-10 second maximum
  – Reuse as many graphics as possible so
    that they reload from cache
  – Avoid excessive scrolling
9. Multimedia
  – Be discriminating
  – Good animation attracts attention to
    specific information, then stops
  – Too much movement distracts reading
    and slows comprehension
10. Site Statistics
  –   Monitor site traffic
  –   Which pages peak user interest?
  –   Which pages make users leave?
  –   Adjust according to analysis results
Web Site Usability
 Example – Evaluating Web
• What is the relative importance (from
  a user perspective) of:
  –   Document structure          ____%
  –   Content organization        ____%
  –   Navigational tools          ____%
  –   Interface design       ____%
  –   Use of the medium           ____%
   Example – “High Level”
• Do the screen designs look
• Is the use of the screen
• Do the icons show creativity?
• Can you understand the level of the
• Could an African pygmy use the
  system successfully?
• Would it work just as well if the user
   Example – “High Level”
      Questions (cont)
• Would it work just as well if the user
  normally wrote from top to bottom?
• How well have all the design
  decisions been explained?
     Taxonomy of Evaluation
                           Evaluation Techniques

       Predictive Models                           Experimental
       and Techniques                              Techniques

HCI-based         Theory-based           Mockups               Prototypes
Heuristics        Models

                                                   Wizard of Oz

                                          (adapted from Joelle Coutaz)
     Usability Evaluation
                                                             Independent variables

User Characteristics            System Functions
                                                                    Task Characteristics
     Knowledge                     Task match
     Discretion                    Ease of use
     Motivation                  Ease of learning

                                  User Reaction
                            Implicit cost/benefit analysis
    Positive outcome
                                                                    Negative outcome

  Good task-system match                                         Restricted use
  Continued user learning                                      Non-use, partial use,
                                                                   distant use

                                                              Dependent variables
                  Usability Evaluation

Task completion       Number of tasks correctly completed.
                      Number of tasks completed in given time.
                      Time taken per task.
Action usage          Frequency of use of different commands.
                      Use of command sequences.
                      Use of special command (e.g. ‘help’)
Shortcuts             Use of keyboard equivalents.
Display perusal       Time spent looking at display.
                      Comparative data for different screen designs.
User errors           Classification of error types.
                      Frequency of error types.
                      Time spent in error situations.
                      Time taken to correct errors.
Input devices         Comparative time taken to execute tasks.
       Why do User Testing?
• Can’t tell how good a UI is until
  – people use it
• Other methods are based on
  – Who may know too much
  – Who may not know enough (about
    tasks, etc.)
  – UI experts
• Hard to predict what real users
  will do
Evaluation in a Usability Lab
     Choosing Participants
• Representative of target users
  – job-specific vocabulary / knowledge
  – tasks
• Approximate if needed
  – system intended for physicists
     • get physics students
  – system intended for engineers
     • get engineering students
• Use incentives to get participants
      Ethical Considerations
• Sometimes tests can be distressing
  – users have left in tears
• You have a responsibility to alleviate
  –   make voluntary with informed consent
  –   avoid pressure to participate
  –   let them know they can stop at any time
  –   stress that you are testing the system, not them
  –   make collected data as anonymous as possible
• Often must get human subjects approval
        User Test Proposal
• A report that contains
  –   objective
  –   description of system being testing
  –   task environment and materials
  –   participants
  –   methodology
  –   tasks
  –   test measures
• Get approved and then reuse for final report
                  Informed Consent Form for Usability Participants
                              SLAC Public Web Site

Purpose of this study
The purpose of this study is to understand how employees and visitors want to use the
SLAC public Web site. Your participation in this study will help us to adapt a new
service to the needs and desires of its users and visitors.

Information we will collect
We will ask you to try out a public Web site. We will observe how you interact with it,
and will also interview you briefly. The information from your visit will be used, along
with that from other similar visits, to improve the site you will see today. Summary data
may be used in publication for scientific purposes.

Digital video permission
We will take hand written notes and video tape the session. By signing this consent
form, you are giving us consent to use your verbal statements and still images, but not
your name, for the purposes of demonstration and evaluation only. This is in no way a
product endorsement.

We may discuss ideas with you or show you Web designs, photo imaging technology,
hardware, or software which are not yet announced products. We are doing this so we
can get your feedback only. By signing this form, you agree not to tell anyone, including
family members, detailed information about this visit and about any new hardware,
software or interface designs you observe during this interview. What you can say is
that you participated in a study to help improve a web product.

Freedom to withdraw
You are free to refuse to participate, take a break, or withdraw from this study at any
time. Please let us know when you need a break.

If you have questions, please ask them now or during the study.

You will receive $75 in cash as incentive for participating in this study.

After reading this form, if you agree with these terms, please show your acceptance by
signing below.

Date                   Participant Signature

                       Participant Name (Printed)
       Types of Testing
• Concept Testing
  – Present the Web site to the user
  – See if they “get it”
  – Do they understand the purpose of the
  – Do they understand the value
  – Do they understand the site organization?
  – Do they have a feel for the site
      Types of Testing (cont)
• Key Task Testing
  –   Present the Web site to the user
  –   Ask them to perform a task
  –   Observe how they perform the task
  –   Personalize the task
      • Contrived tasks have no “emotional buy-in”
      • User should make best use of their personal
  Deciding on Data to Collect
• Two types of data
  – Process data
    • Observations of what users are doing and thinking
  – “Bottom-line data”
    • Summary of what happened (time, errors, success)
    • i.e., the dependent variables
    Which Type of Data to
• Focus on process data first
  – Gives good overview of where problems
• Bottom-line doesn’t tell you where to
  – Just says: “too slow”, “too many errors”,
• Hard to get reliable bottom-line results
  – Need many users for statistical
       The “Thinking Aloud”
• Need to know what users are thinking, not
  just what they are doing
• Ask users to talk while performing tasks
  –   Tell   us   what they are thinking
  –   Tell   us   what they are trying to do
  –   Tell   us   questions that arise as they work
  –   Tell   us   things they read
• Make a recording or take good notes
  – Make sure you can tell what they were doing
    Thinking Aloud (cont)
• Prompt the user to keep talking
  – “Tell me what you are thinking”
• Only help on things you have pre-
  – Keep track of anything you do give help
• Recording
  – Use a digital watch/clock
  – Take notes, plus if possible
    • record audio and video (or even event logs)
• Multiple displays such as the
  previous can be accomplished using
  – A cheap webcam (e.g., Logitech
  – Application sharing in NetMeeting
        Using the Results
• Update task analysis and rethink
  – Rate severity and ease of fixing critical
  – Fix both severe problems and make the
    easy fixes
• Will thinking aloud give the right
  – Not always
  – If you ask a question, people will always
Measuring Bottom-Line Usability
• Situations in which numbers are useful
  – Time requirements for task completion
  – Successful task completion
  – Compare two designs on speed or # of errors
• Do not combine with thinking-aloud. Why?
  – Talking can affect speed and accuracy
• Ease of measurement
  – Time is easy to record
  – Error or successful completion is harder
     • Define in advance what these mean
      Analyzing the Numbers
• Example: trying to get task time <=3
  –   Test gives: 2, 1.5, 4, 9, 1, .5
  –   Mean (average) = 3
  –   Median (middle) = 6.5
  –   How does it look?
• Factors contributing to our uncertainty
  – small number of test users (n = 6)
  – results are very variable (standard
    deviation = 3.2)
Analyzing the Numbers (cont)
• This is what statistics is for
• Crank through the procedures and you
  – 95% certain that typical value is between .5
    & 5.5
• Usability test data is quite variable
  – Need lots to get good estimates of typical
  – 4 times as many tests will only narrow
    range by 2x
     • Breadth of range depends on sqrt of # of test
 Measuring User Preference
• How much users like or dislike the system
  – Can ask them to rate on a scale of 1 to 10
  – Or have them choose among statements
     • “Best usability I’ve ever…”, “better than
  – Hard to be sure what data will mean
     • Novelty of UI, feelings, not realistic setting
• If many give you low ratings -> trouble
• Can get some useful data by asking
  – What they liked, disliked, where they had
    trouble, best part, worst part, etc. (redundant
    questions are OK)
Discount Usability Engineering
• Reaction to excuses for not doing user testing
  – “Too expensive”, “takes too long”, …
• Cheap
  – No special labs or equipment needed
  – The more careful you are, the better it gets
• Fast
  – On order of 1 day to apply
  – Standard usability testing may take a week or more
• Easy to use
  – Some techniques can be taught in 2-4 hours
     Examples of Discount
• Walkthroughs
  – Put yourself in the shoes of a user
  – Like a code walkthrough
• Low-fi prototyping
• On-line, remote usability tests
• Heuristic evaluation
     Fidelity in Prototyping
• Fidelity refers to the
  level of detail
• High fidelity    ?

  – Prototypes look like the
    final product
• Low fidelity ?

  – Artists renditions with
    many details missing
Low-fi Sketches and
     Low-fi Sketches and
• Where do storyboards come from?
  – Film & animation
• Give you a “script” of important events
  – Leave out the details
  – Concentrate on the important interactions
         Why Use Low-fi
• Traditional methods take too long
  – Sketches -> prototype -> evaluate -> iterate
• Can instead simulate the prototype
  – Sketches -> evaluate -> iterate
  – Sketches act as prototypes
     • Designer “plays computer”
     • Other design team members observe & record
• Kindergarten implementation skills
  – Allows non-programmers to participate
      Hi-fi Prototypes Warp
    Perceptions of the tester/reviewer
    – Representation communicates “finished”
      • Comments focus on color, fonts, and alignment
• Time
    – Encourage precision
      • Specifying details takes more time
• Creativity
    – Lose track of the big picture
     Online, Remote Usability
• Use Web to carry out usability evaluations
• Two main approaches
  – Agent-based evaluation (e.g., WebCritera)
     •   Model automatically evaluates UI (web site)
     •   Very inexpensive (no participants needed)
     •   Fast turnaround (hours)
     •   Can run on many sites & compare -> benchmarks
  – Remote usability testing (e.g., NetRaker &
     •   Combines usability testing + market research techniques
     •   Large sample sizes -> reliable bottom-line data
     •   Automatic logging & some analysis of usage
     •   Fast turnaround (1 day)
     •   Templates allow less experienced testers to use
  Remote Usability Testing
• Move usability testing online
  –   Research participants access via Web
  –   Answer questions & complete tasks in “survey”
  –   System records actions or screens for playback
  –   Can test many users & tasks -> good coverage
• Analyze data in aggregate or individually
  – Find general problem areas
       • Use average task times or completion rates
  – Playback individual sessions
  – Focus on problems with traditional usability testing
      Heuristic Evaluation
• Usability Heuristic Evaluation
  – Developed by Rolf Molich and Jakob Nielsen
  – Helps find usability problems in a UI design
  – Small set (3-5) of evaluators examine UI
     • Independently check for compliance with usability
       principles (“heuristics”)
     • Different evaluators will find different problems
     • Evaluators only communicate afterwards
        – Findings are then aggregated
  – Can perform on working UI or on sketches
Heuristic Evaluation (cont.)
 Heuristic Evaluation (cont)
  – HE for Web sites
    • 5-8 evaluators can identify 75%-85% of site
    • 10-12 evaluators can identify 95%
• Navigation Heuristic Evaluation
  Why Multiple Evaluators?
• Every evaluator
  doesn’t find
  every problem
• Good evaluators
  find both easy
  and hard ones
       Heuristic Evaluation
• Evaluators go through UI several times
  – Inspect various dialogue elements
  – Compare with list of usability principles
  – Consider other principles/results that come to
• Usability principles
  – Nielsen’s “heuristics”
  – Supplementary list of category-specific
     • Competitive analysis & user testing of existing
      Heuristics (original)
• H1-1: Simple &      • H1-6: Clearly
  natural dialog        marked exits
• H1-2: Speak the     • H1-7: Shortcuts
  users’ language     • H1-8: Precise &
                        constructive error
• H1-3: Minimize        messages
  users’ memory
                      • H1-9: Prevent
  load                  errors
• H1-4: Consistency   • H1-10: Help and
• H1-5: Feedback        documentation
 Heuristics (revised set)
               searching database for matches

• H2-1: Visibility of system status
  – Keep users informed about what is
    going on
  – Example: pay attention to response time
    • 0.1 sec: no special indicators needed, why?
    • 1.0 sec: user tends to lose track of data
    • 10 sec: max. duration if user to stay focused
      on action
  Heuristics (cont)
• H2-2: Match between system
  and real world
  – Speak the users’ language
  – Follow real world conventions
   Heuristics (cont)

• H2-3: User control &
  – “Exits” for mistaken choices,
    undo, redo
  – don’t force down fixed paths
           Heuristics (cont)

• H2-4: Consistency   • H2-5: Error prevention
  – Navigation        • H2-6: Recognition
  – Authentication      rather than recall
                        – Make objects, actions,
                          options, and directions
                          visible or easily
          Heuristics (cont)

• H2-7: Flexibility and efficiency of use
  – Accelerators for experts (e.g., keyboard shortcuts)
  – “Smart” menus
  – Personalization
      Heuristics (cont)

• H2-8: Aesthetic and minimalist design
  – No irrelevant information in dialogues
        Heuristics (cont)

• H2-9: Help users recognize,
  diagnose, and recover from errors
  – Error messages in plain language
  – Precisely indicate the problem
  – Constructively suggest a solution
           Heuristics (cont)
• H2-10: Help and documentation
  –   Easy to search
  –   Focused on the user’s task
  –   List concrete steps to carry out
  –   Not too large
Phases of Heuristic Evaluation

 1) Pre-evaluation training
   – Give evaluators needed domain knowledge and
     information on the scenario
 2) Evaluation
   – Individuals evaluate and then aggregate results
 3) Severity rating
   – Determine how severe each problem is (priority)
      • Can do this first individually and then as a group
 4) Debriefing
   – Discuss the outcome with design team
 How to Perform Evaluation
• At least two passes for each evaluator
  – First to get feel for flow and scope of system
  – Second to focus on specific elements
• If system is walk-up-and-use or
  evaluators are domain experts, no
  assistance needed
  – Otherwise might supply evaluators with
• Each evaluator produces list of problems
  – Explain why with reference to heuristic or
    other information
  – Be specific and list each problem separately
  How to Perform Heuristic
• Why separate listings for each violation?
  – Risk of repeating problematic aspect
  – May not be possible to fix all problems
• Where problems may be found
  – Single location in UI
  – Two or more locations that need to be
  – Problem with overall structure of UI
  – Something that is missing
     • Hard with paper prototypes so work extra hard on
     • Note: sometimes features are implied by design docs
       and just haven’t been “implemented” – relax on those
          Severity Rating
• Used to allocate resources to fix problems
• Estimates of need for more usability
• Combination of
  – Frequency
  – Impact
  – Persistence (one time or repeating)
• Should be calculated after all evaluations.
  are in
• Should be done independently by all
   Severity Ratings (cont.)

0 - Don’t agree that this is a usability
1 - Cosmetic problem
2 - Minor usability problem
3 - Major usability problem; important to
4 - Usability catastrophe; imperative to
• Conduct with evaluators, observers, and
  development team members
• Discuss general characteristics of UI
• Suggest potential improvements to address
  major usability problems
• Development team rates how hard things
  are to fix
• Make it a brainstorming session
  – Little criticism until end of session
    Severity Ratings Example

1. [H1-4 Consistency] [Severity 3][Fix 0]

The Web site used the string “Next" on the first page for
filling a multi-page form, but used the string “Continue" on
the second page. Users may be confused by this different
terminology for the same function.
   Heuristic Evaluation vs.
        User Testing
• HE is much faster
  – 1-2 hours each evaluator vs. days-weeks
• HE doesn’t require interpreting user’s
• User testing is far more accurate (by
  – Takes into account actual users and tasks
  – HE may miss problems and find “false
• Good to alternate between HE and user
      Results of Using HE
• Single evaluator achieves poor
  – Only finds 35% of usability problems
• 5 evaluators find ~ 75% of usability
  problems - N(1-(1-L)n)
• Why not more evaluators???? 10? 20?
  – Adding evaluators costs more
  – Many evaluators won’t find many more
     Decreasing Returns
   problems found        benefits / cost

• Caveat: graphs for a specific example
    Thank You


To top