Usability Evaluation

Document Sample
Usability Evaluation Powered By Docstoc
					Usability Evaluation

A report prepared on behalf of BlueStream:
Media Tools for Knowledge Makers

March 23, 2006

Prepared by:
David Hsiao
Zhengfei Liu
Nika Smith
Maurice Solomon
Executive Summary

Our team conducted a usability evaluation of BlueStream to evaluate the overall ease of
use, functionality, and visual design. This evaluation is the sixth in a series of usability
activities being conducted on behalf of the BlueStream development team. The findings
from this study will hopefully assist in creating an even more appealing and effective

The usability team conducted one-on-one interviews with five current non-users of
BlueStream from the University of Michigan who work with digital media. Each interview
lasted one hour and was conducted in person in either the GROCS lab in the
Duderstadt Center or in the participantʼs own work environment. Participants were given
six tasks to complete while the test administrator and note taker(s) observed their
interaction with the system. Throughout tasks, the test administrator remained as
unobtrusive as possible, allowing the participants to work on their own.

The team recorded both the participantsʼ audio commentary and digital video captures
of the screen during the sessions. Further, the team took notes on participantsʼ
interactions with the web site and obtained measurements of time on task, task success
rates, and subjective ratings of the site in post-test questioning. Collectively, these data
were used to compile findings from the evaluation.

The evaluation pinpointed several major areas in which the system can be improved to
better support the needs of non-expert users, who will comprise a majority of the user
base while BlueStream remains a relatively new tool:

   1. There are critical issues with user understanding and motivation to use
      metadata, such as understanding of terms, best use of the categories presented
      and defined vocabularies
   2. Terminology often does not correspond to usersʼ own vocabulary, creating issues
      with understanding metadata, functionality, and asset details.
   3. Ambiguity in the search interface options limits usersʼ ability to understand how to
      use it properly to find assets within a reasonable time frame
   4. The system does not store or provide information about file type, limiting usersʼ
      ability to search or review results by this information
   5. The menu structure and layout at the top of the screen caused users to miss the
      “add asset” button as well as the asset-level “actions” pulldown.
   6. Knowledge of why to use the system and examples of how it could be used is
      needed to catch usersʼ interest and encourage them to use the system often

This report presents a usability evaluation of the BlueStream digital asset management
system. The main purpose of this usability evaluation is to identify usability, functionality,
and visual design strengths and weaknesses as identified through real user
interactions to inform both the BlueStream development team and future usability
evaluation work with representative users.

The goals of completing this evaluation are to:
   • Understand how potential end-users would approach BlueStream for the first
      time and use it for managing assets
   • Pinpoint areas in which BlueStream succeeds in providing a valuable service to
   • Pinpoint specific barriers to first-time use of the system that can be remedied by
      improvements in the interface or functionality

Description of BlueStream
BlueStream, formally known as DAMS, is an asset management system developed,
supported, and funded by University of Michigan, in partnership with IBM, Stellent,
Virage, and Telestream. BlueStream currently serves academic departments and
researchers to create, manage, store, and share large amounts of digital objects, such
as images, video, and audio.

Target Audience
BlueStream primarily serves faculty, staff, researchers, and students at the University of
Michigan who need to use, store, and manage video, audio, images, and other large
digital media files. These users embody a diverse set of technical skills, functionality
needs, and interaction preferences that BlueStream must support. For instance,
members of this primary audience may be either novice or expert media asset users,
and may prefer to search or browse for assets in vastly different ways.

Currently, the BlueStream user base is comprised of 115 individual users from 3
academic departments and 11 research projects; however, our team has expanded the
target audience to include potential users, such as UM faculty, staff, researchers and
students who:
    • Are on the BlueStream mailing list after learning about BlueStream from
       colleagues or through attending a BlueStream presentation
    • Are affiliated with a department that is currently considering adopting BlueStream
    • Use digital assets but manage them with other online tools, such as Mfile or

Further, IT support staff within academic departments are considered a secondary
audience that must work with faculty, staff, researchers, and students to support their
media asset needs for courses and research. While IT support staff are consistent in
their technical expertise, their functionality needs and interaction preferences are as
diverse as the primary audience members they support.

The BlueStream development team, lead by Louis King, is currently most interested in
obtaining usability data regarding potential (non-current) users; thus, this report focuses
on understanding the needs and preferences of this portion of the target audience.

BlueStream can be used to hold a wide variety of media assets. Its depth of functionality means that it
must be carefully presented to new users, who may otherwise find it intimidating.

Overview of Evaluation Methods
To complete the evaluation, the team conducted five one-hour usability tests with
prospective BlueStream users during the week of March 13, 2006. Each session was
conducted one-on-one with a test administrator and occurred in either the participantʼs
work location or in the GROCS lab in the Duderstadt Center. The GROCS lab was
chosen as an alternate location due to its accessibility for participants on North Campus
and the usability testing team. Participants used a laptop supplied by the team, with an
external keyboard and mouse attached for comfort. Interactions with the system were
captured using Snap Z Pro screen capture and audio recording software, with audio fed
from an iSight digital camera. In most cases, the digital camera was not used to record
participantsʼ facial expressions at participantsʼ requests.

Participants were given a series of tasks to complete that the client identified as critical
actions for using BlueStream effectively. The usability team observed participantsʼ
interactions with the system, asking questions only as necessary to facilitate discussion.
The team captured data on time to complete tasks, success rate of task completion, and
subjective options via a post-test questionnaire. A note-taker transcribed sessions to
capture participantsʼ comments.

Task Development
The majority of findings from the evaluation came from task completion with the system.
The usability team identified six major task goals that new users should be able to
complete, based on findings from the heuristic evaluation and survey, as well as based
on conversations with the BlueStream development team regarding their most pressing
concerns. The goals are:
   1. Locate an existing asset
   2. Download an existing asset to a personal computer
   3. Add a new asset
   4. Convert an existing asset from one file format to another
   5. Edit an existing assetʼs metadata
   6. Change an existing assetʼs security level

Participants attempted most of these tasks, although in some cases the session time
ran too short to account for all of them. The tasks gave participants a broad view of the
capabilities of the system, as well as exposed them to a diverse set of navigation
options, vocabulary, and design choices inherent in the system. All participants
completed tasks in the same order, as tasks built upon prior knowledge with increasing

The complete set of tasks is available in the Test Administratorʼs Guide in Appendix B.
Participants were recruited using personal and professional contacts within the
University of Michigan community, as well as through faculty and staff mailing lists that
the team identified as likely to contain readers who use digital media with some
frequency. Interested individuals were walked through a screening questionnaire (see
Appendix C) to gauge whether they were appropriate participants for the study. The
main qualifications for the study included:

   1. Student, faculty, or staff member of the University
   2. Current use of digital media in some role at the University
   3. Does not use BlueStream to manage digital media

Demographic Data
To maximize task completion time during sessions, the screening questionnaire
included questions that collected demographic data, such as technology use. These
data describe the diverse audience base that BlueStream serves. A summary of
demographic data collected from the screening questionnaire is available in Appendix

Session Completion
Sessions were completed one-on-one in participantsʼ work locations or in the GROCS
lab of the Duderstadt Center.. Task completion occurred on a laptop provided by the
usability team, equipped with Mac OSX, Firefox web browser, and a wireless Internet

Prior to beginning the sessions, the test administrator cleared the browserʼs cache and
Internet history, and logged into the system using her BlueStream account; this latter
step was necessary given that participants are not current users and thus would not
already have their own accounts to use.

The test administrator began the session by explaining the rights and expectations
during the session and asking the participant to read and sign an Informed Consent
Form (see Appendix A). Next, she explained the order of activities. Participants then
completed tasks, with the test administrator stopping the participants as necessary to
move to other tasks during the remaining time. After the second session it became
evident that the video file referenced in tasks 2a, 4a, 4b, and 4c was no longer
available. For the remaining three sessions, participants were asked to locate a
different video file that was available and that would require the same level of difficulty
in finding as the previous video.

Finally, sessions ended with the participants filling out a post-test questionnaire (see
Appendix E for the questionnaire and Appendix F for a summary of responses) and
verbally responding to some debriefing questions. Sessions lasted one hour and
participants received baked goods from Zingermanʼs Deli as compensation for their time
and effort.
Team Member Roles
All usability team members were present at most sessions. Each member was assigned
a role: test administrator, note-taker, and task timer. When all four team members were
present, two would serve as note-takers to ensure that all members had a role. The test
administrator set up the equipment, provided copies of testing materials and
compensation, greeted the participant, and administered the session. Further, the test
administrator followed the Test Administratorʼs Guide as a script for sessions,
intervening with ad-hoc commentary only when necessary, such as when the participant
required assistance to understand a task. The note-taker(s) transcribed the sessions,
noted participantsʼ reactions, and looked for usability issues that contributed to usersʼ
difficulties in using the system. Note-takers also asked questions of the participants
when necessary to further understand their thoughts or concerns.

Issues identified through task completion, post-test questionnaire results and verbal
responses to debriefing questions are rated on a three-point severity scale, as
described below:
   • High severity: Issue prevents user from completing the task, or causes
       significant difficulty in completing the task.
   • Medium severity: Issue causes a moderate level difficulty in completing the
       task; however, user is not completely prevented from task completion.
   • Low severity: Issue creates an annoyance but does not hinder task completion.

This severity scale corresponds to that used in the evaluation of findings from the
previously completed cognitive walkthrough.
Findings and Recommendations

This section provides a list of major usability issues that requires immediate attention
from both developers and the usability team. The following table, ranked by severity,
summaries the issues that are covered in this report:

Number     Issue                                      Severity Section
2          Description of some Asset attributes is    High       Vocabulary

5          Users confused by “full text search”  High            Search
           versus “search for text”
8          Too many metadata categories; unclear High            Metadata
           which fields are required
9          Standards / rules for metadata fields  High            Metadata
10         Top of the screen menu structure leads High           System Features
           to confusion
3          Some terminology is too advanced /     Medium         Vocabulary

4          Names used for actions are confusing       Medium     Vocabulary

6          The “Item Type” pull-down does not       Medium       Search
           match userʼs expectations
7          No indication of file type is provided in Medium       Search
           asset details
1          Users are not immediately clear on what Low           Initial Impressions
           they can do with the site.
11         Ingest / Task process is confusing to    Low          System Features

1. Task Completion

All participants experienced some difficulty in completing tasks, with a majority of issues
occurring in locating a way to download an asset, understanding how to specify where a
new asset gets saved, and finding the metadata editing form. As discussed in the
remaining sections of this report, success or failure was dependent on the web siteʼs
presentation of information and functionality, especially with regards to understandability
of terminology, ease of locating desired functionality, and ability to predict how the web
site will expect users to get things done.
For instance, participants were generally unable to download an existing asset to their
personal computers or edit its metadata because they did not know that they would
have to click on the asset to view its details and subsequently locate these functions.
Instead, participants expected to find the functionality directly from the search results
page that displayed the asset. With regard to adding assets to the correct collection,
users generally started by clicking the “Add Asset” button, possibly expecting to specify
a destination after the fact.

Table 1 below summarizes task completion rates. Note that Task 3 has been split into
two sub-parts, as it was important to make the distinction between the ability to add an
asset (which most participants were able to do) and add it to the proper location (which
caused significantly more trouble).

                                                          Table 1: Task Success Rates

                                                                                3a: add asset to correct collection

                                                                                                                                        4b: change metadata **

                                                                                                                                                                 4c: change security **
                                           2b: download asset
                     2a: locate asset **

                                                                                                                      4a:transcode **
                                                                3a: add asset

                 1   Success   Fail                             Success         Fail                                  Success           Success                  Success
                 2   Success * Fail                             Success         Fail                                  N/A               Fail                     N/A
                 3   Success   Success                          Success         Success                               Success           N/A                      Success
                 4   Fail      N/A                              Success         Fail                                  Success           Success                  N/A
                 5   Fail      N/A                              Fail            Fail                                  Success           Fail                     Success

    Success Rate 60%                       33%                  80%             20%                                   100%              50%                      100%


            * P2 task 2a: The video to find for this task was no longer available, but the
            participant took the right path to find it and comprehended the results appropriately.

            ** After P2, the team modified these tasks to locate a different video with a
            comparable level of difficulty. From P3 onward, the task successes reflect the use of
            the new video

            N/A indicates that participant was not assigned the task
Further, participants required a significant amount of time as they attempted to complete
most tasks. As a comparison point, one team member who had roughly two monthsʼ
worth of intermittent use of BlueStream completed the tasks to measure semi-
experienced user time on task. These times were compared against the participantsʼ
actual time on task to show how much longer it takes new users to interact with the
system. Based on findings discussed in the remaining sections of this report, it is
expected that time on task was lengthened due to the significant issues participants had
with understanding the web siteʼs terminology, the location of desired functionality, and
interpretation of the web siteʼs response to their interactions.

Table 2 below displays a summary of time to complete tasks:

                           Table 2: Time to Compete Tasks

                                                                                                            4b: change metadata
                                                        2b: download asset

                                                                                                                                  4c: change security
                                     2a: locate asset

                                                                             3a: add asset

                                 1   3:32               3:00                 14:06           0:40           4:25                  1:38
                                 2   9:25               2:00                 7:46            N/A            6:55                  N/A
                                 3   3:21               0:52                 5:29            3:57           N/A                   1:43
                                 4   4:00               N/A                  4:30            4:23           0:45                  N/A
                                 5   10:15              N/A                  5:15            2:51           3:21                  1:40
                             Mean    6:06               1:57                 7:25            2:57           3:51                  1:40
Semi-experienced user time on task   0:14               0:08                 1:00            0:26           0:16                  0:27
                        Difference   5:52               1:49                 6:25            2:31           3:35                  1:13

2. Initial Impressions

   • The web site uses a consistent visual design with other University of Michigan
      web applications, especially through the use of the blue-centric color scheme
   • Participants immediately noticed and appreciated the ability to search for assets,
      commenting that this would be an important advantage of using BlueStream
      instead of Ctools, mFile, or other competing products.
   • Participants immediately noticed and became intrigued in the Workbench
      functions, seeing possibility for advanced interaction with digital media provided
      through these buttons.
Usability Issues

Issue #       Issue Description                                            Severity
1             Users are not immediately clear on what they can do with     Low
              the site.

Upon first visiting the site, participants were immediately drawn to the welcome text in
the central content area of the home page. Most read this text entirely and commented
that it did not explicitly tell them what they wanted to know about the system, such as
what functions it provides and why users should use it over competing products.


Because users do pay attention to this content area, it may be better served by
providing a brief bullet-point summary of the systemʼs features and advantages. Links to
more details can also be provided to ensure that users are not overwhelmed with large
chunks of information at once on the home page. Participant 3 suggested that links to
examples of things that can be done with the system would be helpful.
3. Vocabulary

   • Many terms used by the system leverage usersʼ existing computer / file system
      experience well, to maximize familiarity.
   • Terms are used consistently throughout BlueStream, allowing users to become
      familiar with what is meant by a term through gradual association.

Issue #        Issue Description                                               Severity
2             Description of some Asset attributes is ambiguous                High

When faced with providing asset metadata, all the participants complained that there
were too many options to fill out, and they were not clear about the differences between
the vocabulary used to label some asset attributes. The following asset attribute labels
created the most confusion:

“Creator”: all the participants were not sure if creator points to the user uploading the
file, or the one who created the file. Several participants mentioned that if it were the
uploader of the file, the system should be able to auto-fill this field.

“UM Identifier”: participants could not understand what “UM identifier” was. They
assumed it was a unique ID for each asset, but they did not know whether they should
fill it out by themselves, or whether the system would generate the unique ID
automatically after they added the new asset to the system. In addition, there is another
piece of metadata called “Identifier”, with no hint as to the difference between this and
“UM Identifier.”

Finally, none of the participants understood “UM steward”, “coverage” and “asset class”.
Instead of attempting to fill them out, they skipped the fields entirely, assuming they
were not important if they did not understand them.

Study usersʼ vocabulary to determine better terms for those mentioned above that
create confusion. Also consider adding a brief annotation or an example beside those
attributes, so users can understand what each attribute means.
See section 5, Metadata, for further discussion.
Issue #       Issue Description                                            Severity
3             Some terminology is too advanced / difficult                  Medium

In general, participants did not recognize many of the specialized words used by the
system, such as asset, ACL, and UM identifier. These words are not in participantsʼ
vocabulary, and the particular terminology caused participants uncertainty about the
system when completing tasks. For the participants who are not familiar with file
management systems, these terms create significant confusion during first-time use.

For example, the word “asset” is understandable, but participants mentioned their
preference to use other more straightforward words, such as “file”.
     “I can understand the word (asset), but I am not sure what other users might
    think about it.”

                            -Participant who frequently works with media files

The phrase “Time metadata channel” is also not intuitive. Most participants did not
understand the purpose of this search item even though they checked the dropdown
list, as a result, they chose to ignore this search item when they tried to finish the
locating an asset task.

Few participants understood what “ACL” meant, and what it was short for. Searching by
ACL is one of the search items, but there is no annotation for ACL, or a dropdown list in
this item.

                                               ACL in “Edit Metada” panel

ACL in search
panel, no
annotation, no
dropdown list

  One possible solution is to replace the terminology with commonly used words. For
  instance, instead of using the abbreviation ACL, system could use “File access control
  level”, and then provide a pulldown menu for this search item.

  If the current terminology is necessary, another solution to make it understandable is to
  provide annotations. The help could include a separate vocabulary for the terminology
  and create a link at the initial page; also, like what most other applications do,
  BlueStream could make a short highlight annotation displayed on the screen if the
  userʼs mouse or cursor hovers over a certain term for a short period of time.
Issue #        Issue Description                                               Severity
4             Names used for actions are confusing                             Medium

When participants tried to locate targeted files by browsing, they were confused about
the difference between the “Browse” and “Explore” tabs at the left panel. The category
under the browse is the subject category, while the category under the Explore tab
maps to department and schools of University of Michigan; however, participants were
not able to distinguish the two tabs merely by looking the titles.

Further, they were not helpful for participant to locate the targeted file. During the
usability test, when users tried to find the media file with the title of “oral exam”, some of
them chose to locate the file by browsing, but found that most of the folders supplied
under Browse were empty. This hindered task completion significantly.

“Transcode” is another action term that was not familiar to the participants. When the
participants were asked to convert the format the media they just uploaded from
QuickTime to Windows Media format, they could find the transcode option in the Actions
dropdown list, but most of them hesitated to select it because they were not sure
transcode was the command to convert the format until they had reviewed the rest of
the options.

In addition, participants were hesitant about the meaning of the action label “ingest”. All
five participants said they have never seen the word in other systems or software
before. Half of the participants assumed this term meant that the file they had just
worked with would not be ready immediately due to the system being too busy to
process the work immediately.

    “It is sad, the system doesnʼt think my file is important. I am going to find a
    way that would make system to process my task right away!”

                          -Participant dissatisfied with the concept of “ingest”

                        Locating asset by Browse
Locating asset by Explore

          Task ingest
To handle the confusion about the terms “Browse” and “Explore”, consider modifying
collection categories to correspond with UM academic units, so users are clear where to
find the target file without looking though whole categories. One participant suggested
BlueStream use Yahoo! Browse category as a reference.

There are thousands of files in the system, but few of them are classified under a
certain subject, or under a UM department folder. Another possible solution is to create
a required attributes for adding a new asset, making sure every new added asset must
be classified into a given subject, or into a UM department folder.

Consider using an alternate action term to replace “ingest” to make the status of their
assets clearer to users, such as “processing of (filename).”

4. Search

   • Search panel is always visible on the left side of screen. Users quickly learned
      its location, and understood to use it when asked to locate a specific asset
   • Participants liked having multiple options for locating assets available in this

Usability Issues

Issue #       Issue Description                                              Severity
5             Users confused by “full text search” versus “search for text” High

Four participants out of five commented that they did not understand the difference
between “full text search” and “search for text” options in the Search panel. There is no
instruction on the search panel explaining the difference between these options. Only
Participant 4 interpreted “full text search” to search within the text document, whereas
“search for text” looks for search terms in the metadata.


Informing users the difference between these two types of search fields can help them
to form better search queries. The system can provide a brief description on where does
the “full text search” search and where does “search for text” search. Allowing the user
to understand how these searches work will enables them to more efficiently use
search. Additionally, reducing feelings of uncertainty and guessing that users encounter
will increase overall satisfaction with the system.

Issue #       Issue Description                                               Severity
6             The “Item Type” pull-down does not match userʼs                 Medium

Users seemed to confuse “item type” with “file type,” or that they use these two phrases
interchangeably. From the user tests we found out that these two item types are not
useful to them when conducting a search. The participants did not know what exactly
these two choices meant, and they do not know what to expect under each choice.
Without prior knowledge about these two choices, this pull-down menu is not helpful in
narrowing down their searches.

When performing the task of searching for a video, they actually expected to see “file
type” when they click on “item type.” They were looking for a choice that can narrow
down the search to only videos, but there was no such choice, and “item type” seemed
to be the closest alternative.
          “When I pull down the item type, I was expecting to find the specific file type,
          something like document, audio, video, or powerpoint, I donʼt know what the
          options in the item type dropdown are, but I guess they are about topics.”

          “Itʼs surprising to see ʻUM Generalʼ and ʻPerformance Artsʼ under here.”


To make the search more useful, the system should provide a way for users to search
by file type. Within this field users should be able to select different type types that the
system supports, such as audio, video, PowerPoint, or image. Providing these options
in a listbox will allow users to select multiple critieria for their search.

Further, the term “Item Type” should be replaced with more specific labeling that refers
to what is meant by the two options provided below.

Finally, the system should also provide brief explanation to inform users what “UM
General” and “Performance Arts” mean. Without the knowledge about the differences
between them, the “Item Type” criterion is useless, and users will always choose “All
Item Types.”

Issue #          Issue Description                                                     Severity
7               No indication of file type is provided in asset details                 Medium


           “Does it say that this is actually a video? It looks like it is a video but it
          doesnʼt say so.”

                                                                                 -Participant 1

When looking at an asset metadata in the search results, there is no clear description or
indication of file types on the search result page. Some assets have icons indicating
their file type, but some do not (see example 1 below). Further, some files do have a full
descriptive file name with file extension provided to indicate the file type, but not all
assets have this information (see example 2).

This difference is especially confusing in differentiating between images and videos.
The system presents a thumbnail for images, and a snapshot for videos, which both
look the same. Users have no clear way to distinguish one from another, except that
some videos include text transcripts of its speech in the search results (example 3).


Example 1: icons show file types.

           Example 2: UM Identifier includes file

Example3: top one is a video; bottom one is picture.
Providing specific information about the file type of the asset in the search results will
solve this problem. Even though it is possible to infer from some other metadata
information, users prefer to quickly grasp the file type of the asset from looking at the
search results. For instance, the search results could list the file type or file extension,
and use an easily differentiated icon to indicate video, audio, image, and other file

5. Metadata


       •    Users understood some of the basic metadata categories, such as
       •    All participants saw the need for metadata, and the idea that it makes later
            search must easier and more robust.

Usability Issues

Issue #         Issue Description                                                Severity
8              Too many metadata categories; unclear which fields are             High

When adding a new asset to the system, users are presented with a long list of
metadata fields to fill out during the process. The fields are not organized alphabetically,
nor are the fields grouped according to their categories. Further, every item on the list is
spaced the same way regardless of their semantic similarity to each other.

For example, in the “Performing Arts” metadata, the distances between Ensemble 1~4
are the same as the distance between Assistant Engineer 2 and Ensemble 1. Grouping
similar items together would provide users with a quick and easy way to skip a group of
items together that they do not need. With grouped textboxes, users can locate useful
information in “chunks” rather than having to go through all the textboxes one by one in
serial process.

The order of the textboxes is also confusing. At first, the textboxes on the top seemed to
be more important fields like “Title,” “Subject,” and “Creator.” Without any notification or
dividing signs, the list suddenly switches to alphabetical order. In addition, it is very non-
intuitive and confusing to order unrelated items together just because of their alphabets.
The current arrangement of the order requires users to switch between different groups
of information when they fill out this form. For example, they need to switch from
“Ensembles” to “Image Filename” and then back to “Instrument.”
Finally, the instructions at the top of the form indicated that required fields would be
annotated with as asterisk (ʻ*ʼ) but there is no such annotation to be found next to any of
the fields. In fact, no field is required to be filled out, but users expect some fields to be
required, such as Title.


      By alphabetical order
      By importance

       Instruction indicates that * denotes required attribute, but no field is starred.

Ideally, similar items should be grouped together for easy recognition and perception,
especially in a long list like the metadata entry form provided in BlueStream. The
metadata entry form when adding a new asset has more than 30 fields, and these
textboxes are ordered in a non-intuitive manner with no indication that any are actually
required for completion.

The most important fields, such as the link to attach the file, should be towards the top
of the screen. Similar items should be grouped together, and unrelated item should
have space to separate them. For example, “Ensembles” can be placed closer to each
other while further from other textboxes. Moreover, the ordering should also be
consistent. One suggested ordering mechanism is to put all the instrument performer
fields together (“Ensemble,” “Instrument,” “Soloist,” etc.) while put other system-oriented
fields together (“Filename,” “Status,” “Approved for internet,” etc.).

In addition to grouping, the system could list only one field for each metadata type (i.e.
only “Instrument 1”) and provide users an option to add more as needed. This way the
user is not limited to total of 3 instruments and will not feel obligated to fill them all, but
will also not be inundated with multiples of these fields appearing when they are not

Issue #        Issue Description                                                 Severity
9              Standards / rules for metadata fields                              High

The concept of metadata is not firmly integrated in the general user-base, and there has
not been a general consent or standard for entering metadata. Alarmingly, many assets
in the system presently have little or no metadata entered about them, such as a title,
creator, or description. This is likely due to two observations from user testing: that
users skip items they donʼt understand; and that users assume none of the metadata
fields are actually required for completion. The lack of meaningful metadata significantly
hinders its intended value and complicates locating assets efficiently.

One method of improving the value of metadata is to empower users with the
knowledge of what each type of metadata represents. Both users searching for and
users adding assets should agree on the standard meaning of the metadata provided in
the system. Without this agreement, metadata becomes less useful and usersʼ ability to
use the system is hindered.

One participant commented that he did not know whether the “Title” in the search pane
referred to the title of the presentation, document, or the file name. When filling out the
metadata information during the process of adding a new asset, all participants also
commented on not knowing what each metadata field meant. The presence of vaguely
defined metadata increases the likelihood that users may interpret the terms differently
and thus enter information differently.

In both search pane and adding new asset page, the system can provide brief definition
for each field, or at least have the explanation ready (ie context sensitive help) when
users are confused. Some of the metadata fields should be restricted to controlled
vocabularies and other closed lists of terms. Without some consistent definition and
standard, searching for asset becomes a guessing game based on luck. The system
can use JavaScript to present the definition of each word when users mouse-over the

6. System Features

      • The systemʼs functionality seems easily learnable. Participants were able to
          build on earlier tasks to complete later tasks. For example, once participants
          figured out how to edit metadata, they were able to easily find where to alter
          access controls.
      • Users were able to quickly determine what each menu item or area of the site
          did, based on its layout.

Usability Issues

Issue #       Issue Description                                            Severity
10            Top of the screen menu structure leads to confusion           High
Even during their short time with the system, participants came to understand the
second row of menu options – the darker, brighter blue row labeled as Workbench – as
containing the most frequently used options. This caused problems when attempting to
add a new asset task because users thought of adding an asset as one of these
frequently used options, yet this function was not available in this area.

For example, Participant 4 looked in the Workbench for the Add Asset command,
reasoning that it was a frequent task. After noticing that the function actually was
located in the global bar at the top of the page, he commented that Add Asset didnʼt
group logically with the other functions there, such as Help or Log Out. Further, both
Participant 2 and Participant 5 tried using “Submit to Workflow” to add an asset, with P5
going all the way to the help system, without noticing the Add Asset button.

After the test, participants noted the lack on contrast at the top of the screen as the
main reason options up there didnʼt catch their eyes. This top bar is the lightest part of
the screen, and the blue-on-blue color scheme caused most participants trouble. In
fact, it is fairly common for users to ignore the very top of web pages, in part because of
the way advertising is usually presented on the web. In the literature, this is referred to
as banner blindness, and is known to affect even sites without ads.

Additionally, users went to the menu area at the top of the screen to do asset-specific
actions, instead of using the “Actions” pull down menu embedded with the asset details
content area. For example, Participant 5 started with this top menu bar when asked to
transcode an asset. Participant 3 started with the dark blue bar, indicating its centrality,
and after searching a bit, went to the light blue top bar, searching there. She did not
think to look down next to the asset for the “actions” pull down. Additionally, for some
assets with a larger preview image, this pull down will be off the screen, exacerbating
the userʼs difficulty in locating it.

                                                   Global functions appear here; users did not
                                                   expect to find New Asset in this bar

Workbench; participants focused here for most tasks but did not
understand that these were low-level functions available only when an
asset or collection had first been selected
Improve the color palette in the global navigation bar to help the important icons at the
top of the screen ʻpop-outʼ. This will make them noticeable to users, and overcome
potential banner-blindness. In highlighting the icons, Tasks should be differentiated from
Prefs, Help and Logout, based on the difference in frequency of use and importance to
the user. Add Asset should also be differentiated in both design and spacing; placing it
further to the left away from the other options may make it stand out more.

There should be more visual coherence between the Workbench and its relation to the
Asset Details content area of the page. Participants thought that the Workbench items
were always available for use because they were always displayed. This may be
improved by only displaying the functions when in the proper mode, such as viewing an
asset or a collection. Further, the Actions dropdown should be included in this
Workbench since it is also asset-related. This would centralize almost all actions the
user takes with the system along the top, much like search is centralized along the left
side, and contribute to coherence for the user.

Issue #       Issue Description                                                  Severity
11            Ingest / Task process is confusing to users                        Low

While the “Task” functionality in Blue Stream is robust and allows the system to handle
batch tasks and requests from users in parallel, the novice participants had some
frustration with the system. On a positive note, users quickly realized that the Task
dialog does not refer to actions performed on assets, but instead tells “you what you
have already done so far”, in the words of Participant 3.

However, in addition to the issues raised in the vocabulary section, users were confused
by and frustrated with the completion times associated with system tasks, such as the
time it would take for an asset to be ingested. Some participants interpreted the next
day scheduled completions as a flaw with the system, or evidence that they had done
something wrong. Participant 2 commented that she would want to know that the
system would notify her when her task was done.

      “It seems like I have to wait until tonight to find out if my file is available.”

        -Participant frustrated with the ingest time provided in the Tasks menu

Participant 1, who was a sophisticated user of other content management systems, had
no problems understanding the ingest and task process. This shows that the task
functionality must be preserved, while giving novice users a little extra guidance to get
comfortable. The tasks menu should provide information in plain English, specifying that
their request has been received and will be completed at a certain time, to assure them
that they have not done anything wrong. Further, users should be told that they can
check back in the Tasks menu to review the status later so that they know what to do

The following major usability issues were uncovered through this evaluation:
   1. There are critical issues with user understanding and motivation to use
       metadata, such as understanding of terms, best use of the categories presented
       and defined vocabularies
   2. Terminology often does not correspond to usersʼ own vocabulary, creating issues
       with understanding metadata, functionality, and asset details.
   3. Ambiguity in the search interface options limits usersʼ ability to understand how to
       use it properly to find assets within a reasonable time frame
   4. The system does not store or provide information about file type, limiting usersʼ
       ability to search or review results by this information
   5. The menu structure and layout at the top of the screen caused users to miss the
       “add asset” button as well as the asset-level “actions” pulldown.
   6. Knowledge of why to use the system and examples of how it could be used is
       needed to catch usersʼ interest and encourage them to use the system often

By addressing the areas of concern presented, BlueStream will become easier and
more understandable for both novice and expert users, speeding its adoption by those
who need it most.
Appendix A: Informed Consent Form

This usability evaluation is being performed in order to understand how easy to use and
helpful BlueStream is to users. The BlueStream usability team would like to observe you
as you work with BlueStream to determine how it can be improved.

Informed Consent

I freely and voluntarily consent to participate in this usability evaluation under the
direction of the BlueStream usability team: David Hsiao, Zhengfei Liu, Nika Smith, and
Maurice Solomon.

I understand that my participation is completely voluntary and that I may withdraw my
consent and discontinue my participation at any time without penalty or prejudice to my
business organization or me. I further understand that I will be compensated only upon
completing the session.

I have been given the right to ask questions concerning the procedures to be employed
during this study and to have these procedures explained to my satisfaction.

Audio Recording Release

Audio recordings made during this study will be used for research purposes. I have
been informed that my work during the evaluation will be recorded and may be viewed
only by the BlueStream usability team. I give my consent to use my recorded voice for
this purpose, with the provision that my name will not be associated with the recording.

I have read and understood the foregoing and understand that I will receive a copy of
this form on the day of the study upon request.



Appendix B: Test Administrator’s Guide

Pre-Test Procedure

* Thank you for participating in this study. Today, we will be looking at BlueStream,
which is a University-developed tool to help users manage their digital media. Its on the
web, and we will be accessing it through a standard web browser.

* Weʼve brought our own laptop, but we have plugged in a full size mouse and full size
keyboard to recreate a more desktop feel. We will be capturing the screen during the
test, because it is often difficult to remember everything that occurs during the session
afterwards. We will not be logging keystrokes.

* We want to emphasize that we are not testing you, or your abilities – we are testing
the web site. We donʼt expect you to be familiar with it, we are here to learn from you.

* We are interested in all of your feedback on the web site, positive and negative. We
would really like you to constantly think out-loud as you go through the system,
commenting on the content, navigation, terminology, colors. Tell us what you are
looking at, what you are thinking, why you are doing something, etc. This helps us
understand your thought process as you go through the site.

* You may ask questions clarifying the task, but I may not answer questions about the
system itself because we want to replicate, as closely as possible, how you would use
the site if I were not here.

* The evaluation is broken up into 3 or 4 subtasks, and shouldnʼt take more than 30

Do you have any questions for us before we begin?

Overview of Session

First Iʼd like to give you an overview of what weʼll be doing today:

Iʼll show you the system and ask for your initial impressions.
Then, youʼll complete a series of tasks. If we are running out of time, I might stop you
and have you move on to the next task. At the end Iʼll give you a questionnaire to fill out
and ask you some debriefing questions. Then, weʼre done.
Task 1: Initial Impressions

Open a browser window and open BlueStream. So, this is the BlueStream system. I
want you to take a minute or two and just look around this opening screen, moving the
mouse over things as well, and tell me your thoughts.

   •     What do you think you can do with this system?

   •     How do you feel about the overall design? Color scheme? Size and type of
         the font used?

   •     What kind of information or functionalities to you expect to find under each of
         the buttons and tabs?
Task 2: Locating an Asset

2a Goal: locate a video for dental students to learn about oral exams.
One of the things BlueStream does is store files for later use. I want you to locate a
particular file, a video made for dental students. Itʼs a video that discusses oral exams.

Indended Path: Search Tab, Text Search, type in “oral exam”, video comes up on first

2b Goal: Download the asset to the computer

Now I want you to get this video onto our computer here.

Intended path: Actions button -> Download
Task 3: Add New Asset

3a Goal: Now, I want you to take the video file on your desktop, and add it to the
system. Pretend the asset is a video recording of this test.

Intended path: New Asset button, new page with long attribute form, fill out some of the
form, find attach file box towards the bottom, select file, save form. User is dropped at a
“scheduled for ingest” screen

Are you satisfied that you have added an asset to the system?

3b Goal: I want you to confirm that your asset is in the system

Intended path: Tasks button, and then can see task is complete.
Alternate Path: Search for a word in the assetʼs title.
Task 4: Edit Existing Asset

For the next set of tasks, we will work on editing an existing file in the system, with the
following UM identifier (hand paper to participant with the UM identifier printed on it).

First, locate this video (assist the user with finding this video if he/she has trouble; we
are not testing their ability to find assets anymore).

4a Goal: Transcode Video

This video is quicktime format. I want you to convert it to the Windows media format.

Intended path: Actions tab -> Transcode. Factory Tab - > Windows Media medium.
Click on task number to see task. Click on tasks button to make sure its complete

4b Goal: Change Metadata

We want to update some of the description of this video file. I want you to change the
name to “Bridge Video”, and make the file read-only.

Intended path: Action tab - > Edit Metadata.

4c Goal: Change access control

Now make the file read-only

Intended path: Type in name, change ACL tab to PublicReadACL

How did you find the system overall?

What did you think of the functionality provided?

Not that you have seen it a bit, what other functions do you expect to be in such a

Where there parts you found confusing?

Did you feel like you know where to go all the time?

What would have made your experience better/ easier?

How did you feel about the help system? (if they used it)
Appendix C: Screening Questionnaire

                           BlueStream Web site
                         Screener for Participants
                         March 13 – March 17, 2006
                           5 participants needed
                   Location: Participantʼs office or TBD
            Duration: 1 hour Compensation: Zingermans treats


Academic Department:       (recruit a mix)

Session Location:      

Session Day/ Time:      

1. What is your role at the university?      

2. Do you work with any of the following digital media file types in your department?

      Digital audio recordings (examples: .mp3, RealAudio, Windows Media)
      Digital video recordings (examples: .avi, RealVideo, Windows Media)
      Digital images (examples: .jpg, .tiff, .gif)
      None (Skip to Termination section)

3. How often do you use digital media in your work at the University?      

4. In what ways do you incorporate these digital media files into your work?

      For research purposes
      To present during lectures
      As a supplement to other teaching materials given during lectures
      For students to use in completing assignments and/or exams

              Please describe       (if not relevant to BlueStream, skip to Termination
5. What do you currently use to make these digital files available to yourself, and
optionally to others you wish to have access to them, such as colleagues or your

      Personal computer filespace
      Networked filespace
      Content management system
      Which one?      
      BlueStream/DAMS (Skip to Termination section)
      Please describe      

6. How many hours per week do you spend on the computer?      

7. It looks like you qualify as the type of participant we need for this study. The study
session will be videotaped. Only the team working on this project will use the tape and
your name will not be associated with the tape or other data in any way. You will be
asked to sign an informed consent form. Would you be willing to be videotaped?


      No (Skip to Termination section)

8. May we conduct the study in your office? We will provide a computer for you to use.

      No—We will get back to you as to an alternate location

Screening complete. Refer to schedule to find a timeslot for the participant.


Unfortunately, we will not be able to use you as a participant for this study. Thank you
for your time.
Appendix D: Responses to Screening Questionnaire

                                                                    3: Frequency of use

                                                                                          4: Incorporatation
                                                                                          of media files into

                                                                                                                                   computer use per
                                                2: Use of digital

                                                                                                                5: Current tools
                                                                    of digital media
                                                media types

                                                                                                                                   6: Hours of
                             1: Role at


                             Staff: Student    PowerPoint, audio, Every day               Other: use as         Ctools, Mfile,    No response
                             focused,          video, images                              tools of              Personal computer
                             encouraging more                                             communication.        filespace,
                             discussion around                                            Provides              Networked
                             literacy in rich                                             framework to          filespace, Other:
                             media                                                        share file types      Flickr, mBlog,
       Digital Media                                                                                            Sitemaker
     1 Commons
                             Faculty member: PowerPoint, audio, Every day                 Incorporates into     Other: CDs           No response
                             studies use of    video, images                              classes               because the total
                             media; searching,                                                                  disk space
                             classification of                                                                  required for all the
                             information                                                                        files exceeds the
                                                                                                                amount of space
                                                                                                                the department is
                                                                                                                willing to provide
       School of
     2 Information
                             Student, taking                                         For
                                                PowerPoint, audio, A few times per week personal use,           Ctools, Networked 5-6/day
                             courses on digital video, images                        and for class              filespace, Other:
                             media and working                                       assignments                CDs and mini DVs
                             with professors
                             who conduct
                             research with
       School of             digital media
     3 Information
                             Student and        PowerPoint, audio, Every day              Other: creating       Personal computer >40hrs/week
                             graphic designer   video, images                             and distributing it   filespace, CMS
       School of                                                                          to colleagues         (Exact Target)
     4 Information
                             Faculty member: Audio, Video           Every day             For research          Personal computer >40hrs/week
                             records everything                                           purposes, for         filespace, iPhoto, 1
                             encountered in                                               courses, for          terabyte external
                             everyday life using                                          personal pleasure     hard drive
                             video and audio.
                             Relates these
                             media to poetry.
                             Creates books of
       College of            works from this.
       Literature, Science
     5 and the Arts
Appendix E: Post Test Questionnaire

Circle the number that seems most appropriate for each question below:

1. Overall, how satisfied were you with the Website?
Very dissatisfied
     Very satisfied

2. Overall, how easy was the Website to use?
Very difficult 
     Very easy

3. Overall, how easy was it to navigate, find the right buttons to push, and get to the
various parts of the Website to work with assets?
Very difficult 
     Very easy

4. Overall, how clear was the language on the Website?
Very confusing
     Very clear

5. Overall, how clear were the labels on the buttons on the Website?
Very confusing
    Very clear

6. How easy was it to add assets to the Website?
Very difficult 
     Very easy

7. How easy was it to modify assets, such as saving to a different file format and
changing the file name?
Very difficult 
     Very easy

8. How interested are you in using this site to manage digital assets in the future?
Very uninterested
     Very interested

9. What aspects of the Website did you like the most?

10. What aspects of the Website did you dislike the most?

11. How can the Website be improved?
Appendix F: Responses to Post Test Questionnaire

                                  Responses are given on a 7-point Likert
                                         1 = Negative response
                                          7 = Positive response

                                                                                                                                                                                                          8: Interest in using in the future
                                                                                                               5: Clarity of labels on buttons

                                                                                                                                                                            7: Ease of modifying assets
                                                                                                                                                 6: Ease of adding assets

                                                                                                                                                                                                                                                                              10: Aspects least liked
        1: Overall satisfaction

                                     2: Overall ease of use

                                                                                      4: Clarity of language

                                                                                                                                                                                                                                               9: Aspects most liked
                                                              3: Overall navigation

                                                                                                                                                                                                                                                                                                             11: Improvements

                                                                                                                                                                                                                                                                                                             There needs to be a balance
                                                                                                                                                                                                                                                                                                             of how many paths one can
                                                                                                                                                                                                                                                                                                             take to complete a task.
                                                                                                                                                                                                                                                                                                             There should be some
                                                                                                                                                                                                                                                                                                             flexibility but not too many
                                                                                                                                                                                                                                                                              Special- formatting            different ways to do the
                                                                                                                                                                                                                                               Potential for sharing with     shorthand, metadata            same thing or it gets
   1    6                            6                        6                       4                        7                                 7                          7                             7                                    other Departments              shorthand                      confusing

                                                                                                                                                                                                                                                                              Search. Difference between
                                                                                                                                                                                                                                                                              full text search and search
                                                                                                                                                                                                                                                                              for text. There's only 164
                                                                                                                                                                                                                                                                              documents in the system.
                                                                                                                                                                                                                                                                              How do you use Explore?
                                                                                                                                                                                                                                                                              There's nothing in Explore     Make it obvious what mode
                                                                                                                                                                                                                                                                              folders. Metadata suck. It's   I'm in, whether I'm
                                                                                                                                                                                                                                                                              difficult to understand how    searching/browsing or
                                                                                                                                                                                                                                                                              to enter a document. The       entering documents. I
                                                                                                                                                                                                                                                                              workbench doesn't make         assume that people who
                                                                                                                                                                                                                                                                              sense. After using the         search won't also be
                                                                                                                                                                                                                                               I can't recall one thing I     system for 4-5 minutes I       contributing at the same
   2    1                            1                        1                       1                        1                                 2                          N/A                           1                                    liked at all                   wanted to leave.               time.

                                                                                                                                                                                                                                                                              Search options (or options
                                                                                                                                                                                                                                                                              in general) are too many.
                                                                                                                                                                                                                                                                              Some are difficult to
                                                                                                                                                                                                                                                                              understand without
                                                                                                                                                                                                                                                                              explanations. Tasks are not
                                                                                                                                                                                                                                                                              clearly organized. For
                                                                                                                                                                                                                                                                              example, the top "Tasks"
                                                                                                                                                                                                                                                                              option and the lower
                                                                                                                                                                                                                                               Working with a variety of      "Actions" part are kind of
   3    5                            5                        5                       5                        5                                 5                          5                             5                                    media files                    competing for attention     Search

                                                                                                                                                                                                                                                                                                             Presenting information that
                                                                                                                                                                                                                                               Changing file formats is       Uncertainty of why to use      is more contextual (eg video
                                                                                                                                                                                                                                               extremely useful, and          the system and how I fit       files usually have this type
                                                                                                                                                                                                                                               second navigation was pretty   with everyone else using the   of metadata) or based on
   4    5                            5                        4                       4                        5                                 5                          6                             5                                    clear                          system                         basic, intermediate, novice

                                                                                                                                                                                                                                                                                                          Use some simple language
                                                                                                                                                                                                                                                                                                          for actions likely to be
                                                                                                                                                                                                                                                                                                          performed by novices to file
                                                                                                                                                                                                                                                                                                          management applications
                                                                                                                                                                                                                                               Possible (simple) sharing of                               with possibility of application
                                                                                                                                                                                                                                               large complex digital files                                responding to expertise
                                                                                                                                                                                                                                               across university and in      In some cases, no obvious gained by user-- so app will
                                                                                                                                                                                                                                               larger connected community. language for simple, basic     respond to user's increased
                                                                                                                                                                                                                                               Conversion of files from      actions likely to be         competence (learn and
                                                                                                                                                                                                                                               format to format, platform to performed by novices to file adapt to user's experise
   5    3.5                          3.5                      3.5                     3                        6                                 4                          4                             7                                    platform, etc.                management applications      level)
Average 4.1                          4.1                      3.9                     3.4                      4.8                               4.6                        5.5                           5