Difference between revisions of "User Interface Evaluation"

From MPDLMediaWiki
Jump to navigation Jump to search
m (→‎Interview Form Example (Pullout only): added link to use as template)
(→‎GUI Workshops: added reference for card sorting)
Line 186: Line 186:
Two kinds of workshops are applied if necessary:
Two kinds of workshops are applied if necessary:


#Card Sorting
#Card Sorting <ref>[http://dmt.fh-joanneum.at/projects/ebus2/seite.php?name=Card_Sorting FH JOANNEUM Gesellschaft mbH zu Testverfahren]</ref>
#Wireframes
#Wireframes


Line 229: Line 229:


For each workshop the UIE team prepares a topic.
For each workshop the UIE team prepares a topic.
==References==
<references/>


==Software Based Evaluation==
==Software Based Evaluation==

Revision as of 08:21, 25 June 2009

Interfaces, built at the MPDL are subject to evaluation. Main focus are all interfaces of the current PubMan solution. Evaluation of the User Interface Engineering team has different approaches: Usability Interviews and tests, GUI Workshops and software based evaluation.

Usability Interviews (Thinking Aloud)[edit]

Interface releases are tested together with potential users from institutes of the Max Planck Society. They perform tasks covering important application functionality. An interviewer tracks feedback and observes user interactions. All issues are noted down. The test result is accumulated if at least 8-11 interviews are performed. A summary is given at the end of the interview series which leads to measures and changes in the interface.

The following table shows what is needed to conduct an interview at an institute.

Setup

Participants

1

Supervisor

1 plus 1 visitor (optionally, usually developers)

Equipment
  1. PC
  2. Internet connection
  3. Tasklist (provided by UIE)
  4. Test Form (provided by UIE)
  5. Recording Device (optionally)
Duration

1h

Results

Document with notes on user actions for each action/step.

The results are analysed after 6 - 8 interviews towards a statistic, providing GUI issues ordered by functional area and tasks.

Steps
  1. Short Introduction
  2. Participant solves task independently
  3. Open questions/diskussion

Usability interviews are conducted to discover, document and classify usability issues for

  • a specific release (e.g. PubMan 2.0.0.1)
  • a functional prototype
  • a prototype draft

Up to 8 interviews are usually sufficient to discover the main flaws of a GUI. The interviewer just provides tasks for his participant. The participant does all tasks on his own and should be encouraged to comment on his actions. It is not recommended for the interviewer to interfere in any form. If the participant gets stuck and there are important steps to follow, he gets a hint how to continue.

Interviews can be recorded optionally if participants agree. If a participant is not able to solve a task or step it will be noted as a fatal usability issue. For later interviews with a more standardized set of tasks the issues are to be classified in the following way:

  1. Fatal (Task could not be finished successfully or in a proper way)
  2. Serious (Task could not be finished on the first attempt or user performance is bad)
  3. Minor (The user hesitates, is not sure or does not feel comfortable with the flow)

For each interview the UIE team provides an interview outcome. The document contains all tasks, observation notes and comments of participants. Additionally the interviews collect demographic characteristics as well to prepare data needed for personas.

Interview Form Example (Pullout only)[edit]

Submission Participant 01 Participant 02 Participant 03
Nr. Task Nr. Step Status Issue Comment (Observation) Type Status Issue Comment (Observation) Type Status Issue Comment (Observation) Type
1 Submit a publication item detailled
1 clicks submission in the Main menu OK OK No choosed it from startpage content OK
2 clicks on full submission OK Automatically chosen OK Easy Submission OK
3 clicks a collection (optional) OK Automatically chosen OK
4 identifies group basic OK
5 selects genre = journal article OK OK
6 fills title OK OK
7 identifies group person & org OK OK
8 enters author name, role OK OK
9 clicks on select or types manually Serious Did\'t use Affiliation input (mixed up with source) Procedure OK After automatic detection all authors must be entered
10 clicks an organization in tree (optional) Serious Used \"Help\" to get information (not helpful). Procedure
11 identifies group details OK OK
12 fills a date OK OK
13 identifies group source Serious Mixed up source data and publ. Data Procedure OK
14 fills genre and title Minor Was looking for \"Sternderl\" Position OK
14 clicks on the button ‘submit‘ Validation cycle! (Aff. Missing) OK
End OK OK
2 Upload 2 different files
1 identify group ‘file’ OK OK
2 click on browse OK OK
3 click on upload OK OK
4 enter file properties OK OK
5 operate ‘add’ OK OK
6 repeat actions two times OK OK
End OK OK
3 Save Locator/File
1 Type URL in input field OK OK
2 Click upload link OK OK
3 Choose content category OK OK
End OK OK
2 Use item as template
1 go to workspace OK Hard to find from Submission back OK
2 click on title to enter item version view Serious Did not realize the linked title Procedure OK
3 click on ‘use as template’ OK
End OK OK
4 Add 2 organizations for an author
1 OK OK
2 OK OK
3 OK OK
End OK OK

GUI Workshops[edit]

UIE-Process

For GUI changes, enhancements or new GUIs interface drafts can be discussed and shaped together with potential users. The most problematic part is to get at least 6 people who are interested and skilled to take part in interface shaping directly.

Participants should know about basic interface controls and their mode of operation at the frontend. At the same time they need to have a clear understanding of available functionality.

UIE Workshops can help clarify

  • how users approach their tasks
  • how they expect the interface to behave
  • employed metaphors

Two kinds of workshops are applied if necessary:

  1. Card Sorting [1]
  2. Wireframes

Setup

Participants

5 - 8

Supervisor

1 - 2

Equipment

Equipment provided by UIE Team

Duration

4h - 6h

Result

A rough draft of a graphical user interface part.

Steps
  1. Introduction to a GUI task
  2. Participants work on paper prototypes
  3. Question and Answers
  4. Comparison of Results
  5. Clustering
  6. Rework if necessary
  7. Preparation plus dissamination of results

For each workshop the UIE team prepares a topic.

References[edit]

Software Based Evaluation[edit]

An open source heat map application has been installed to track user interactions along with their mouse actions. The results did not deliver the required quality of feedback and the solution was not extendible towards more dynamic interfaces. Software based evaluation is on hold since then.