User Interface Evaluation

From MPDLMediaWiki
Revision as of 12:51, 9 January 2008 by Rkiefl (talk | contribs) (edited by Rkiefl via TableEdit)
Jump to navigation Jump to search
User Interface Evaluation

Interviews
PubMan R2 Interviews
PubMan R3 Interviews

Expert Interviews
Overview

Workshops
Overview
Berlin, 2008: Lists of Authors
Nijmegen, 2008: Easy Submission, ...
Munich, 2007: Short Item View

edit


Interfaces, built at the MPDL are subject to evaluation. Main focus are all interfaces of the current PubMan solution. Evaluation of the User Interface Engineering team has different approaches: Usability Interviews and tests, GUI Workshops and software based evaluation.

Usability Interviews[edit]

Setting:

Participants

5 - 8

Supervisor

1 - 2

Equipment

Equipment provided by UIE Team

Duration

4h - 6h

Result

A rough draft of a graphical user interface part.

Steps
  1. Introduction to a GUI task
  2. Participants work on paper prototypes
  3. Question and Answers
  4. Comparison of Results
  5. Clustering
  6. Rework if necessary
  7. Preparation and dissamination of results
edit table

Participants:1 Supervisor: 1 Duration: ~1h Equipment:

asdfas
1
edit table

Workshops[edit]

Initially the workshop for the short list view was not ment to deliver very much valid results, but when I put the pages from the whiteboard some things were striking. So I counted them for you:

1. Creators are mostly on the first position (18/23 Participants) 2. Title is mostly used as the second field (15/23 Participants) 2. File/Fulltext seems to have an outstanding position because it was placed differently on exposed positions or apart from the list (down, right, top) 3. Most participants prefer a vertical alignment of labels and fields (16/23)

Some participants tried to do a more sophisticated layout - which might mean that they do not like monotonous listings (just guessing).

Software based Evaluation[edit]

TBD