Difference between revisions of "User Interface Evaluation"

From MPDLMediaWiki
Jump to navigation Jump to search
Line 56: Line 56:
Duration: ~1h
Duration: ~1h
Equipment:
Equipment:
|-
!bgcolor = #eee |Participants
|
5 - 8
|-
!bgcolor = #eee |Supervisor
|
1 - 2
|-
!bgcolor = #eee |Equipment
|
Equipment provided by UIE Team
|-
!bgcolor = #eee |Duration
|
4h - 6h
|-
!bgcolor = #eee |Result
|
A rough draft of a graphical user interface part.
|-
!bgcolor = #eee |Steps
|
<ol>
<li>Introduction to a GUI task</li>
<li>Participants work on paper prototypes</li>
<li>Question and Answers</li>
<li>Comparison of Results</li>
<li>Clustering</li>
<li>Rework if necessary</li>
<li>Preparation and dissamination of results</li>
</ol>
|-


==Workshops==
==Workshops==

Revision as of 12:49, 9 January 2008

User Interface Evaluation

Interviews
PubMan R2 Interviews
PubMan R3 Interviews

Expert Interviews
Overview

Workshops
Overview
Berlin, 2008: Lists of Authors
Nijmegen, 2008: Easy Submission, ...
Munich, 2007: Short Item View

edit


Interfaces, built at the MPDL are subject to evaluation. Main focus are all interfaces of the current PubMan solution. Evaluation of the User Interface Engineering team has different approaches: Usability Interviews and tests, GUI Workshops and software based evaluation.

Usability Interviews[edit]

Setting:

Participants

5 - 8

Supervisor

1 - 2

Equipment

Equipment provided by UIE Team

Duration

4h - 6h

Result

A rough draft of a graphical user interface part.

Steps
  1. Introduction to a GUI task
  2. Participants work on paper prototypes
  3. Question and Answers
  4. Comparison of Results
  5. Clustering
  6. Rework if necessary
  7. Preparation and dissamination of results
edit table

Participants:1 Supervisor: 1 Duration: ~1h Equipment:

|- !bgcolor = #eee |Participants | 5 - 8 |- !bgcolor = #eee |Supervisor | 1 - 2 |- !bgcolor = #eee |Equipment | Equipment provided by UIE Team |- !bgcolor = #eee |Duration | 4h - 6h

|- !bgcolor = #eee |Result | A rough draft of a graphical user interface part. |- !bgcolor = #eee |Steps |

  1. Introduction to a GUI task
  2. Participants work on paper prototypes
  3. Question and Answers
  4. Comparison of Results
  5. Clustering
  6. Rework if necessary
  7. Preparation and dissamination of results

|-

Workshops[edit]

Initially the workshop for the short list view was not ment to deliver very much valid results, but when I put the pages from the whiteboard some things were striking. So I counted them for you:

1. Creators are mostly on the first position (18/23 Participants) 2. Title is mostly used as the second field (15/23 Participants) 2. File/Fulltext seems to have an outstanding position because it was placed differently on exposed positions or apart from the list (down, right, top) 3. Most participants prefer a vertical alignment of labels and fields (16/23)

Some participants tried to do a more sophisticated layout - which might mean that they do not like monotonous listings (just guessing).

Software based Evaluation[edit]

TBD