Difference between revisions of "User Interface Evaluation"
(4 intermediate revisions by 2 users not shown) | |||
Line 1: | Line 1: | ||
{{Template:TemplateUIE_Activities}} | {{Template:TemplateUIE_Activities}} | ||
<div style="float:left; width:70%; margin-bottom:3em;"> | <div style="float:left; width:70%; margin-bottom:3em;"> | ||
Detailed evaluation of the human machine interface is an effective method for designing a better user experience. Modern web applications are expected to deliver a high degree of usability. User participation within the development process is essential to meet those expectations. To balance the developer’s perspective, proven methods such as workshops and usability tests are applied to ensure the necessary degree of user participation in interface development. | |||
Interfaces, built at the MPDL are subject to evaluation. Main focus are all interfaces of the current PubMan solution. Evaluation of the User Interface Engineering team has different approaches: Usability Interviews and tests, GUI Workshops and software based evaluation. | Interfaces, built at the MPDL are subject to evaluation. Main focus are all interfaces of the current PubMan solution. Evaluation of the User Interface Engineering team has different approaches: Usability Interviews and tests, GUI Workshops and software based evaluation. | ||
Line 232: | Line 235: | ||
==Software Based Evaluation== | ==Software Based Evaluation== | ||
An open source heat map application has been installed to track user interactions along with their mouse actions. The results did not deliver the required quality of feedback and the solution was not extendible towards more dynamic interfaces. Software based evaluation is on hold since then. | An open source heat map application has been installed to track user interactions along with their mouse actions. The results did not deliver the required quality of feedback and the solution was not extendible towards more dynamic interfaces. Software based evaluation is on hold since then. | ||
==Cognitive Walkthroughs== | |||
Cognitive walkthroughs are done collaterally according to all interface drafts and prototypes. | |||
==References== | ==References== | ||
Line 238: | Line 245: | ||
</div> | </div> | ||
[[Category:User Interface | [[Category: User Interface Engineering]] |
Latest revision as of 14:58, 17 May 2011
APPLICATION AREAS |
---|
|
PROJECTS |
Research- and Metadata Handling Corporate & Interface Design (under Rework) |
edit |
Detailed evaluation of the human machine interface is an effective method for designing a better user experience. Modern web applications are expected to deliver a high degree of usability. User participation within the development process is essential to meet those expectations. To balance the developer’s perspective, proven methods such as workshops and usability tests are applied to ensure the necessary degree of user participation in interface development.
Interfaces, built at the MPDL are subject to evaluation. Main focus are all interfaces of the current PubMan solution. Evaluation of the User Interface Engineering team has different approaches: Usability Interviews and tests, GUI Workshops and software based evaluation.
Usability Interviews (Thinking Aloud)[1][edit]
Interface releases are tested together with potential users from institutes of the Max Planck Society. They perform tasks covering important application functionality. An interviewer tracks feedback and observes user interactions. All issues are noted down. The test result is accumulated if at least 8-11 interviews are performed. A summary is given at the end of the interview series which leads to measures and changes in the interface.
The following table shows what is needed to conduct an interview at an institute.
Setup
Participants |
1 |
---|---|
Supervisor |
1 plus 1 visitor (optionally, usually developers) |
Equipment |
|
Duration |
45min |
Results |
Document with notes on user actions for each action/step. The results are analysed after 8 - 11 interviews towards a statistic, providing GUI issues ordered by functional area and tasks. |
Steps |
|
Usability interviews are conducted to discover, document and classify usability issues for
- a specific release (e.g. PubMan 2.0.0.1)
- a functional prototype[2]
- a prototype draft
Up to 8 interviews are usually sufficient to discover the main flaws of a GUI. The interviewer just provides tasks for his participant. The participant does all tasks on his own and should be encouraged to comment on his actions. It is not recommended for the interviewer to interfere in any form. If the participant gets stuck and there are important steps to follow, he gets a hint how to continue.
Interviews can be recorded optionally if participants agree. If a participant is not able to solve a task or step it will be noted as a fatal usability issue. For later interviews with a more standardized set of tasks the issues are to be classified in the following way:
- Fatal (Task could not be finished successfully or in a proper way)
- Serious (Task could not be finished on the first attempt or user performance is bad)
- Minor (The user hesitates, is not sure or does not feel comfortable with the flow)
For each interview the UIE team provides an interview outcome. The document contains all tasks, observation notes and comments of participants. Additionally the interviews collect demographic characteristics as well to prepare data needed for personas.
Interview Form Example (Pullout only)[edit]
Submission | Participant 01 | Participant 02 | Participant 03 | ||||||||||
Nr. | Task | Nr. | Step | Status | Issue Comment (Observation) | Type | Status | Issue Comment (Observation) | Type | Status | Issue Comment (Observation) | Type | |
1 | Submit a publication item detailled | ||||||||||||
1 | clicks submission in the Main menu | OK | OK | No choosed it from startpage content | OK | ||||||||
2 | clicks on full submission | OK | Automatically chosen | OK | Easy Submission | OK | |||||||
3 | clicks a collection (optional) | OK | Automatically chosen | OK | |||||||||
4 | identifies group basic | OK | |||||||||||
5 | selects genre = journal article | OK | OK | ||||||||||
6 | fills title | OK | OK | ||||||||||
7 | identifies group person & org | OK | OK | ||||||||||
8 | enters author name, role | OK | OK | ||||||||||
9 | clicks on select or types manually | Serious | Did\'t use Affiliation input (mixed up with source) | Procedure | OK | After automatic detection all authors must be entered | |||||||
10 | clicks an organization in tree (optional) | Serious | Used \"Help\" to get information (not helpful). | Procedure | |||||||||
11 | identifies group details | OK | OK | ||||||||||
12 | fills a date | OK | OK | ||||||||||
13 | identifies group source | Serious | Mixed up source data and publ. Data | Procedure | OK | ||||||||
14 | fills genre and title | Minor | Was looking for \"Sternderl\" | Position | OK | ||||||||
14 | clicks on the button ‘submit‘ | Validation cycle! (Aff. Missing) | OK | ||||||||||
End | OK | OK | |||||||||||
2 | Upload 2 different files | ||||||||||||
1 | identify group ‘file’ | OK | OK | ||||||||||
2 | click on browse | OK | OK | ||||||||||
3 | click on upload | OK | OK | ||||||||||
4 | enter file properties | OK | OK | ||||||||||
5 | operate ‘add’ | OK | OK | ||||||||||
6 | repeat actions two times | OK | OK | ||||||||||
End | OK | OK | |||||||||||
3 | Save Locator/File | ||||||||||||
1 | Type URL in input field | OK | OK | ||||||||||
2 | Click upload link | OK | OK | ||||||||||
3 | Choose content category | OK | OK | ||||||||||
End | OK | OK | |||||||||||
2 | Use item as template | ||||||||||||
1 | go to workspace | OK | Hard to find from Submission back | OK | |||||||||
2 | click on title to enter item version view | Serious | Did not realize the linked title | Procedure | OK | ||||||||
3 | click on ‘use as template’ | OK | |||||||||||
End | OK | OK | |||||||||||
4 | Add 2 organizations for an author | ||||||||||||
1 | OK | OK | |||||||||||
2 | OK | OK | |||||||||||
3 | OK | OK | |||||||||||
End | OK | OK | |||||||||||
GUI Workshops[edit]
For GUI changes, enhancements or new GUIs interface drafts can be discussed and shaped together with potential users. The most problematic part is to get at least 6 people who are interested and skilled to take part in interface shaping directly.
Participants should know about basic interface controls and their mode of operation at the frontend. At the same time they need to have a clear understanding of available functionality.
UIE Workshops can help clarify
- how users approach their tasks
- how they expect the interface to behave
- employed metaphors
Two kinds of workshops are applied if necessary:
Setup
Participants |
5 - 8 |
---|---|
Supervisor |
1 - 2 |
Equipment |
Equipment provided by UIE Team |
Duration |
4h - 6h |
Result |
A rough draft of a graphical user interface part. |
Steps |
|
For each workshop the UIE team prepares a topic.
Software Based Evaluation[edit]
An open source heat map application has been installed to track user interactions along with their mouse actions. The results did not deliver the required quality of feedback and the solution was not extendible towards more dynamic interfaces. Software based evaluation is on hold since then.
Cognitive Walkthroughs[edit]
Cognitive walkthroughs are done collaterally according to all interface drafts and prototypes.
References[edit]
- ↑ Wikipedia - Comparison of Usability Methods
- ↑ A functional prototype (Model) (also called a working prototype) will, to the greatest extent practical, attempt to simulate the final design, aesthetics, materials and functionality of the intended design. The functional prototype may be reduced in size (scaled down) in order to reduce costs. The construction of a fully working full-scale prototype and the ultimate test of concept, is the engineers' final check for design flaws and allows last-minute improvements to be made before larger production runs are ordered. Definition retrieved from Wikipedia article on Prototypes
- ↑ FH JOANNEUM Gesellschaft mbH zu Testverfahren
- ↑ Wikipedia on wireframes, The Wireframes Magazine