Book contents
- Frontmatter
- Contents
- Preface: HCI'94 – You Probably Haven't Seen It All Before
- Part I Invited Papers
- Part II Methodology of Interactive Systems Development
- 3 Transferring HCI Modelling and Design Techniques to Practitioners: A Framework and Empirical Work
- 4 The Use of Visual Indexing as an Interview Support Technique
- 5 A Domain Analysis of Air Traffic Management Work can be Used to Rationalise Interface Design Issues
- 6 Manuals as Structured Programs
- 7 Improving Education through Computer-Based Alternative Assessment Methods
- 8 Visual Programming in a Visual Domain: A Case Study of Cognitive Dimensions
- 9 Evaluating Evaluation Methods
- Crafting Interaction: Styles, Metaphors, Modalities and Agents
- Modelling Humans, Computers and their Interaction
- Notations and Tools for Design
- Part VI Computer-Supported Cooperative Work
- Author Index
- Keyword Index
9 - Evaluating Evaluation Methods
Published online by Cambridge University Press: 04 August 2010
- Frontmatter
- Contents
- Preface: HCI'94 – You Probably Haven't Seen It All Before
- Part I Invited Papers
- Part II Methodology of Interactive Systems Development
- 3 Transferring HCI Modelling and Design Techniques to Practitioners: A Framework and Empirical Work
- 4 The Use of Visual Indexing as an Interview Support Technique
- 5 A Domain Analysis of Air Traffic Management Work can be Used to Rationalise Interface Design Issues
- 6 Manuals as Structured Programs
- 7 Improving Education through Computer-Based Alternative Assessment Methods
- 8 Visual Programming in a Visual Domain: A Case Study of Cognitive Dimensions
- 9 Evaluating Evaluation Methods
- Crafting Interaction: Styles, Metaphors, Modalities and Agents
- Modelling Humans, Computers and their Interaction
- Notations and Tools for Design
- Part VI Computer-Supported Cooperative Work
- Author Index
- Keyword Index
Summary
In HCI the aim of evaluation is to gather information about the usability or potential usability of a system. This paper is principally concerned with evaluating the effectiveness of two discount user inspection evaluation methods in identifying usability problems in a commercial recruitment database system with complex interface and system functionality. The two specific inspection methods investigated are heuristic evaluation and cognitive walkthrough. Several comparisons are made between the number, nature and severity of usability problems highlighted, the time needed to employ the methods and the ability to generate requirements for re-design. The results indicate that the methods are best considered as complementary and both should be employed in, but perhaps at different stages of, the design process.
Keywords: evaluation, usability inspection methods.
Introduction
The development of a successful interactive system depends on a formula of iterative design and early and continuous evaluation. However, industry's response to conducting evaluations has been patchy (Johnson & Johnson, 1989; Rosson, Maass & Kellogg, 1988). Many industrialists remark that the reasons for this are the cost of employing evaluation methods and the expertise necessary. Another reason is the cumbersome and complex nature of evaluation approaches, especially task analytic approaches such as TAG (Payne & Green, 1986), TAL (Reisner, 1981) and GOMS (Card, Moran & Newell, 1983). Additionally, evaluations are seen as providing information about what is unsatisfactory, but are less useful in generating information that can be used to facilitate more usable and fewer re-designs. Researchers therefore, must assess the effect of using current evaluation methods within the industrial development process, develop future methodologies and tools that require a limited training period and can be far more easily accommodated within the development process.
- Type
- Chapter
- Information
- People and Computers , pp. 109 - 122Publisher: Cambridge University PressPrint publication year: 1994
- 8
- Cited by