By: François Aubin, Jean-Marc Robert, and Daniel Engelberg
1. Overview and Purpose
The paper proposes a structured, systematic method for transforming task analysis directly into user interface design, thereby reducing reliance on intuition and individual judgment in the design process.
By integrating human factors guidelines into Hierarchical Task Analysis (HTA), the authors aim to create a consistent, traceable process that can be partially automated, accelerating interface development.
2. Foundations
Task Analysis and Usability
Traditional methods such as GOMS (Goals, Operators, Methods, Selection rules) are useful for modeling user goals and predicting performance metrics (e.g., task execution time), but insufficient for specifying interfaces.
The authors therefore extend HTA by systematically linking task operators (the cognitive actions of users) to human factors principles, interface objects, and interaction techniques.
Objective
The goal is to ensure that every cognitive task (e.g., compare, recognize, decide) can be consistently mapped to an interface behavior and object, grounded in ergonomic and cognitive design principles.
3. Task Operators and Interface Mapping
The method introduces a taxonomy of task operators, each connected to empirical interface object guidelines derived from engineering psychology literature (Wickens, 1992).
Each operator represents a cognitive goal, independent of specific tools (e.g., “select” instead of “click mouse”).
Examples (from Table 1)
Task Operator | Definition | Example of Human Factors / Interface Object Guideline |
---|---|---|
Compare (approximate) | Examine two or more objects to find rough similarities or differences. | For quantitative values, best results are obtained with two linear scales aligned on the same baseline. |
Compare (exact) | Examine two or more objects to find precise differences. | For quantitative values, text is superior to graphics. |
Discriminate | Examine objects to discover differences. | For graphic objects, graphics are superior to text. |
Recall | Bring an object from long-term memory back to awareness. | For all types of objects, automate the task if possible. |
Recognize | Relate a perceived object to a stored memory of that object or class. | For abstract objects, text is superior to graphics. |
Scan | Survey a complex object by glancing over its elements. | For quantitative values, orient items in a list or table. |
Detect | Discover the presence of an object or a property (signal detection). | For graphic objects, automate detection if the signal-to-noise ratio is high; otherwise, assist the human with highlighting or visual cues. |
Calculate | Perform a mathematical operation on two or more objects. | For quantitative values, automate the task. |
Select | Choose from among several objects. | For all types of objects, use a written menu with shortcuts. |
Enter (text) | Enter text into the system. | For text, use the keyboard; minimize unnecessary data entry. |
Integrate | Combine several objects into a coherent whole. | For graphic objects, display and allow direct manipulation. |
Correlate | Evaluate similarities in trends among multiple objects. | For quantitative values, automate the correlation. |
Judge | Form an opinion by weighing evidence. | Allocate to humans; avoid premature automation. |
Decide | Resolve uncertainty and determine an outcome. | Present cues simultaneously to avoid bias; if uncertainty persists, allocate to human judgment. |
These mappings make cognitive operations directly actionable in the interface design process.
4. Design Process Phases
(a) Task Optimization and Allocation
Before interface mapping, tasks are optimized (remove redundancies, add functions) and allocated between human and machine according to capability and reliability.
(b) High-Level Design
Defines the main screens, storyboards, and interaction styles.
Three empirical principles guide this phase:
- Each screen must contain only and all information needed for its task.
- The organization of screens must mirror the task structure.
- Screens are organized by priority and frequency—secondary tasks go to secondary windows.
This phase supports early empirical usability testing.
(c) Detailed Design
Specifies spatial layouts, messages, and interaction techniques, using direct mapping between:
- Task operators
- Objects
- Human factors guidelines
- Dialogue specifications
- Interaction methods
- Rules:
- Interface behavior follows from task operators and domain standards.
- Implementation (widgets, layouts) follows from interface behavior and system environment.
- High-level design consistency is maintained.
5. Example: Customer Service Application
In a utility company’s customer service system:
- A representative performs tasks like Check identity, Check payments, Check consumption, Compare last year.
- The high-level design defines screens aligned to these tasks.
- The detailed design maps each operator (e.g., discriminate, recognize, select) to interface elements such as text fields, icons, and buttons.
This ensures logical and ergonomic consistency across the workflow.
6. Discussion and Implications
The proposed framework:
- Provides a formal and practical link between human factors guidelines and task modeling.
- Has been validated in eight commercial projects (banking, finance, customer service).
- Can potentially standardize and automate parts of UI design.
Future research aims to:
- Extend the taxonomy of task operators to new domains.
- Classify and test additional operators.
- Refine mappings with updated cognitive and ergonomic data.
7. References
- Aubin, F., Robert, J.-M., Engelberg, D. (1994). From Task Analysis to User Interface Design. Proceedings of the 12th Triennial Congress of the International Ergonomics Association, Toronto.
8. Key Takeaways
- The paper bridges the gap between task analysis and interface design through a cognitive-ergonomic mapping model.
- It formalizes what was previously intuitive—creating a traceable, semi-automatable pipeline from user tasks to interface specifications.
- This framework remains foundational in cognitive engineering, usability design, and model-based UI generation.