Abstract :
There is a long history of human factors evaluation in support of complex, safety-related instrumentation, including the use of mock ups, lighting rigs and simulators. There is an established literature on evaluation as part of the development life-cycle, criteria to use and best plan documentation. However, apart from long-standing and widely ignored advice on the dangers of laboratory experiments, there is still very little advice on how to achieve results that are scientific, credible and useful, although reliability and validity now receive some attention in a practical context. This paper presents a practical approach to producing supportable results. As such, it is not particularly concerned with specific measures for usability. Rather, it describes quality assurance activities that support the conduct of an evaluation
Keywords :
computerised instrumentation; human factors; software performance evaluation; software quality; user interfaces; complex man-machine interfaces; development life-cycle; documentation; human factors evaluation; interface evaluation; laboratory experiments; quality assurance; reliability; safety-related instrumentation; usability;