Interface Design Evaluation & Assessment
Building an Heuristic Team in the CSU

Initiatives 1, 2, and 3

(Version 2)


Rachel S. Smith, Senior Interface Designer
CSU Center for Distributed Learning | | voice: 707.793.9336

Jay C. Rees, Web Systems Coordinator
Cal State San Marcos | | voice: 760.750.4774

The team would like to thank Cher Travis Ellis for her work on the proposal.


Lou Zweier, Acting Director
CSU Center for Distributed Learning | | voice: 707.664.4337
for Rachel S. Smith

Norm Nicolson, Dean of Instructional and Information Technology Services
Cal State San Marcos | | voice: 760.750.4775
for Jay Rees



    This proposal requests funding for the creation of a multimedia peer review group (the Interface Design Evaluation & Assessment Group [IDEA Group]) to be trained in heuristic design evaluation. The original group will develop and deliver workshops at regional CATS meetings, encourage and enable others to join, and create a trainer guide and supporting materials. The guide and associated materials will be posted to the CATS-MERLOT website for further dissemination to all academic technology staff.

    The project team will work toward this goal by first becoming trained in conventional heuristic review practices and then expanding their knowledge to incorporate new or changing techniques and/or technology. Our next step will be to create a workshop to present at regional CATS gatherings. During these workshops, we will encourage attendees to help form a larger pool of experts who can perform heuristic evaluations for each other and for other designers within the CSU. We will also provide interested attendees with materials so that they may present this training at their own campuses and at other regional gatherings, so that we may reach as wide an audience as possible.

    Ideally, the pool will grow to sufficient size that any one member would not have to spend a lot of time evaluating designs. An individual evaluator would be able to perform some, but not all, necessary heuristic reviews without adversely impacting their own schedules. (Please see "Heuristic Evaluation," below, for more information on the ideal number of evaluators per review.) Evaluators would perform the heuristic reviews on a volunteer basis; we hope to encourage a spirit of reciprocity among the group.

    This proposal addresses all 3 of the 2000/2001 TIGERS initiatives. First, it will result in the documentation of a process (the process of performing heuristic evaluations), which may be used by others. Second, the IDEA Group will further their professional development by receiving training in performing heuristic evaluations and by researching current methods. Third, the IDEA Group will prepare and present a training workshop at at least one regional CATS workshop, in addition to preparing trainer materials so that interested attendees may present the workshop at other locations.


   An heuristic evaluation is a formal review of an interface (or interface design document) which is conducted by a group of trained experts in the field of usability engineering. Each expert reviews the interface separately, makes a report, and provides that report to the designer. The reports compare the design to a set of known design principles, or heuristics. When taken together, the reports provide valuable information about usability problems in the design. Heuristic evaluations can be performed on web site designs, software interface designs, print publications, and any other product that is designed to be used by a wide audience.

   This type of evaluation is most effective if it is conducted by a group of trained experts. Nielsen (1993) reports that a single evaluator will find only 35% of the usability problems in any given interface. Five evaluators, however, will find 75% of the issues. Twelve or more evaluators will find about 80-90% of the problems. This suggests that the ideal group size for one evaluation is between five and twelve people. Since heuristic evaluation depends upon the evaluator's knowing what the design principles are, as well as how to tell if they have been applied, it is best if this type of evaluation is done by someone who has training and/or experience.

   This proposal seeks to create a pool of trained evaluators. While the Center for Usability in Design and Assessment (CUDA) does provide heuristic evaluation services, CUDA does not have the bandwitdh to provide enough independent evaluators to reach the 75% efficacy mark for many projects. We would like to augment CUDA's services by increasing the number of individuals who can perform this particualr type of evaluation. In addition, heuristic evaluations are beneficial at several stages of a project. Even if CUDA could provide enough evaluators for a single project, the cost of repeated evaluations would be prohibitive. CUDA's services are invaluable as part of a complete usability testing program; this proposal seeks to add to, not replace, those services. A group of volunteers performing these evaluations for each other can only increase the amount of beneficial usability testing that can occur on any given project.


   The project team has been in touch with CUDA regarding this proposal. CUDA has agreed to allow us to review the training materials they have already developed and use them as a starting point for our materials. In addition, CUDA will provide help on an advisory basis as we progress. Further, CUDA is planning to rework their training materials in the near future, and CUDA staff are open to using the materials that ultimately result from this project to save time and effort.

   Originally, the project team planned to go through CUDA's walkthrough training in heuristic evaluation. In an effort to reduce the budget for this proposal, we proposed to CUDA that we schedule a half-day meeting in lieu of formal training. The shorter format of the meeting eliminates the need for us to stay overnight, and CUDA has agreed to work with us on a quid-pro-quo basis rather than charging the usual walkthrough training fee. The agenda for the meeting has yet to be determined, but will include a discussion of how we can help to improve the materials CUDA has already created. We will provide all deliverables (see below) to CUDA for their use.

   Prior to meeting with CUDA, the project team will study the process of heuristic evaluation as documented by Nielsen (1994).


   The original IDEA group will develop materials that will aid the evaluators and designers for future projects. These materials will represent the primary set of deliverables for this project and will include the following items:

  • Trainer Guide: The IDEA Group will develop a guide for conducting the training workshop at regional gatherings so that interested attendees may be able to reproduce the training on their home campuses. This guide, and all materials listed below, may be made available in CATS-MERLOT and provided to CUDA for their use.

  • Workshop Handouts: The IDEA Group will develop detailed handouts for workshop attendees.

  • Design Submission Form: The IDEA Group will develop a form to accompany a design which is submitted for heuristic evaluation. This form will allow the designer to acquaint the evaluators with the nature of the design being evaluated, including its intended audience, task profiles, and other pertinent information.

  • Evaluation Form: The IDEA Group will develop a form to guide the evaluators as they perform the heuristic evaluation. The purpose of this form is to ensure consistency across evaluations.

  • Report Form: The IDEA Group will develop standards for reporting the heuristic findings or results to the designer.

IDEA Group Training
Registration for UIE Workshop Due* June 25, 2001
Meeting at CUDA June 2001 (date TBA)
UIE Workshop* July 11-12, 2001
Development of Materials
Trainer Guide Completed July 27, 2001
Workshops Handouts Completed July 27, 2001
Other Forms Completed August 3, 2001
Email List Available after August 6, 2001
IDEA Group Available to Present at CATS Regional Workshops after August 6, 2001
*Starred items are relevant only if the proposal is approved for Budget One: UIE Workshop. All unstarred dates are flexible.

Training Source
Works Cited
Nielsen, J. (1993). Usability Engineering. Acadenic Press, San Diego, CA.
Nielsen, J. and Mack, R (1994). Usability Inspection Methods. John Wiley & Sons, Inc., New York

    We have developed two budgets for this proposal. The first includes training by User Interface Engineering, a well-known usability group. We believe the training will be of great benefit to our research and will enrich the final materials. The second budget is provided in case funding is too limited to incliude the UIE training.
User Interface Engineering Workshop: Product Usability: Survival Techniques in Burlingame, CA
Patricipants: All Lodging Per Diem Registration Transportation Total:
Rachel (car) $0.00 $75.00 $635.00 $74.00  
Jay (air) $398.00 $92.00 $635.00 $152.50  
Budget notations: $2,061.50
Rachel will drive and stay with friends; Reg. By 6/25 includes 2nd day sessions free. Daily per diem apx. $46.
Meeting with CUDA in Long Beach, CA
Participants: All Loding Per Diem Transportation Total:
Rachel (air) $0.00 $29.00 $250.00  
Jay (air) $0.00 $29.00 $74.00  
Budget notations: $382.00
Daily per diem approximated at $46; Rachel air (SFO to Orange County)
Stipend/Release Time
  Release Time Stipend  
  Total Time* Rachel** Jay Rachel Total:
Trainer Guide 45 $0.00 $593.55 $244.35  
Handouts & Materials 60 $0.00 $791.40 $325.80  
Misc. Communication 15 $0.00 $197.85 $81.45  
Totals 120 $0.00 $1,582.80 $651.60  
Total Time Commitment   40 hrs 60 hrs 20 hrs $2,354.40
Budget Notations
*Jay and Rachel are eash assigned to half of the time. Jay is requesting releases time; Rachel is requesting part release time and part stipend.
**Rachel's release time is paid by the CDL and does not appear as part of this budget.

Meeting with CUDA in Long Beach, CA
Participants: All Loding Per Diem Transportation Total:
Rachel (air) $0.00 $29.00 $250.00  
Jay (air) $0.00 $29.00 $74.00  
Budget notations: $382.00
Daily per diem approximated at $46; Rachel air (SFO to Orange County)
Stipend/Release Time
  Release Time Stipend  
  Total Time* Rachel** Jay Rachel Total:
Trainer Guide 45 $0.00 $593.55 $244.35  
Handouts & Materials 60 $0.00 $791.40 $325.80  
Misc. Communication 15 $0.00 $197.85 $81.45  
Totals 120 $0.00 $1,582.80 $651.60  
Total Time Commitment   40 hrs 60 hrs 20 hrs $2,354.40
Budget Notations
*Jay and Rachel are eash assigned to half of the time. Jay is requesting releases time; Rachel is requesting part release time and part stipend.
**Rachel's release time is paid by the CDL and does not appear as part of this budget.

    These budgets do not take showroom tax, car rental tax, or other taxes and fees into account. Per diem is estimated based on earliest departure and latest arrival times. Airfare estimates subject to change.


   Rachel has been in the field of HCI for five years. In that time, she has developed interfaces for several successful educational software products, including the Biology Labs On-Line and MERLOT. She has been trained in usability engineering by prominent experts in the field, including Jakob Nielsen, Deborah J. Mayhew, Ben Schneiderman, Mary Beth Rosson and John M. Carroll. Recently she presented a 3-part workshop, entitled Interface Design for Educational Multimedia, at the CATS 2001 conference in Long Beach, CA.

   Rachel has extensive experience in developing curriculum materials and in presenting successful workshops. She holds a Master's degree in education (art) from Stanford University. She has been developing and presenting workshops on design and multimedia topics for seven years.


   Jay Rees works for CSUSM's Academic Computing Services as the Web Systems Coordinator, leading a team of ACS web developers and programmers. He is responsible for leading department website projects, managing web servers, developing and supporting web systems for campus departments, writing/implementing department and campus policies and procedures, coordinating the student assistant program (apx. 45 students), and researching/implementing/training on new applications for the campus. He regularly trains other tech. staff on new methods or products and has taught at conferences (CATS -- all years), Extended Studies (developed and taught Web Design Certificate Program), and he was the primary developer for a TIGERS grant project (STAR).

   Jay has extensive experience with contracting and leading web projects involving initial data gathering, writing up cost and time estimates, requirement documents, flowcharts or storyboards, designing, testing, training, and maintaining/debugging.