Print Header Logo

Institutional Research

Outcomes and Measurement Strategies for Academic Programs

Recognizing the difficulty of creating outcomes and their corresponding measures, IR&E is providing the following suggested outcomes and measurement strategies. These are broad, general outcomes that we feel apply to all departments. This list of outcomes was compiled using current AIER improvement plans and items currently represented in the AIER process under Quality, Demand, and Cost.

By explicitly stating them as “outcomes,” we hope to accomplish several things: to improve the flow of the report, to help departments link assessment results in these areas to the planning process, and to allow some flexibility in measurement strategies. We feel that measures should be program-specific, and we hope to decrease our reliance on national, university-wide surveys at the department level. To that end, we have provided some alternate ideas for measurement strategies.

We will continue the practice of using at least two measures for every outcome, but those measures do not have to be, and in most cases should not be, made every year. This is particularly true of surveys, unless you are targeting a specific group each year. In general, surveying graduating seniors each year is reasonable, but surveying all students each year is not.

Each measure should include a benchmark value used to interpret the results of that measure. A benchmark is used to describe either what is acceptable or desired, but not some future unachievable ideal. For example, “The average departmental score for the course evaluation question `Professor showed respect for students` will be at least 5.25, with no course score in the department below 3.0.” Note that the targets are specific, constant values, and not stated as a mandated increase each year. In this case, they are below the EMU average. If comparisons to some other group are used, the group should be very similar to the department or students in the department. For example, it might be appropriate to compare course evaluation scores within clusters but not across clusters (if we had clusters!).

Outcomes, measures, and the benchmark values will typically remain unchanged for many years. Each time data are collected for a particular measure, a quick comparison to the benchmark can be made and noted. Every few years, a more complete analysis of the results should look for trends over time. The report should include a note about when the next full analysis is planned. For the example above, data can actually be recorded every semester. An analysis might only be performed every 3 or 4 years as long as new values are above the benchmark targets and not substantially below the previous values.

Learning Objectives

Knowledge, Skills, and Perspective

For example: theories, vocabulary, methodologies, significant events, studio/lab techniques, problem solving, critical reflection, professional ethics, modes of inquiry.

These vary by discipline and are generally well represented in the current reports. Keep up the good work.

Operational Objectives

Caring for Students

For example: academic advising, career/graduate advising, student satisfaction, self-confidence & self-efficacy

Outcome: Advising (academic, career, etc.) will be available and effective.

Measurement Strategies:

  • Number/length/time of meetings between students and advisors
  • Percentage of students graduating on schedule
  • Frequency of directed study and transfer credit because of schedule planning errors
  • Number of course loading changes due to incorrect advising, etc.
  • Set specific goals for department activities (career seminar each year, etc.)
  • Alumni survey responses (satisfaction with advising)
  • Percentage of students in graduate work, rate of job search success, etc.

Outcome: Students and department faculty will interact in ways to promote learning and personal growth.

Measurement Strategies:

  • Course evaluations (first and third questions)
  • Student interview and department survey responses (including perceptions of the learning environment)
  • Faculty evaluation of students (confidence, frequency of interaction, etc.)

Caring for Faculty/Staff

For example: inter- and intra-office communication/cooperation, faculty development, workload, research

Outcome: Faculty, staff, and student workers will support each other in their work.

Measurement Strategies:

  • New course preparations, advising, etc. are equitably assigned
  • Student worker training and support system in place and effective
  • Mentoring and support for new faculty and staff available
  • Research and faculty development resources are equitably assigned
  • Faculty-staff surveys show positive communication/interactions with peers
  • Number and types of interactions with other departments and offices

Telling the EMU Story

For example: Website, public relations, recruitment, fundraising

Outcome: The department will promote its programs and EMU as a whole in its interaction with the public.

Measurement Strategies:

  • Website is updated regularly
  • Department newsletters and other publications
  • Seminars or partnerships with external organizations
  • Department and faculty activities are publicized/marketing dept. is notified
  • Faculty members meet with prospective students as requested by admissions
  • Growth of departmental endowment (if any)


For example: managing space, staff, equipment resources; curriculum planning and accreditation, etc.

Outcome: The department exercises good stewardship in its use of resources.

Measurement Strategies:

  • Environmentally sustainable purchasing
  • Economically sustainable purchasing
  • Ensure that equitable and efficient use of departmental spaces
  • Work with physical plant to ensure proper maintenance of facilities
  • Appropriate use of work study, staff, and other personnel resources
  • Maintenance of equipment and technology at appropriate levels

Outcome: The department will offer quality programs and services and be responsive to external factors.

Measurement Strategies:

  • Review curriculum every X years
  • Ensure curriculum meets needs of students, industry, and accrediting agencies
  • Ensure extra-curricular programs and activities meet needs of constituents
  • Participate in campus-wide initiatives as needed