Print Header Logo

Institutional Research

Glossary of Common Assessment Terms


The systematic process of gathering, using, and analyzing information about student learning and/or administrative outcomes to make decisions about program effectiveness and improvements.

Authentic assessment

The assessment process is similar to or embedded in real world activities.


Performance data that are used as a standard for comparison of program outcomes. A program can use its own data as a baseline benchmark against which to compare future performance. It can also use data from another program as a benchmark. In the latter case, the other program often is chosen because it is exemplary and its data are used as a target to strive for, rather than as a baseline. For example, our goal is to maintain faculty salaries at the median of the CCCU faculty salaries.

Closing the loop

The process of interpreting assessment results, determining the implications for change and implementing the necessary changes.

Course-embedded assessment

Using direct measures of student learning in a particular course, such as evaluation of a performance, paper, art project, or test results, to assess overall program outcomes.


An interpretation of scores on a measure that focuses on how the student compares to a set of criteria: “80% of the students passed the Standards of Learning for English; the University is accredited by the Southern Association of Colleges and Universities.”

Direct Measures:

Direct measures for learning outcomes

Measures of student learning that require students to demonstrate their knowledge, values and skills. Objective tests, essays, presentations, and classroom assignments all meet this criterion.

Direct measures for administrative outcomes

Measures of performance taken from department operations: no. of students enrolled, percent of student financial need met, percent of tasks completed within allotted time, roster size for athletic teams, no. of museum visitors or counseling clients.

(Developmental) Portfolio

A portfolio designed to show student progress by comparing products from early and late stages of the student’s academic career.

Formative assessment

An assessment which is used for on-going improvement (individual or program level) rather than for making final decisions or for accountability.

Indirect Measures

Indirect measures for learning outcomes

Methods, such as surveys and interviews, which ask students to reflect on their learning rather than actually demonstrate of their learning.

Indirect measures for administrative outcomes

Measures of performance asking faculty, staff, students or other clients to reflect on the services provided (Student Satisfaction Inventory, Alumni Survey, Faculty Staff Survey, user surveys, focus groups, interviews)


Resources a program uses to achieve program outcomes. Examples are staff, facilities, equipment, curricula, supplies, etc.


Data collected on the same individuals or programs over a period of time. A study investigating development, learning, or other types of change in individuals or programs over time.


An interpretation of scores on a measure that focuses on how the student or program compares to others: “the student’s performance put him at the 75th percentile; the University ranked second in US News Report’s annual list of Best Colleges.”


Refers to the specific knowledge, skills, or developmental attributes (i.e. attitudes or values) or behaviors that students develop as a result of their university experience; administrative outcomes specify the intended benefit or impact of the program.


The direct products of program activities. Examples include no. of classes taught, no. of counseling sessions conducted, no. of hours of services provided, no. of participants.


The percentage of examinees in the norm group who scored at or below the raw score for which the percentile rank was calculated.

Performance-based assessment

An assessment technique involving the gathering of data through systematic observation of a behavior or process and evaluating those data using a clearly articulated set of performance criteria as the basis for evaluative judgments.

Processes or Activities

What a program does with its resources—the services it provides to fulfill its mission. Examples: counseling, advising, teaching, reporting, coaching, recruiting, workshops, processing financial awards, cleaning & maintaining facilities, etc.

Qualitative data

The values of a variable differing in kind (quality) rather than in amount.

Quantitative data

The values of a variable differing in amount rather than in kind.

Quality Enhancement Plan (QEP)

A “carefully designed and focused course of action that addresses a well-defined issue of issues directly related to improving student learning” (SACS, 2004, p. 9). The QEP is required for continuing accreditation status with the Southern Association of Colleges and Schools Commission.


A scoring tool that lists the criteria for a piece of work, or “what counts” (for example, purpose, organization, and mechanics are often what count in a piece of writing); it also articulates gradations of quality for each criterion, from excellent to poor.

Summative assessment

A sum total or final product measure of achievement at the end of a course of study or a period of time.

Trend data

A comparison of data for a specific measure over multiple years, in order to see whether change has occurred.

Triangulation of data

The building of multiple sources of information or ideas to support a central finding or theme.

Value-added assessment

The impact of participating in higher education on student learning and development above that which would have occurred through natural maturation, usually measured as longitudinal change or difference between pretest and posttest; a comparison of the knowledge, skills, and developmental traits that students bring to the educational process with the knowledge, skills and developmental traits they demonstrate upon completion of the educational process.