Connecting Assessment to University and Division Goals

Example: Aligning Objectives and Outcomes to University Priorities

University Priority: The University commits itself to increasing students’ retention and graduation rates and decreasing their time to degree.


Departmental Goal: Help students clarify and implement individual educational plans which are consistent with their skills, interests, and values.


Departmental Program Objective: By summer 2008 at least 90% of all FTF will have participated in the three-phase advising program that offers sessions at critical junctures in the academic year.

Connecting Assessment to Budgeting

Departmental assessments can be used to demonstrate—in data-driven ways—the value and/or cost-effectiveness of programs, services, and staff. Such verification is important in declining budget times.

Types of data that can influence budgeting decisions include:

  • Participation or utilization data
  • Workload estimators and productivity data
  • Satisfaction data
  • Student learning data
  • Graduation and retention data
  • Participation and Utilization Data

It is critical (and often the “norm”) for many university departments to know how many students their services serve. However, participation and utilization data for some co-curricular activities may not be tracked on some campuses—or if it is tracked, it is not tracked in a methodical, consistent way.

What budget-related questions can participation and utilization answer?

Participation and Utilization data shows how many students or clients are served by a particular service in a particular period of time. Questions it can help answer include these types:

  • How many “free” events does the University Union offer each semester? How many students attend those events?
  • How many students utilize the Student Health Center? What percentage of the student population is this?
  • How many students are served by New Student Orientation?

Example Participation and Utilization Data: Counseling & Psychological Services (CAPS)

Research Questions:

  • What percentage of Sac State students utilize CAPS?
  • How does this compare to the rest of the CSU?
  • How does this compare to the rest of the nation?
  • How will a proposed fee increase affect the percentage of students who use CAPS?


Example Participation and Utilization Data: The WELL (Sacramento State’s new fitness and wellness facility)

 Research Questions:

  • How many students use The WELL?
  • How many students use The WELL per day, per week, or per month?
  • How regularly do registered users access The WELL?

Example Data: The WELL

  • Weekly Average “check-ins:” 10,074
  • Weekly High “check-ins:”       15,931
  • Daily average “check-ins:”     1,352
  • Daily high “check-ins:”           2,809

The WELL Member Visits


Workload Estimators

How might a workload estimator be used in a budget exercise?

Workload estimators can help make the case that a certain number of staff or a certain budget is necessary for operation.  Workload estimators show how much time critical processes or appointments take and how many staff are required to meet department needs in a timely manner, e.g.:

  • Financial aid packaging, transcript evaluation, etc.
  • Academic advising appointments, Health Center patient visits or counseling appointments, etc.

Example Workload Estimator: Academic Advising Center

Research Questions:

What are the essential functions of the Academic Advising Center (AAC)?
How many staff hours are needed for each of these critical functions?
Does the AAC have enough staff to perform its essential functions?
How will services be affected if the AAC reduces (or increases) its staff?

The data on the following chart helps answers some of these questions.

Example Workload Estimator: Enrollment Management

Research Questions:

  • Which units constitute the Enrollment Management area?
  • What are the essential functions of those units?
  • How many staff hours are needed to perform those functions?
  • Do these units have enough staff to perform their essential functions?

The following chart answers some of those questions.


Satisfaction Data

How might satisfaction data augment a budget request?

Administrators are more inclined to fund programs whose participants have high levels (rather than low levels) of “user satisfaction.

Example Satisfaction Data: Office of Global Education Satisfaction Survey

Program Objective

After completing a short online survey (in the late spring) 90% of Office of Global Education’ s clients will agree or strongly agree their experience with OGE is positive.

Methods and Measures

In May/June 2011, OGE surveyed 1,340 students, faculty, staff and alumni using OGE received 196 responses: 67% from students/alumni, 32% from faculty/scholars, and 1% from staff.


  • 91% agreed or strongly agreed OGE provides excellent customer service.
  • 90% agreed or strongly agreed the advice they received was helpful.
  • 90% agreed or strongly agreed the OGE staff is knowledgeable.
  • 90% agreed or strongly agreed the OGE staff is responsive to their needs.

Student Learning Outcome (SLO) Data

How might student learning data be used in a budget exercise?

With increasing calls for accountability and transparency, administrators should be expected to show the “bang for the buck.” In this environment, funds are likely to go to programs that augment or facilitate student learning and add value compared to those that are either “feel good” programs or those that don’t show a value-added.

Example Student Learning Outcome Data: Office of the University Registrar

Student Learning Outcome

Students who complete the Registrar’s “How to Graduate Soon” workshop will demonstrate that they understand how to apply to graduate, and what they need to complete (and by when) in terms of GE or major requirements.

Methods and Measures

Staff evaluators from the Registrar’s Office offer eighteen graduation workshops throughout the academic year. Upper-division students (80+ units) are encouraged to attend three graduation workshops and complete a pre- and post-test assessment.


The Learning Outcome was met. In 2010-2011, students demonstrated an understanding of graduation application processes and graduation requirements by increasing their pre-test scores (average of 74%) by 12% (average post-test score 86%).

Example Retention Data: Housing & Residential Life

Program Objective

First-time freshmen who live on-campus will be more likely to persist to their second year of college than first-time freshmen who live off-campus.

Note: we are not necessarily claiming a cause and effect relationship here.

Methods and Measures

Review first-year retention data collected by the Office of Institutional Research (OIR) to see if there are differences between first-year residential and first-year commuter students.Example Retention Data: Academic Advising



Example Retention Data: Academic Advising

Departmental Program Objective

Students on academic probation who participate in a new second-year retention program will achieve a 10% higher retention rate than those students who did not receive a similar advising intervention.


Review the historical data that OIR has collected for earlier cohorts of this population (those on academic probation) for whom the intervention was not available. Compare the retention rates of those who did not go through the program (earlier cohorts) with those who completed the intervention.

Academic Standing Data for those who did not experience the intervention.


Academic Standing Data for those who did experience an intervention.


Example Retention Data: Academic Advising

 Program Objective

The first-year retention rate for those freshmen who complete all three FYE advising session will be considerably higher than those who do not complete all three.


Review the first-year retention data collected by OIR and compare rates of “completers” with “non-completers.”

First-to-second year retention rate for “completers.”


First-to-second year rate for “non-completers.”