DISCLAIMER: This data in this section is fictitious and does not, in any way, represent any of the programs at Gallaudet University. This information is intended only as examples.

Using Assessment Results

This step involves making recommendations using your analysis of the data to make program changes that will improve student learning.


Recommendations: actions taken/to be taken to improve student learning that is clearly supported by the data — what will be done, who will do it, how it will be assessed, and when.

Once the data are analyzed, the unit should be able to see whether it has achieved its intended outcome.

  • Where the criterion is met or surpassed, the unit may rightly conclude that no change is needed and report, “No action required.” Suppose the same outcome is assessed the next year. In that case, the results are repeated, and the staff can ensure the criterion was met. The unit should consider assessing a different outcome in the following cycle.
  • In the case where the results indicate the criterion level was not met, the unit needs to evaluate its results further to determine what needs to be done to improve the likelihood of achieving the outcome.
  • The unit makes an action plan that includes what will be done, who will do it, and by when. This step is what makes the difference between “assessment” and “busy work.”

AREAS to look at when assessment results are disappointing…


  • Are goals inappropriate or overly ambitious?
  • Do they need to be clarified?
  • Do you have too many?


  • Does curriculum adequately address each SLO?


  • Are you teaching in the way students learn best?


  • Are they poorly written and misinterpreted?
  • Do they match your SLOs?
  • Are they too difficult for most responsible students?


  • Is poor performance really the student’s fault?


Develop recommendations to improve student learning outcomes based on your data analysis, which identifies your program/unit’s strengths and weaknesses. You should create a plan to improve your weaknesses and build on your strengths to make them better. (Remember to build into the plan the periodic re-assessing of your strengths to make sure you’re not slipping.)


***Results of the pre-test have documented conclusively that students entering the class are far from “knowing it all” – in fact, the scores are typically below 50 percent accurate. These pre-test data document the great need for the Library 101 course, despite some students’ claims, and form the foundation for subsequent student learning throughout a student’s academic career.

Even though the final exam shows a dramatic increase in student learning, several items are still required improvement:

  • Item 2, an achievement rate of just 32.8 percent, is not adequate. Course administrators will investigate why more students are not learning or retaining this specific item;
  • Item 3 shows a great positive jump in student learning outcomes seen in the final exam percentage correct, but again a success rate of only 56.1 percent is not adequate. This item will be addressed by course administrators in the effort to increase the overall percentage of student learning and retention on this item.

*Assessing the Effectiveness of Non-Instructional Support Offices **Adapted from: Suskie, L. (December, 2008). Understanding and Using Assessment Results. Paper presented at 2008 Middle States Commission on Higher Education Annual Conference. ***Example adapted from: Iowa State University. Library 160: Measurement of Outcomes and Results.

Contact Us


College Hall 410A

(202) 559-5370



Select what best describes your relationship to Gallaudet University so we can effectively route your email.
By submitting this form, I opt in to receive select information and deaf resources from Gallaudet University via email.
This field is for validation purposes and should be left unchanged.