Western University of Health Sciences Institutional Research and Effectiveness - IRE Western University of Health Sciences
Institutional Research and Effectiveness - IRE

Assessment Plan

Assessment Plan

At Western University, we created a process that is not only effective, but also sustainable. Based in large part on lessons learned from the WASC Assessment Leadership Academy as well as WASC conferences, an assessment plan was created with three basic attributes:

Each year in March, the Director of IRE and the Senior Assessment Analyst meet with program representatives to discuss that year’s assessment plan and answer any questions they may have. Program representatives may come from curriculum committees, assessment committees, or some other ad hoc group chosen to lead the process. From that point, programs have approximately four months to complete assessment reports.

Reports are submitted to the Director of IRE in July, who then distributes them among members of the Assessment/Program Review Committee for their review. Two committee members review each report, making sure that no one evaluates a report from his or her own program. To help guide feedback, committee members utilize a feedback form and an assessment evaluation rubric that describes expectations for each section of the assessment report.

Once the feedback process is completed, the Senior Assessment Analyst reviews each feedback form and assembles individual feedback reports for all programs. As a supplement, the Senior Assessment Analyst also creates a meta-report, which is shared with executives such as the Provost and the college Deans.

The four-year plan allows programs to assess two Institutional Learning Outcomes per year, thereby allowing for all eight Institutional Learning Outcomes to be assessed at the conclusion of the 4th year. At the conclusion of the 4th year, the Institutional Learning Outcomes will be reassessed the following four years for evaluative and meaningful comparisons. This process allows for programs to gain information as to the utility of their data collection procedures, the structure of their curriculum, and for the monitoring of student achievement.

Summary of Previous Assessment

Western University has just completed the pilot phase of the assessment plan which determined the feasibility of the process and is now currently in year one of the assessment plan. The pilot phase has provided us with a number of important insights. For instance, using scores from Western University’s assessment report rubric, we generated data to evaluate overall performance for the assessment of Interpersonal Communication and Evidence-Based Practice (see Figures 1 and 2).

Overall, results between the two outcome domains were similar. For instance, the strongest area in both Interpersonal Communication (mean = 2.5) and Evidence-Based Practice (mean = 2.9) reports was on Organization. These figures indicate that, on average, the organization of reports was ‘Developed’ for Evidence-Based Practice, and nearing ‘Developed’ for Interpersonal Communication. Other aspects of assessment reports that were above average (as indicated by the red line in each chart) were Assessable Learning Outcomes and Assessment Participation. These findings suggest that the assessment process within each program was sufficiently organized and the program learning outcomes align well with the institutional learning outcomes.

On the other hand, results also show areas that were somewhat weak on average for Western University assessment reports. For example, on both outcomes, Assessment Goals and Methods for Data Collection were well below average. This result alludes to a few different shortcomings. To receive a score of 4 (Well Developed) in this area, programs should demonstrate a systematic data collection process. Assessment Goals were relatively weak on average. That is, many programs did not clearly articulate the purpose or expectation for their assessments.

Findings also show that Implication sections on both Interpersonal Communication (mean = 2.0) and Evidence-Based Practice (mean = 1.8) were weak. Either the recommendations or plan of action was minimal or missing entirely, or it was unclear how results of the assessment would be used.


Figure 1. Western University Interpersonal Communication Skills Assessment Rubric Scores


Figure 2. Western University Evidence-Based Practice Assessment Rubric Scores


To aid in the pursuit of reliable and valid assessment data with meaningful results, Western University is currently developing a plan to improve assessment contributions. Faculty workshops, although in the infancy stages of development, are being created in conjunction with the Center for Academic and Professional Enhancement to help support faculty with data collection methodologies, assessment rubric development, data analysis, and curricular mapping strategies. To further improve WASC assessment knowledge and information, IRE will begin to recruit personnel from Western University programs who are interested in participating in upcoming WASC workshops. As a final step toward improving Western University assessment strategies, the Assessment and Program Review Committee is currently modifying the assessment report template, taking into consideration suggestions for improvement from all program faculty and staff.