Home > Curriculum > Academic Programs > Academic Program Assessment Review Template & Rubric
Reply
Next Page

1

Previous Page

Academic Program Assessment Review Template & Rubric


You must be logged in to rate this toolkit.

out of 1 votes

Created on: 08/02/19 01:15 AM
Last Updated by: on

Replies: 0

Institution/Organization Name: New Mexico State University (NM)

Tool URL: http://assessment.nmsu.edu/committees/university-outcomes-assessment-council-uoa

Institution URL: http://www.nmsu.edu

Instructions for Accessing the Site:

Scroll down on the page of the above link, and click the links under Forms for Assessment


SETTING:

The reporting templates and rubrics guide planning and reporting of Academic Program Assessment activities, including planning, implementation and resulting actions. The rubric is used to provide feedback to academic departments regarding annual reports.


Appropriate for two-year: Institutions,Systems,State Agencies

Two-Year Institution Size: Any

Appropriate for four-year: Institutions,Systems,State Agencies

Four-Year Institution Size: Any

PURPOSE

ISSUE:

These tools promote increased intentionality, forethought about, reflection on and consideration of findings related to Academic Program Assessment. The use of analytical rubrics to provide formative feedback models good assessment practices and encourages and informs improvement.


GOALS / EXPECTATIONS:
    • The tools themselves will be instructive, and will clarify expectations for Academic Program Assessment
    • Rubrics will provide valuable feedback on planning, implementation, analysis and use results to inform decision-making in academic programs. Specifically, rubrics will facilitate increased efficiency in providing meaningful feedback to departments and programs regarding assessment practices.
    • The tools will be models for learning assessment - they will guide inclusion of key elements of quality assessment and demonstrate effective use of an analytical rubric to provide both instruction and feedback
    • The tools will foster broad faculty engagement in intentional, systematic and meaningful investigation of student learning, and will promote best practices in assessment to improve learning, as well as assessment reporting

DESCRIPTION

SUMMARY:

These tools were initially conceived to foster increased knowledge of and engagement with meaningful assessment of student learning. The first iterations were developed through a faculty committee that reviewed Academic Program Assessment reports. Subsequent refinements have occurred over recent years to provide instruments that model strong practices for assessing direct evidence of learning. Descriptions of performance levels included in the rubric were revised over time to speak directly to areas that are cornerstones to effective assessment, but that are commonly neglected, inadequately addressed, or are hotbeds for confusion and/or misinterpretation of the assessment process and/or its purpose. As they exist today, the tools (and the rubrics specifically) provide formative feedback, instruction for improvement, and summative evaluation of Academic Program Assessment practices. They facilitate consistency of feedback to and across departments and provide, to a significant extent, detailed feedback in an effective and efficient manner. In addition, the tools provide the institution a means to aggregate areas of strengths and weaknesses in assessment activities and reporting at various levels beyond the department/program completing the report, including at the college- and institution-level. Aggregate results can be used to determine appropriate departmental/program, college, or institutional interventions and/or response. Aggregate results are reported to our state Legislative Finance Committee.

MAJOR CHARACTERISTICS:

Departmental/program reports on Academic Program Assessment are submitted annually to the Office of Assessment. Each report is reviewed using the rubric, and a response is prepared for each report submitted. The primary feedback consists of the completed rubric, and occasionally includes additional comments as appropriate for any given report. College-level reports are then collectively provided based on summative findings identified through application of the rubric (last criteria on the rubric). These are simple tools that can be used in a number for venues and for a number of purposes related to learning assessment. The descriptive format of the rubrics is intended for 'instructional' purposes - in addition to evaluative criteria, they provide information that can be used to improve assessment practices and reporting. The rubrics also acknowledge both strengths and weaknesses in departmental/program assessment activities and reports.


FEATURES:

 NA


PERFORMANCE MEASURES:
    • By Summer 2014, 100% of departments/programs will have a documented plan for assessing student learning
    • By 2014-15, the Office of Assessment will develop workshops/training based on areas of greatest need as determined by aggregate rubric data
    • By 2015-16, all departments/programs will document faculty-inclusive decision-making that is informed by student learning assessment
    • By 2016-17, 100% of departments/programs will achieve a summative evaluation of "Report provides convincing evidence that the department/program is 'doing assessment well' "

ACHIEVED OUTCOMES:

The 2013-14 academic year is the first year the tools are being used in their current form, including the summative criteria at the end of the rubrics. Likewise, the goal of 100% of departments/programs reporting was set Fall 2013 - past goals were 75% of departments in any given academic year (which were achieved). Over the iterations of the current forms, we have greatly increased faculty involvement in departmental assessment processes and activities, and have made significant strides in shifting from a compliance-driven reporting process to assessment that provides meaningful information about student learning that is used to inform decision-making.


IMPACTS:

The impact of the tools has not been dramatic, but rather has had an ongoing, steady impact on the perceived nature and purpose of assessment as a tool to improve student learning.


RESOURCES AND LESSONS LEARNED

LESSONS LEARNED:
    • Aligning the rubric and the categories therein to the reporting template maximize effectiveness and efficiency
    • Faculty/Department Heads who use the rubric when preparing the report indicate greater satisfaction with the process than those who do not use the rubric as a guide
    • If engagement of faculty and/or students within the department is not explicitly included as an expectation, it is less likely to take place
    • Although the rubric increased the efficiency of providing meaningful feedback, it does still take a significant amount of time to review reports

RESOURCES AND COSTS NEEDED:

There are no special or significant resources needed to use these tools.

COSTS

  • Copies of the tool, if hard copies are used (as opposed to electronic documents)
  • Time for reviewers to apply the tool to assessment reports - can be done individually, by paired reviewers, or in groups

FUTURE PLANS AND OTHER INFORMATION

FUTURE PLANS:

The tools have undergone a series of improvement over several iterations. We will be moving to online reporting in the 2014-15 academic year, but will continue to follow the same general format, and will still use the rubrics to provide feedback.

 


LINKS:

NA



CONTACT INFORMATION:

Shelly Stovall 
Director of Assessment 
New Mexico State University - Office of Assessment 
MSC 3CEL 
PO Box 30001 
Las Cruces, New Mexico 88003 
Phone: 575-646-7621 
sstovall@nmsu.edu


SUBMITTED BY:

Shelly Stovall 
Director of Assessment 
New Mexico State University - Office of Assessment 
MSC 3CEL 
PO Box 30001 
Las Cruces, New Mexico 88003 
Phone: 575-646-7621 
sstovall@nmsu.edu


Reply
Next Page

1

Previous Page

New Reply

Please login to post a response.

DISCLAIMER: THE MATERIALS IN THIS TOOLKIT ARE PROVIDED BY ACADEMIC LEADERS FOR USE BY THEIR COLLEAGUES. THEY CAN BE ADOPTED OR ADAPTED AS NEEDED. INCLUSION IN THE TOOLKIT DOES NOT IMPLY ENDORSEMENT BY THE WESTERN INTERSTATE COMMISSION FOR HIGHER EDUCATION (WICHE), THE WESTERN ACADEMIC LEADERSHIP FORUM (The Forum), OR THE WESTERN ALLIANCE OF COMMUNITY COLLEGE ACADEMIC LEADERS (The Alliance).