Quality Assurance

Data Dashboards for Faculty Collaboration, Professional Learning, and Student Success

Pierce College
https://www.pierce.ctc.edu/
Faculty and administrators have a long list of data needs. When we presented data in the past, we often did so only at an institutional level. This resulted in a general sense of belief, but with a simultaneous belief of either "that's not happening in my classes," or "I don't see myself/my classes in these data". So, how then do we expect faculty to help increase student success at the course level without giving them the opportunity to dig deep into their own course and departmental data?

We are all likely familiar with the concepts of data, data analysis, and, to some degree, dashboards. What is unique about this innovation is the intersection of the tool, the depth, the training, and the culture.

The tool we are using is Tableau, which we have access to through a state-wide license. It creates both accessible/understandable visualizations of data, as well as statistical details for those who wish to dig deeper. The depth is that we have *released the data to everyone* who is trained such that faculty can see both their own data and the data of their colleagues. This provides context for understanding student experiences across courses, rather than simply within one's own. The training is designed to provide the technical, ethical, and emotional support and guidance needed in order to understand how the tool works, how to use it effectively and in ways that support and grow, and how to prepare oneself both for what they will see in their own data and how to discuss that in a broader context with faculty peers.

The tools was introduced in what we framed as a "non-punitive" environment. That is, the data and discovery that occurs were to be used to identify gaps, prompt conversations between faculty about student success, and to help identify professional learning opportunities that may be undertaken individually or by teams.

We introduced the the tool (and broad access to the data) first to a small group of faculty who asked to see "all the data." They, in turn, became "evangelists" of the data access and soon other faculty were asking for similar access. Having already built a culture of evidence wherein data is used to provide understanding and drive decision-making, we were well staged to elevate that to a culture of discovery wherein individuals were inspired to explore, identify areas of focus, and engage themselves and their colleagues to find approaches to learning that advance student success. This was initially introduced and used in institutes that focused on faculty learning/doing research and eventually expanded to be a part of the discipline and program review process.

0
Carly Haddon

chaddon@pierce.ctc.edu

Tool Profile Detail

Collaborative and Responsive Model of Course Development

Laramie County Community College
http://lccc.wy.edu

Traditionally, online course offerings were produced as needed rather than programmatically with minimal consistency in delivery or development of course materials.

The Center for Learning Technologies complied research on best practices for consistent, high quality online course delivery led to the formation of this project. The instructional designers and subject matter experts collaborated to design both online that ensure comparability and consistency in meeting course competencies and learning objectives that mirror LCCC’s face-2-face courses.

0
Les Balsiger

lbalsiger@lccc.wy.edu

Tool Profile Detail

Academic Program Assessment Review Template & Rubric

New Mexico State University
http://www.nmsu.edu

These tools promote increased intentionality, forethought about, reflection on and consideration of findings related to Academic Program Assessment. The use of analytical rubrics to provide formative feedback models good assessment practices and encourages and informs improvement.

These tools were initially conceived to foster increased knowledge of and engagement with meaningful assessment of student learning. The first iterations were developed through a faculty committee that reviewed Academic Program Assessment reports. Subsequent refinements have occurred over recent years to provide instruments that model strong practices for assessing direct evidence of learning. Descriptions of performance levels included in the rubric were revised over time to speak directly to areas that are cornerstones to effective assessment, but that are commonly neglected, inadequately addressed, or are hotbeds for confusion and/or misinterpretation of the assessment process and/or its purpose. As they exist today, the tools (and the rubrics specifically) provide formative feedback, instruction for improvement, and summative evaluation of Academic Program Assessment practices. They facilitate consistency of feedback to and across departments and provide, to a significant extent, detailed feedback in an effective and efficient manner. In addition, the tools provide the institution a means to aggregate areas of strengths and weaknesses in assessment activities and reporting at various levels beyond the department/program completing the report, including at the college- and institution-level. Aggregate results can be used to determine appropriate departmental/program, college, or institutional interventions and/or response. Aggregate results are reported to our state Legislative Finance Committee.

0
Shelly Stovall

sstovall@nmsu.edu

Tool Profile Detail

Quality Assurance Program

Colorado Community Colleges Online
http://www.ccconline.org/

The CCCOnline Quality Assurance Program strives to ensure ongoing quality in online courses by assessing instructor participation within courses.

CCCOnline practices an institutional approach to quality; its Academic Deans, Chairs, Academic Technology, Instructional Design, Student Services, Training, and Quality Assurance Teams work together as courses are built, updated, and taught. The Quality Assurance process focuses specifically on each course’s community of learning and evaluates courses regularly using a set of discussion-based criteria.

0
Elizabeth Dzabic

elizabeth.dzabic@cccs.edu

Tool Profile Detail
Syndicate content