Simplifying Assessment in the Digital Era
As institutions strive to improve student learning, assessment is a persistent concern for administrators and academics alike. The usefulness of assessment itself has been called into question, and there’s certainly little doubt that many colleges struggle to get it right. However, some argue that the professors conduct lies in the methodology used to gather, contextualize and analyse data. The good news is that, in the digital era, there are tools that enable administrators to streamline assessment and make it simpler and more effective than ever— adding value to both their work and that of faculty.
It’s Still About People
As with any shift to a digital platform, implementing assessment software requires careful oversight. After all, software can only be useful if users adopt it. For this reason, faculty needs must be a key consideration for any project manager because winning their support is vital to successful adoption. While administrators may oversee the assessment process, it’s the faculty who decide what to assess, explains Catherine Wehlburg, Associate Provost for Institutional Effectiveness at Texas Christian University (TCU). “It’s their curriculum, it’s their students, it’s their work,” she says. “I’m there to support them and back them up and answer questions and to ask questions. But the work that they do is most certainly theirs, and they have every right to change it at any point in time.”
Wehlburg introduced assessment software at TCU more than ten years ago. Before that, department heads would send her their data in multiple MS Word documents, which was a nightmare to manage. She began to look for a more organized, coherent process. “It seemed like there had to be a better way.” She created a committee to review the available options – a step she recommends, as it allows colleges to include faculty in decision-making. Giving them a say helps earn their buy-in. (Learn More: How Dr. Wehlburg organizes internal and external program reviews and how to overcome obstacles in the process.)
Creating a community of learning
Across all courses, there are some basic similarities: All professors are looking at student performance— just in different formats or on different schedules. But the criteria on which dance teachers assess their students differs significantly from how chemistry professors conduct their reviews.
Digital platforms must be flexible in order to accommodate this diversity while enabling assessment professionals to use processes that suit their individual institution’s needs.
At TCU, some programs already had to comply with specialized accreditation requirements, and Wehlburg wanted to make sure they did not have to duplicate their efforts in two systems. She chose a single platform that allows faculty to put whatever they want into its text boxes — reflecting variations in thinking, measurements, and language.
Elsewhere, the use of a single platform brings consistency to a college’s assessment. Old Dominion University in Norfolk, Virginia, offers nearly 100 programs and 50 certificates. To comply with regional accreditation requirements, the college had to regularly present data in a manner that would be easily accessible to the Southern Association of Colleges and Schools (SACS).
“Before implementing this program [Weave], we had several versions of Word documents and not all the relevant questions related to student learning were addressed,” says Tisha Paredes, Assistant Vice President for Assessment at Old Dominion.
“This system helped us with that, getting everything related to student learning outcome assessment annually reported and in the same format.” (Learn More: Dr. Tisha M. Paredes of Old Dominion University discusses how to engage faculty in the QEP during the planning stages, as well as how to sustain faculty enthusiasm beyond the first year.)
The most effective use of these tools allows assessment data to inform other processes and decisions within the institution. College leaders at the Southwest Indian Polytechnic Institute in Albuquerque created an electronic repository for assessment data in 2012, which tied specific dollar amounts to assessment objectives and outcomes. The data fed into institutional documents and allowed administrators to create “outstanding reports, which can be used for institutional budget planning,” according to Edward Hummingbird, Director of Institutional Research, Effectiveness, and Planning.
In addition, a central role of the platform was to facilitate knowledge-sharing across the institution, forming a collaborative workspace in which methods, processes, and best practices could be shared. “We wanted a system that allowed all users to see not only their own assessment plans, data and reports, but allowed them to view others’,” Hummingbird says. “In doing so, we were hoping to create an internal community of learning.” (Learn More: Dr. Hummingbird and other panel members share tips, tricks, and best practices for assessment and accreditation.)
As demonstrated in the previous examples, a software platform alone is not enough. The most effective implementations have involved both software that helps streamlines processes as well as participation in communities wherein professionals share best practices, compare notes, and learn from peers working in assessment and accreditation.
The Road to Successful Faculty Adoption
Faculty need help to figure out how to navigate the software and accurately input their data. At Old Dominion University, this meant offering workshops and one-to-one support and giving faculty additional assistance around the time when assessments are due. At TCU, when their system was recently upgraded, Wehlburg managed the implementation carefully, estimating that it would take a year to fully integrate the upgrade. Faculty needed time to be informed of the changes, discuss them, and develop their own response. “Transitions to new systems can be hard,” Wehlburg says. “I think that you have to build in enough time to do this.”
Training enables faculty to avoid problems that could otherwise crop up. Staff might need to be reminded, for instance, they should preserve data history rather than delete it so that an accurate record of assessment can remain in place. Important topics like data protection should also be considered. “Managing data in a system like this can be daunting,” Hummingbird says. “But the quality of data in the system ensures the relevance of findings and resource recommendations coming out of the system.”
Focus on improvement, not accountability
Assessment software solutions are much more than simple storage tools. When deployed effectively, they can produce key insights that provide timely, relevant data that informs decisions and supports effective planning. Assessment professionals are excited about the potential these tools offer to enable innovation within their institutions. For example, these platforms could be used to document student activities that take place outside the classroom. Or administrators could examine data to identify areas where faculty need help and use that information to develop workshops and other types of support.
But the software’s most important function may simply be to allow those teaching and working at colleges and universities to focus on their primary jobs. Software should never be a barrier to assessment. “We want to focus on the improvement and not the accountability piece,” Wehlburg says. “The real focus needs to be on what are students
learning, how much are they learning, and how can we improve it.”
First published in The Chronicle of Higher Education, Copyright © 2018
Want to see how Weave can help simplify your accreditation and assessment processes?
Fill out the form below for a personalized demo. You’ll receive advice on your current accreditation system, as well as an overview of how Weave can simplify your workflows, improve collaboration and make your assessment and accreditation process a success.