image2

Sometimes it can feel difficult to get excited about Institutional Effectiveness, not only as an individual but for a campus. Recently in a webinar Mount Saint Mary’s University shared their assessment and program review transformation, where Dr. Michele Starkey, Dr. Laurie Wright Garry, and Dr. Elizabeth Sturgeon reviewed some great stories and strategies for fostering broad stakeholder involvement and creating a team-centered environment. We’ve included a list at the end with ones you can implement on your campus!

About MSMU

Mount Saint Mary’s University is a Catholic university in Los Angeles, CA. It is a liberal arts institution primarily for women, and was founded in 1925 by the Sisters of Saint Joseph of Carondelet. There are two campuses with around 3,000 total students, 120 full-time faculty and 300+ adjuncts.

Assessment Process: Then and Now

If there’s one common emotion around assessment, it can be overwhelming. Between institutional learning or general education outcomes, program outcomes, student learning outcomes, and accreditation requirements, it’s no wonder some of us struggle to know where to even begin.

General Education: MSMU decided starting large would work best; initially they had 18 general education outcomes to assess. Approaching 3-4 a year was a good place to start. The General Education Committee used an AAC&U rubric in the beginning, and eventually worked with faculty to develop their own. Part of the process includes evaluating the rubric tool and adjusting it as needed.

Once the rubric was determined the chairs began designing Rubric Scoring Parties, which is a fine-tuned process they use today (currently virtual):

  • Determine which courses align with which outcomes
  • Randomly collect artifacts (usually around 80), de-identified and coded
  • Recruit volunteers (MSMU gives each a $50 Amazon gift card)
  • Hold a norming session: use a few examples to ensure everyone is using the rubric in the same way.
  • Hold the scoring session
  • Have a data analysis meeting (what was surprising, where did students do well or not, trends, actions we can take)

The rubric scoring results are shared in a report combined with program results and the graduating senior survey via an infographic, the website, and a newsletter.

In addition to seeing the experience as iterative and modifying as they went, the process was rolled out gradually.

Program Assessment: After learning from assessing gen ed, next MSMU introduced department assessment (prior to program assessment). One department representative participated in weekly working meetings that provided professional development in assessment (writing mission statements, outcomes, measures, targets, results, and closing the loop). Facilitators encouraged representatives to research learning outcomes in their disciplines as a starting point.

After a year the departments moved to replicating the process for each of their programs, which is the basis of the process they use today:

  • Academic Assessment Committee (co-chaired by an administrator and a faculty member; includes 4 elected members as part of faculty governance and some other appointed members) meet once a month to oversee outcomes assessment and gen ed.
  • Program Assessment Liaisons (PALs) and Co-curricular Assessment Liaisons (CALs) are determined for each area. Their role is not to do all the work, but to lead, coordinate, and remind others in their area. They are responsible for submitting documents.
    • They receive 1.5-3 units of release/overtime as well as counting this work toward service hours. This is funded by each department.
    • Onboarding process: One of the Co-Chairs of Academic Assessment Committee will
      • email the PAL responsibility document
      • Meet one-on-one with new PAL to go over the required assessment documents, process and to set up an assessment calendar
      • Direct new PALs to the Assessment myMSMU page for resources
      • Suggest professional development and software training
  • Annual Documents submitted (and scored with a rubric then shared with the faculty assembly and in the newsletter):
    • assessment plan
    • curriculum map
    • summary of data results with trends over time per outcome
    • annual outcomes report (with a focus on closing the loop from previous years)

It’s worth noting that Dr. Laurie Wright Garry, a faculty member, has personally experienced an important shift in how she experienced assessment at MSMU. She describes it as a conversion story, from reluctance to understanding. Initially there was an emphasis on accountability and compliance, “The accreditor is coming!” which caused a lot of anxiety and therefore negative feelings for all involved. There was also a lack of comprehension; she shares “I felt assessment was something the sciences did, not humanities.” Perhaps one of the biggest challenges was a feeling of judgement, that if assessment results were “not good” the faculty or program must also be bad. Obviously none of this was very motivating.

Since the changes above have been implemented over the past few years, Dr. Garry has had a completely different (and positive) experience. After she was unable to find a volunteer in her department to be a PAL, she decided to try it. “And if I was going to do it, I was going to be dedicated to doing it right.” Just because something might be hard to assess wasn’t a reason to not do it. She had two areas of focus. First, reuniting her fragmented department behind a mission statement that they crafted together. Second, making assessment center on what faculty were passionate about students learning, and using the process as a tool to ensure they were, in fact, teaching it. She felt they had a really good department, so assessment was not about numbers but about making it even better.

The new system supported this through regular, productive communication with the Academic Assessment Committee chairs, an iterative, reflective process, and celebrating accolades and areas of growth. “I was stuck in my cave and now I am in the sunlight of assessment.”

Program Review: Then and Now

Initially program review was part of the curriculum committee at MSMU and each program was supposed to report every six months. Departments had to rely on themselves for data, and there was no budget for external review. Additionally, because there was also no follow up and no tie to budget, many did not complete it on schedule.When there was a change in department or program leadership, inevitably that person had to start completely from scratch, compounding the problem.

Recently MSMU completely revamped their process:

  • Created a separate program review committee with six elected faculty members and two more from Institutional Planning and Research, as well as the Associate Provost. This committee created a web presence with resources to guide programs through the entire process with templates and examples.
  • Revised the schedule to be every two years, and as they did with program assessment, they grew it slowly by starting with a few programs and testing the most critical part of the process first (self study, faculty surveys, student surveys, but saving external review for a later group).
  • Established a better timeline, with the self study due in October, Committee report due in December, and once expanded the spring was dedicated to external review.
  • Facilitated IR providing programs with their data for the self study.
  • Revised the self study template based on their accreditor – it went from 20 annotated pages to less than 3!
  • Prepared for external reviews by creating a no-fee contract to come directly from the Provost’s office.
  • Created a template for evaluator’s feedback and report that focused on actionable items: general comments, themes from previous reports, interviews with faculty, evaluations from students, and 3-5 accolades and recommendations.
  • Instituted an Annual Progress Report due one year later to share what the program has accomplished in relation to the recommendations.
  • The Provost was also supportive when the committee asked that budget increases be tied to program review completion.
  • The entire process takes place in Weave, customized to MSMU fields (which also keeps it standardized across programs).

Dr. Laurie Wright Garry was also part of the first group to pilot the new process. She shares that the chairs were willing to actively participate and sat with her at her computer to work on the report. The focus was “What are you proud of? Let’s work from there.” When it came time for her to work with her colleagues, they broke the report into pieces and people volunteered to manage a piece and the entire group worked through it until it was finished. She said the feeling of being on the same team and cheerleading each other made it a positive experience.

What’s Next for MSMU?

After all that great work, they aren’t done yet! For assessment, they plan to tackle curriculum map revision, aligning CLOs with PLOs, and applying for the Excellence in Assessment Designation from NILOA. Program Review will continue to grow by adding in the external review component, continuing to revise the process, and an eventual link to budget.

Institutional Assessment Strategies for Any School

Every institution is unique, but many of the tactics used to transform MSMU can be used on any campus. Sometimes it’s shifting perspective, or it can be trying a new approach.

  • It takes a village: involve people at all layers so everyone has ownership
  • Positive common ground: Compliance is not usually positive or relevant to everyone – but students and teamwork everyone can get behind
  • Start with professional resources (from an accreditor, professional organization) and then test and revise them to create your own
  • Establish common terminology and shared vision: Avoid confusion by outlining language and acronyms early, as well as the higher vision (to serve students)
  • Work macro to micro: Though it might not sound intuitive, starting with a larger area (gen ed or a department) and then moving to smaller ones (programs) is more manageable and tests the process with smaller groups of people
  • Fine tune then replicate, remembering to involve stakeholders and keep adjusting when necessary
  • Recognize the work: Counting IE work as part of service hours, giving awards for quality work, funding professional development, paying faculty/staff either with money or reductions, and even gift cards can go a long way in changing the perception and participation
  • Give feedback and share: Building in valuable feedback and sharing end reports validates the time, efforts, and findings of the work
  • Utilize technology: Consider investing in a platform to facilitate and streamline the experience for everyone – this will result in saved time and money, and reduce frustration and missing items.

Other resources:

Leave your comment