Improving Institutional Effectiveness and Assessment Practices

min read

Key Takeaways

  • In preparing for an accreditation substantive change visit, Broward College found disparate manual processes across campus and, in some cases, no planning or assessment at all.
  • Faced with a tight schedule and an overloaded IT department, Broward looked above campus for a solution to help improve its institutional effectiveness and implement an all-campus assessment process in time for the accreditation visit.
  • A three-month pilot of the Accountability Management System by TaskStream, a software-as-a-service solution, helped the college move quickly toward full compliance with institutional effectiveness requirements for accreditation.

In the summer of 2008, Broward College prepared for an on-site substantive change visit from the Southern Association of Colleges and Schools (SACS) Commission on Colleges. We took the opportunity to review our institutional effectiveness structure and assessment practices. Our internal review revealed disparate manual processes and, in some cases, no planning or assessment at all. For accreditation, we needed to prove institutional effectiveness.

"Although there were assessment activities across the college, nothing was organized; we didn't have proof that we were 'doing Institutional Effectiveness.'"
— Wendy Clink

Essentially, we needed a way to organize and demonstrate the assessment already occurring on campus and to improve assessment college-wide. We needed more of the college's programs to:

  • Evaluate their activities to see if they aligned with the Broward's strategic plan
  • Think about how to improve the assessment they were doing

And we needed a way to do all of this quickly, before the SACS visit.

Establishing a standard process for assessment college-wide was no small feat. Organizationally, Broward has three campuses and four learning centers serving three Florida counties and approximately 65,000 students. We also have a growing number of online programs and overseas affiliate programs. Broward has a one college philosophy and a decentralized reality. Although administrative functions such as student, business, and academic affairs and IT report into central district offices, each campus has its own president and retains autonomy.

Facilitating the Work

Given Broward's size and complexity, we understood the importance of automating the manual, disparate assessment processes. With only six months before the on-site visit from SACS, we engaged in a review of technologies to help us do so.

Given the tight timeline we faced, we needed a solution that we could get up and running quickly with minimal or no customization required. We could not ask our already overloaded IT staff to provide technology support in any capacity, so a hosted solution immediately rose to the top of our list of options. Fortunately, our decision to outsource aligned with the college's strategic plan, which mandates reviewing existing solutions to see if they meet our needs before we consider building anything.

We formed a committee comprised of faculty, staff, and administrators to select a tool to help us simplify the planning, assessment, and review process. The committee identified five outcomes that we wanted to focus on during our evaluation of the tools. They specified that participating evaluators would be able to describe how:

  1. Contemporary learning outcomes tools can be used to show the relationship between outcomes at the course, program, and degree level.
  2. These tools can be used to provide evidence of learning at the course, program, and degree level.
  3. These tools are populated with state and national outcome measures (for example, teacher education).
  4. College faculty and staff can collaborate using these tools to document learning outcomes and evidence of learning.
  5. Institutions are currently using these tools to provide evidence of learning at the program and degree level.

We needed to make sure that our provider of choice met all of our needs and had a successful track record with organizations similar in size and complexity to Broward. After a preliminary review of existing technologies, we invited several companies to campus to demonstrate their products. Participants rated these presentations using a matrix we developed to ensure that the chosen solution would meet all of our needs. Figure 1 shows the form we asked participants to complete.

Clink Figure 1

Figure 1. Evaluation Matrix

Ultimately, we chose the Accountability Management System (AMS) by TaskStream because it did not require further development and was the most user-friendly, comprehensive, and scalable system of those we evaluated.

Establishing a Common Assessment Framework

The end goal was a single assessment approach for the entire college. We started by establishing a new Office of Institutional Research, Planning, and Effectiveness (OIRPE) that brought the former offices of institutional effectiveness, institutional research, and business intelligence under a single umbrella. We felt that this consolidation of offices would facilitate turning data into knowledge and put the resources needed behind institutional effectiveness.

With the OIRPE people in place, we introduced the AMS technology and situated it to support the work needed. We developed a new institutional effectiveness framework that college leadership adopted, which included the following tenets:

  • Leverage approval processes and workflows already in place for budget and program managers at both the district and campus levels
  • No longer accept paper in the assessment process
  • Identify unit coordinators for every degree program, educational support unit, and administrative support unit
  • Use a single template approach and standard terminology for all units, both academic and nonacademic, to keep assessment simple and easy to understand
  • Assign reviewers to provide feedback

To get buy-in and agreement across the college to use the new framework and technology, we launched a proof-of-concept pilot program that allowed us to start small and incrementally grow our assessment initiatives. We began with a subset of program areas, hand-picking 10 programs based on findings from our assessment interviews. We wanted programs already doing assessment that could be good role models for other units and satisfy accreditation standards.

The pilot started in December 2008 and included lots of iterative feedback to improve the framework as we progressed. We established an assessment calendar (Figure 2), which included hard stops for when assessment plans had to be submitted for review in TaskStream and when reviewers had to submit the plans to the OIRPE. We coordinated the calendar with the existing budget calendar to ensure that plans could be submitted with budget requests.

Clink Figure 1

Figure 2. 2011 Example Assessment Calendar

We also invested heavily in ongoing professional development using workshops, lunch and learn sessions, and group faculty meetings. The training focused more on assessment than the technology because everyone found the technology easy to understand and navigate. The units went back through their 2008–2009 assessment plans, entered them into AMS, and then created new 2009–2010 assessment plans to close the loops and meet compliance requirements for our upcoming accreditation visit.

Figure 3 provides some examples of how Broward is using AMS to support the assessment process.

Figure 3. Presentation of Key Capabilities in AMS for Broward

Scaling Up in Response to Recommendations from SACS

The entire pilot took three months from start to finish. It proved that the technology and the institutional effectiveness framework could be scaled up for an institution-wide approach. We used the reporting features in AMS to provide reports to the visiting SACS team in March 2009 to showcase our institutional effectiveness framework. The visiting team commended us for the framework and advised us to scale up our efforts to achieve compliance.

We had about six months to respond to the SACS visiting team recommendations. From April through mid-August we worked with every unit of the college identified in our framework, individually meeting with a total of 451 college employees one or more times, as well as providing feedback via e-mail and phone. Throughout this process, we refined our framework; TaskStream's Mentoring Services responded quickly with any changes we requested to our setup in AMS.

Our response to SACS was due on September 15, 2009. We again used the data contained in reports generated straight out of TaskStream to provide evidence to SACS of our institutional effectiveness plans. In December 2009 we learned that SACS had accepted our responses to their recommendations regarding those plans.

Lessons Learned

Without TaskStream's AMS we would not have been ready in time to respond to SACS. It was critical to quickly and easily see progress in our programs and know where to invest our time. In the process, we learned the importance of the following:

  • Visibility: Improvement happens when motivated by others. The entire institution uses TaskStream to showcase their work, so all units can see the work of other units. This promotes transparency, accountability, and the sharing of best practices throughout the college.
  • Flexibility: Assessment requires iteration after iteration. It helps to keep in mind that everybody is learning and the process is evolving. Having a flexible technology to support this evolution has been critical for us.
  • Support: Finding the right technology does not come from just checking off a list of feature requirements; it is about finding a viable provider willing to work with you over time. We could not have gotten as far in our efforts as we did, as quickly, without TaskStream's support.

Perhaps most importantly, the technology has helped us reinvigorate units across the college with a culture of evidence and assessment. By adopting an existing solution, we had the time we needed to help everyone understand planning and outcomes development and measurement. We have moved from a culture where assessment was not discussed to a culture where it is part of what we do and talk about every day. Our outcomes assessment already demonstrates the success of our evolving approach (statistics generated using AMS):

  • The number of certificate, technical, and baccalaureate student learning outcomes available for assessment increased by 24 percent from 2009 to 2011.
  • The number of outcomes included in an assessment plan (with at least one measure) increased by 40 percent from 2009 to 2011.
  • The number of findings from assessments increased by 16 percent from 2009 to 2010.
  • The number of direct assessments increased 38 percent from 2009 to 2011.

Next Steps

We developed a rubric for evaluating the assessment work based on examples provided to us from other schools. This year we will start using that rubric to conduct a more formal evaluation of how units are doing with their plans and providing them with feedback, all within TaskStream. A short example of the academic plan assessment rubric we use at Broward follows:

  Emerging Established Exemplary
Mission Statement Unit mission statement needs to more clearly link to the Broward College mission statement and/or state what the unit seeks to accomplish in non-specific terms. Unit mission statement links to the Broward College mission statement and states what the unit needs to accomplish. Unit mission statement clearly links to the Broward College mission statement and explicitly states what the unit seeks to accomplish.

Providing professional development to college employees to understand the rubric helps them grow and ensures that our assessment work continues to improve. We want to move from checking a "Yes/No," the unit did the assessment, to "How do you improve the assessment?" We want to continue to develop and document a continuous improvement process by assessing our assessments!

Acknowledgments

Many thanks to the following individuals at Broward: Dr. Dawn Broschard, Director of Institutional Effectiveness; Dr. Barbara Bryan, Provost of North Campus; Joyce Walsh-Portillo, Director of Assessment; Michael Fenick, Professor of Business and Office Careers; Kathleen Brown, Associate Dean of Health Sciences; and Russ Adkins, Associate Vice President of Instructional Technology.