EMERGING DIALOGUES IN ASSESSMENT

Perspectives on Assessment at Community Colleges

April 10, 2019

Elizabeth Carney and Kathleen Gorski

This article offers a dialogue between two assessment practitioners who both recently moved from four-year institutions to community colleges. We explore recent initiatives at our respective colleges and reflect on what has worked well and what has been challenging. In the conclusion, we suggest a list of shared experiences that seem particular to community colleges compared to four-year schools. Our intent is to reflect on practices we believe could be helpful specifically for assessment professionals at other community colleges and, more broadly, the assessment profession as a whole. We hope to spark further conversation to broaden and deepen the contribution of community colleges to our assessment professional communities.

First, a quick overview of the community college mission: Access is a predominant mission at most community colleges to provide choices for the diverse populations they serve. According to Baime & Baum (2017), "Unifying community colleges is their common goal of providing broad access to post-secondary education. Virtually all have open admission policies that allow students to enroll regardless of their academic preparation and achievement…" (p.2). The White House summit on community colleges report (2011) states that "community colleges are a vital part of our higher education system, enrolling 43% of all undergraduates and a disproportionate percentage of minority, non-traditional, older adult, low-income, working, parent, and first-generation students" (p.17).

Students in attendance can come from all stages in life, ranging from recent high school graduates to retirees. Reasons for attendance include students who are interested in transferring to a four-year institution, completing a two-year degree in a career and technical education, developing specific skills for career enhancement, taking a few classes part-time while in attending another institution full-time, and enjoying lifelong learning one or two classes at a time.


Dr. Elizabeth Carney, Clackamas Community College, Oregon City, Oregon

Where We Started

After five years of full-time work supporting program assessment at a four-year institution, I began my job at Clackamas Community College in August of 2016, just after the college received a set of Recommendations from their regional accreditor, in response to their Year Seven Self Study (the culminating report in the seven-year accreditation cycle of the Northwest Commission on Colleges and Universities). This prompted a year of mandatory assessment meetings unprecedented at the college and new reporting requirements, initiated by college administrative leadership with agreement from faculty senate leadership. The task was, essentially, to build a college-wide system of assessment. Previously, an assessment committee that included a group of interested and committed faculty had worked to provide education and support, but there was not systemic, institution-wide involvement or expectations, and assessment was primarily happening at the course level. Data from general education courses had been gathered in a home-grown tool but there was no mechanism for access or use of it, and it actually had begun to build a negative culture around assessment. My job, as a faculty member in our teaching and learning center who specializes in assessment, was and is to advise administration on policies and expectations that support good assessment, to chair the faculty-led Assessment Committee, to provide consultation and professional development in various forms, and to guide three faculty assessment coaches.


What We Did

In 2016, we created teams of faculty representing degrees, certificates, and pre-college and academic support areas, as well as each of our state-mandated general education areas for transfer degrees. Each team had, and still has, a faculty team lead. Working under the pressure of accreditation, the timeline for the work was compressed over that first year, with the aim to show progress in our interim accreditation report. This compressed timeline affected the way we approached the work and where we were able to focus. There was indeed a lot to report after that first year. For example, nearly all program teams created curriculum maps for the first time and most programs piloted a program-level direct measure for the first time. However, my focus here will be on one aspect that seemed to be key to our development as a faculty and as a college and might give readers some insight into the particular assessment landscape of community colleges, at least as I’ve experienced it at CCC.

In the span of three months, each faculty team held what we referred to at the time as a “Direct Measure Meeting” with the goal of increasing shared understanding about the meaning and evaluation of one program-level learning outcome or general education outcome. These meetings were facilitated by me or by one of the faculty coaches I trained in a step-by-step process. The meetings were documented using a form with reflection questions that teams then submitted to our Learning Management System assessment space for feedback from me and the coaches. The majority of programs chose an embedded assignment, project, exam, or portfolio from a class identified as being at “Mastery” level on their curriculum map for one program learning outcome or general education outcome. They created a program-level rubric to assess that outcome, and they examined a small sample of student work using the new rubric as a pilot process. Teams with external exams for state licensure and such (many of our Career and Technical Education programs fall in this category), who were not primarily using rubrics, examined how well the external exam/measure aligned with their outcomes and curriculum and how the results might be used to inform program assessment. Again, the coaches and I facilitated these conversations.

For both groups—let’s call them rubric and non-rubric groups, though it’s an inexact label—the facilitation of the meetings was key because we asked the same questions consistently across all groups, prompting them to reflect on big questions about their hopes and intentions for student learning and about alignment and curriculum. For example, we asked whether the examination of student work and the conversation with the team had sparked any new thoughts about the program learning outcome, the assignment prompt or exam, or the curriculum map. For the rubric groups, this was more than a norming session because we tried to guide the team toward these larger questions and defined the process as exploratory and iterative, trying not to focus emphatically or exclusively on inter-rater reliability. Instead, the student work served more as a prompt for collaborative inquiry into the existing system and faculty approaches.

Given our rushed timeline of development that year, I think this approach was especially important because teams needed time to “live with” program outcomes that they had not fully engaged with in the past. Many needed to revisit their newly created curriculum maps or to understand how a program level rubric is usually different than a rubric used to give students a grade. Many needed more time to understand assessment generally, and to wrestle with why we were doing this work other than just to "comply."

The facilitated direct measure meetings offered a way for faculty to encounter issues such as alignment and assignment transparency in a supported, structured manner through a process of discovery, inquiry, and discussion. Teams made discoveries through these discussions, not necessarily based on valid and reliable student achievement data, because most didn’t have that yet, but rather based on reflecting as a group in new ways. Some discoveries led to immediate, relatively evident and low stakes decisions to try new things, such as changes in course sequencing. Other discoveries led to questions the team planned to explore further, such as whether they should require pre-requisites. In some cases, teams discovered that degree curricula (as a collection of courses) could use better backward design (designing with the program outcomes in mind) and that outcomes and course sequences were not necessarily transparent to either students or faculty as a group.


What Worked and Why We Believe It Worked

I think collaborative inquiry was key as a foundational step for our college, a way to help spark a shift in culture and to pique interest in the value of assessment. One reason is that this approach directly addressed the culture of faculty isolation, which exists also in four-year institutions but has particular causes and manifestations typical of two-year colleges. In my experience, most faculty at Clackamas care deeply about student success, and they believe in the community college movement, but various systemic and cultural factors can get in the way of collaboration on matters of pedagogy and curriculum. Related to this is the fact that faculty sometimes don’t feel ownership of courses or pedagogy in cases where they "inherit" courses from a previous instructor. The practice of passing along and inheriting courses can be seen as part of a larger approach to teaching and curriculum as knowledge transmission, which is implicitly supported by the "cafeteria style" of course delivery dominant in community colleges, as well as in the general education programs in four-year schools. In other cases, standards from the state (such as the Oregon mandated general education outcomes) or technical licensure standards seem to dictate what faculty can do. The team inquiry process provided a format to begin surfacing these issues and their implications for teaching and learning. In a survey at the end of the 2016-17 academic year, 84% of faculty agreed or strongly agreed that “Working with my assessment team was a valuable learning experience for me” and 80% agreed or strongly agreed that “participating in program assessment has influenced how I approach teaching in my own classes.” Faculty also reported increases in their understanding of how their program curriculum is designed and sequenced and that they learned more about how other faculty think about teaching.

Gaining a better shared understanding of curriculum and teaching through program assessment also seems, anecdotally, to have helped us as faculty establish the why and how for our college’s work on Guided Pathways, a movement that is part of the recent shift nationally from an access focus toward a focus on completion and, crucially, equity (for those unfamiliar, one source of information about Guided Pathways is the Community College Research Center). While collaborative inquiry has been central to our assessment work, it is likewise central to the work we need to do as faculty to create the alignment and transparency necessary to help students get on a path, stay on a path, learn on a path, and complete a path—the four pillars of Guided Pathways.


Challenges

Our challenges moving forward are both specific to community colleges and shared widely across all types of institutions. For example, students in a significant number of our programs don’t tend to move through the curriculum in an optimal way even when there is an intentional, aligned, educationally coherent learning sequence. At four-year institutions, this might result from various factors including large numbers of students transferring in with varying types of preparation. At Clackamas, and other community colleges, this course sequencing challenge derives in part from the traditional focus on access which led to open enrollment and few pre-requisites, not to mention the disproportionate number of part-time students, students with significant life challenges, and systemically non-dominant students—all of whom are more vulnerable to dropping in and out, or to dropping out altogether.

Also, not necessarily unique to community colleges is the important role of part-time faculty. While Clackamas has provided funding for part-time faculty to participate in our assessment activities, many don’t have enough time to participate in assessment meetings, regardless of the incentive of extra payment; they have jobs in industry/professions and teach on the side or are juggling multiple teaching jobs.

Lastly, it will—and should—take us time and iterations to achieve usable data about student learning even as we feel pressure to speed up the process and do it “right” the first time. At the same time, in my experience, finding gaps and places for improvement initially is not just about student learning “output,” it is also about an exploration of the “input” (meaning the existing curriculum and the potential curriculum). It is about faculty exploring existing intentional and unintentional systems that impact how we do business. One of the big “Ahas” in my own professional growth as an instructor and faculty developer has been fully recognizing that faculty often are not afforded the same good practice we advocate for students. Just as students need transparency of outcomes and expectations to have an effective and equitable learning experience, faculty need transparency, too in order to have an effective and equitable opportunity to teach students well. Like our students, we need the time and space to explore the lay of the land, to both collectively and individually reflect on our teaching, our courses, and our students, and to be given the support to understand and act upon this knowledge. As assessment professionals, we know that we need to recognize the systemic factors that interact with assessment and help determine its success. At the same time, assessment can also help surface and address these systemic factors, which for community colleges seem often to reflect the particular history of the community college movement.


Dr. Kathleen Gorski, Waubonsee Community College, Sugar Grove, Illinois

Where We Started

As a higher education assessment professional personally committed to open access, I was very enthusiastic to begin a position at a community college. I believed in the mission to admit and serve students with varied academic and educational backgrounds and was eager to serve and assist with the improvement of learning for a diverse student population.

As I began my community college tenure in 2016, with years of baccalaureate experience, I quickly realized that many traditional assessment practices do not easily apply. Access is important, however, to provide access, curriculum and enrollment flexibility are paramount. Long gone are the days of curriculum maps with logically sequenced courses where program outcomes are introduced, reinforced and mastered by students. Our transfer disciplines do not have program outcomes because the State of Illinois doesn’t offer transfer discipline-specific degrees. Our transfer students are awarded an Associate in Arts or an Associate in Science. Many courses have recommended prerequisites, but most do not require them in order to provide students added flexibility in course scheduling. In addition, many students transfer to four-year institutions without officially completing their two-year degrees. This discourages the ability to provide traditional assessment courses such as capstone or portfolio because enrollment would be substantially challenged. What does this mean for assessment? It means that community college professionals need to look at multiple ways to assess, including a focus on individual courses, using data in new ways, and creating embedded college-level assessment capstone projects in high enrollment courses. These are the areas that I am currently researching and hope to implement in the future to improve assessment practices at my institution.


What We Did or Plan to Do

  • Focus on individual courses.

    My first faculty development session was during a pre-semester in-service. I quickly discovered that traditional mapping is not always applicable. Although the courses appeared to be sequenced in numbers and could be charted on a map, most did not have pre-requisites. The reality is that students take courses that fit in with their personal schedules and often times are not selected in the perfect numerical sequence. This is where I learned the importance of course assessment. We are beginning our assessment process by looking at individual courses with an emphasis on high enrollment and gateway courses. We will begin improving curriculum at the individual course level. This is also necessary because our transfer disciplines do not have program outcomes. The opposite is true of our Career and Technical Education (CTE) A.A.S. degree and certificate programs. Our terminal career associates degrees follow practices similar to four-year institutions including program outcomes.

    Although we cannot follow traditional curriculum mapping in our discipline areas, we can mini map courses that have a definitive sequence such as English 101 & 102 and Anatomy and Physiology 101 & 102. As we begin our course assessment practices, we will also be viewing the sequencing of related coursework.

  • Multiple Ways to Assess.

    Many factors need to be taken into consideration when planning curriculum at a community college. In our transfer areas, our purpose is to build strong foundations for students to transfer to bachelors’ programs. It is imperative to have assessment systems in place to assure that we are preparing our students for future study. We also need to assure that our courses are accepted at transfer institutions. In my experience, four-year institutions offer capstone or portfolio courses as a culminating experience. This is also an opportunity to assess programs. These types of courses would be ideal at a community college but few institutions would accept them as transfer credit. Keeping in line with one part of our mission to serve students in their transfer to four-year colleges, we need to think of other ways to assess at the discipline and college-level.

    We recently piloted assessment at the college/general education level (CL0Os). Our previous exit exam was not aligned with our curriculum and student participation was low due to the voluntary structure. Our Outcomes Advisory Council (OAC) faculty agreed that we needed to consider other ways to assess our students at this level. Our decision was to use embedded assessments across all courses. We piloted all five of our CLOs: Critical Thinking, Information Literacy, Communications, Quantitative Literacy and Global Awareness in courses that met general requirements. Faculty used the AAC&U VALUE rubrics to assess existing assignments in their courses. Additionally, some faculty used tagged questions on exams that aligned to the CLO outcomes and our English faculty piloted the assessment of all outcomes in a portfolio. Our preliminary conclusions show that using a portfolio including student reflection, embedded in a required course, provided the most comprehensive picture of student learning. However, this required the most use of faculty resources and may not be scalable across the institution. Embedded assessment using the VALUE rubrics gave us the second most valuable information regarding our student learning.

    The pilot results are valuable, but we realize that the most meaningful data required the most faculty resources and that the level of the student population assessed was varied. Our results included students in their first semester in attendance to students in their last semester. We expect that our results of critical thinking assessment should indicate higher levels of proficiency for the second year, second semester student. Since our data were aggregated, the student level was unknown. Our pilot resulted in our consideration of additional ways to assess and new ways to use data.

  • Capstone Assignments

    Another area that we are considering for the assessment of our college learning outcomes is the implementation of capstone assignments. It is not practical for community colleges to require portfolio or capstone classes to our students planning on transferring to other colleges because they are typically not accepted by the four-year institutions. If they are, they are typically elective credit which may not contribute to degree completion. We need to assure that we are creating pathways in our transfer areas that align with the needs of our transfer institutions. There has been discussion of creating capstone assignments that align with all of our CLOs and embed the assignments into multiple high enrollment general education courses. We believe this could be as meaningful as our portfolio assessment results in the pilot. We will be discussing this further in the fall as a viable option to achieve meaningful results.

  • Using Data in New Ways.

    We are currently working with our institutional research team to find out how we can identify our second semester students in their third and final required general education concentration course to assess CLOs. If we can disaggregate our CLO assessment data by course completion, we will be able to view data from students in their first, second and third CLO: Critical Thinking course, for example. This would give us an exit as well as a value-added view of student learning. We currently do not have systems in place that can provide this data to us. Using data in this fashion was not necessary at my previous institutions because final assessment of college learning outcomes was embedded in senior level, portfolio or capstone courses all which followed a prescribed sequence requiring prerequisites.


What Worked and Why

My first two years were successful because of faculty collaboration and their passion for their students and their students learning. Our accomplishments and future success are and will be attributed to faculty involvement. When I arrived, we were in our third year of providing faculty liaisons release time to lead assessment. This was a new practice at our community college. It has been successful in our assessment practice and is expanding this year in our Office of Teaching and Learning with two full-time faculty liaisons and an adjunct faculty liaison. Although I have built collaborative relationships with faculty, we would not have the same involvement without faculty colleague to faculty colleague collaboration. My mission is to support and guide the process while faculty lead.


Challenges

My biggest challenge was faculty frustration with assessment. I arrived after many years of unsuccessful starts and stops. Over the years, there were continual changes in the process which was confusing to all. Upon my arrival, I met with each Outcomes Council faculty member for insight on how we should move forward. I listened to all, we came together and created a clear plan and communication to move forward intentionally and at a pace that was conducive to all. Our communication included step-by-step maps of each semester’s work. We have followed the same path created two years ago and are making considerable progress.

Another challenge was that previously, assessment was seen as the responsibility of only one administrator with a team of coordinators. Since assessment was seen as only the assessment unit’s responsibility, others were not actively engaged. We are all interested in student success and for assessment to be successful, it must be a team initiative. In my two years, I am proud of the inquiry and scholarship of the faculty and administrators as we work together to improve learning for our students.


Concluding Thoughts

In collaborating to write this piece, we discovered that we have approached assessment at our colleges in many of the same ways. We both work to create systems to ensure that assessment can be faculty-led, with the support of college resources. We both have worked to support faculty’s ability to do embedded assessment and to choose from multiple methods of assessment. These are practices that many assessment practitioners probably share regardless of the type of institution they work in.

We noticed other shared experiences that seem particular to community colleges compared to four-year schools, or at least to manifest in particular or extreme ways. Here are a few of the factors we’ve found to greatly influence assessment at community colleges:

  • the open-access model;
  • the relatively high percentage of vulnerable students;
  • serving as both a through-way as well as an endpoint in students’ educational pathway;
  • two very different structures coexisting: Career Technical Education and Transfer;
  • the necessity to adhere to state standards (such as state-mandated general education outcomes) and four-year transfer credit requirements.

Still, we are left with plenty of questions and the desire to continue the dialogue about assessment in community colleges. Are our experiences representative? For practitioners at community colleges that have long-established program assessment systems in place (compared to our colleges which both started building systems more recently), what are their thoughts about what we’ve shared here? We hope to see further conversation about these and other topics among community college assessment practitioners and within organizations like AALHE.


References:

American Association of Community Colleges, Commission on the Future of the Community Colleges. Building Communities: A Vision for a New Century. Washington, D.C.: American Association of Community Colleges, 1998.

Bailey, T. R., Jaggars, S. S., & Jenkins, D. (2015). Redesigning Americas community colleges: A clearer path to student success. Cambridge, MA: Harvard Univ. Press.

Baime, B. and Baum, S. (2016) Community colleges: Multiple missions, diverse student bodies, and a range of policy solutions. Retrieved from https://www.urban.org/sites/default/files/alfresco/publication-pdfs/2000899-Community-Colleges-Multiple-Missions-Diverse-Student-Bodies-and-a-Range-of-Policy-Solutions.pdf

Felix, E. R., Bensimon, E. M., Hanson, D., Gray, J., & Klingsmith, L. (2015). Developing Agency for Equity-Minded Change. New Directions for Community Colleges, 2015(172), 25-42

Jankowski, N. (2017). Degrees that matter: Moving higher education to learning systems paradigm. S.l.: Stylus Publishing.

Jenkins, D., Lahr, H., Fink, J., & Ganga, E. (2018, April 19). What We Are Learning About Guided Pathways. Retrieved from https://ccrc.tc.columbia.edu/publications/what-we-are-learning-guided-pathways.html

Nunley, C., Bers, T., Manning, T., (2011) Learning Outcomes Assessment in Community Colleges. National Institute for Learning Outcomes Assessment. Retrieved from: http://www.learningoutcomesassessment.org/documents/CommunityCollege.pdf

National Student Clearinghouse Research Center. (2017). Term enrollment estimates. Retrieved from https://nscresearchcenter.org/wp-content/uploads/CurrentTermEnrollment-Fall2017a.pdf

White House Summit on Community Colleges. (2011). Summit report. Retrieved from http://www.whitehouse.gov/sites/default/ files/uploads/community_college_summit_report.pdf