Skip to main content

MCC Daily Tribune Archive

League Learning Abstract - In The Trenches: Assessment As If Understanding Mattered


What began in 1994 as an experimental pilot study in the psychology department has evolved into an effective, interdisciplinary approach to learning, teaching, and creative assessment. Read about the Multiple
Intelligences/Learning for Understanding initiative at Glendale Community College in the August Learning Abstract.

Published monthly with Support from SCT (www.sct.com).

** To view the web version of this abstract, in printer friendly layout, go
to https://www.league.org/publication/abstracts/learning/lelabs0308.htm **

__________________________________________________________
In The Trenches: Assessment As If Understanding Mattered

René Díaz-Lefebvre

Effective assessment of student learning outcomes has been a major issue for higher education for a number of years. Commissions have issued position papers, reports have been disseminated, and there has been an increase in assessment literature (Banta, 1999).

Other issues in community colleges come and go, yet the assessment movement seems to be gaining speed again. Accrediting agencies have incorporated suggestive criteria for what constitutes an effective assessment activity. Some states have mandated various forms of assessment evaluation ranging from high-stakes testing to the creation of portfolios. Data is collected and made available using traditional measures of student academic performance and progress. A decade of rhetoric and effort has generated a minimal amount of effective change in student performance assessment (Alfred, et al., 1999).

THE ASSESSMENT MAZE

With so much information available today on assessment, student learning outcomes, data collection, and the dissemination of results, is it any wonder why many at the community college find themselves in a whirlwind of activities attempting to justify how and why assessment is taking place at their campus? Should assessment focus on improvement of student learning, or should assessment focus on student accountability and the quality of learning produced? Should we look at standardized test scores, portfolios, grades, projects, exit interviews? The list goes on and on.

Many colleges have become caught up in the mechanics of conducting a
student survey: (a) administering a test; (b) conducting interviews; (c) collecting the data, which is most likely sitting in storage somewhere on campus; and (d) writing and disseminating an impressive spiral-bound report, which also ends up sitting in storage. Important questions might be obscured by these mechanics, such as, How is this process going to achieve more productive instruction and more effective learning at my campus or in my classroom? Has an emphasis on technique and data collection overshadowed self-examination, reflection, and continuous improvement of the learning and teaching enterprise valued so highly at the community college? These are not popular questions to ask, especially if one is in the trenches, teaching day in and day out. The reality of what goes on in the classroom is sometimes overlooked or not captured by those far removed from the action.

                                        (This article is continued in the attachment)
 

Dr. Susan Salvador
Office for Student Services
08/29/2003