At the recent meeting of the National Forum on the Future of Liberal Education, I promised to put together some references to important work concerned with assessment and other approaches to improving liberal education. As I drafted this, I realized that many of the topics raised practical questions about how one might exercise low key but effective leadership on campuses. Hence the "leadership questions" in the text below:
1. In recent years, many institutions have reformulated their understanding of “liberal education” as the cumulative development of cognitive and personal capacities such as critical thinking, analytical reasoning, cogency of written and oral expression, moral reasoning, civic engagement, etc.
Background:Under the auspices of its “Liberal Education and America’s Promise” program (LEAP), the Association of American Colleges and Universities (AAC&U) has articulated one widely followed set of “essential learning outcomes,” which are commonly known as LEAP outcomes.
Many colleges and universities have similar learning outcomes.An AAC&U poll found that about four out of five institutions have specified such learning outcomes for undergraduate education (see page 3).
Leadership question:Does this match my understanding of “liberal education” and that of my department or institution?
2. While there is broad agreement among faculty members, business leaders, and others about the importance of such capacities, students often have little idea that these are intended outcomes of their education
Background:The same AAC&U poll surveyed senior administrators and found only about 5% were confident that their students understood the institution’s learning goals (see page 5).
Leadership question:How can I best convey the learning goals that I regard as most important with my students?
3. Most of the goals in point 1 above are cumulative, that is, they are not likely to be achieved in a single course or project; but progress toward many of them can to some degree be assessed by currently available tools and instruments.
Background:There are many kinds of assessment instruments.Here are some frequently used types:
Surveys of student responses:The most widely used of these is the National Survey of Student Engagement or NSSE, which your institution, which your institution may already be using.
Direct measures of student performance:While perhaps not the most widely used, one instrument that is frequently discussed is the Collegiate Learning Assessment or CLA, which uses well-crafted performance tasks to gauge the quality of student critical thinking, problem solving, analytical reasoning, and writing.
Because the CLA is designed to measure student performance at an institutional level, translating results to the individual or classroom level is challenging (and is not without problems).CLA in the Classroom is an “on the ground” alternative that provides faculty with the know-how to adapt the CLA performance tasks—or even to develop their own—for use in their own courses.
Another way to directly assess student learning at the individual or course (or even departmental) level is through the use of rubrics.Through its Valid Assessment of Learning in Undergraduate Education or VALUE initiative, the AAC&U has developed a set of rubrics that evaluate student accomplishment of LEAP outcomes.
Worth knowing about as well is the Wabash National Study of Liberal Arts Education, which uses a combination of student surveys and direct measures of performance to get a holistic sense of students over their four years of college.
Leadership question: Does my institution use these instruments and do faculty learn from them?
4. Data from these instruments, when aggregated and analyzed, can show which practices are most effective, as well as which are not.
Example:Using data and research on student learning, George Kuh of Indiana University has documented a number of “high impact educational practices,” that is, practices that have been proven to benefit many college students.An excerpt of Kuh’s findings, as well as a link to purchase the whole document, can be found here.
Leadership questions:Are all these “high impact practices” in place on my campus and what percentage of students take advantage of them?In general, does the development of new academic programs or practices on my campus start with data?That is, are data used to determine the kinds of programs or practices that need to be implemented to genuinely improve student learning?
5. Course evaluations can be transformed from “student satisfaction surveys” to student self-assessment of their progress toward learning goals.
Background:Robert J. Thompson and Matt Serra. “Use of Course Evaluations to Assess the Contributions of Curricular and Pedagogical Initiatives to Undergraduate General Education Learning Objectives.” Education 125, no. 4, Summer 2005, pp. 693-701.Not available on line but see also: http://www.teagle.org/liblog/entry.aspx?bid=1&id=58.
Also worth checking out is the work of the IDEA Center, and in particular, its “student ratings of instruction” initiative, which is focused on “learning and curricular objectives.”
Leadership question: When was our course evaluation questionnaire last revised?
Leadership question: Who on my campus is best informed about such matters?Is there any campus-wide venue for dialogue about such matters?
7. Cognitive science is yielding some insights about student learning at the college level.Our colleague Elise Temple (Dartmouth) can help on this matter, I am sure, but for a start, see also the list of “Findings” from the Teagle Collegium at Columbia University.
Leadership question: Are cognitive psychologists or neuro scientists at your institution doing any work along these lines?
8. Among the many other promising approaches to increasing student learning is the Reacting to the Past pedagogy project run by Mark Carnes at Barnard College.
Leadership question: Has anyone I know tried any of these simulations, or role- playing games and with results?
9. There are many websites that provide resources and forums for discussion of assessment material.Here are just a few:
Think of this as a draft in need of further input and improvement. I'll be interested in your reactions, which you can email to me at wrconnor@teaglefoundation.org.