Inside Higher Ed
October 26, 2009
INDIANAPOLIS -- By most measures, the National Survey of Student Engagement has no shortage of numbers to demonstrate its success. Ten years after it started, with 140 colleges asking students a series of questions about everything from how much homework they get to how much they talk to members of racial groups different from their own, higher education has embraced the idea. This year 643 colleges participated, and because many colleges do the survey every two or three years, that figure understates the extent of the use of NSSE (pronounced "Nessie").
NSSE – based at Indiana University -- has spawned additional surveys of faculty members, of community college students (the original NSSE is focused on four-year institutions) and law students. A gathering here this weekend of the program's leaders, college presidents and assessment experts drew officials from as far away as South Africa, where six universities are about to start their own version of NSSE. College presidents now routinely cite NSSE to accreditors, politicians and others who ask about their commitment to accountability and to improving the student experience.
There was certainly celebration here about the progress of NSSE. But there were also serious questions raised, with many suggesting that colleges use NSSE to pay lip service to improving themselves, but don't necessarily do much beyond administering the survey. That worries those who gathered here because many see the demands for accountability for higher education only increasing in the years ahead.
But if there was some frustration here about the limited use of NSSE to actually improve the academic experience, there was also a sense that there is an emerging body of evidence that may just persuade skeptics that it's worth taking the time to act on the survey. And there is talk of additional reforms -- to NSSE and to higher ed -- that could make the survey more relevant.
The largest challenge facing NSSE is finding ways to be sure "that campuses actually use the results and not just administer the survey," said Stanley O. Ikenberry, former president of the American Council on Education and former (and soon to be interim) president of the University of Illinois, who is also among the leaders of a new research center on the state of assessment in higher education.
Ikenberry’s comment and similar remarks were heard again and again during the discussions here. There is a broad sense that NSSE is one way that colleges show accreditors that institutional self-study is taken seriously, even if just filing the report with an accreditor isn’t actually that serious an act.
Others mentioned the boost given to NSSE by Margaret Spellings, who as education secretary in the last Bush administration talked repeatedly of the need for colleges to use comparable measures to study themselves. NSSE is far less controversial than standardized tests of learning, and so was an easy answer to Spellings and those who agreed with her.
NSSE staff members and college presidents here who fully embrace the NSSE idea joked about how they would like to require colleges that participate to file reports on what they actually do with the data, and that they be required to do something. Generally, those involved in NSSE stressed that colleges should use it because it offers good insights, but those involved in the policy world stressed the risks of not taking accountability seriously.
The Dangers of Inaction
Molly Corbett Broad, president of the American Council on Education, warned those gathered here that they would be foolish to think that accountability demands were a thing of the past. She noted that the push for colleges to be more accountable predated Spellings and outlasted her tenure at the Education Department. Given the significant investments the Obama administration is making in higher education, she said no one should be surprised that they are accompanied by demands for accountability.
She said that while she is “impressed” with the work of NSSE, she thinks higher education is “not moving fast enough” right now to have in place accountability systems that truly answer the questions being asked of higher education. The best bet for higher education, she said, is to more fully embrace various voluntary systems, and show that they are used to promote improvements.
The danger, she said, is that without such a shift, government entities will set their own standards. She said that right now she sees that potential coming less from the federal government than from states. Those with very high levels of unemployment, Broad warned, “may be tempted to tie their level of support [for higher education] to very specific outcomes tied to job creation.”
One reason NSSE data are not used more, some here said, was the decentralized nature of American higher education. David Paris, executive director of the New Leadership Alliance for Student Learning and Accountability, said that “every faculty member is king or queen in his or her classroom.” As such, he said, “they can take the lessons of NSSE” about the kinds of activities that engage students, but they don’t have to. “There is no authority or dominant professional culture that could impel any faculty member to apply” what NSSE teaches about engaged learning, he said.
One effort Paris described that his group is considering to get more action on NSSE and other assessment systems is to set up a certification system that would indicate which colleges actually act on the results they get.
New Realities for Measuring Engagement
If everyone here seemed to agree that colleges need to focus more on how they use NSSE, there was much debate on what NSSE should actually measure. And some suggested that it is due for an overhaul to reflect changes in higher education.
Adrianna Kezar, associate professor of higher education at the University of Southern California, noted that NSSE’s questions were drafted based on the model of students attending a single residential college. Indeed many of the questions concern out-of-class experiences (both academic and otherwise) that suggest someone is living in a college community.
Kezar noted that this is no longer a valid assumption for many undergraduates. Nor is the assumption that they have time to interact with peers and professors out of class when many are holding down jobs. Nor is the assumption -- when students are “swirling” from college to college, or taking courses at multiple colleges at the same time -- that any single institution is responsible for their engagement.
Further, Kezar noted that there is an implicit assumption in NSSE of faculty being part of a stable college community. Questions about seeing faculty members outside of class, she said, don’t necessarily work when adjunct faculty members may lack offices or the ability to interact with students from one semester to the next. Kezar said that she thinks full-time adjunct faculty members may actually encourage more engagement than tenured professors because the adjuncts are focused on teaching and generally not on research. And she emphasized that concerns about the impact of part-time adjuncts on student engagement arise not out of criticism of those individuals, but of the system that assigns them teaching duties without much support.
She stressed that NSSE averages may no longer reflect any single reality of one type of faculty member. She challenged Paris’s description of powerful faculty members by noting that many adjuncts have relatively little control over their pedagogy, and must follow syllabuses and rules set by others. So the power to execute NSSE ideas, she said, may not rest with those doing most of the teaching.
And of course there is technology. When there are students today who view real engagement as a professor who answers his or her Facebook messages at midnight (as opposed to, say, one with many office hours), is it time for a new set of questions, she asked.
Finally, Kezar noted that there is a relationship between many of the factors she outlined and economic class. The students who are more likely to have to work long hours outside of college, not to experience residential life, to attend colleges with relatively few tenure-track faculty members, and so forth are less wealthy, on average, than other students. Does NSSE, she asked, draw enough attention to these issues?
Similarly, Shaun Harper, an assistant professor of education at the University of Pennsylvania, said that he sees a need for more “race conscious study of student engagement.” In his work, he finds wide variation among members of different racial and ethnic groups in how they perceive such questions as the interest of faculty members in doing research with them. Institutional averages mask these issues, he said.
Validating NSSE
While some sessions here featured critiques of NSSE, others provided key evidence that the surveys relate directly to student learning. One criticism of NSSE over the years is that it measures student behaviors, rather than actual student learning. While NSSE supporters have said that there is lots of evidence that students who do more rigorous work and interact with faculty members more do learn more than other students, the fact remains that NSSE has measured those activities, not whether students learned more biology or history.
Research presented here, however, by the Wabash College National Study of Liberal Arts Education offered concrete evidence of direct correlations between NSSE attributes and specific skills, such as critical thinking skills. The Wabash study, which involves 49 colleges of all types, features cohorts of students being analyzed on various NSSE benchmarks (for academic challenge, for instance, or supportive campus environment or faculty-student interaction) and various measures of learning, such as tests to show critical thinking skills or cognitive skills or the development of leadership skills.
The Wabash study includes a “pre-test” in which students are tested for both their knowledge and attitudes before arriving at college so that the test can focus on what is actually added during college. While there are only preliminary results available, Wabash researchers said that their evidence shows that even in the freshman year, there is a correlation between what NSSE considers positive attributes (such as measures of academic challenge) and learning outcomes (such as gains in critical thinking skills).
Charles Blaich, director of the Wabash project, said that this is a significant finding as it shows that “you have your next move” when you get NSSE results back. Blaich said that Wabash team members have been using the data they collect on NSSE and their other measures to help colleges make specific changes in policies.
For example, he said that one college was getting lower scores than would be desirable for NSSE’s measures of academic challenge, and that those lower scores also resulted in smaller gains in critical thinking skills. Wabash followed up with in-depth interviews with faculty members, many of whom said that they were holding back on homework out of the fear that their students were working too long hours in jobs to handle the homework. Using answers to other NSSE questions, Blaich said he was able to show the faculty members that they were overestimating the hours students at this college were working, and so could add assignments. They did so, and appear to be getting the desired gains, he said.
The irony of the Wabash work with NSSE data and other data, Blaich said, was that it demonstrates the failure of colleges to act on information they get -- unless someone (in this case Wabash) drives home the ideas.
“In every case, after collecting loads of information, we have yet to find a single thing that institutions didn’t already know. Everyone at the institution didn’t know -- it may have been filed away,” he said, but someone had the data. “It just wasn’t followed. There wasn’t sufficient organizational energy to use that data to improve student learning.”
Alexander McCormick, director of NSSE, said he was excited about all of the ideas shared at the meeting, and he said the NSSE team was committed to finding ways to update the test, although that will happen at a slow pace to allow those doing longitudinal studies to adjust. McCormick also said he was pleased to hear so many people committed to getting NSSE used more -- in the sense of acting on its results.
“I want to try to make the point that there is a distinction between participating in NSSE and using NSSE," he said. "In the end, what good is it if all you get is a report?"
This blog on Texas education contains posts on accountability, testing, K-12 education, postsecondary educational attainment, dropouts, bilingual education, immigration, school finance, environmental issues, Ethnic Studies at state and national levels. It also represents my digital footprint, of life and career, as a community-engaged scholar in the College of Education at the University of Texas at Austin.
No comments:
Post a Comment