Skip to Content, Navigation, or Footer.

In keeping course evaluations internal, Brown cites privacy, bias concerns

Course evaluations yield bias against female and underrepresented minority faculty members

In the first meeting of ECON 1530: “Health, Hunger and the Household in Developing Countries” on Sept. 9, Professor of Economics Andrew Foster gave students access to all University course evaluations from the previous year to help them decide whether to take the course.

For students trying to narrow their shopping carts to four or five courses, the results of these evaluations could be a trove of information. But for a variety of reasons — including student and faculty privacy, biases against female and underrepresented minority faculty members and the existence of the Critical Review — the University’s current practice is not to make the results available to students or administrators, said Dean of the College Maud Mandel.

Missing peer reviews on the Critical Review — the student-run publication of course reviews — can make shopping period “more chaotic” as students have less information about what to expect from courses, Foster said.

Tatiana Dubin ’18 said it can be frustrating to look up a course on the Critical Review and find no peer reviews were submitted.

But students also experience the annoyance of filling out two sets of evaluations when a faculty member does distribute the Critical Review surveys.  “It’s cumbersome to ask students” to complete both, Foster said.

Dean of the Faculty Kevin McLaughlin P’12 echoed this sentiment, noting that doing both may “seem a little excessive.”

Distribution of the Critical Review surveys is voluntary: They are delivered to each department, but each professor ultimately chooses whether to hand them out. The departmental course evaluations are more widely-used, as students often must fill them out before receiving their final grades. All departments except for two — the Department of Earth, Environment and Planetary Sciences and the Department of Modern Culture and Media — distributed online course evaluations last academic year, and the University-wide return rate was just under 76 percent, Mandel wrote in a follow-up email to The Herald.

University evaluations are predominantly for internal departmental use and the tenure evaluation process.

Only the department head, the faculty member and sometimes the department manager are able to view the results, Mandel said. Faculty members value this information as they examine their courses and decide how to change them for future semesters, she said. A faculty member “can think through the strengths and weaknesses of his or her teaching profile and address them,” she said, adding that this process is “about pedagogy and improving your own method.”

These evaluations are also used as one of several factors to determine salaries,  contract renewals, reappointments, promotions and tenure decisions, McLaughlin said.

But the quality of information generated by these evaluations could be compromised if their results were accessible to students, he said. “If we were to make those public to the students, we would lose some of the objective kinds of reviews that we get.”

Mandel compared the difference to that between writing in a diary and writing a memoir, adding that “the mode of presentation and the things that people choose to talk about are really different.”

Mandel also expressed concern that any other comprehensive course review platform “would gut the Critical Review, which has a long and wonderful history.”

“I found that the peer-reviewed information that I did have (from the Critical Review) was incredibly useful, and it was one of the main factors in deciding what to take and which professors I chose to take classes with,” said Andy Triedman ’15.5. “It would certainly be a big help if it were available for every class.”

But recent research indicates that these peer reviews might be inaccurate due to the biases of the students who fill them out.

“It is very clear that female instructors get lower ratings on the overall effectiveness score than male instructors,” said Lisa Martin, professor of political science at the University of Wisconsin at Madison and author of “Gender, Teaching Evaluations and Professional Success in Political Science.” “The bigger the class gets, the bigger that gap becomes,” she added.

Other studies reinforce those findings, showing that evaluation results not only suffer from gender biases but also disfavor underrepresented minority faculty members.

Due to this phenomenon, some faculty are “cautious” about “making the data about their course evaluations public” or even “making it widely available,” Mandel said.

“We’re aware of that literature on that topic,” McLaughlin said, adding that a typical bias that emerges in evaluations of a younger woman teaching a lecture course in the sciences is “a lack of respect for the instructor and a sense that the instructor might not be that knowledgeable.” “You can sort of discard some of those comments,” he added.

But should the evaluations be made public, students might take biased information as fact.

If the University were ever to consider making the departmental course evaluations available to students, the issue of student bias “certainly would shape any conversation going forward about changing” the existing system, Mandel said.

ADVERTISEMENT


Popular


Powered by SNworks Solutions by The State News
All Content © 2024 The Brown Daily Herald, Inc.