The book's authors state that based on their recent (2007) study, the majority of professors "say they keep their own politics out of the classroom". In fact, only a minority of college faculty (28%) admit that they openly reveal their political bias to their students.
But even if the above statistics are true, does it even matter if teachers conceal their political leanings? Another study, conducted by two professors from Pennsylvania State University may have the answer. In their research-based article, "I Think My Professor Is a Democrat", they published two related findings, based on student surveys:
- College students agree that most professors do not reveal their political bias (thus corroborating the findings from the book mentioned at the beginning)
- But 75% of students were able to guess correctly their professor's political leanings, anyway.
Finally, the biggest question looming behind this discussion: if the great majority of college professors call themselves liberal, does this influence their students to become more left-leaning as well? The same researchers conducted another study which found that students started shifting slightly to the left under both Republican- and Democrat-voting professors -- not just under liberal-leaning teachers.
What do these studies mean for our class discussion? What could be responsible for the shift toward Democrats? Is there a difference between college-level classrooms and high school with regard to these findings?