
Suanne G. answered 11/16/19
Graduate-Degreed History Expert and Writer
I would agree that there's likely some inherent bias on the part of the prof here, though it's possible that s/he is simply playing Devil's Advocate to spark debate. Some general thoughts on this, though: it's true that, historically, the South has been more conservative taken as a whole than other parts of the country. A lot of this is based on history and religion, honestly. In the aftermath of the Civil War, local populations were often marginalized during Reconstruction (for example--did you know that Confederate Civil War veterans were stripped of their voting rights?). This created an inherent mistrust of the federal government that still lingers to this day. The South was also fertile ground for the more conservative forms of Protestant religious groups, and these churches (especially the Baptist church) retain a strong influence over large segments of the population, especially in rural and deep suburban areas (and there are likely more such areas in the South than in the more densely-populated Northeast, for example).
I would argue, though, that this whole premise is somewhat dated--remember the recent focus on the hotly-contested election in Alabama (where Moore was defeated despite Trump's help), or the current governor's race in Louisiana that's all over the news today. There are now many more large and growing cities in the South--Atlanta, Nashville, Dallas--that are, generally speaking, less conservative than the rest of their respective states, and represent a large enough number of votes that politicians are beginning to pay more attention to them.
Hope that helps.