All Debates
You are browsing through all debates. You can refine the results by using the drop-down boxes above. You can view more information about each debate by clicking Show Details at right.
The American culture tends to depict much of the deep South as being hillbilly territory and "ignorant". These views of course are largely stereotypical and negative. Is this the continuing aftereffect of the Civil War and the stigma that the Northern states viewed the Southern states with?