UglyCasanova
Lifer
I've noticed that a lot of people on here view the South as just this assbackwards part of the country where everyone does nothing all day except get drunk, read the Bible, and have sex with their relatives. Living here all of my life I have got to say that is not entirely true..
So what do you think of the South? And those of you who don't live in the US, does the South carry this kind of stigma around the world or is it just what the yanks think of us?
So what do you think of the South? And those of you who don't live in the US, does the South carry this kind of stigma around the world or is it just what the yanks think of us?