LiuKangBakinPie
Diamond Member
In the majority of mainstream films women are sexualised and the large majority of nudity in films is (thankfully) female. Even in television shows like boardwalk empire and game of thrones it's all female nudity, and often presented very frankly. Countless magazines, films and channels all contain bare breasts and bums, so some women must be uncomfortable with it? Or do they just ignore it?