The whole gay culture thing has gotten very mainstream. I've grown up in a conservative culture and when I was growing up, there was no talk of gays. Me and kids of my age never had any idea at all. It is only in the last 10 years that it's become a common thing. So many movies, shows and everything in between is either about showing the plight of gays or promoting their cause. This is not a good thing or a bad thing. It is what's happening.
Before, people used to do things, which are considered "bad" by many in secret. Now, doing things in front of everyone is encouraged. Once again, not a bad or a good thing.
I think what these Arab countries are doing is what many people do when they see a danger to their way of life. Some choose to embrace it, like the Western societies, while others don't. That supposed way of life is very important to some people. It's like they will lose everything if their way of life, or traditions, is changed in any way. In the West, the opposite is true. The more crazy, bizarre, extreme, obnoxious you are, the more you are looked up to.