I am interested to know which way folk think the country has moved say since LBJ. Who is winning. Has the country moved to the right or to the left? Which side has convinced more people. I think the right is winning, that they have done a better job convincing people that a progressive great society is not the way to go. I think even a much more moderate Democrat like Obama people have become more convinced, is too far to the left? I think we have moved to the right. I don't even think there's much of a real left anymore. Today's left looks to me like a bit left of center. Anyway, no poll, I don't like them. What do you think?