Originally posted by: ntdz
They didn't become pussies until after WW1/WW2. They got decimated by Germany twice, and ever since then they are against wars bigtime.
Your facts are wrong. They won WW1. You were probably taught to think the US won WW1 but France and Britain were the main victors and did the heavy lifting. And they didn't suddenly become pacificists. They didn't quite give up their colonial possessions at the drop of a hat and they participated in the first Gulf War.