Originally posted by: Feldenak
I know my relatives on the other side of the pond like to refer to the US as "The Colonies"....then again they're family so I expect some abuse.
Originally posted by: MmmSkyscraper
Originally posted by: Feldenak
I know my relatives on the other side of the pond like to refer to the US as "The Colonies"....then again they're family so I expect some abuse.
When I was in school, history was Euro-centric. The two areas of focus were the Romans and WW2.
Originally posted by: Atheus
I don't think it gets much of a mention in schools to be honest - there's just too much British history to cover, so many colonies and wars and conflicts, so many important people...
In the history books they just try to tell the truth I guess. Why? Are we painted in a bad light in US schools? I wouldn't be surprised. I heard they teach that America won the second world war pretty much by itself.
Originally posted by: Platypus
Probably as one-sided as American text books on the subject.. interesting though, maybe some UK AT'ers can reply about their school experience.
Originally posted by: HeroOfPellinor
Sure, but what do they teach AFTER 8th grade?
They were going in depth on Europe, and specialising in more local history.
Originally posted by: Atheus
Why? Are we painted in a bad light in US schools? I wouldn't be surprised. I heard they teach that America won the second world war pretty much by itself.
Originally posted by: MmmSkyscraper
Originally posted by: Feldenak
I know my relatives on the other side of the pond like to refer to the US as "The Colonies"....then again they're family so I expect some abuse.
Never heard the US referred to like that over here.
When I was in school, history was Euro-centric. The two areas of focus were the Romans and WW2. I only did history for the first 3 years of secondary school but I don't remember my friends ever mentioning American stuff. They were going in depth on Europe, and specialising in more local history.
Originally posted by: DVK916
You don't even learn about the War of 1812?
Originally posted by: Atheus
I don't think it gets much of a mention in schools to be honest - there's just too much British history to cover, so many colonies and wars and conflicts, so many important people...
In the history books they just try to tell the truth I guess. Why? Are we painted in a bad light in US schools? I wouldn't be surprised. I heard they teach that America won the second world war pretty much by itself.
Originally posted by: Platypus
Probably as one-sided as American text books on the subject.. interesting though, maybe some UK AT'ers can reply about their school experience.
