A friend of mine is looking for info relating to the change of values from WWI to the Great Depression. I told him I would help, but I can't really find anything for him. The main thing I can find is about the changes in women's rights, importation of different cultures, and a mistrust of war and society.
Anyone have any info on this subject? A good book or internet source? Thanks.
Anyone have any info on this subject? A good book or internet source? Thanks.