Thank you for this. That was more informative than years of public school coverage of WWII and the lead up to it.
I knew about the Treaty of Versailles and how harsh it was- or was at least perceived to be- but not about Hitler’s promise to do away with it.
Do you have any preferred authors/books for further reading on the subject? I’m still trying to undo years of poor education from being taught in a conservative school.
Thank you kindly! I’ll be hitting up the nonfiction section of my local bookshops to try and find them.
I’ve always been curious about that period in history.