Can anyone suggest a book regarding the rise and demise of the Weimar Republic? I want to know more about Germany during the interbellum- the culture, the life, the society, the factors which ultimately paved the way for the rise of the Nazis, and the ultimate collapse of the Republic. I have read a lot on the Third Reich and the Second World War, but the history of the short-lived Weimar Germany is something which interests me more.
Also, how intense was anti-Semitism in Germany after the First World War, during the early days of the Republic, before Hitler and the Nazis attained prominence? Was Hitler just the focal point of the wishes of aspirations of millions of other Germans around him then? Did his brand of racism find substantial resonance in the contemporary society?