
Linda G. answered 08/08/18
middle and high school American history with 5 years experience
The Treaty of Versailles ended the war and and lead to the Germans losing the war. However, some Germans did not feel they actually lost the war. It was the Weimar Republic, the right wing element, that held this view despite the fact that the military was weakening. This belief continued in the 1930s and helped Hitler gain control due to the economic downturn that prevailed.