Tyler R. answered 05/14/19
Current student at Temple University, Music Therapy Major
There were many issues that faced the world since the end of the First World War. Germany was certainly amongst those nations that, unfortunately, turned towards Nazism as a means to not only bring about unity of the German people, but to essentially prepare to engage in an endeavor to better themselves (in the wrong ways). This was also mirrored in Japan as they saw themselves technologically and militarily prepared than China, wanting to spread their influence to most of Asia and the Pacific. The United States wasn't initially invested into engaging the war, which was of itself how their nationalistic identity was presented as a non-intervention ideology. It would greatly shift when the Empire of Japan instigated the Bombing of Pearl Harbor, which shifted their nationalism towards waging war against Japan, and subsequently Germany.