Peter F. answered 07/03/19
United States Historian with 15+ Years of Teaching Experience
As an American, and as a teacher of American History at the middle and high school levels, I would agree that Americans and American History textbooks are definitely more focused on the military aspects of our nation's history. The #1 reason for that is because the United States of America is a very war-ridden country. We gained our independence as a nation through war, helped England win World War II, and time and time again, our nation's leaders feel that we need to not only go to but also appear victorious in a war of some sort in every generation in order to maintain our worldly "superiority" as a land. After all, the Star-Spangled Banner depicts warlike images and messages to follow in its lyrics.
NOTE: The above information is strictly matter-of-fact. It does not reflect my opinions as an American citizen. Just to let you know. :)