Anees A. answered 03/18/20
Experienced Tutor in English, Writing, History/Social Studies
In a sense America was never really isolationist, in that once the United States was established (and before then) they continued to expand within the North American Continent, conquering and settling land occupied by Native Americans and in other cases contesting over land and eventually claiming it (the Mexican-American war, resulting in the United States defeating Mexico and formally acquiring some of it's territory). As far as the the rest of the world, the U.S. generally stayed out of major wars/conflicts such as the Crimean War or various European invasions of India, Africa etc. Most histories would say that formal American "engagement" in foreign affairs began with the Spanish-American war of 1898. This conflict saw the U.S. supporting uprisings by colonized nations such as Cuba and Puerto Rico against Spanish control, but was then followed by U.S. attempts to control the newly independent Cuban nation, and formal colonization of Puerto Rico. The U.S. would also seize control of Spanish provinces in the Philippines and other island nations, resulting in brutal occupations of the Philippines in particular.
This period was followed in the early 20th century by various invasions/interventions of Central American and South American countries such as the Dominican Republic, Nicaragua, Honduras, etc. The First World War really highlighted the first major entry into international affairs by the U.S. Though initially they didn't formally enter the war, after 1917 with the sinking of the Lusitania (and the Russian Revolution taking place), the U.S. decided to enter the European conflict. Jumping ahead, during the Second World War the U.S. at first resisted military involvement in Europe and East Asia (though they did offer material support to Allied Powers such as Britain), but after the Japanese attack on Pearl Harbor in December, 1941 the U.S. declared war on Japan and Nazi Germany, and would later emerge victorious along with the Allied Powers.
With the U.S. and the Soviet Union remaining the biggest winners of the War in 1945, the rivalry between the two renewed itself with the U.S. determined to thwart any expansion of Soviet influence or the emergence of Left-Wing, Socialist, or Communist governments around the world. Put briefly, this rivalry played itself out until 1989-1991 with the dissolution of the Soviet Union and formerly Communist led countries in Eastern Europe turning into market and diplomatically friendly partners of the U.S. and it's friends in western Europe.
Since 1991 the U.S. has basically acted as the leading superpower and enforcer of it's interests and political/economic order around the world, be it through organizations such as the World Trade Organization or the still existent NATO, as well as through military aggression against countries such as Panama, Iraq, Afghanistan, Libya etc.