My question is according to conservatism, when is it appropriate for the United States to go to war abroad? When American interests are at stake? If so, what are those interests at stake that conservatives believe need to be defended by military use of force?
A lot of today's conservatives believe that the U.S. should use military force whenever democracy is being threatened, particularly in strategic parts of the world. When we were more dependent on foreign oil, those interests included Middle Eastern states such as Iraq, Kuwait, etc.