

After initially opposing American entry into World War I, President Woodrow Wilson changed his mind and eventually decided to commit the US to the War in order to “make the world safe for democracy.” How has American foreign policy changed since WWI? How has Americans’ engagement with the rest of the world transformed since this conflict and what impact have those changes had domestically on American society? Your answer can include (but does not necessarily have to include) events up until the present day.