American imperialism describes policies aimed at extending the political, economic, and cultural control of the United States over areas beyond its boundaries.