The history of the United States is taught with significant gaps and falsehoods. In order to be fully informed about the United States, its politics, and its history, decolonizing our understanding is essential. This requires an inspection of settler colonialism, Manifest Destiny, and the Doctrine of Discovery, as well as how these hugely influential concepts and policies have infused every aspect of American culture. Together, these three concepts truly form the foundation of America as a country.
Colonialism is described as an effort by a country or people to overtake another for its own gain. Settler colonialism is when that gain is specifically the seizure of the land for settlement and continued economic advancements. Every step of building America has used white settler colonialism as a guiding premise if only in action, not name. “The history of the United States is a history of settler colonialism-the founding of a state based on the ideology of white supremacy, the widespread practice of African slavery, and a policy of genocide and land theft (Dunbar-Ortiz, 2015, 2).” The United States was founded on achieving prosperity for whites through claiming the land and resources and eradicating Indigenous peoples.
Examples of colonialism on behalf of the United States are not constrained to settler colonialism. For example, the colonization of Hawai’i was mostly extractive colonialism. Hawai’i was an economic advantage because of the financial gain that came from extracting its resources, including its surplus food, most notably sugarcane. It was only colonized when settlement was the only way to keep control of those resources. Extractive colonialism is also present in North America, most clearly in the tar sands throughout the continent, but it emerged after settler colonialism had taken its toll on the land. The white settler colonial project, the effort to settle and colonize what is now the United States with white people, is the violent, glorified history of the United States.