We are always told that immigration is a positive by seemingly every power center in America, the media, the politicians whether Democrat or Republican, our schools, etc., we never hear the opposite. Even Trump says he is pro immigration.
This is worrisome, it is easy to be wrong when one does not even consider the opposite. Worldwide it is different, lots of countries are anti-immigration such as Mexico and Japan who both have very restrictive immigration.
Searching Google for a while with “Is Immigration always a positive?” had trouble finding a single article that was anti-immigration so switched to:
“Immigration harms natives” still had trouble finding an article. So immigration must be good, right?
Lets look at American history. What is the worst thing that ever happened to Cherokees? Yes that is right, immigration by white Europeans is the worst thing that ever happened to Cherokees. Disease, being shot, having their lands stolen, the Cherokee Trail of Tears.
Whats the worse thing that ever happened to California Natives? It was Hispanic immigrants stealing their lands, killing them and giving them deadly diseases. They were mostly wiped out due to immigration.
What is the worst thing that ever happened to the Spanish and Mexicans that immigrated into California? Yes again it was immigration, they allowed enough Americans to immigrate into California that they lost their land and the United States took over California. I know two families in California that lost about 90% of their land when the USA took over California through immigration.
The opposite appears to the the truth about immigration than what we are told. Heavy immigration generally harms those already there and benefits those that come in. Why is this being hidden from us?