A friend linked me to this pic, I think it’s hilarious:
It makes me so angry to think of how so many white people think that America is ‘their’ country. What makes it so? We were “here first” ?? Ask the Native Americans how true that is. Europeans came over to someone else’s home and effectively wiped them out so that we could stake claim to the place. Does this not make anyone else sick? We want to push mexicans out; many white people would love to ship all black people back to Africa. What gives them the right?
I know my rampage against an entire race is uncalled for – certainly most white people (hopefully) don’t feel this way. But I have encountered so many who do.. it’s so sad.