52ndStreet
Gold Member
- Jun 18, 2008
- 3,879
- 905
- 130
I have noticed that white Americans are becoming more fascists and extream with their thoughts on race in America. More cops shooting unarmed black males, more discrimination in the work place. What is going on in America.? Your thoughts.