An interesting take I recently heard...
Since the left has essentially been in charge of education in this country for at least the last four decades, from K-12 and the universities...
Doesn't it stand to reason that if America truly is a fundamentally racist nation, that it has been taught to be fundamentally racist by those leftist run educational institutions?
Things that make you go...hmmmm...
Bookmarks