Wednesday, August 10, 2011

Is it true that in America liberals are racist in their deepest heart?

They firmly believe that blacks can not take care of themselves and their families. The liberals have controlled all the major cities in America for 3-4 generations and are firmly of the belief that the blacks MUST be given handouts in order to survive. Is this true?

No comments:

Post a Comment