All I want to do is understand why America is trashing itself. In America all "minority" groups are crying for fairness. A what point in your life were you told that life is fair. It is scary and troubling that so many Americans feel so entitled to jobs, money, healthcare, and many other things. America please tell me why.
No comments:
Post a Comment