So many Americans today bury their heads in the sand and pop up from time to time only to cry that "the government needs to fix this or that" only to plunge it right back in the sand and forget what the government is doing. Where did that act come from? When did Americans start believing that government was the answer to their problems?
Sadly though it is what most Americans believe and do and all it does is ensure that the very people that cause the problems get reelected to fix the problems, because our heads are in the sand not paying any attention. Government has never and will never fix anything. Name one thing that they put their hands in that has gotten better? It only gets worse and we are the ones that pay. Now with almost 50% of Americans living on the government dime, that circle will continue in perpetuity.
The truth of the matter is, we are responsible for our own lot in life and if we do not like it, change it. We are not owed anything from anyone. Not our family, not our friends, not a business, not our town, and sure as heck not the Federal Government.
Stop playing like an ostrich! WAKE UP AMERICA!!!