I keep hearing "that's not fair" or 'We have to be fair"
I wonder where this concept of life being fair came from. I see way too many adults uttering these words. Maybe I don't have to tell you this but life isn't fair it happens and it isn't always right or fair.
Now I agree that if we lived in an happy imaginary go lucky world everything should be fair and would be. But we live in the real world and nothing is fair. If you look from two different perspective. One side is always going to feel cheated, well maybe not always but mostly.
You know I believe that we have been very blessed to have been born in America the greatest country in the world. We have all be given a chance for greatness, a chance to succeed. No matter what state or city, color or race we are or were born in, we all can achieve what ever we dream. This stigma of fair will only drag you down. It is quite possibly one of the worst words in the whole english vocabulary among other four letter words.
It starts when your a kid and your parents tell you that you have to be fair. It is Innocent and your parents mean well but there the seed has been planted. All through life it grows like the weed that it is.