Active Now

Element 99
Discussion » Questions » Babies and Kids » When you're a kid you're taught certain things. As an adult you discover some of them were LIES. Why don't parents tell kids the truth?

When you're a kid you're taught certain things. As an adult you discover some of them were LIES. Why don't parents tell kids the truth?

They tell you if you are willing to work hard enough you can achieve anything you want. They don't tell you that there are people out there whose only goal is to thwart you/harm you/be an obstacle or a thorn in your side and try to sabotage you. You grow up thinking what you can achieve in life is entirely up to you. That's a LIE.  When others control the options you have it isn't all up to you at all. Bah humbug!  :(

Posted - December 14, 2017

Responses


  • 1326
    That's the way the world is,, Rosie a very sad fact of life. Maybe parents don't tell kids the truth to not scare them of walking out the front door. 
      December 27, 2017 11:04 PM MST
    0