Discussion»Questions»Human Behavior» Are you aware of any particular culture, society or country in the world wherein everyday dental care is held as being extremely important?
Are you aware of any particular culture, society or country in the world wherein everyday dental care is held as being extremely important?
* I am not referring to it as being the most important thing to them, nor the top priority in their lives, just something they do better than in other parts of the world.
It's not the U.S.! We go on and on about the yellow teeth of Brits, yet ourselves can't resist the urge to get all up in someone's face first thing in the morning like "maybe they won't notice".
Funny you should bring this topic up. I went to the dentist on Saturday. One of my right, top molars has been giving me excruciating pain for quite some time. I'm pretty tough but it became intolerable. Off we trotted and I definitely need an extraction or a root canal. I opted for the root canal.
Where I am going with this is, the expense. I have insurance but it doesn't cover root canals and the last one I had was $2,200. The United States is not a place where we can easily take care of our bodies. Insurance is priced highly and if you fall in the middle-class bracket, it rapes you every month you are healthy. It is almost like a lurking demolition ball just daring you to be sick. When you do need it, the deductible is so high, you never get the reward of paying all those years.
To answer your question, no. I do not. What I do know is people are really interested in being more healthy and caring for one another. I'm assuming this includes the people in medical and dental fields, along with the pharmacies. We need to STOP making it so out-of-reach for us as a whole. None of us want to be in pain or die a slow, tedious death. Allow us to take care of each other and ourselves till the end as comfortably and safely as we can.
Thank you Randy for letting me vent.
This post was edited by Merlin at October 22, 2018 9:31 PM MDT