I’m trying to get perspective on this particular beauty standard and how I want to approach it. Do people whiten their teeth where you live? Is it seen as expected to do so? Do you live in a city?
I have healthy teeth that have nevertheless seen a lot of tea and coffee. I have generally thought of this as similar to wrinkles, i.e. a natural thing bodies do that I don’t want to pay money to fix since it isn’t broken. I still think this. But I have been feeling lately like there might be more actual social stigma to my teeth being discolored. I am wondering if this is at all real? Has whitening teeth become an expected thing for all adults to do now? I thought I’d ask how other people feel and think about this and what the general norm is in your social circle.
My dentist disagrees. He recommends moderation, but says it is not nearly as bad as people make it out to be.
I do it myself, about once a year, and I don’t have any issues at all.
My dentist said I could do it more often if I felt I needed to, 3-4 times a year, and my enamel would be fine, as long as I followed the directions.
I have been told by my dentist that it can permanently damage your tooth enamel. I did a quick search and found an NIH study on enamel softening. It looks at hardness, but that is all. I only read the object and the abstract, but that part didn’t mention enamel thickness. The study mentions that hardness is restored after about a week.
I would generally advise caution and just take your dentists advice about these things. I will admit I am generally biased about this and it definitely can be harmful if not done correctly.
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4319295/