I’m trying to get perspective on this particular beauty standard and how I want to approach it. Do people whiten their teeth where you live? Is it seen as expected to do so? Do you live in a city?
I have healthy teeth that have nevertheless seen a lot of tea and coffee. I have generally thought of this as similar to wrinkles, i.e. a natural thing bodies do that I don’t want to pay money to fix since it isn’t broken. I still think this. But I have been feeling lately like there might be more actual social stigma to my teeth being discolored. I am wondering if this is at all real? Has whitening teeth become an expected thing for all adults to do now? I thought I’d ask how other people feel and think about this and what the general norm is in your social circle.
We buy the whitening strips but don’t do professional whitening. I think that’s fairly common here. I live in Dallas, TX.
I don’t even know how common it is to have your teeth whitened professionally. I just asked my dentist and they told me to buy whitening strips and do it myself.