I work at Bath & Body Works a store that brings in a predominately female crowd. During the time period that I have worked there I've noticed the influx in fro's in Black America. I had a pretty intense conversation with a young lady the other day who was probably in her early 40's. We discussed products we've used & different regimens. I shared things with her she didn't know, as she did with I. However, what stood out the most from our conversation was when she asked "Since going natural have you received more compliments from black men or white men?" I didn't even have to hesitate on my response the answer was definitely white men. And I've heard that answer from quite a few of my natural friends as well.
WHY IS THAT???
I know from black men and black women on a daily basis I get asked when I'm going to press my hair, or is this just something I'm going to do for a while? Why is it so difficult for our own to embrace what is becoming the new "norm" amongst our culture?
A couple years ago, I read an article (THAT I CANNOT FIND AGAIN *sadface*) that spoke about the sudden change in natural black women that causes them to become white. Stressing that we become vegetarians (pescatarians), listen to white music, and become health freaks as a whole. Even if that is the case why does it concern the next person? When realizing the damage you've done to your hair, you also get a chance to step back and realize the damage being done to your body. As far as the white music I found that to be not one bit relevant.
I just want to know why being natural has such a negative stigma attached to it at times especially in our own community? Why?
No comments:
Post a Comment