Every year, I see hundreds of advertisements that sell the idea of fairness. I don’t know how fair it is in the 21st century but I surely know that it’s a symbol of a regressive society that we’re still holding onto.
I think it’s a dirty communication to the women. I mean, these advertisements clearly are creating a divide in the mindset of people about women based on the color of their skin. What can be dirtier? And the best part is that most men and women are promoting these dirty advertisements. These are the same people who also vote-up for gender equality and indiscrimination.
Neither the man nor the women are trying to stop this completely. In fact, they make huge money out of this fairness business.
In my limited knowledge, everyone is a partner of crime—the company who makes such a product, the media company who make such advertisements and the people who buy such products.
I feel such a shame whenever I see a fairness advertisement. We’re demeaning and discriminating humans on basis of the color of skin. And, then we say that we’re a cultured and civilized society.
In fact, I think, we’re full of hypocrisy and then we blame and criticize others for everything. With such discrimination based on petty issues, are we seriously looking to stop rape, abuse and other crimes related, especially, to women? Does color of skin really matter? Does a fair skin really provide prosperity in life?