Today's post is a little different from what is usually on here. I want to speak from the heart about something I care about - the topic of feminism.
It's often viewed as a hot topic. I wonder why? I wonder if, even today in 2017, society is still fundamentally male dominated?
For me, this issue is not just about inequality or sexual harassment. It’s about the right to "life, liberty and the pursuit of happiness", a statement used in the United States Declaration of Independence. For me, it’s not about women’s rights or women’s issues. It is about human rights and issues.
The word feminism has become so loaded with ugly cultural baggage that some people will deny that they’re feminists even though they really do care about the issues. Then there are others who will use the word as some kind of insult, to be wielded as a weapon.
Why do we treat feminist issues in this way when really, what we are talking about is human dignity and human worth. Framed like this, feminism no longer becomes about us and them or male and female. It becomes about fairness and being humans beings. Full stop. Maybe then we will start to see genuine equality.
What do you think?