It's a great time to be a woman (and why we're seemingly not allowed to say this)
So, I'm not sure if I might need to femsplain this...? :-)
There has never been a better time to be a woman in the west, and yet if the media/ social media narrative is to be believed, things are getting worse not better. Amidst all the talk of #metoo, the gender pay gap and data showing the woeful, ongoing under-representation of women at the top in business/ finance/ investment etc - we are being instilled with a sense of the ongoing patriarchal nature of society and of women being held back. How accurate is this and what might be the risks to women (and men) of such a narrative? And why is open debate on this subject so stifled and triggering? A positive discussion (please!) on some of the good news and how we can all work harder, together to build resilience in women.