More and more celebrities are being asked if they’re feminists. Some stars shy away from the ‘F’ word, but others wear the feminist label with pride. We’ve witnessed others learn that feminism isn’t a dirty word and we’ve seen their opinion on the matter change over time. It’s all a sign that pop-culture is starting to embrace feminism and that’s worth celebrating!
Here’s what Hollywood’s most outspoken activists have to say about feminism.