Hollywood’s feminists speak out: part one

featDespite all the misconceptions and harmful stereotypes surrounding the ‘F’ word, some celebrities aren’t afraid to speak their mind and demand equality in the world’s top entertainment industry.

More and more celebrities are being asked if they’re feminists. Some stars shy away from the ‘F’ word, but others wear the feminist label with pride. We’ve witnessed others learn that feminism isn’t a dirty word and we’ve seen their opinion on the matter change over time. It’s all a sign that pop-culture is starting to embrace feminism and that’s worth celebrating!

Here’s what Hollywood’s most outspoken activists have to say about feminism.