Hollywood

A New Study Finds White Men Still Talk the Most in Movies
The goal with any society is to get better at stuff, to learn from mistakes and evolve, gradually, over time, to become the best version of itself. But for all the woke baes and social justice awareness out there, it looks like movies still have a long way to go. A new study has found that white men…
Study Shows Film Criticism Is a Male-Dominated Field
Particularly avid film fans are likely well-aware that the industry is a male-dominated one in which women — both behind of and in front of the scenes — are outnumbered and thus outranked by men. But as Meryl Streep pointed out last year, the problem extends beyond Hollywood proper and…