Gender roles have changed from what there were sixty or seventy years ago. Back in the 40’s and 50’s, men made the money and the women kept the household. That has changed. There are a lot of movies that have women with leading roles. Women have the same jobs as men aswell in the real world. I don’t think that gender roles are switching, just evening out.
The same goes for racial equality. Back in the early 1900’s African-American and Indian roles in movies were very negative. They were never the heroes, always the villain. Thankfully that is changing. In general we as a whole are becoming more accepting of others.