Our culture is infused with stereotypes, and their ubiquity can make them hard to spot. But stereotypes contribute to solidifying the status quo and make things hard to change. For example, I’m interested in gender balance in computer science. If people tend to associate men with technical jobs more than women, they tend to appoint and reward men more. And if you’re female, you might unconsciously assume that technical jobs are not for you.
I’m a female computer scientist. Even though I’m a counter-stereotype, when I give lectures and ask audiences to play a game of “housewife or computer scientist”, people tend to start out assuming housewife. It takes a few rounds before it becomes apparent that every one of the women I feature is actually a computer scientist.
Stereotypes are, fortunately, malleable. And unlocking data with text and video analysis techniques driven by machine learning can help identify them.
Overcoming them includes iniatives like inclusion riders in Hollywood and role models in education and workplaces, both of which help to provide counter-stereotypes.
But what about inclusion riders in your browser to nudge stereotypes back towards parity? This is a short-cut, rather than waiting for society to equalise, then waiting for culture update to reflect it, then waiting for our stereotypes to update.
What if every time you read a science story, your browser inserted a photo & a quote from a female scientist?
Or figured out where to re-write some “hes” as “shes” (or vice versa) without affecting the text?
Over time, this could update the implicit gender associations in your head. And if it works, the technique could be extended across other implicit bias areas such as race, disability, age, etc.. The first step is to figure out how to do this in a non-intrusive way; and secondly to establish whether it works.