BIAS GENDER BIAS MOVIE INDUSTRY

This AI Tool Corrects Gender Bias In Portrayal Of Females In Movies

A 2019 study coined a very interesting term — ‘the Cinderella complex’. The authors of this study analysed 7226 books, 6000 movie synopsis, and 1,100 movie scripts and found that the words used to associate with the male and female characters reeked of gender bias. The lives of the male characters were adventure and aspiration-oriented, whereas the female characters were more passive and romantic-relationship oriented.

This is just one of the countless studies that show how gender bias, a lot of time unintentional and a product of societal conditioning, creeps in popular text and media. This, in turn, sets an incorrect narrative. Keeping this in view, the researchers at the Allen Institute for Artificial Intelligence in collaboration with the University of Washington created an AI-based tool that rewrites text to correct potential gender bias in character portrayals. Christened PowerTransformer — an encoder-decoder model developed based on a pre-trained language model. 

How Does It Work

The narrative in popular media often assigns a stereotypical colour to gender roles. This problem is widely recognised, and there have been several attempts at removing such biases

One such method is controllable text revision which rephrases text to a targeted style or framing. The conventional controllable text revision has to overcome three main challenges — editing beyond just surface-level paraphrasing, the revision should not make unnecessary changes to the underlying meaning of the text, and models must learn to debias and learn text without any supervised data, thus preventing straightforward machine translation-style modelling.

To overcome above-listed challenges, the researchers at Allen Institute for Artificial Intelligence and the University of Washington jointly formulated a new controlled text revision task called controllable debiasing that studies portrayal biases through connotation frames of power and agency, which capture knowledge about the implied power dynamics with respect to verbs. 

The researchers have introduced a new controllable debiasing approach called PowerTransformer. In this approach, reconstruction and paraphrasing objectives are combined to overcome the lack of parallel supervised data. The PowerTransformer model uses connotation frame knowledge both at training time using control tokens, and during the generation, time using agency-based vocabulary boosting. 

Further, this model uses an OpenAI-GPT transformer model as the base. PowerTransformer is similar to the GD-IQ tool that was developed by the University of Southern California Viterbi School of Engineering. GD-IQ uses AI techniques to analyse the text of a script to identify the number of males and females and whether they represent the population at large. PowerTransformer introduces an improvement over GD-IQ by rephrasing text using machine learning. For example, ‘Alice daydreamed about a doctor’ is rewritten as ‘Alice pursued her dream to be a doctor’. Such rephrasing gives the character more authority.

After experimenting, the researchers studied 16,763 characters from 767 modern English movie scripts and found that of these characters, 68% were inferred to be men and the remaining 32% women. The researchers attempted at mitigating gender biases in these portrayals by attempting to re-balancing the agency level of female characters to be at par with male characters using PowerTransformer. 

The model proved to be successful in increasing the positive agency and decreasing the passiveness associated with the female characters. “Our findings show promise for using modern NLP tools to help mitigate societal biases in text,” noted the researchers. Additionally, they also cautioned that this was a pilot study, and the model would still require human intervention in automatically rewriting the entire movie.

Wrapping Up

This tool has the potential to help authors and scriptwriters in writing stories or movie plots by providing different framings for alternative portrayals of characters. This could conclusively help in stereotypical portrayals of females and debunk gender roles in society which is heavily influenced by media.

The post This AI Tool Corrects Gender Bias In Portrayal Of Females In Movies appeared first on Analytics India Magazine.






Read more on Analytics India Magazine

You may also like

Comments are closed.

More in BIAS