IFJ - International Federation of Journalists

07/15/2025 | News release | Distributed by Public on 07/14/2025 21:43

#IFJBlog: Can AI help in gender inclusive journalism

#IFJBlog: Can AI help in gender inclusive journalism?

Artificial intelligence has risked the safety of women in digital spaces along with reinforcing gender bias in different mediums; however, it carries the potential to train journalists to understand and apply gender-sensitive journalism and identify misogynist content, writes Kinza Shakeel.

[Link]

Artificial Intelligence tools have promoted gender stereotypes and are being used with heightened caution in the newsroom. Pakistan-based AI tools are in development to help journalists spot patterns that reveal gender bias. Credit: Kirill Kudryavtsev / AFP

"Brother kills sister over obscene video leak" read a headline of a news report on a femicide reported in a local Pakistani newspaper. Instead of opting for a more sensitive and objective headline, the reporter himself defined those videos as "obscene". Such videos are called "intimate" in the correct ethical terms; however, many journalists end up calling them "obscene" in their reports.

Headlines like these create a narrative of victim-blaming, as the majority of people are not media literate in Pakistan and misogyny is a big problem in the larger society. Femicide is an epidemic in the country along with many other gender-based violent crimes. While many journalists deliberately use victim-blaming language to sensationalise violence against women, there are also several others who unintentionally end up using it due to their lack of awareness of gender sensitivity in journalism.

The practice of producing media content in a way that is fair, accurate, and avoids perpetuating harmful gender stereotypes is called gender-sensitive reporting. It considers gender at all stages of media production, from story selection to the language used, ensuring that media content reflects the diversity of experiences and perspectives of women.

The advent of digital media has contributed towards more gender-sensitive news reporting validating responsible journalism. However, as the technology advances and artificial intelligence (AI) takes over the digital world including newsrooms, the risks for women as well as women journalists have increased.

Deepfakes, forged content, gender inequalities on the basis of biased data, medical misogyny, and much more dangers are subjected to women by AI, according to UN Women.

Though there has been a drastic misuse of AI, it can still be used for the benefit of women and women journalists in order to promote gender-inclusive journalism.

Experts from the fields of technology, research and journalism believe that AI can train journalists and help in developing more gender-inclusive journalism.

A possible friendship between AI and journalists

By identifying and mitigating bias in language and content, promoting balanced representation, and ensuring accountability in reporting, AI has the ability to contribute to more gender-sensitive journalism, according to a report by International Journalists' Network.

Text and images can be analysed by AI-powered tools. It can also track gender representation in media, suggesting more inclusive language, and assist journalists in creating more equitable and accurate narratives.

Pakistan-based Uks research AI tool, which is in process of development, is also one such resource that will help journalists spot language and patterns of news in the media that show intentional and unintentional gender bias.

"The tool can help spot when women are only being shown as victims or are not being represented in a story. This kind of feedback can help journalists adjust their reporting for a more balanced reporting," said Tasneem Ahmar, founder of Uks research, apublication centre dedicated to the cause of women's development in Pakistan.

Highlighting the possible positive role of AI in helping journalists do responsible journalism, she said: "One of the main reasons Uks is developing its AI platform is to create more responsible journalism about women's issues in Pakistan. Many times, journalists don't realise that their reporting reinforces harmful gender stereotypes. The AI platform we are building will offer support to content creators by pointing out problematic language and framing. It will work as a guide to the content creator to ensure that the coverage of women's issues is more accurate and respectful."

Female representation in newsrooms and AI

Pakistani newsrooms have disproportionately low female representation, with women journalists making up only about 11% of the total workforce, according to a 2024 gender audit of 15 news organisations in Islamabad reported in Freedom Network.

Many news outlets have no women in leadership positions. This fact further highlights this low percentage.

In addition to assistance in gender-sensitive journalism, AI can also play a vital role in identifying gender bias in news media organisations.

AI tools like Genei can detect the gender of the people mentioned and/or quoted in the stories produced in their newsroom, with the ultimate goal to promote gender equity in their pieces, as per the World Press Institute.

Additionally, Textio is another tool that uses AI to recommend word choices and write job postings for companies to improve the demographic diversity of their job applicants and the speed and efficacy of hiring.

Applied AI is another tool that performs the same task as Textio. Moreover, internal audit tools using AI can also help analyse bylines, speaking time, and content assignment to spot gendered patterns.

Tasneem about this aspect, also added: "We do believe that AI has the potential to help organisations make more inclusive decisions, like hiring more women or supporting their career growth."

AI mirrors the real world

There is an evident algorithm bias even in artificial intelligence which does not favour women equally. The reason for this dilemma is that AI mirrors our real world values. Its outcome and results depend on its creators and designers. If the creators and designers have no knowledge of gender-sensitivity or if they intentionally promote unequal values, then it is obvious that AI will also act on the basis of their design.

The data which is used to train AI systems and the codes written by human engineers are sources of bias many times, as engineers have their inherent bias that is likely to pass onto algorithms if left unchecked, according to a reportby Science Direct.

In the past, popular AI tools like ChatGPT have promoted sexist gender stereotypesin answer to some questions. This highlights the fact that AI's outcome also depends on the biases of the designers.

Nighat Dad, founder of Digital Rights Foundation, a research-based advocacy organisation in Pakistan, emphasised on this aspect andsaid: "Technology must be accompanied by strong editorial will and gender equity leadership. Representation is not just about numbers, it's about power, voice, and decision-making. AI can facilitate that journey, but can't change the ecosystem itself, structural change will only happen if we commit to make that change happen."

AI can train journalists but not teach them empathy

As humans, we all have biases towards everything in our lives. However, certain biases when amplified in the media by journalists cause real harm to women and minorities. Traditionally, research organisations have carried out many training programs to teach and inform journalists about gender-sensitivity. AI also has the ability to train journalists in this regard, but it cannot implant ethical values such as truth, empathy and consideration in journalists.

The people willing to learn and those who already understand the importance of women's representation and pro-women news reporting have to take the lead.

"‎AI can't instil empathy or ethics, but it can support capacity-building. AI-powered training modules like UNESCO's Gender-Sensitive Reporting Toolkit or media education platforms using adaptive AI can help tailor learning journeys for journalists. However, AI must complement, not replace critical human interventions," highlighted Nighat, as she put prominence on AI's role in training journalists.

She also emphasised that the change has to be on ground-level as well and said: "Newsrooms must also institutionalise gender desks, bring in feminist trainers, and develop accountability mechanisms. AI can nudge, analyse, and detect but the values have to come from us."

The use of AI combined with responsible journalism and journalists aware of gender-sensitive reporting can make change in the global news media. AI data has the potential to point-out the lack of female representation in newsrooms, which can help foster a more gender-inclusive approach in hiring. Thus, a technology can be employed for the good of women and women journalists, if designed and used accordingly.

Kinza Shakeel is a journalist for GEO News based in Pakistan, covering women's rights and climate justice.

Published

15 July 2025

  • Twitter
  • LinkedIn
  • Facebook
IFJ - International Federation of Journalists published this content on July 15, 2025, and is solely responsible for the information contained herein. Distributed via Public Technologies (PUBT), unedited and unaltered, on July 15, 2025 at 03:44 UTC. If you believe the information included in the content is inaccurate or outdated and requires editing or removal, please contact us at support@pubt.io