• Skip to content (access key 1)
  • Skip to search (access key 7)
FWF — Austrian Science Fund
  • Go to overview page Discover

    • Research Radar
      • Research Radar Archives 1974–1994
    • Discoveries
      • Emmanuelle Charpentier
      • Adrian Constantin
      • Monika Henzinger
      • Ferenc Krausz
      • Wolfgang Lutz
      • Walter Pohl
      • Christa Schleper
      • Elly Tanaka
      • Anton Zeilinger
    • Impact Stories
      • Verena Gassner
      • Wolfgang Lechner
      • Birgit Mitter
      • Oliver Spadiut
      • Georg Winter
    • scilog Magazine
    • Austrian Science Awards
      • FWF Wittgenstein Awards
      • FWF ASTRA Awards
      • FWF START Awards
      • Award Ceremony
    • excellent=austria
      • Clusters of Excellence
      • Emerging Fields
    • In the Spotlight
      • 40 Years of Erwin Schrödinger Fellowships
      • Quantum Austria
    • Dialogs and Talks
      • think.beyond Summit
    • Knowledge Transfer Events
    • E-Book Library
  • Go to overview page Funding

    • Portfolio
      • excellent=austria
        • Clusters of Excellence
        • Emerging Fields
      • Projects
        • Principal Investigator Projects
        • Principal Investigator Projects International
        • Clinical Research
        • 1000 Ideas
        • Arts-Based Research
        • FWF Wittgenstein Award
      • Careers
        • ESPRIT
        • FWF ASTRA Awards
        • Erwin Schrödinger
        • doc.funds
        • doc.funds.connect
      • Collaborations
        • Specialized Research Groups
        • Special Research Areas
        • Research Groups
        • International – Multilateral Initiatives
        • #ConnectingMinds
      • Communication
        • Top Citizen Science
        • Science Communication
        • Book Publications
        • Digital Publications
        • Open-Access Block Grant
      • Subject-Specific Funding
        • AI Mission Austria
        • Belmont Forum
        • ERA-NET HERA
        • ERA-NET NORFACE
        • ERA-NET QuantERA
        • Alternative Methods to Animal Testing
        • European Partnership BE READY
        • European Partnership Biodiversa+
        • European Partnership BrainHealth
        • European Partnership ERA4Health
        • European Partnership ERDERA
        • European Partnership EUPAHW
        • European Partnership FutureFoodS
        • European Partnership OHAMR
        • European Partnership PerMed
        • European Partnership Water4All
        • Gottfried and Vera Weiss Award
        • LUKE – Ukraine
        • netidee SCIENCE
        • Herzfelder Foundation Projects
        • Quantum Austria
        • Rückenwind Funding Bonus
        • WE&ME Award
        • Zero Emissions Award
      • International Collaborations
        • Belgium/Flanders
        • Germany
        • France
        • Italy/South Tyrol
        • Japan
        • Korea
        • Luxembourg
        • Poland
        • Switzerland
        • Slovenia
        • Taiwan
        • Tyrol–South Tyrol–Trentino
        • Czech Republic
        • Hungary
    • Step by Step
      • Find Funding
      • Submitting Your Application
      • International Peer Review
      • Funding Decisions
      • Carrying out Your Project
      • Closing Your Project
      • Further Information
        • Integrity and Ethics
        • Inclusion
        • Applying from Abroad
        • Personnel Costs
        • PROFI
        • Final Project Reports
        • Final Project Report Survey
    • FAQ
      • Project Phase PROFI
      • Project Phase Ad Personam
      • Expiring Programs
        • Elise Richter and Elise Richter PEEK
        • FWF START Awards
  • Go to overview page About Us

    • Mission Statement
    • FWF Video
    • Values
    • Facts and Figures
    • Annual Report
    • What We Do
      • Research Funding
        • Matching Funds Initiative
      • International Collaborations
      • Studies and Publications
      • Equal Opportunities and Diversity
        • Objectives and Principles
        • Measures
        • Creating Awareness of Bias in the Review Process
        • Terms and Definitions
        • Your Career in Cutting-Edge Research
      • Open Science
        • Open-Access Policy
          • Open-Access Policy for Peer-Reviewed Publications
          • Open-Access Policy for Peer-Reviewed Book Publications
          • Open-Access Policy for Research Data
        • Research Data Management
        • Citizen Science
        • Open Science Infrastructures
        • Open Science Funding
      • Evaluations and Quality Assurance
      • Academic Integrity
      • Science Communication
      • Philanthropy
      • Sustainability
    • History
    • Legal Basis
    • Organization
      • Executive Bodies
        • Executive Board
        • Supervisory Board
        • Assembly of Delegates
        • Scientific Board
        • Juries
      • FWF Office
    • Jobs at FWF
  • Go to overview page News

    • News
    • Press
      • Logos
    • Calendar
      • Post an Event
      • FWF Informational Events
    • Job Openings
      • Enter Job Opening
    • Newsletter
  • Discovering
    what
    matters.

    FWF-Newsletter Press-Newsletter Calendar-Newsletter Job-Newsletter scilog-Newsletter

    SOCIAL MEDIA

    • LinkedIn, external URL, opens in a new window
    • , external URL, opens in a new window
    • Facebook, external URL, opens in a new window
    • Instagram, external URL, opens in a new window
    • YouTube, external URL, opens in a new window

    SCILOG

    • Scilog — The science magazine of the Austrian Science Fund (FWF)
  • elane login, external URL, opens in a new window
  • Scilog external URL, opens in a new window
  • de Wechsle zu Deutsch

  

Detecting gender bias in children´s books

Detecting gender bias in children´s books

Laura Vana (ORCID: 0000-0002-9613-7604)
  • Grant DOI 10.55776/TAI517
  • Funding program 1000 Ideas
  • Status ended
  • Start December 1, 2021
  • End November 30, 2023
  • Funding amount € 147,697

Disciplines

Computer Sciences (40%); Psychology (10%); Sociology (10%); Linguistics and Literature (40%)

Keywords

    Gender Bias, Natural Language Processing, Children'S Literature, Content Analysis

Abstract Final report

When asked to draw a mathematician, girls are twice more likely to draw a man than a woman, while boys almost universally draw a man. A similar tendency to associate professions such as firefighters, surgeons and fighter-pilots to the masculine gender has been observed in children as young as 5 years old. Gender stereotypes form early in the childs development and are carried over throughout adolescence into adulthood, leaving long-lasting effects on emotional and cognitive development, while shaping activity and career choices as well as impacting academic performance. In this work, we propose a solution for addressing gender under- and misrepresentation in textual literature for pre- and primary-school children. In childrens books, a crucial element in the child development process, male characters outnumber female characters, non- binary characters are basically absent, and gender roles and stereotypes are being reinforced. The goal of the project is twofold. Firstly, we want to identify and measure different aspects related to gender under- and misrepresentation. For example, proportion of male vs. female characters, gender-assuming pronouns and language that reinforces stereotypical gender roles could all be relevant in this context. Once reliable measurements are obtained, they will be combined into a gender representation score. The score should be easily interpretable to increase public awareness and serve as an aid to parents, educators and decision-makers. Secondly, after computing this score we want to develop best-practice guidelines for its validation in order to ensure transparency and accuracy of the methodology. In this step we will rely primarily on the opinions of gender experts and linguists. The innovative character of this project lies on the integration of the following quantitative and qualitative research techniques. On the one hand, the measurement procedure will build on modern artificial intelligence (AI) algorithms for the analysis of text. Recent advances in this field allow for algorithms to be aware of the context in which words appear, rather than analyzing words separately. Context-awareness makes such algorithms promising tools for the measurement of more complex components of gender bias in textual data. However, as it is well known that AI techniques may present drawbacks in terms of transparency and interpret ability, we do not plan to rely solely on them in our analysis. In particular, we will complement them by making use of state-of-the- art qualitative methods for literature review, data collection and validation procedure.

Gender stereotypes form early in the child's development and are carried over throughout adolescence into adulthood, leaving long-lasting effects which may impact activity and career choices, as well as academic performance. Books, in particular, can have considerable influence, as their characters serve to shape role models of femininity and masculinity for young children. Thus, gender under- and misrepresentation in children's textual literature can contribute to the internalization and reinforcement of negative stereotypes. In this project we aimed to leverage natural language processing tools to automatically measure different aspects of gender bias in children's literature. To address this issue, we first reviewed exiting literature in psychology and social sciences and identify relevant dimensions of gender bias in children's books. The representation of gender among the characters, their centrality to the story, stereotypical portrayal related to occupations, appearance, brilliance bias, emotions, toys and interests, physical attributes and strength as well as agency vs passivity of the characters should be taken into account when aiming to measure the gender bias of a text. Moreover, the presence of stereotypical language should also be detected. As part of the project, we employed natural language processing tools to automatically build interpretable gender-bias measures for an extensive collection of the identified dimensions. We furthermore proposed a "data-driven" method to measure bias by utilizing word embeddings, which are patterns learned from a large collection of text. This allows us to compute an over-all (albeit less interpretable) bias response for a whole story. Finally, to improve the interpretability, we propose to use the collection of interpretable measures to explain the rather black box computed bias measure in order to derive a scoring function which allows to understand which dimensions contribute most in explaining the bias. We illustrate the approach on a collection of 30 classical fairytales.

Research institution(s)
  • Technische Universität Wien - 100%

Research Output

  • 1 Publications
  • 1 Datasets & models
Publications
  • 2024
    Title DETECTING GENDER BIAS IN FAIRY TALES
    Type Other
    Author Camilla Damian
    Link Publication
Datasets & models
  • 2024 Link
    Title DGBIAS
    Type Data analysis technique
    Public Access
    Link Link

Discovering
what
matters.

Newsletter

FWF-Newsletter Press-Newsletter Calendar-Newsletter Job-Newsletter scilog-Newsletter

Contact

Austrian Science Fund (FWF)
Georg-Coch-Platz 2
(Entrance Wiesingerstraße 4)
1010 Vienna

office(at)fwf.ac.at
+43 1 505 67 40

General information

  • Job Openings
  • Jobs at FWF
  • Press
  • Philanthropy
  • scilog
  • FWF Office
  • Social Media Directory
  • LinkedIn, external URL, opens in a new window
  • , external URL, opens in a new window
  • Facebook, external URL, opens in a new window
  • Instagram, external URL, opens in a new window
  • YouTube, external URL, opens in a new window
  • Cookies
  • Whistleblowing/Complaints Management
  • Accessibility Statement
  • Data Protection
  • Acknowledgements
  • IFG-Form
  • Social Media Directory
  • © Österreichischer Wissenschaftsfonds FWF
© Österreichischer Wissenschaftsfonds FWF