On 28 May, in the framework of the International Day of Action for Women’s Health, a panel discussion on gender bias and racialisation in biomedical research and artificial intelligence was held in the Marie Curie Room of the Barcelona Biomedical Research Park (PRBB). The activity was part of the Women’s Health Awareness Week (WHAW). The meeting was organised by the Barcelona Institute for Global Health (ISGlobal) and the European project INCLUDE, with the collaboration of the Bioinfo4Women initiative of the Barcelona Supercomputing Center (BSC-CNS), PRISMA, the PRBB and the European project AHEAD.
The roundtable brought together three speakers whose trajectories are in dialogue from different places of knowledge:
- the multidisciplinary artist Maisa Sally-Anna Perk
- the physician and reference in medicine with a gender perspective Carme Valls-Llobet
- the expert in artificial intelligence applied to health Paula Petrone.
The session was hosted by Clàudia Fernández and Francisca Casas-Cordero from ISGlobal’s Scientific Culture Unit. The intersection between lived experience, clinical trajectory and technological perspective allowed us to weave a complex, situated and at the same time accessible conversation about how structural biases – of gender, race, class and age – cross research, diagnosis and medical care.
Three visions, one reality
The contributions brought to light the persistent invisibility of chronic conditions in women, such as Functional Neurological Disorder (FND) or endometriosis, and how these diagnoses are often minimised or entirely dismissed by conventional medicine. The emotional dimension, pain, and the body as a site of struggle were reclaimed through Maisa Sally-Anna Perk’s testimony, which exposed the institutional neglect faced by many women with poorly recognised conditions.
Carme Valls-Llobet, for her part, highlighted how decades of androcentric research had produced biased knowledge: through the use of male animal models and cell lines, clinical trials that exclude women, and the underrepresentation of key variables like menstrual cycles, caregiving burdens, or exposure to endocrine disruptors. She also denounced the medicalization of women’s distress as a structural silencing strategy.
The current healthcare system remains androcentric and intolerable for both patients and professionals. For decades, medicine has medicalised women’s distress instead of investigating it.
Carme Valls-LLobet
Drawing from her experience in data science, Paula Petrone showed how these biases are transferred — and amplified — within artificial intelligence systems. The lack of diverse and properly labelled data, the omission of key variables (like sex, gender, or race), and the training of algorithms on homogenous datasets result in tools that perpetuate inequalities. This can lead to inaccurate or even dangerous results for under-represented populations. Despite this, she has an optimistic view, defending the transformative potential of AI if it is designed from a critical, intersectional and situated perspective.
We want to develop algorithms that are precise, efficient, reliable, and fair. I believe this is possible, and we’re on that path.
Paula Petrone
Audience contributions further deepened the conversation: How can patients’ qualitative experiences be integrated into algorithmic models? What role can AI play in self-managed healthcare when the health system fails? There was also discussion on the role of patient communities, the need to integrate embodied knowledge into research, and the urgency of rethinking medical education.
The event welcomed 50 participants, including researchers in health and technology, healthcare professionals, PhD students and feminist activists. The diversity of backgrounds and experiences present enriched the dialogue and demonstrated a collective demand for spaces that bring together science, lived experience, and critical perspectives.
Rather than producing definitive answers, this roundtable contributes to further questioning how we think about health, evidence, and technology. It reminded us that the experiences and data that are rendered invisible — those of women, racialised people, and sick or non-normative bodies — are not anecdotal, but central to building research that is not only fairer but also more relevant and effective for the diversity of human life.
