Legal and Ethical Reform for Gender-Inclusive AI: Addressing Invisible Symptoms and Biased Algorithms

Abstract:

To enhance diagnostic accuracy and treatment efficiency, Case-Based Reasoning (CBR) systems are increasingly used in clinical decision support tools. CBR systems often rely on historical patient data to introduce a diagnosis but when past cases are predominantly male-oriented or reflect entrenched gender stereotypes in medical practice, CBR systems can misclassify or inderdiagnose women’s health conditions. Cardiovascular disease, pain mangement and neurodevelopmental disorders are among the areas in which CBR systems often perpetuate gender inequality in healthcare. Traditionally diagnosed in men, real-world examples often misdiagnosed women for heart disease, under treated female pain and underdiagnose ADHD in girls. By drawing on international human rights instruments such as the Convention on the Elimination of All Forms of Discrimination against Women (CEDAW) and Sustainable Development Goal 5, the regulatory gaps and systemic risks of algorithmic bias in healthcare AI will be assessed and applied in a legal-ethical lens. Integrating legal mandates with ethical AI design principles as well as emphasising inclusive data practices, transparent case selection and gender-sensitive similarity, this paper will create a pathway towards a transformative AI systems that are equitable, accountable and inclusive gender aware systems in medical contexts.