{
    "title": "Unmasking inequalities in AI: new research reveals how artificial intelligence might reinforce inequality",
    "modified_at": "2025-02-27 10:00:08",
    "published_at": "2025-02-27 10:00:00",
    "url": "https://press.vub.ac.be/unmasking-inequalities-in-ai-new-research-reveals-how-artificial-intelligence-might-reinforce-inequality",
    "short_url": "http://prez.ly/Vjfd",
    "culture": "en",
    "language": "EN",
    "subtitle": "Artificial Intelligence (AI) is often seen as a powerful tool designed to improve our lives, from smartphones to hiring algorithms. But a new study, Unmasking Inequalities in the Code, by Professor Tuba Bircan reveals a deeper issue: AI is not neutral, it mirrors and amplifies existing societal inequalities. ",
    "slug": "unmasking-inequalities-in-ai-new-research-reveals-how-artificial-intelligence-might-reinforce-inequality",
    "body": "<p>The researchers challenge the widespread belief that AI-induced bias is a technical flaw, arguing instead AI is deeply influenced by societal power dynamics. It learns from historical data shaped by human biases ,absorbing and perpetuating discrimination in the process. This means that, rather than creating inequality, AI reproduces and reinforces it.</p><p>&ldquo;Our \u200b study highlights real-world examples where AI has reinforced existing biases.&rdquo; Prof. Bircan says. &ldquo;One striking case is Amazon&rsquo;s AI-driven hiring tool, which was found to favor male candidates, ultimately reinforcing gender disparities in the job market. Similarly, government AI fraud detection systems have wrongly accused families, particularly migrants, of fraud, leading to severe consequences for those affected. These cases demonstrate how AI, rather than eliminating bias, can end up amplifying discrimination when left unchecked. Without transparency and accountability, AI risks becoming a tool that entrenches existing social hierarchies rather than challenging them.&rdquo;</p><p>AI is developed within a broader ecosystem where companies, developers, and policymakers make critical decisions about its design and use. These choices determine whether AI reduces or worsens inequality. \u200b When trained on data reflecting societal biases, AI systems replicate discrimination in high-stakes areas like hiring, policing, and welfare distribution. Professor Bircan&rsquo;s research stresses that AI governance must extend beyond tech companies and developers. Given that AI relies on user-generated data, there must be greater transparency and inclusivity in how it is designed, deployed \u200b and regulated. Otherwise, AI will continue to deepen the digital divide and widen socio-economic disparities.</p><p>Despite the challenges, the study also offers hope. &ldquo;Rather than accepting AI&rsquo;s flaws as inevitable, our work advocates for proactive policies and frameworks that ensure AI serves social justice rather than undermining it. By embedding fairness and accountabilit into AI from the start, we can harness its potential for positive change rather than allowing it to reinforce systemic inequalities.&rdquo; Prof. Bircan concludes.</p><p>&nbsp;</p><p><strong>Reference:</strong><br>\u200bTuba Bircan, Mustafa F. &Ouml;zbilgin, (2025) <em>Unmasking Inequalities of the Code: Disentangling the Nexus of AI and Inequality</em>. <em>Technological Forecasting and Social Change</em>, Volume 211, 123925. <a href=\"https://doi.org/10.1016/j.techfore.2024.123925\">https://doi.org/10.1016/j.techfore.2024.123925</a>.</p><hr /><p><strong>Contact</strong>:<br>\u200b<a href=\"mailto:Tuba.bircan@vub.be\">Tuba.bircan@vub.be</a></p><div class=\"release-content-contact\" id=\"contact-bbfc8d0d-1fe9-4cb4-a13e-37ab9e4cbfb6\">\n    <div class=\"release-content-contact__avatar\"><img src=\"https://cdn.uc.assets.prezly.com/96e3692f-39a4-433b-b0ee-9008befa578f/-/crop/1117x1118/419,0/-/preview/-/scale_crop/128x128/center/-/format/auto/\" alt=\"Koen Stein\" class=\"release-content-contact__avatar-image\" /></div>\n    <div class=\"release-content-contact__details\">\n        <strong class=\"release-content-contact__name\">Koen Stein</strong>\n        <em class=\"release-content-contact__description\">Perscontact wetenschap &amp; onderzoek</em>\n        <ul class=\"release-content-contact__details-list\"><li class=\"release-content-contact__details-list-item\"><a href=\"mailto:koen.stein@vub.be\"  class=\"release-content-contact__details-list-item-link\" title=\"koen.stein@vub.be\"><svg class=\"icon icon-paper-plane release-content-contact__details-list-item-icon\">\n                <use xlink:href=\"#icon-paper-plane\"></use>\n            </svg>koen.stein@vub.be</a></li>\n<li class=\"release-content-contact__details-list-item\"><a href=\"tel:+32 (0)471517909\"  class=\"release-content-contact__details-list-item-link\" title=\"+32 (0)471517909\"><svg class=\"icon icon-mobile release-content-contact__details-list-item-icon\">\n                <use xlink:href=\"#icon-mobile\"></use>\n            </svg>+32 (0)471517909</a></li>\n<li class=\"release-content-contact__details-list-item\"><a href=\"https://vub.be\" target=\"_blank\" rel=\"noopener noreferrer\" class=\"release-content-contact__details-list-item-link\" title=\"vub.be\"><svg class=\"icon icon-browser release-content-contact__details-list-item-icon\">\n                <use xlink:href=\"#icon-browser\"></use>\n            </svg>vub.be</a></li></ul>\n    </div>\n</div><p>&nbsp;</p><p>&nbsp;</p><p>&nbsp;</p><p>&nbsp;</p><p>&nbsp;</p>",
    "mainvisual": {
        "thumbnail": "https://cdn.uc.assets.prezly.com/f74da900-87b0-405a-bcba-8783394f0f18/-/crop/614x713/0,36/-/preview/-/scale_crop/250x250/center/-/format/auto/",
        "large": "https://cdn.uc.assets.prezly.com/f74da900-87b0-405a-bcba-8783394f0f18/-/crop/614x713/0,36/-/preview/-/preview/500x500/-/format/auto/",
        "original": "https://cdn.uc.assets.prezly.com/f74da900-87b0-405a-bcba-8783394f0f18/-/crop/614x713/0,36/-/preview/"
    },
    "header": {
        "large": "https://cdn.uc.assets.prezly.com/e1dc8864-5474-49ce-864a-9d5d5ebae950/-/preview/1200x1200/-/format/auto/",
        "release": "https://cdn.uc.assets.prezly.com/e1dc8864-5474-49ce-864a-9d5d5ebae950/-/preview/1200x1200/-/format/auto/"
    },
    "contacts": [
        {
            "name": "Koen Stein",
            "company": null,
            "description": "Perscontact wetenschap & onderzoek",
            "email": "koen.stein@vub.be",
            "website": "https://vub.be",
            "address": null,
            "telephone": null,
            "mobile": "+32 (0)471517909",
            "twitter": null,
            "facebook": null
        }
    ],
    "author": {
        "id": 27019,
        "username": "koen.stein@vub.be",
        "email": "koen.stein@vub.be",
        "name": "Koen Stein",
        "display_name": "Koen Stein",
        "first_name": "Koen",
        "last_name": "Stein",
        "avatar_url": "https://avatars-cdn.prezly.com/user/27019/9f0643fadfadc952fac04a7d528db74318e913d6b52497b7e8cfd725d0d3879e?v=3&c=8030e50ffd4a1281d6d86a9bf7dc55c9a5a52dc02b856729079ec7a7674f9c73&max=2",
        "use_case_answer": null,
        "sign_in_flow": "password",
        "created_at": "2023-10-03T08:18:35+00:00",
        "is_terms_of_service_accepted": true,
        "last_seen_at": "2026-04-03T08:43:00+00:00",
        "locked_until": null
    },
    "format_version": 5
}