Women’s Day in AI: Building Inclusive Architecture

temp_image_1772731288.580317 Women's Day in AI: Building Inclusive Architecture



Women’s Day in AI: Building Inclusive Architecture

Women’s Day in AI: Building Inclusive Architecture

As artificial intelligence (AI) becomes increasingly integrated into daily life and, specifically, architecture, the question of participation and potential exclusion is paramount. Beyond efficiency and economic growth, the Organisation for Economic Co-operation and Development (OECD) emphasizes “inclusive growth” as a guiding principle for AI development. This includes responsible stewardship, augmenting human capabilities, advancing inclusion, reducing inequalities, and promoting sustainability.

However, the technology sector, and now AI, has historically been shaped by a culture often dominated by a limited perspective. This International Women’s Day provides a crucial opportunity to reflect on and discuss inclusivity within AI systems used by architects and the broader built environment. To gain valuable insights, we spoke with Katie Fisher (Director at CARD Projects and RIBA J Rising Star), Olivia Stobs-Stobart (Design and AI Lead at Plan A Consultants), and Renee Dobre (Architect and Design Computation Team Leader at NBBJ).

Prioritizing Inclusivity from the Start

Olivia Stobs-Stobart emphasizes the importance of integrating inclusivity from the outset: “Any new technology we adopt must pass key tests, with inclusivity being a core consideration. We prioritize feedback through structured pilot tests, gathering input from diverse team members. Accessibility and usability for neurodiverse colleagues are also crucial. It’s not an afterthought; it’s integral to our process. We aim to empower our team, ensuring technology supports, rather than hinders, their work.”

Addressing Imbalances in AI Development

Katie Fisher points to a significant imbalance in leadership, core engineering, and venture funding within the AI field. “Teams building large language models, BIM-integrated AI plugins, and generative design platforms remain overwhelmingly male. Optimisation tools prioritizing speed and cost often reflect these perspectives. While women are more visible in ethics roles, they are less frequently involved in shaping the underlying code or investment decisions.”

Olivia Stobs-Stobart adds, “When systems learn from our data and reflect our values, we must ask: whose ideas are represented? With only 22% of AI and data professionals in the UK being women, can we truly claim AI is a fair representation of our population? This imbalance compounds existing inequalities, creating harmful biases.”

The Role of ‘Translators’ in Architecture

Renee Dobre highlights a bottleneck in the integration and technical leadership phases within architecture firms. “We need more women in the ‘translator’ space – those leading the charge to turn raw AI capability into applied, human-centric architectural workflows. The training data itself is a foundational issue. LLMs and image-generation models often draw from a historical canon that has systematically elevated male ‘starchitects’ and marginalized female designers. This inherent bias is embedded in the system.”

Katie Fisher echoes this sentiment, noting that AI intensifies existing imbalances in architecture, such as underrepresentation in senior roles and pay gaps. Generative housing tools trained on historic developer-led schemes can replicate existing spatial norms, automating inequalities.

Velocity and Scale: The Danger of Automated Bias

Renee Dobre emphasizes the speed at which AI can amplify biases: “Architecture is historically slow, taking years to manifest biased environments. AI, however, generates concepts and optimizes data in seconds. It’s not just mirroring past inequalities; it’s actively automating and projecting them into our future designs at an unprecedented rate.”

Olivia Stobs-Stobart stresses the importance of broadening the diversity conversation beyond gender, recognizing the value of “soft skills” often associated with women and ensuring these skills are not undervalued. She emphasizes the need for diverse voices at the problem-definition stage, not just the solution stage.

Towards Accountability and Transparency

Katie Fisher advocates for mandatory transparency and reporting: “Every AI tool used in architecture should publish information about its development team demographics and the datasets used for training. This information should be visible during procurement. We have buying power and should demand accountability.”

Renee Dobre proposes an industry-wide AI Governance Framework, similar to building codes, requiring “Equity and Bias Audits” for all new software. This framework would mandate disclosure of team diversity and training data, ensuring tools don’t perpetuate historical biases.

Hope for the Future

Despite the challenges, there is optimism. Olivia Stobs-Stobart believes that conscious, incremental steps towards responsible AI can collectively create positive change. Renee Dobre is encouraged by the younger generation of digital-native architects who view equity, wellbeing, and technology as interconnected.

Further Resources: Download RIBA’s Inclusive Design Overlay

Thanks to Katie Fisher, Olivia Stobs-Stobart, and Renee Dobre. Text by Paul Hirons. This is a professional feature edited by the RIBA Practice team.

RIBA Core Curriculum topics: Inclusive environments, Design, construction and technology


Scroll to Top