Research

User Research for AI Governance: The Shift from Tools to Transformation

Building an AI governance ecosystem means embedding responsible AI practices throughout the organization.

October 15, 2024
Author(s)
Yomna Elsayed, Ph.D
Contributor(s)
No items found.
No items found.

After recently attending Credo AI’s Responsible AI Leadership Summit, I found myself reflecting on how the democratization of GenAI has significantly broadened access to AI capabilities, increasing AI's touchpoints and interactions across entire organizations. As a result, the conversation around AI governance is evolving from a focus on tools and metrics to a broader framework of organizational transformation. In order to truly embrace the innovative capabilities of GenAI in a safe and responsible manner, AI governance can no longer be siloed to a single department or function; it must be a cross-organizational effort.

AI governance isn’t just about ethical AI or mitigating societal harm; it’s about embedding responsible AI practices across the organization—from legal compliance and risk management to data handling and internal policies. At its core, AI governance addresses how an organization operationalizes AI responsibly, spanning departments, workflows, and processes. This shift demands a deeper understanding of how governance integrates into the company’s overall framework, with user research playing a key role in guiding this transformation.

Process Over Metrics: Building an AI Governance Ecosystem

Through extensive conversations, interviews, and surveys with industry leaders, we’ve learned that governance goes far beyond compliance checklists and legal frameworks. Customers tell us the biggest challenge in AI governance isn’t just identifying the right Responsible AI (RAI) metrics—it’s figuring out how to begin the process. AI governance is fundamentally about culture and processes, with success depending on how well organizations embed governance into their daily workflows and decision-making.

While governance tools are important for meeting regulatory demands, the real shift comes from integrating those tools across departments, with a focus on people and processes. Building an AI governance ecosystem means embedding responsible AI practices throughout the organization. Key factors like AI maturity, executive buy-in, and risk management expertise are essential, but the biggest barrier is often the resistance to change—a reluctance to shift existing practices toward responsible AI frameworks.

This was echoed at the Summit , where organizations across the Responsible AI ecosystem shared how they’re transforming their cultures through education, incentives, and executive leadership to make responsible AI a core part of their operations.

Governance as an Organizational Change Problem

To truly understand AI governance, it’s essential to stop viewing it as merely a compliance issue and instead treat it as an organizational change problem. According to David Gleicher’s Change Equation (D x V x F > R), successful change occurs when dissatisfaction with the current state (D), a compelling vision for the future (V), and actionable first steps (F) outweigh the resistance to change (R).

In the context of AI governance, user research must not only identify technical pain points but also examine the broader factors that drive or hinder adoption. The success of AI governance depends on more than the technical features of a platform—it hinges on how well the platform aligns with an organization’s vision, culture, and operational processes.

The Role of User Research: Understanding the Context

This is where user research plays a pivotal role in supporting change. To drive the necessary transformation, researchers must dig deeper than just identifying gaps in the current system—they need to understand the full context in which AI governance is being implemented. By engaging with stakeholders across the organization, user research helps pinpoint the organizational dissatisfaction (D) and shapes the compelling vision (V) for a governance structure that integrates seamlessly with existing processes. Moreover, by uncovering the operational roadblocks and exploring first steps (F), research guides organizations toward actionable strategies to mitigate resistance (R) and ensure adoption.

This work goes beyond addressing a static set of requirements—it involves helping organizations navigate the complexity of integrating responsible AI into their daily workflows. Success depends on how well governance tools are embedded into these workflows and whether the organization is ready to embrace change. Factors such as AI maturity, executive buy-in, and risk management expertise are critical to this transition. User research helps align governance practices with the needs of the entire organization, ensuring that the solution is both effective and adaptable.

Governance is Dynamic: A Co-Evolutionary Relationship

AI governance isn’t a one-time implementation—it’s a dynamic, evolving process, particularly with the rapid advancements driven by GenAI. What worked just a few months ago may no longer suffice. In a recent Deloitte survey, two-thirds of organizations reported increasing their investments in Generative AI, seeing strong early returns. However, scaling these successes remains a challenge—70% of organizations have moved 30% or fewer of their GenAI experiments into production, with risk, regulations, and governance being the top barriers to progress.

Given these challenges, it’s no surprise that industry leaders at the summit emphasized the importance of governance in unlocking GenAI's full potential. As organizations rapidly adapt to new technologies, governance frameworks and platforms must evolve in tandem, creating a dynamic relationship where both the organization and its governance tools grow to keep pace with GenAI’s transformative impact.

User research is critical to this evolution. By combining discovery research—to explore future governance challenges—with evaluative research—to test solutions’ effectiveness—researchers ensure platforms remain flexible and forward-thinking. This iterative process allows governance systems to continuously adapt, remaining agile and capable of managing the increasing complexity of AI technologies.

Final Thoughts: From Tools to Transformation

AI governance is not just about providing tools to ensure compliance—it’s about transforming how organizations think and operate around responsible AI. The real challenge lies not in defining the metrics but in fostering processes that encourage companies to embrace change, critically evaluate their AI systems, and align their practices with a broader vision for ethical AI.

In the end, AI governance is an organizational journey, and user research is the roadmap that helps navigate it. By understanding the deeper organizational dynamics at play, we can ensure that governance is not just a technical hurdle, but an integral part of how companies innovate and adopt AI responsibly.

DISCLAIMER. The information we provide here is for informational purposes only and is not intended in any way to represent legal advice or a legal opinion that you can rely on. It is your sole responsibility to consult an attorney to resolve any legal issues related to this information.