While organizations understand AI governance and controls are important, actual implementation lags behind significantly, a gap attributed to problems not in technology but leadership and culture.
Governance, risk and compliance solutions provider
"AI systems are being deployed faster than oversight structures can keep up, leading to ad hoc governance, uneven accountability, and increased exposure to legal, ethical and operational failures," said the report.

And even when an organization does have a governance program, AuditBoard said the quality can vary greatly. For example, while organizations are prioritizing things like AI usage monitoring (45%), risk assessments (44%), and third-party model evaluations (40%), practices that AuditBoard said are foundational—such as usage logging, model documentation maintenance and enforcement of access controls for AI systems—lags behind.
"These are sophisticated processes that require accurate data inputs, strong accountability frameworks and well-defined governance policies. And yet, many of the building blocks that support such efforts—things like model inventories, usage logging and approval workflows—are either missing or inconsistently applied," said the report.
Compounding problems is the fact that some organizations are automating these governance processes before routine controls are in place to support them. Rather than building upward from strong foundational practices, many organizations are trying to scale governance from the top down.
"It's an approach that risks embedding inconsistency, rather than eliminating it. What emerges is a kind of governance illusion: Automation gives the appearance of control, but without foundational processes and clear ownership, it may simply replicate gaps at scale," said the report.
This underscores another finding that fewer than 15% of the respondents thought the main problem was the technology itself; more often, respondents cited lack of clear leadership (44%), lack of internal expertise (39%) and limited resources (34%) as the main factors.
"Most organizations are not struggling to find dashboards or compliance software; they're struggling to determine who's accountable, how teams should coordinate, and what workflows need to change. The issue is less about capability and more about clarity," said the report.
Overall, AuditBoard said organizations should move beyond principles and into execution by defining how policies apply to real-world scenarios (e.g., which teams review AI use cases, how model performance is monitored, and what happens when issues arise), with governance embedded into daily decisions, not just documents.
Because AI governance isn't owned by one function, the report said risk, compliance, product, legal, security and engineering all need a seat at the table as well. If an organization does choose to automate its AI controls, it should do so strategically, not prematurely, with a focus on core controls such as AI inventories, access approvals and documentation standards before taking other steps. Considering even the best frameworks fail without user understanding, organizations are also urged to roll out training tailored by function and seniority. Finally, AuditBoard said organizations need to shift from annual reviews to continuous updates, with teams structured to respond to new tools, risks and regulatory changes as they emerge.
The survey included 412 respondents who were audit, GRC or IT decision-makers and purchase influencers working at companies with annual revenue of at least $100 million.