The vast majority of professional auditors have expressed concern that AI tools, while helpful, risk undermining professional judgment; nearly half said the use of AI also risks eroding the trust people have in the profession entirely.
This is according to recent data from audit solutions platform
This, in turn, could explain why the majority of respondents said they were not comfortable using AI for decision-support analysis and client-facing preparation. While often touted as a way to empower less experienced staff members, for a first-year professional auditors generally only trust them to use AI for routine administrative support, information gathering and summarization, and data review and pattern detection. For second- or third-year professionals, this extends to initial drafting of deliverables, technical research assistance and data review and pattern recognition. This could be linked, at least in part, to 79% of auditors naming algorithmic bias in AI tools as a moderate, very significant, or extremely significant risk factor.

Part of it might be because many auditors feel let down by the AI tools they use as well as the training provided. The survey found that while firms are seeking more engagement with AI solutions vendors, many are dissatisfied with them. The poll found that 63% find the tool to be neither comprehensive nor user-friendly, 47% said the tools do not scale as well as initially advertised, and 65% say the tool lacks depth and does not cover the latest advancements in technology. With this in mind, the survey found 72% say they are unprepared, slightly prepared or moderately prepared for upskilling and reskilling staff for AI use.
Given this finding, auditors generally said AI sophistication was less important than security and safety. The poll found more than half of auditors (55%) are willing or very willing to trade some level of AI performance in exchange for stronger security and safety. In particular, a human-in-the-loop is seen as non-negotiable for audit professionals. A clear majority of auditors (64%) also said professionals should always validate AI outputs relied upon in reaching professional conclusions.
Auditors are not necessarily against AI but think that processes need to be adjusted to account for it. The poll found that 67% of auditors somewhat or strongly agreed that the profession needs to rethink the entire audit execution model in order to fully embed AI. While the report did not elaborate on the specifics of what that meant, it did say auditors want to use AI to add value for their clients. Over the next two to three years, one of the biggest opportunities is believed to be in integrating AI and advanced data analytics to enhance risk identification and deliver deeper insights (39%), followed by deepening industry specialization to provide more tailored and strategic advisory services (21%), then enhancing collaboration with clients in the audit cycle (14%) and, finally, redesigning the audit process to be more agile, tech-enabled and client-centric (13%).
However, real progress on these goals will be difficult without a new set of professional standards. The poll found that a clear majority (66%) said there is an urgent need for a globally harmonized AI framework for audit and assurance.
Standards have also been on the mind of the International Auditing and Assurance Standards Board (




