AT Think

CPA Success Index is DOA: NASBA's data no longer holds up

Several years ago, colleagues and I developed the CPA Success Index. The Success Index offered a metric better than raw single-part CPA exam pass rates to be used for evaluating CPA candidate performance from colleges and universities around the country. It estimated how likely candidates from a given school are to complete all the sections of the CPA exam within the crucial 18-month window.

The Success Index was a four-part alternative to the National Association of State Boards of Accountancy's single-part pass-rate rankings, which we quickly found posted on university websites all over the country, and provided data for prospective students looking for a clearer signal of program quality. 

In January, we published the final index based on the pre-CPA Evolution examination model. I anxiously awaited NASBA's 2024 Candidate Performance Book to determine whether a new CPA Success Index that fit the new CPA Evolution exam model could be calculated. I anticipated preparing an index for the three-part core and separate index scoring for each discipline specialty exam, and using a 30-month exam window. I also anticipated preparing an index that would show the probability of passing in the time window, and estimate the average time for candidates from respective universities. 

But unfortunately, this year, the CPA Success Index is effectively dead on arrival. The problem is not the calculations themselves; it's the glaring inconsistencies I found in NASBA's candidate performance data, the numbers the success index relies on.

NASBA's numbers don't add up

We historically used NASBA's Candidate Performance data to calculate our Success Index scoring and rankings. However, for the Success Index to be reliable, the CPA exam data must be accurate and consistent. Historically, NASBA's Candidate Performance Book provided data that was consistent enough to make reliable estimates. Confidence in their data is now gone. While combing through the 2024 data, we found large numbers of candidate scores to be missing, and potentially classified to incorrect colleges or universities, or not reported at all.  

To explain, let me illustrate the inconsistencies in the data for my school, the University of Northern Iowa. However, it applies to many, if not most, schools' data in the report. In 2024, our department started closely tracking every graduate sitting for the exam, including collecting copies of candidates' official score reports. What we found is that we had official candidate reports from far more candidates than NASBA is reporting. The discrepancies are not trivial; they range from 25% to 40% of the scores, depending on the section. We also found that Iowa community colleges appear in NASBA's reporting despite the fact that Iowa law requires a bachelor's degree to sit for the CPA exam. It seems logical to ask where all those community college students actually got their degrees. (If a 2024 CPA Success Index were to be published, top performers nationally would include community colleges in states that require a BA.) 

Meanwhile, states with dozens of community colleges and far more students, such as Illinois, show none. And NASBA has significant errors when reporting students from graduate programs. As a result, we didn't report graduate school rankings in our original Success Index reports. Overall, the discrepancies in the 2024 Candidate Performance report are not minor, and they signal systemic data integrity problems with candidate reporting at NASBA.

The GIGO effect

The CPA Success Index was a refined and purposeful analytic aimed at providing useful, informative data for academic institutions looking to benchmark their programs and provide a KPI that learning goals can be measured against, to provide reliable data to prospective students who might desire enrolling at an academic institution where students are likely to pass the exam, and to firms, particularly public accounting firms, who need to maximize their recruiting budgets and hire graduates from schools in which they are confident students will pass the CPA exam. But no matter how sound the model, garbage in equals garbage out. If candidate counts and pass rates are misreported in NASBA's Candidate Performance data, the resulting Success Index scores and rankings may be as misleading as NASBA's. Programs that appear to be performing poorly may actually be terrific, and those that look stellar could in fact be struggling.

A reliable metric goes beyond university bragging rights; schools use data and rankings to recruit students, justify resources, and support accreditation reports. Prospective students and employers often use data and rankings to gauge program quality. If the scoreboard is broken, everyone, from future CPAs, faculty, firms, are flying blind.

The timing couldn't be worse

Accounting education has been under intense scrutiny over the past decade as enrollments are down (although turning a corner), and the profession is searching for ways to attract and prepare new CPAs. At the same time, the CPA Evolution overhaul is reshaping exam content and candidate strategies. Reliable outcome metrics have never been more critical. Losing confidence in one of the few useful data points we had, just as the exam itself changed, sets accounting education back.

What must happen next

If the NASBA data, and any metrics that rely on NASBA data, are to regain credibility, four things must happen next. First, NASBA must supply full and transparent data that can be reconciled. In the past, NASBA offered custom reporting, albeit for a very high fee, for colleges and universities so they could get verifiable and accurate data. That would allow programs to do things like analyze data that differentiates the performance of undergraduate and graduate students. NASBA discontinued that practice a few years ago. 

Second, NASBA should improve its process and control over data reporting. That may mean adding more fields in CPA applications that would allow for better and more accurate differentiation in reporting, or adding processes that reconcile data and reports with students' transcripts. NASBA can theoretically access all the data it needs to produce an accurate report since CPA candidates must submit the transcript from every institution they attended, even if it is just one dual credit high school class, when registering to take the CPA exam. 

Lastly, NASBA should refrain from ranking institutions. Forget about putting out faulty rankings with bad data; ranking institutions does not take into account the types of students who enroll at particular institutions, and reporting outcomes (even if they were correct) distorts the impact and quality of education a student receives without accounting for the inputs. The colleges and universities that appear to have the most distorted data in the 2024 NASBA Candidate Performance report are those that have a large number of transfer students, and very likely those that cater to low-income and minority students.  This hurts the profession's efforts to diversify and provides poor information to students with the smallest margin for error in their life.  

Finally, NASBA may need to have its data audited and verified for accuracy by an independent firm. That may be extreme, given this is the association for boards of accountancy, but the data appears to be so erroneous and misleading that any future reports are presumably questionable.

It is ironic that the National Association of State Boards of Accounting produces a report that people rely upon but that can't be trusted and verified.  For the profession, the academy and future CPAs, NASBA needs to get it right.

(Editor's note: The author developed the original CPA Success Index methodology for Accounting Today and published it for several years. NASBA plans to provide updated information to at least the author's school, the University of Northern Iowa.)

For reprint and licensing requests for this article, click here.
Accounting Accounting education CPA Exam Accounting students
MORE FROM ACCOUNTING TODAY