top of page

The Most Widespread Audit Failure

Every year, the PCAOB (the regulatory body that audits the auditors) selects hundreds of audits from public accounting firms for quality review. The bulk of the review of an audit takes place over the course of a week. There’s a kickoff meeting on Monday and a wrap-up on Friday, with daily meetings sprinkled in throughout the week for the PCAOB review team to ask questions and dig into certain areas of the audit. The final report is issued once all of the reviews of the audits for that firm are completed.

The results of these quality reviews are private, unless the PCAOB determines that the firm did not appropriately remediate issues related to the findings. We looked at the public reports from the top 50 public accounting firms within the last 5 years and noted that one finding in particular impacted every single firm: validating the completeness and accuracy of reports and data.

Here’s what that means:

Auditors request hundreds, sometimes thousands of pieces of information when performing their procedures. These requests fall into two broad categories: information provided by the entity being audited (IPE), and information obtained from sources other than the entity being audited. IPE can include reports, data extracts, copies of key documents, bank statements, or anything else that the auditor gets directly from the client.

Because the client is being audited, the auditor must approach this data with the appropriate amount of professional skepticism. The history of auditing is filled with stories of fraudsters fabricating and falsifying documents. The amount of professional skepticism changes based on a few factors, such as the source system, how the data will be used in the audit, the process used to generate the data, and the format of the data itself (PDF, CSV, etc.).

For example, the source itself has to be considered. Is the data from a reliable source system? Is it a standard report used from a packaged system widely used in the industry or an ad-hoc query run on home-grown applications? Was the system covered by IT General Controls during the period, and were those controls designed and operating effectively?

How will the data be used? Is the information a sample record for one test? Or is this information going to be leveraged across multiple tests and be the source for selecting subsequent samples? The more broadly it’s used, the more important it is to make sure you are comfortable with the completeness and accuracy of the data.

What format was the data provided in? CSV? Those are trivial to manipulate. PDFs are a lot more difficult to manipulate, but not impossible.

Here’s what the PCAOB has to say on the topic of IPE:

When using information produced by the company as audit evidence, the auditor should evaluate whether the information is sufficient and appropriate for purposes of the audit by performing procedures to:

  • Test the accuracy and completeness of the information, or test the controls over the accuracy and completeness of that information; and

  • Evaluate whether the information is sufficiently precise and detailed for purposes of the audit.

Failure to evaluate the completeness and accuracy of data or reports was an issue at every large firm and accounted for over 35% of all publicly disclosed audit deficiencies identified by the PCAOB.

So, what should you do as an auditor?

Most of the time, the procedures you perform will involve validation that the source of the data is not fundamentally flawed, and that you can be reasonably certain the data you got was not manipulated in some way.

Data Source & Extraction Method

Typically, this means the system that generates the reports should be covered by IT General Controls, and the auditor should also get screenshots of how the evidence was obtained by the person providing it to the auditor. Within those screenshots, the auditor should look for evidence that the parameters that were used to generate the report are consistent with the request and requirements of the test.

Here are a few parameters that are important to look out for:

  • Date range: does the report cover the right dates (and times)?

  • Run date / cutoff: was the report generated on a day that would exclude items from the report, for example, before or after a reporting period ended?

  • Filters: were there any filters applied to only show certain types or records? For example, on an employee listing, are terminated employees excluded?

Data Format:

Now what about the format? PDF is the gold standard because it’s reasonably hard to manipulate, but CSV and XLSX are so much better when you need to select a sample. How do you get comfortable that the CSV or XLSX file wasn't manipulated, either intentionally or otherwise? Typically that’s where screenshots, record counts, and control totals come in.

Screenshots and Screen Recordings:

You’ll want the client to provide a screenshot of the record count in the system that you can then match to the count of the file. Alternatively, you can have the client record their screen so you can observe the “chain of custody” of the evidence you’re getting. You can hop on a zoom call, observe them in person, or have them record their screen with an audit document request platform like UpLink, which gives you the ability to require the client to upload a screenshot or record their screen directly from the app, in addition to uploading evidence.

Completeness & Accuracy Cheat Sheet:

Do you have control coverage over the system?

If not, you can’t rely on the data in the system being complete or accurate. Think garbage in, garbage out. Only with the effective design and operation of controls over the processes and systems surrounding this data can you move on to the next evaluation criteria.

Was the report a “standard report from packaged/COTS software”?

If not, you must perform procedures to determine that the query was tested by management for completeness and accuracy. This will typically include evidence of QA testing, UAT, and potentially other procedures to ensure the report does not exclude anything systematically (e.g. like limiting results to certain departments or amounts).

Are all the filters visible on the report output itself in a non-editable format like PDF?

If not, in addition to evaluating that the correct parameters were used to run the report, e.g. run date, date range, and filters, you’ll also need to perform procedures to determine the parameters selected were those applied to the output. For example, how do you know the client didn’t give you a different report? Typically, you’ll want to tie control totals from the source system to the report output, or have a screen recording of the records being generated and delivered to you.


The most ubiquitous quality finding from the PCAOB is that auditors are not performing sufficient procedures around the completeness and accuracy of data they obtain from the client.

Auditors need to have confidence in the source of the data, the method of extraction, and the “chain of custody” of that information so they know it wasn’t manipulated by the client. Modern PBC and auditing software like UpLink can help address this issue by allowing the auditor to require a screenshot or screen recording from clients when requesting evidence.


bottom of page