This article first appeared on radicalcompliance.com October 11th, 2021
The PCAOB published fresh guidance last week about how auditors should handle evidence supplied by others to help the auditor assess financial statements, important performance or valuation metrics, and, well, all the other stuff that can go into an audit report these days.
The guidance is formally titled “Evaluating Relevance and Reliability of Audit Evidence Obtained From External Sources.” While audit firms are the primary audience, internal auditors and compliance managers might want to give the document a read too, so you can better prepare for whatever demands your audit firm might make next time they’re in your conference room.
Why did the PCAOB issue this guidance at all? Because information provided by others (rather than obtained by the audit firm directly) is becoming ever more important to corporate audits, and ever more common, too. So the PCAOB wanted to remind everyone about the standards for evaluating such evidence before bad habits lead to audit failures or acrimonious conversations between audit firm and corporate client.
For example, a real estate business might use data from an external service provider about occupancy rates and rents to determine the supposed buildings it owns. Financial firms might use data provided by stock exchanges to value the financial instruments in their possession. Retailers might use information about product reviews to decide the value of their inventory, or data about historical weather patterns to model foot traffic estimates for the upcoming quarter.
In all the above cases (and many more), those metrics could end up being material to the financial statement audit, or possibly other audits performed for different reasons. So the external audit firm needs to understand how relevant, reliable, and sufficient that data actually is, before issuing an opinion on the company’s financial statements.
Issues to Consider in the Data
If external data is going to be used in an audit, the PCAOB guidance says, the auditor should keep three questions in mind.
Is the data relevant? The data must be relevant to whatever statement the company is making or to the objective of the control the auditor is testing; the more relevant the data, the better.
For example, if you want to assess the values a company is reporting for financial assets it owns, and you use data about actively traded equities from a national stock exchange — that data is relevant. On the other hand, if you’re trying to model possible credit losses by studying historical loss data from peer firms, that data might be less relevant; maybe market conditions have changed, or the peer firms aren’t working with precisely the same customer group as your company. So the auditor needs to evaluate the relevance of that data more thoughtfully.
Is the data reliable? You also need to consider the source of the external data, and how the data was obtained.
A lot of issues can come into play here. Say you’re assessing the value of real estate assets by studying data on rents and occupancy rates in local markets. Maybe the provider of that data is a newbie firm with no reputation or experience; maybe it calculated occupancy rates differently from one quarter to the next; maybe one of your company’s board directors is a senior executive at the firm and there’s a conflict of interest.
An auditor also needs to consider whether it can gather that data directly from the provider (more reliable) or through some complex process involving the company (less reliable). We could keep going with lots of scenarios, but you get the point: reliability is an important issue with lots of variables to consider.
Is the data sufficient? You need enough data to support whatever conclusion you’re trying to make. Where the first two questions were about the quality of evidence, this one is about the quantity.
Hypothetical example: a retailer is using product review data to help determine the value of its inventory, but only has 20 reviews when it’s shipping 1 million units annually. Is that sufficient evidence to justify an impairment of inventory? No.
Why Are We Caring About This?
Compliance, audit, and risk professionals need to care about external data because more and more companies are using this stuff, sometimes to calculate enormously important line items in the financial statements. So we should understand how to, you know, determine whether all this data is worth more than a pile of sand.
The PCAOB even says as much in its guidance, in this bland but nevertheless important paragraph:
Advancements in technology in recent years have improved accessibility and expanded the volume of information available to companies and their auditors from traditional and newer external sources … In addition, information from relatively newer, nontraditional external sources, such as web data aggregators and social media platforms, is becoming more prevalent.
Clearly the PCAOB is releasing this guidance now to help audit firms prepare for the audits they’ll be conducting in early 2022; it’s part of a larger project the PCAOB is running to understand how the rise of data analytics and similar technologies is affecting the staid, traditional world of audits.
The rest of us can still appreciate this document, however, because it illuminates the questions we all need to consider as companies place more reliance on data to make decisions. Where is this data coming from? How can we trust the source? How does the source calculate the data? Do we have enough data to make valid decisions?
Those questions are just as important for audit committees, internal auditors, and even compliance officers to consider. Anyone who performs audits or uses them to demonstrate the effectiveness of a compliance program should be conversant in the issues raised here.
We could go even further, and say that these questions about using external data are a junior varsity version of the issues we explored in a post last week about artificial intelligence. In that post, a cybersecurity acquaintance railed about access and change management controls; how do we know that AI supplied by a third party is really doing what it’s supposed to do? How can one audit that?
My friend was asking how a company can trust AI and the resulting data, so management can make sound, well-informed decisions. This PCAOB guidance is raising similar questions, without mentioning AI specifically. But the fundamental question — how can we trust the data? — is the same.
It’s almost like that question will be around for a long, long time.