XBRL 2.0

Digital financial reporting that actually works

XBRL DATA QUALITY – WHISTLING PAST THE GRAVEYARD

I just listened to a recent Toppan Merrill webcast on how XBRL disclosures are being used. The panel included the host software vendor and the SEC. My interest was in data quality and the topic arose when these two questions were addressed to the SEC representative:


  1. Is poor data quality impacting consumption?
  2. What is the SEC was planning to do about it? 

The SEC rep is a smart and effective spokesman for the Commission, but I was astonished by his answer. He first side-stepped the SEC’s role by declaring that the filers are 100% responsible for the quality of their filings. Legally, of course, he’s correct, but that doesn’t mean the SEC and software vendors should just let poor quality happen.


He then spent several minutes explaining how XBRL filing errors provided investors with very useful insights into a company’s choices, judgment, internal processes, controls and validation procedures.


What?


Instead of decrying the dismal state of data quality and its negative impact on consumption, we’re told that XBRL is a window into the systems, controls, and even the minds, of corporate finance departments. Stop worrying about whether revenue, earnings and EBITDA are reliable and comparable. What’s important is what analysts can glean from the way the data was prepared.

 

Now, if the SEC rep had gone on to say something like “…but the SEC understands that data quality must improve dramatically if XBRL’s benefits will ultimately justify its costs. To that end…”, then I would have remained hopeful. But he did not. That the SEC will exercise real quality control anytime soon is a proposition that is dubious at best.

 

Then the moderator from Merrill weighed in on quality by touting the XBRL-US rules promulgated by a group of volunteer accountants and data aggregators. These data quality rules are only suggestions that can be, and often are, violated without consequence. The Merrill rep then pulled out the one graph that’s always used to illustrate quality improvement. It shows that negative value errors have declined. Ironically, this measure doesn't really matter as long as the relationships between items are correctly specified, as required.


Quality issues that matter to end users have not declined significantly in recent years, as demonstrated by XBRLogic’s Quality ScoreIn addition to the SEC’s lack of enforcement, software vendors share the blame with solutions that allow file creation that is error-ridden and incomplete. 


The SEC and software vendors know the kind of junk that’s being produced. It resides in the metadata, hidden beneath the surface of normal looking statements. Take this balance sheet from a Toppan Merrill client. It looks normal and correct, and for simply viewing the results, it’s fine.



But for investors and analysts who incorporate this data into their models or databases for the purpose of valuation, screening and comparative analysis, the balance sheet looks like this:



This version shows the errors, omissions and non-comparable elements that inhibit the usability of the data.

  1. Items in red are missing relationships that give each element context, enable adjustment of value polarity and allow validation of fact values.
  2. Items in purple are invalid summations, the result of missing relationships.
  3. Items in blue are extensions (custom tags) that render the statement non-comparable. Each could have been tagged to a standard element (shown below) while leaving its label unchanged. This would have preserved both the meaning of the reported element and the statement’s comparability. 


These errors are the responsibility of the filer, but why weren’t they flagged and prevented by Toppan Merrill software? The validation I’ve done here is not difficult. Software should prevent this from ever getting close to a final submission to the SEC. And, of course, the SEC should reject this submission as incomplete and incorrect.

Here’s what I know:

  1. XBRL data quality is not improving for the metrics that matter to investors. 
  2. The SEC is unwilling, or unable, to exercise quality control. 
  3. Existing compliance software allows filers too much latitude in the selection of tags, the creation of extensions and the inclusion of required metadata. Toppan Merrill’s software is better than most and still allows this incomplete, non-comparable statement to be created. There are other software vendors that don’t even pretend to comply with XBRL filing requirements.

Data quality needs to be addressed head-on by all of XBRL’s stakeholders. As a commercial product, XBRL data would not have lasted a year, much less 10. As a government mandated product, quality control requires that the SEC reject filings with material errors and omissions, issue fines, and certify filing software as compliant. It may even need to require audits of XBRL filings*. A higher cost of non-compliance will induce companies to take measures to improve their filings, including selecting software that assists in that process. 


*See Charles Hoffman's excellent article on the issue of XBRL audits... http://xbrl.squarespace.com/journal/2019/10/17/auditing-xbrl-based-financial-reports.html







Loading