• Increase font size
  • Default font size
  • Decrease font size
Home News Archive DCAA Passes Peer Review

DCAA Passes Peer Review

E-mail Print PDF

Regular reader EZ-Eric sent us an email recently, incensed that DCAA had passed another external peer review. The DoD OIG posted the results of its peer review here. EZ-Eric called the report and the peer review process a “complete sham.”

Would you like to know more?

According to the report, the objective of the review was to evaluate DCAA’s system of quality control. The report was intended to express an opinion on “the design of the system of quality and DCAA’s compliance with standards and requirements” of that system of quality.

What’s a system of quality? According to the report:

A system of quality control encompasses DCAA’s organizational structure and policies adopted and procedures established to provide it with reasonable assurance of conforming to Government Auditing Standards (GAS). The elements of quality control are described in GAS. DCAA is responsible for establishing and maintaining a system of quality control that is designed to provide it with reasonable assurance that the organization and its personnel comply with professional standards and applicable legal and regulatory requirements in all material respects.

The DoD OIG conducted the assessment in accordance with GAS and CIGIE’s Guide to Conducting Peer Reviews. (CIGIE is the exclusive club comprised of 73 Inspectors General from the Federal agencies and departments and independent components. Google it, if you’d like to know more.) Fundamentally, that means that the IG auditors conducted their review with independence and objectivity, and obtained sufficient evidence to support conclusions reached.

To obtain the evidence, the IG auditors selected 67 audits, all issued before June 30, 2016, for testing. The 67 audits were selected from a cross-section of DCAA FAOs across the country, including from Field Detachment. (If you don’t know what an FAO is or what Field Detachment is, then you should probably do some more research before reading the rest of this article.) The selected audit reports covered a variety of assignments, from Incurred Cost (audits of annual proposals to establish final billing rates) to Forward Pricing Rates to proposals to CAS Disclosure Statements, and more. A couple Pre-Award Accounting System reviews were selected, but we didn’t see any post-award Accounting System reviews, nor did we see any MMAS or Estimating System reviews in the selection. That was a curious omission, since inadequate Business System reviews have supported findings of audit quality system failures in the past.

The report contains a handy chart showing details of the 67 selected audit reports, including which FAO performed the audit, the assignment objective, and even the exact assignment number. In addition, the report has another table (Enclosure 2) showing exactly which of the 67 selected audit reports had deficiencies.

There were 25 of them.

In other words, 25 of the 67 selected audit reports had deficiencies. That’s 37 percent. More than one-third. Or, to put it more bluntly, nearly four out of every 10 reports selected for review were found to have one (or more!) deficiencies.

That’s not good.

In fairness, not all deficiencies have equal significance. Some deficiencies are not a big deal. Others are indeed kind of a big deal. The IG report treated every deficiency equally, But—and this is critical!no deficiency was found to be a significant deficiency.

That’s right. No deficiency was found to rise to the level of “significant deficiency”—the definition of which the OIG was careful to omit from its report, despite quoting the definition of “deficiency” in full. Yes, there were many deficiencies found, but none were deemed to be significant.

Which is why DCAA passed its audit.

Looking at the deficiencies found, three types scream “significance” to us.

Deficiency 1: DCAA did not obtain sufficient evidence to support its audit conclusions. As the IG reported—

GAS 2.09(a) states that an audit consists of acquiring sufficient, appropriate evidence to express an opinion on whether the subject matter is based on the criteria in all material respects or the assertion is presented, in all material respects, based on the criteria. For 18 of 67 audits (27 percent) we selected for review, we found one or more instances in which DCAA auditors did not obtain sufficient, appropriate evidence to support an opinion expressed in the report. We found this deficiency in all six DCAA regions. Among the 18 audits, we found a total of 25 instances when the auditors did not obtain sufficient, appropriate evidence to support DCAA’s opinion that contractor proposed costs were reasonable, allowable, or compliant with contract terms.

Well, gosh. More than a quarter of all selected reports lacked sufficient evidence to support audit conclusions. That seems like kind of a big deal to us. Yet, the IG, using professional judgment, did not deem that deficiency to be a significant deficiency.

Moreover, as the IG itself noted in the report, this is a repeat finding from the external peer review it conducted in 2014. The IG stated that DCAA had taken corrective actions in response to the 2014 finding, but the corrective actions had proven ineffective and “based on the results of our current review, DCAA still needs to consider additional steps to ensure that auditors gather sufficient evidence to support reported opinions.”

If this were a contractor business system review and there had been a repeat finding, we all know what the result would be.

To put this deficiency into perspective, in 2008 the GAO and the DoD OIG both found that DCAA’s audits lacked sufficient evidence to support conclusions and opinions. It was a core finding and the subject of Senate hearings. And it led to DCAA’s quality system lacking an external peer review approval for several years thereafter.

Nine years later, apparently nothing has changed.

Yet, DCAA passed the 2017 quality system external peer review.

Deficiency 2: DCAA did not report on pertinent information or scope restrictions. (AKA “reporting” deficiency.) As the IG reported—

GAS 5.04 requires auditors to communicate pertinent information to individuals requesting the audit. In addition, as discussed in AT 101.73 and .74, restrictions or limitations on the scope of an audit may prevent the auditor from issuing an unqualified opinion. When the reported opinion is qualified or disclaimed, the reasons for doing so should be described in the audit report. For 8 of 67 audits (12 percent), DCAA did not appropriately communicate pertinent information or important limitations to the contracting officer. We found this deficiency in five of the six DCAA regions. … The findings demonstrate a pattern and pervasiveness of issues the reflect the need for improving the reliability of DCAA audit reports.

The IG auditors stated that the root cause of this deficiency was that DCAA auditors were not following their own audit procedures. Although disagreeing with the IG audit finding(s), DCAA agreed to update its procedures and to give incoming auditors extra training in this area. (Apparently the current auditors, the ones not following procedures, were expected to do the right thing without any additional corrective actions.) The OIG considered the matter “closed”—meaning that DCAA’s corrective action was accepted.

Thus, DCAA passed its 2017 quality system external peer review.

Deficiency 3: DCAA did not adequately document the procedures performed. As the IG reported—

GAS 5.16a requires that auditors prepare audit documentation in sufficient detail to enable an experienced auditor, having no previous connection to the examination engagement, to understand from the documentation the nature, timing, extent, and results of procedures performed. In 9 of the 67 audits (13 percent), the documentation taken as a whole was insufficient to understand the nature, timing, extent, or results of work performed by the DCAA auditor. We had to hold extensive discussions with the audit staff to understand the procedures performed and why those procedures accomplished the audit objective. We found this deficiency in three of the six DCAA regions. Each of the nine audits had four or more documentation inadequacies.

Well, this is a problem. If you don’t tell people what you did (procedures performed) and you don’t tell them what you didn’t do (reporting), then they might assume your audit conclusions and opinions were valid, even if you didn’t have sufficient evidence (based on procedures performed/not performed) to support those conclusions and opinions.

Seems fairly significant to us.

What’s going on here in this particular deficiency? According to the IG—

Our review did not disclose any inadequacies with DCAA policies and procedures related to documenting the work performed in accordance with GAS. However, the auditors did not comply with established DCAA procedures. Given the number and significance of documentation deficiencies we found (including additional, less significant documentation issues addressed in our Letter of Comment dated November 17, 2017), DCAA should assess the effectiveness of its controls for ensuring compliance with established Agency policies and procedures and take appropriate corrective action. As part of its corrective action, DCAA should consider the need to provide comprehensive refresher training on the GAS documentation requirements.

So it wasn’t the audit guidance; it was the auditors and their inability to follow the audit guidance.

Again, the OIG accepted DCAA’s promise to provide extra training to auditors on the procedures that they are supposed to follow. Not a big deal, apparently. All forgiven. We’re all moving on.

There were other findings; you can read the report if you are inclined to do so. (Link in the first paragraph.) Some of those findings were repeat findings from the 2014 quality system review. The OIG documented that DCAA had declined to implement the recommended corrective action(s) from that 2014 external peer review, and now in 2017 the same findings show up again. How surprising.

And yet, DCAA passed its 2017 quality system external peer review.

It’s not only that nearly 40 percent of the selected audit reports evidenced deficiencies, it’s also that some audit reports evidenced multiple deficiencies. For example—

Audit Report No. 03231-2009M10100046 (an audit of contractor “incurred costs” by the Salt Lake Valley FAO) exhibited deficiencies in every single category. The IG found lack of evidence, reporting issues, lack of documentation, lack of adequate supervisory review, and lack of professional judgment. Same thing for Audit Report No. 06811-2008U10100007 (an incurred cost audit by the BAE York FAO). Given that these were both audits of contractor annual final billing rate proposals—and that such audits take an average of nearly three years to perform these days—it is difficult to understand why the IG auditors would have found any deficiencies whatsoever. Yet there they were: many upon many of them.

And yet, DCAA passed its 2017 quality system external peer review.

The DoD OIG officially concluded—

In our opinion, except for the evidence, reporting, documentation, supervision, and professional judgment deficiencies described after the Overall Management Comments and Our Response section, the system of quality control for DCAA in effect for the year ended June 30, 2016, has been suitably designed and complied with to provide DCAA with reasonable assurance of performing and reporting in conformity with applicable professional standards in all material respects.

And now you know why EZ-Eric was so incensed at this report and the process of governmental external peer reviews. He thinks it’s all a sham.

And it might be.

We believe that one might reasonably wonder whether the DoD OIG can exercise complete independence and objectivity when evaluating the quality control system of a related department. We say “related” because both the OIG and DCAA report up to the Secretary of Defense, albeit by different channels. Yes, the DoD OIG is an independent activity not subject to day-to-day SECDEF oversight; however, it seems reasonable to believe that phone calls could be made, that pressure could be applied. We can envision a scenario where it was pointed out to the Acting Inspector General that it would be in the best interests of the Department (and taxpayers) to have the DCAA quality system remain approved. After all, there is a mountain of contractor final billing rate proposals to wade through (slowly) and ongoing litigation with contractors, to which DCAA audit findings form an important component of the government’s cases. In other words, “find a way to let them pass.”

We’re not saying it happened that way. We’re saying it seems reasonable to wonder if the DoD OIG can, in fact and in appearance, be completely free from pressure that could impair independence and objectivity. Certainly, EZ-Eric thinks that’s the case.

Which is why we are suggesting that DoD OIG no longer perform external peer reviews of the DCAA’s quality control system. Have some other entity perform the work, if only to avoid the appearance of impaired independence and objectivity.

Meanwhile, the 2017 DoD OIG audit conclusion of the external peer review of the DCAA audit quality system stands.

DCAA passed.




Effective January 1, 2019, Nick Sanders has been named as Editor of two reference books published by LexisNexis. The first book is Matthew Bender’s Accounting for Government Contracts: The Federal Acquisition Regulation. The second book is Matthew Bender’s Accounting for Government Contracts: The Cost Accounting Standards. Nick replaces Darrell Oyer, who has edited those books for many years.