Apparently, despite Fed Chairman Ben Bernanke declaring that the recession is over, the bureaucracy set in motion by the president?s recovery activities is still subject to observing existing regulations. One in particular is the Data Quality Act, and according to this OMBWatch article, there is a risk that the data reported at the end of October is going to be plagued by inconsistencies or irregularities with the data. Here is a great quote from the article:
?OMB, however, points to the agencies and Recovery Act fund recipients. According to OMB?s guidance documents, the agencies must take responsibility and work with their recipients to ensure comprehensive and accurate data reporting.?
What is striking is not just the focus on data quality, but specifically the focus on the quality of the reporting. In other posts I have looked at the question of managing the quality of output reports, which takes on a different flavor than validating the data that is input. Some specifics? We often concentrate on persistent data as the object of our concern, but reports are more likely to be transient, may only be calculated and materialized on demand, may depend on embedded computations that should also be subjected to review, may have certain presentation constraints depending on the delivery mechanism, and most importantly, carry the semantics associated with the consumers of the report and must abide by those directives.