TheCRE.com
CRE Homepage About The CRE Advisory Board Newsletter Search Links Representation Comments/Ideas
Data Access
Data Quality
Regulation by Litigation
Regulation by Appropriation
Special Projects
CRE Watch List
OMB Papers
Abstracts and Reviews
Regulatory Review
Voluntary Standards Program
CRE Report Card
Public Docket Preparation
Consumer Response Service
Site Search

Enter keyword(s) to search TheCre.com:

®: CRE Regulatory Action of the Week

CRE Requests Withdrawal Of The National Assessment On Climate Change On Data Quality Act Grounds
On February 11th, CRE requested that the United States Global Change Research Program and the Office of Science and Technology withdraw the First National Assessment on Global Climate Change because it violates the objectivity, utility and reproducibility requirements of the Data Quality Act and OMB's guidelines implementing the Act. More specifically, the National Assessment violates the Act and OMB's guidelines.

  • It relies solely on two inaccurate and unreliable computer models;

  • The two models used in the National Assessment were of foreign origin, despite the National Research Council's admonition that foreign models should not be used to assess global climate change in the United States;

  • The national assessment was published without either development of the underlying science or adequate peer review; and

  • Numerous Government reviewers of the National Assessment emphasized that it is not ready for release without major changes.
  • Read CRE's request for withdrawal of the National Assessment
  • Comment on Item
  • View past CRE Regulatory Action of the Week items

     


     

    February 11, 2002

    Mr. John H. Marburger III, PhD
    Director
    Office of Science and Technology Policy
    Eisenhower Executive Office Bldg.
    17th St. & Pennsylvania Ave., N.W.
    Washington, D.C. 20502-0001

     

    Dear Dr. Marburger:


    For the reasons detailed below, the Center for Regulatory Effectiveness ("CRE") respectfully requests that you withdraw dissemination of the first National Assessment on Climate Change ("National Assessment") (http://www.usgcrp.gov/usgcrp/nacc/default.htm), Withdrawal of the National Assessment is necessary because: (1) there are numerous data quality and scientific flaws in it; and (2) it does not comply with the Federal Data Quality Act ("FDQA"), 44 U.S.C. 3516 note, and with guidelines promulgated by the Office of Management and Budget ("OMB") implementing the FDQA, 67 Fed. Reg. 369 (Jan. 3, 2002). A copy of the OMB guidelines is attached to this letter.

    The National Assessment does not comply with the objectivity, utility, and reproducibility requirements of the FDQA and the OMB guidelines because, e.g.:

  • It relies solely on two inaccurate and unreliable computer models developed outside of the United States;

  • It was published without either development of the underlying science or adequate peer review; and

  • Numerous Government reviewers of the National Assessment have emphasized that it is not ready for release without major changes.
  • In short, the National Assessment is based on inadequate and incomplete science and models; it has never been subject to adequate peer review, and it flunked the limited peer review that has occurred. Yet it has been released and publicly disseminated in its current incomplete and flawed form. Continued Government dissemination of the National Assessment violates the objectivity, utility, and reproducibility requirements of the FDQA and OMB's implementing guidelines. Withdrawal of the National Assessment now could obviate the need for a formal administrative petition, and potential judicial review, once these processes are established by October 2002 through agency implementing guidelines.

    We ask that you respond to this request within 45 days from your receipt of it.

    I. BACKGROUND ON THE NATIONAL ASSESSMENT

    Pursuant to and/or under the auspices of the Global Change Research Act of 1990, 15 U.S.C. 2921 et seq., the United States Global Change Research Program ("USGCRP") is assigned the responsibility of producing the scientific assessment that is the subject of this request for withdrawal and correction. Specifically, "On a periodic basis (not less frequently than every 4 years), the Council, through the Committee, shall prepare and submit to the President and the Congress an assessment which:

  • integrates, evaluates, and interprets the findings of the [USGCRP] Program and discusses the scientific uncertainties associated with such findings;

  • analyzes the effects of global change on the natural environment, agriculture, energy production and use, land and water resources, transportation, human health and welfare, human social systems, and biological diversity; and

  • analyzes current trends in global change both human-inducted [sic] and natural, and projects major trends for the subsequent 25 to 100 years."
  • 15 U.S.C. 2934.

    The Government first disseminated the National Assessment to the public as final information in December 2000. The Government is still disseminating it to the public.

    II. BACKGROUND ON THE FDQA

    The FDQA requires stringent, new standards governing the quality of information disseminated to the public by federal agencies. 44 U.S.C. 3516 note. OMB's FDQA guidelines further refine these standards and require administrative petition mechanisms to correct information that does not comply with them.

    OMB's guidelines broadly define the universe of agency information that is subject to the FDQA standards. This definition encompasses the National Assessment. See 67 Fed. Reg. 377-78; 66 Fed. Reg. 49720-21, 49274-75.

    The data quality standards required by the FDQA and OMB's guidelines are more specific and more stringent than the "arbitrary and capricious" and "rational basis" standards imposed during judicial review under the Administrative Procedure Act. With the possible exception of the Safe Drinking Water Act, the FDQA standards are also more specific and more stringent than the standards imposed by federal environmental statutes. For example, under the FDQA, as interpreted by the OMB guidelines, agency information disseminated to the public, such as the National Assessment, is subject to the following new quality standards:

  • It must be based on the best available, peer-reviewed science and supporting studies conducted in accordance with sound and objective scientific practices; and on data collected by accepted methods or best available methods (if the reliability of the method and the nature of the decision justifies use of the data).

  • It must have utility.

  • Utility requires that the information be useful to the intended users of the National Assessment. These include Congress and the Executive Branch, as well as the public.
  • It must have objectivity.

  • Objectivity requires that agency information (e.g., the National Assessment) be accurate, reliable, unbiased, and developed using sound statistical and research methods.
  • As the statutorily designed steering document for policymaking, the National Assessment is "influential scientific or statistical information"; therefore, it must also meet a "reproducibility" standard that requires transparency regarding data and methods of analysis, "as a quality standard above and beyond some peer review quality standards."
  • 67 Fed. Reg. 376-77; 66 Fed. Reg. 49719, 49721-22, 49724.

    All federal agencies now have to develop their own data quality guidelines in accordance with the following schedule:

    April 1, 2002 Deadline for each federal agency to propose its data quality guidelines for public comment.
     
    July 1, 2002 Deadline for each agency to submit its proposed data quality guidelines to OMB for final review.
     
    October 1, 2002 Deadline for each federal agency's promulgation of its data quality guidelines as a final agency action.

    The federal agencies' data quality guidelines must be consistent with OMB's guidelines, but they could and should also be more program specific. They must also create processes for the public to petition agencies to change information that does not comply with the FDQA, as implemented by OMB's and individual agencies' guidelines. 66 Fed. Reg. 49718. OMB's latest guidelines require an administrative appeals process for petitions and deadlines for agency action on a petition. 67 Fed. Reg. 376.

    Under the OMB guidelines, for information first disseminated after October 1, 2002 federal agencies have the burden of demonstrating compliance with the FDQA, OMB's guidelines, and the agencies' guidelines. By contrast, for information first disseminated before October 1, 2002, an outside party has the burden of demonstrating an agency's noncompliance with the FDQA, OMB's guidelines and/or the agency's guidelines. This demonstration would be made through the administrative petition process, and there is no cutoff period for information subject to the petition process. In other words, all information currently being disseminated by an agency, for example through availability on the agency's website, is subject to a FDQA petition, regardless of when the information was first disseminated. 67 Fed. Reg. 376; 66 Fed. Reg. 49724.

    All "affected persons" may file a FDQA Act petition. 67 Fed. Reg. 376. The OMB guidelines define "affected persons" as "people who may benefit or be harmed by the disseminated information." The OMB guidelines encourage the federal agencies "to consider how persons (which includes groups, organizations and corporations, as defined by the Paperwork Reduction Act) will be affected by the agency's information" in developing their own FDQA guidelines. 66 Fed. Reg. 49721.

    An agency's final action on an administrative petition under the FDQA would be final agency action subject to judicial review under the Administrative Procedure Act, unless Congress precluded judicial review under the Act. 5 U.S.C. 504. Congress did not preclude judicial review of FDQA issues.

    III. THE FDQA AND ITS IMPLEMENTING GUIDELINES APPLY TO THE NATIONAL ASSESSMENT

    The National Assessment is information disseminated to the public that is subject to the FDQA requirements, provided the USGCRP is a federal agency subject to the FDQA requirements. 67 Fed. Reg. 376; 66 Fed. Reg. 49720-21, 49274-75. Because it is an extension of the Executive Office of the President (EOP), the USGCRP is a federal agency subject to the FDQA's requirements.

    The FDQA covers the same entities as the Paperwork Reduction Act, 44 U.S.C. 3501, 3502(1)("PRA") . The EOP is subject to the PRA and therefore also to the FDQA. Id. By statute the President serves as Chairman of the National Science and Technology Council ("NSTC"), operating under the White House Office of Science and Technology Policy ("OSTP"). The OSTP has under its authority the Committee on Environment and Natural Resources ("CENR") (originally "Committee on Earth and Environmental Sciences"). 15 U.S.C. 2932. All of these offices are, therefore, EOP entities, subject to the PRA and the FDQA.

    Under 15 U.S.C. 2934, the President, as Chairman of the Council, shall develop and implement through CENR a U.S. Global Change Research Program. The Program shall advise the President and Congress, through the NACC, on relevant considerations for climate policy. The composite USGCRP is an "interagency" effort staffed in great part by seconded employees from federal agencies, who are themselves subject to the FDQA. It remains under the direction of the President and is therefore a "covered agency" pursuant to the PRA and FDQA. 44 U.S.C. 3502(1).

    Collectively and pursuant to statutory authority, under the direction of these Executive Offices, the USGCRP directed an effort statutorily dedicated in part to studying the state of the science and its uncertainties surrounding the theory of "global warming" or "climate change," ultimately producing the National Assessment. Though originally produced prior to FDQA, the data disseminated by the National Assessment were issued in final form in December 2000. They are currently being disseminated. See http://www.usgcrp.gov/usgcrp/nacc/default.htm. Therefore, the National Assessment is subject to the requirements of the FDQA.

    IV. THE NATIONAL ASSESSMENT IN ITS CURRENT FORM DOES NOT MEET THE REQUIREMENTS OF THE FDQA AND THE OMB GUIDELINES

    As demonstrated by an extensive record obtained in part through the Freedom of Information Act (FOIA), the National Assessment violates the FDQA and OMB guidelines because it inappropriately uses computer models. Further, the National Assessment was published as final before the necessary science underlying regional and sectoral analyses had been performed. Congress mandated completion of the necessary science as a condition precedent to the release of any National Assessment (even a draft). In addition, the purported internal "peer review" of the draft National Assessment did not in fact occur. As the obtained documents demonstrate, commenting parties expressly informed USGCRP that they were rushed and were not given adequate time for substantive review or comment. Nevertheless, USGCRP published and continues to disseminate the National Assessment in violation of the FDQA's "objectivity" and "utility" requirements. As "influential scientific or statistical information," the National Assessment also fails the FDQA's "reproducibility" standard, which requires transparency regarding data and methods of analysis, "a quality standard above and beyond some peer review quality standards." 66 Fed. Reg. 49722.

    A. The National Assessment Relies on Improper Use of Computer Model Data

    As demonstrated in detail below, the climate models upon which the National Assessment relies cannot simulate the current climate. They predict greater and more rapid warming in the atmosphere than at the surface when the opposite is happening (see, e.g., http://wwwghcc.msfc.nasa.gov/MSU/hl_sat_accuracy.html). They predict amplified warming at the poles. In fact, the net temperature change averaged across Antarctica in recent decades has been negative (Nature, 2002); and temperatures from 70 to 85 degrees North have been declining for decades (International Journal of Climatology, 2000). On top of this demonstrable lack of utility for their purported purpose, the National Assessment knowingly misuses the models.

    Numerous Government reviewers commented that the National Assessment improperly used computer models. For example, William T. Pennell of Northwest National Laboratory, in his report entitled "Improper use of climate models" submitted through DOE (John Houghton) to Melissa Taylor at USGCRP, stated (emphasis added):

    Although it is mentioned in several places, greater emphasis needs to be placed on the limitations that the climate change scenarios used in this assessment have on its results. First, except for some unidentified exceptions, only two models are used. Second, nearly every impact of importance is driven by what is liable to happen to the climate on the regional to local scale, but it is well known that current global-scale models have limited ability to simulate climate effects as this degree of spatial resolution. We have to use them, but I think we need to be candid about their limitations. Let's take the West [cites example]...Every time we show maps that indicate detail beyond the resolution of the models we are misleading the reader.

    Other Government reviewers criticized the National Assessment on this ground. For example:

  • "Also, the reliance on predictions from only two climate models is dangerous". Steven J. Ghan, Staff Scientist, Atmospheric Sciences and Global Change, Pacific Northwest Laboratory.

  • "This report relies too much on the projections from only two climate models. Projections from other models should also be used in the assessment to more broadly sample the range of predicted responses." Steven J. Ghan, Staff Scientist, Atmospheric Sciences and Global Change, Pacific Northwest Laboratory.

  • "Comments on National Assessment. 1. The most critical shortcomings of the assessment are the attempt to extrapolate global-scale projections down to regional and sub-regional scales and to use two models which provide divergent projections for key climate elements." Mitchell Baer, US Department of Energy, Washington, D.C.

  • "General comments: Bias of individual authors is evident. Climate variability not addressed... Why were the Hadley and Canadian GCMs used? Unanswered questions. Are these GCM's (sic) sufficiently accurate to make regional projections? Nope." Reviewer Stan Wullschleger (12/17/99).

  • William T. Pennell, Manager, Atmospheric Sciences and Global Change, Pacific Northwest Laboratory, emphasizes the fact that "only two models are used" is a "limitation" on the product.
  • Both models were developed outside of the United States despite warnings by the Climate Research Committee of the National Research Council ("NRC") that "it is inappropriate for the United States to rely heavily upon foreign centers to provide high-end modeling capabilities." The NRC explained that use of foreign models is inappropriate for "a number of reasons," including:

    "1. U.S. scientists do not necessarily have full, open, and timely access to output from European models, particularly as the commercial value of these predictions and scenarios increases in the future.

    2. Decisions that might substantially affect the U.S. economy might be made based upon considerations of simulations (e.g., nested-grid runs) produced by countries with different priorities than those of the United States.

    ***

    While the leading climactic models are global in scale, their ability to represent small-scale, regionally dependent processes (e.g., hurricanes and extreme flood events) can currently only be depicted in them using high-resolution, nested grids. It is reasonable to assume that foreign modeling centers will implement such nested grids to most realistically simulate processes on domains over their respective countries which may not focus on or even include the United States."

    Capacity of U.S. Climate Modeling to Support Climate Change Assessment Activities, Climate Research Committee, National Research Council, pages 4, 13-14 (National Academy Press 1998).

    In addition, and on behalf of this petition, Patrick Michaels, Professor of Environmental Sciences at the University of Virginia, has prepared the following excerpts from his review of the National Assessment dated and submitted to USGCRP on August 11, 2000, along with additional explanatory text. The additional explanatory text is in italics and was prepared after August 11, 2000.

    August 11, 2000 Michaels Review

    "The essential problem with the [National Assessment] is that it is based largely on two climate models, neither one of which, when compared with the 10-year smoothed behavior of the lower 48 states (a very lenient comparison), reduces the residual variance below the raw variance of the data. The one that generates the most lurid warming scenarios-the Canadian Climate Centre (CCC) Model-produces much larger errors than are inherent in the natural noise of the data. That is a simple test of whether or not a model is validand both of those models fail. All implied effects, including the large temperature rise, are therefore based upon a multiple scientific failure. The USNA's continued use of those models and that approach is a willful choice to disregard the most fundamental of scientific rules. (And that they did not find and eliminate such an egregious error is testimony to grave bias).

    For that reason alone, the [National Assessment] should be withdrawn from the public sphere until it becomes scientifically based."

    Additional Explanatory Text by Michaels

    "The basic rule of science is that hypotheses must be verified by observed data before they can be regarded as facts. Science that does not do this is 'junk science,' and at minimum is precisely what the FDQA is designed to bar from the policymaking process.

    The two foreign climate models used in the National Assessment make predictions of U.S. climate change based upon human alterations of the atmosphere. Those alterations have been going on for well over 100 years. Do the changes those models "predicted' for U.S. climate in the last century resemble what actually occurred?

    This can be determined by comparison of observed U.S. annual temperature departures from the 20th century average with those generated by both of these models. It is traditional to use moving averages of the data to smooth out year-to-year changes that cannot be anticipated by any climate model. This review used 10-year running averages to minimize interannual noise.

    The predicted-minus-observed values for both models were then compared to the result that would obtain if one simply predicted the average temperature for the 20th century from year to year. In fact, both models did worse than that base case. Statistically speaking, that means that both models perform worse for the last 100 years than a table of random numbers applied to ten-year running mean U.S. temperatures.

    There was no discernible alteration of the NACC text in response to this fatal flaw. However, the National Assessment Synthesis Team, co-chaired by Thomas Karl, Director of the National Climatic Data Center, took the result so seriously that they commissioned an independent replication of this test, only more inclusive, using 1-year, 5-year, 10-year and 25-year running means of the U.S. annual temperature. This analysis verified that in fact both models performed no better than a table of random numbers applied to the U.S. Climate Data. Mr. Karl was kind enough to send the results to this reviewer."

    August 11, 2000 Michaels Review (continued)

    "[T]he problem of model selection. As shown in Figure 9.3 of the Third Assessment of the United Nations Intergovernmental Panel on Climate Change, the behavior of virtually every General Circulation Climate model (GCM) is the production of a linear warming, despite assumptions of exponential increases in greenhouse forcing. In fact, only one (out of, by my count, 26) GCMs produces a substantially exponential warming-the CCC [Canadian Climate Centre] model [one of the two used in the National Assessment]. Others may bend up a little, though not substantially, in the policy-relevant time frame. The USNA specifically chose the outlier with regard to the mathematical form of the output. No graduate student would be allowed to submit a thesis to his or her committee with such arrogant bias, and no national committee should be allowed to submit such a report to the American people.

    Even worse, the CCC and [UK] Hadley data were decadally smoothed and then (!) subject to a parabolic fit, as the caption for the USNA's Figure 6 makes clear. That makes the CCC even appear warmer because of the very high last decadal average.

    One of the two models chosen for use in the USNA, the Canadian Climate Center (CCC) model, predicts the most extreme temperature and precipitation changes of all the models considered for inclusion. The CCC model forecasts the average temperature in the United States to rise 8.1F (4.5C) by the year 2100, more than twice the rise of 3.6F (2.0C) forecast by the U.K. model (the second model used in the USNA). Compare this with what has actually occurred during the past century. The CCC model predicted a warming of 2.7F (1.5C) in the United States over the course of the twentieth century, but the observations show that the increase was about 0.25F (0.14C) (Hansen, J.E., et al., 1999: GISS analysis of surface temperature change. Journal of Geophysical Research, 104, 30,997-31,022), or about 10 times less than the forecast [Hansen has since revised this to 0.5C, which makes the prediction three times greater than what has been observed]. The CCC forecast of precipitation changes across the Unites States is equally extreme. Of all the models reviewed for inclusion in the USNA, the CCC model predicted more than twice the precipitation change than the second most extreme model, which interestingly, was the U.K. model [the other model used in the NACC]. The U.K. model itself forecast twice the change of the average of the remaining, unselected models. Therefore, along with the fact that GCMs in general cannot accurately forecast climate change at regional levels, the GCMs selected as the basis for the USNA conclusions do not even fairly represent the collection of available climate models.

    Why deliberately select such an inappropriate model as the CCC? Thomas Karl, [co-Chair of the National Assessment] synthesis team replied that the reason the USNA chose the CCC model is that it provides diurnal temperatures; this is a remarkable criterion given its base performance..."

    "The USNA's high-end scenarios are driven by a model that 1) doesn't work over the United States; 2) is at functional variance with virtually every other climate model. It is simply impossible to reconcile this skewed choice with the rather esoteric desire to include diurnal temperatures..."

    Additional Explanatory Text by Michaels (continued)

    "It is clear that the National Assessment chose two extreme models out of a field of literally dozens that were available. This violates the FDQA requirements for 'objectivity.'"

    These data quality flaws in the National Assessment were never corrected. These flaws justify CRE's request that USGCRP cease present and future National Assessment dissemination unless and until its violations of FDQA are corrected.

    B. The National Assessment Was Published Without Allowing Sufficient Time for Development of the Underlying Science and Peer Review

    Congress sharply criticized USGCRP's development of the National Assessment. For example, leaders in the United States House of Representatives repeatedly attempted to ensure that USGCRP and its subsidiary bodies follow the scientific method regarding particular matters, specifically the regional and sectoral analyses. Indeed, the concerns had become so acute that these legislative leaders successfully enacted a restriction prohibiting relevant agencies from expending appropriated monies upon the matter at issue, consistent with the plain requirements of the GCRA of 1990, through language in the conference report accompanying Public Law 106-74:

    None of the funds made available in this Act may be used to publish or issue an assessment required under section 106 of the Global Change Research Act of 1990 unless (1) the supporting research has been subjected to peer review and, if not otherwise publicly available, posted electronically for public comment prior to use in the assessment; and (2) the draft assessment has been published in the Federal Register for a 60 day public comment period1.

    USGCRP did not perform the conditions precedent for valid science as cited in that language. Instead, USGCRP produced and now disseminates a National Assessment knowingly and expressly without the benefit of the supporting science. This data quality flaw not only violates substantive requirements, it also violates Congress' repeated insistence that the supporting science be performed and that a draft national assessment be subject to adequate peer review before it is publicly disseminated as final.

    These data quality flaws in the National Assessment were never corrected. These flaws justify CRE's request that USGCRP cease present and future National assessment dissemination unless and until its violations of FDQA are corrected.2

    Given USGCRP's refusal to wait for completion of the underlying science and their response to the relevant oversight chairmen, it is manifest that USGCRP ignored or rejected these lawmakers' requests, including those of the relevant oversight Chairmen. Instead of heeding Congress' data quality concerns and mandates, the USGCRP published a deeply flawed "final" Assessment without having complied with Congress' direction to incorporate the underlying science styled as "regional and sectoral analyses."3

    Finally, the National Assessment suffers from having received no authentic peer review. Once an advisory committee was chartered pursuant to the Federal Advisory Committee Act (FACA) in 1998, Dr. John Gibbons' communication of January 8, 1998 to the First Designated Federal Officer (DFO), Dr. Robert Corell, indicates a sense of urgency was communicated to the panel by political officials. Further, statements in the record and major media outlets, including but in no way limited to those from certain anonymous if purportedly well placed sources, indicate a perception among involved scientists that political pressures drove the timing and even content of this draft document. This is manifested by the lack of opportunity to comment for parties whose comment was formally requested as part of a "peer review" of the National Assessment.

    This sense of urgency is reflected in, among other places, comments the Cooler Heads Coalition obtained via the Freedom on Information Act. These comments were made by parties from the National Laboratories asked by the Department of Energy to comment on the draft National Assessment. These comments criticize an emphasis on speed as opposed to deliberation, the report's emphasis on "possible calamities" to the detriment of balancing comments which were widely offered, and its reliance on only two significantly divergent models. These comments are exemplified by the following samples from well over a dozen such complaints accessed through FOIA:

  • "This review was constrained to be performed within a day and a half. This is not an adequate amount of time to perform the quality of review that should be performed on this size document" (Ronald N. Kickert, 12/08/99).

  • "During this time, I did not have time to review the two Foundation Document Chapters" (Kickert, 12/20/99).

  • "Given the deadline I have been given for these comments, I have not been able to read this chapter in its entirety" (William T. Pennell).

  • "UNFORTUNATELY, THIS DOCUMENT IS NOT READY FOR RELEASE WITHOUT MAJOR CHANGES"(CAPS and bold in original) (Jae Edmonds).

  • "This is not ready to go!" (William M. Putman).
  • These comments reflect grave concern about an emphasis on time over substance, and about a product whose final content appears predetermined regardless of the science and the facts. The National Assessment was released and continues to be disseminated without offering an actual peer review or otherwise addressing these important concerns.

    V. RELIEF REQUESTED

    For these reasons, CRE requests immediate cessation of dissemination of any document purporting to represent a "National Assessment on Climate Change," unless and until its data quality flaws are corrected in accordance with the FDQA standards. We further ask that you respond to this request within 45 days of your receipt of it.

    Sincerely,

    Jim J. Tozzi
    Member, CRE Board of Advisors

     

    cc (w/attach.):

    Dr. John D. Graham
    Office of Management and Budget
    Eisenhower Executive Office Building
    17th and Pennsylvania Avenue, N.W.
    Washington, D.C. 20503


    Margaret S. Leinen, PhD
    Chair, USGCRP and
    Assistant Director
    National Science Foundation
    4201 Wilson Boulevard
    Arlington, VA 22230


    1. House Report 106-379 (conference report accompanying H.R. 2684, Department of Veterans Affairs and Housing and Urban Development, and Independent Agencies Appropriations Act, 2000 (Pub. Law No. 106-74), p. 137.

    2. In a series of letters during June and July 2000, Congress detailed for USGCRP some of the obvious scientific and data quality flaws in the National Assessment and sought assurance that these flaws would be remedied. USGCRP, via OSTP, drafted a response to House Science Committee Chairman Sensenbrenner that failed to specifically address the concerns raised by Congress. Chairmen Sensenbrenner and Calvert took issue and/or disputed these non-responses in a July 20, 2000 letter to Neal Lane, reiterating their request for compliance with the law's requirements. These scientific and data quality flaws in the National Assessment have never been corrected.

    3. See the Congressional correspondence with OSTP discussed in footnote 2 supra. This flaw occurs despite the fact that the two principal National Assessment sections are "Regions" and "Sections." ( see http://www.gcrio.org/nationalassessment/).