B-247844 Most Agencies Had general results are indicative of the overall status of performance measures in federal agencies. We did our work from June to December 1991 in accordance with generally accepted government auditing standards. Our results are based on interviews with selected agencies as well as survey information; we did not attempt to verify information provided by each of the 102 agencies. Appendix II contains more details regarding our scope and methodology, and appendix III is a copy of the survey sent to agencies. Most of the agencies reported that they had strategic plans and collected a wide variety of program performance measures. About two-thirds of the agencies (67) said they had a single long-term plan that contains goals, standards, or objectives for the entire agency or program. In addition, over three-quarters of the agencies (78) indicated they had long-term plans at the subcomponent level to set goals, standards, or objectives for their programs. Nearly all agencies said they measured a range of performance, such as program inputs, work activity levels, and program outputs. Over 80 percent said they also collected internal quality and timeliness measures, and more than half measured external customer satisfaction, equity of service availability, or program outcomes. In all, over 82 percent of the agencies said they collected measures covering at least parts of their activities in 7 or more of the 11 broad categories of measures listed in the survey. Figure 1 shows the number of agencies and the different kinds of measures that the 102 responding agencies reported they use. Note: The number of agencies responding was 102. Note: Definitions of the measures in this figure appear in appendix III. Most of the performance measurement data agencies collected were A Department of Labor study of federal agencies administering education and training programs reported that even in cases where program outcome data were collected, they appeared to serve no more than informational B-247844 Many Agencies Visited Federal Transit Federal Aviation purposes. This finding is supported more broadly by our survey results. Our interviews and a Department of Labor study of the use of employment The following examples taken from our visits to the Federal Transit FTA, in the Department of Transportation, provides grants to states and localities to help develop, maintain, and operate their mass transit systems. An official said that to track its grant-making activities, FTA created a series of indexes that served as standards to assess the grant-making status among its regional offices. The indexes were based on measures of specific work activities such as the number of grants developed, grants managed, transportation improvement program reviews, and triennial reviews. While these measures were related to the efficiency and compliance efforts of the agency's grant-making activities, they were not used to assess progress toward its strategic plan or that of the Department. As a regulatory agency within the Department of Transportation, FAA said it used a system of performance measures to assess overall organizational accountability toward its mission of fostering a safe, secure, and efficient aviation system. The use of existing data provided information to be used for general management purposes instead of control of individuals or units. B-247844 FAA reported that it focused on programs and activities that promote safety by using performance indicators of ratios and comparisons. FAA used these historical trend measures to compare the targets set annually to measure agency progress. Typical indicators included year-to-year comparisons of security inspections, air traffic delays, and pilot deviations. FAA said it delivered electronic monthly reports in an executive information system and prepared quarterly paper reports that contained concise information and were widely circulated among senior management. Managers were to use this information to get an overall sense of how FAA was doing. Many Agencies Used Department of Defense In order to see how well resources were being managed to accomplish tasks, many of the agencies we visited used performance measures to help make budget decisions, to assess employee performance, and to provide incentives. On our visits, we found more ties to individual employee assessment than any other management use. This was supported by survey results, which show over one-third of the agencies required the use of performance measures in senior management performance contracts. Three examples of agencies that use performance measures to manage operations are the Department of Defense (DOD), the National Archives and Records Administration, and the Job Training Partnership Act (JTPA). At the time of our visit, the Office of the Comptroller of DOD had begun to determine the unit costs of selected support activities, such as recruiting and supply management, in order to identify efficiencies and to make budget decisions. With operation and maintenance outlays of about $85.7 billion in fiscal year 1992, “unit cost resourcing" was intended to help DOD reduce the costs of doing business by helping managers identify the costs of delivering service outputs and by helping them understand the long-term and indirect costs of producing specific outputs. With this knowledge, unit costing was intended to serve as a decision support system that could put DOD closer to budgeting a specific set of activities on the basis of what it actually costs to do the job. According to a senior DOD official, unit cost resourcing was used as a tool to improve management with a focus on output, which requires employees to know what they produce, identifies customer-provider relationships, causes workers to examine the process for needed changes, and creates better cooperation between management and employees. According to B-247844 National Archives and Records Administration DOD'S Comptroller's Office, these efforts called for a change in the The work of the Archives consists of responding to requests for historical According to officials, operational technicians in two offices, the Office of Officials said technicians were expected to meet the established standards, The Archives's standards for routine tasks were the basis for quarterly The Archives had contracted for a series of studies to develop their ར |