Page images
PDF
EPUB

tives." This is fundamental to the PPB system. Without alternatives, the analyst is likely to become the decision maker. If a study is presented with costs, details, and recommendations-but no alternatives-then the agency head's choice is either to accept the study, stick with the status quo, or perhaps, suffer a budget cut. The reason a study is usually commissioned in the first place is that the agency head (or perhaps the President) has already decided that the status quo must be changed.

PPB does not envision all-powerful analytic staffs. It is envisioned that these staffs will come up with alternatives, will study their pros and cons, quantify their implications and present them to the agency head. Under PPB the agency head may become much more involved in decision-making in the future than previously.

The Budget Bureau is interested in seeing how well alternatives have been studied. A good

review of alternatives is fundamental to justification of major expenditures. The Bureau of the Budget does not and will not make the agency's decisions. It may make recommendations regarding programs, and it may send proposals back for further study. The Bureau does not anticipate doing studies for the agencies, and it does not intend to develop a vested position in special analysis. The analysis will be done in the agencies.

In the near-term, we hope to improve major decision-making. In addition, we hope to see improvement in daily management. If we can develop better measures of agency output, in the long run we can use these measures as primary means of management control. This will also affect the budget making process. In the long run, perhaps in the 1980's or later, I anticipate that PPB will not only give us better program planning, but also help us achieve more rational control of the day-to-day functions.

Appendix

In accordance with the instructions in the now well-known Bureau of the Budget Bulletin 66-3 and its Supplement, Government agencies are busily engaged in developing their Planning-Programming-Budgeting (PPB) systems. The Program Evaluation Staff, which I represent, has been deeply involved in the Budget Bureau's recent task of overseeing the establishment of the PPB system, in offering guidance, and in arranging for training opportunities for the agency staffs. In this paper, I will emphasize the "analysis" and "planning" aspects of the documentation and conclude with some implications of the advent of PPB.

PROGRAM STRUCTURES. Agencies have been defining their activities in a systematic, formal manner through categories, subcategories, and program elements which describe what the agency does. For this year's spring preview, that job is now finished.

In some instances, however, the existing structures still contain obvious shortcomings. A good program structure should define agency outputs in a way that will be of most use to agency top management, and some structures can be considerably improved in this regard. Future refinements must, therefore, seek to ac

complish this.

[blocks in formation]

Agriculture, Interior, and the Corps of Engineers. All three agencies now show the recreational opportunity they provide by geographic region, but the regions don't coincide. If each of these agencies reported the recreational opportunities they offer along similar geographic lines, it would then be very easy to construct a total Federal recreational budget which would show exactly (by area) what the total program amounts to. In many such areas interagency uniformity of this type is desirable. This year, though, there simply has not been enough time for any of this to be accomplished.

PROGRAMMING. This system formally establishes the base five-year plan in detail, dictates how program change proposals will be processed, and determines when and how the base-year plan will be extended.

A better grasp of what programming is about can easily be obtained by considering some of the questions which programming staffs have been grappling with recently. For example: What information should be requested from submitting bureaus or agencies? How much detail should the initial plan contain? How much detail should the plan ultimately contain? What forms should be used? To what extent do Department of Defense forms and procedures apply to an agency, and to what extent is DOD experience not applicable?

The programming system must provide for a "crosswalk" which will convert the budget in the new program structure into the present

appropriation structure. But how much accuracy and detail are needed in this conversion? To what extent is accounting precision necessary, and to what extent are statistical costfinding procedures acceptable? Should we try to allocate any joint costs? Which joint costs? What allocation procedure should be used? Or should all joint costs be put in a "general support" category? In passing, let me note that there are no pat answers to these questions. Good programming systems must be individually tailored to the particular needs and circumstances of each agency, and good programming systems therefore demand a lot of hard work.

Definition of the "costs," to be shown in the PFP, has been something of a problem. Some financial people have interpreted "cost" with much preciseness. However, the Supplement to Bulletin 66-3 was written chiefly by economists who were thinking of costs in an analytic sense —that is, the cost of the resources that go into the plan. Now it so happens that one of our BOB Circulars contains a precise, accruedaccounting definition of costs which includes depreciation, changes in inventories, accrued annual leave not taken, etc. In application, many have interpreted costs as used in the Supplement to Bulletin 66-3 in terms of this formal definition, while it was our intention to interpret costs variously for different agencies in whatever way makes the most sense from an economic point of view. For this year, we must live with whatever confusion we have generated. In the future, these multi-year program and financial plans should be tailored to suit the needs and the requirements of the individual programs.

One comment about costs which I feel can be made with some degree of assurance is that in future years the multi-year plan will show investment and operating costs separately for each program element. Many of the problems that we are now trying to solve will then be a lot easier to handle.

MULTI-YEAR PROGRAM AND FINANCIAL PLAN. Probably the two most important. parts of the whole PPB effort are output data and analyses. These are covered in two basic documents: the multi-year Program and Financial Plan and the Program Memoranda.

The Program and Financial Plan (PFP) will consist of three parts-all tables. Part I will be a table of outputs, showing measures for each program element expressed as physical units of what an agency is building, producing, or encouraging, year by year. Part II will show the financial implications of Part I; e.g., how much the units of the building plan are going to cost each year. Part III will contain relevant supplementary tables.

Submission of a multi-year Program and

Financial Plan to the Bureau of the Budget will be new to all agencies. The degree of additional planning required will vary considerably among agencies since some have been planning ahead only on a year-to-year basis.

The PFP is not intended to be an analytic document and it should not be interpreted as a refined cost-benefit analysis. We are not going to be able to take the cost in Part II, divide it by unit outputs in Part I, and automatically come up with something that says this is the "cost per unit of output" or that the reverse shows "benefits per dollar."

The PFP is a summary planning document. As a quantified expression of an agency's plans, it becomes a valuable document for dialog among the bureau chief, the head of the agency, the Director of the Budget Bureau, and the President.

PROGRAM MEMORANDA. This is the analytic document. It should state the basis and objectives of the analysis, describe the concepts and assumptions that were used, and contain all the narrative associated with the budget submission. Just as the PFP states what is planned, the Program Memoranda explains why.

This year, of course, we do not expect to receive a thorough analysis of all programs. But we do feel there is time to (1) formulate reasonably clear and precise program objec tives, (2) define the major issues, and (3) at least mention some of the principal alternatives which were weighed against the selections reflected in the FY 1968 budget. Our plan at BOB is to give each agency a critique on its PM's and have them reworked during the summer and resubmitted in the fall. The final version of this year's PM's will be the springboard of those to be written for the following spring preview. Through this process, we hope to inject a note of continuity to the budget review

process.

Let me elaborate for a moment on the subject of alternatives and their role in the budget review process. Past agency submittals to the Budget Bureau rarely suggested program alternatives (except for Defense, which now makes this a regular habit). The fact that the Budget Bureau is now asking for alternatives to be discussed explicitly has caused many people to

ask me whether BOB intends to start "running

the agency" or "making major agency deci sions." The answer to such questions is decided lv NO! If you will reread the instructions carefully, you will note that they do not ask the agency to submit alternatives to us for a deci sion. Budget submissions should continue to reflect decisions made by the agency head. How ever, when significant sums of money are involved, decisions should reflect the agency's

judgment of what constitutes the best choice among the most imaginative alternatives possible. That is, major decisions with large spending implications should be the consequence of a systematic search process. (NOTE: The Airlift/Sealift memo contained in the Supplement to Bulletin 66-3 illustrates a proper procedure for handling alternatives in budget submittals.) A tenet of program budgeting is that until the best and most efficient means of achieving a stated goal has been thoroughly and systematically searched for and analyzed, major program spending decisions are not justified. I repeat: the Budget Bureau is not asking any agency to abdicate its decision-making responsibility. The Budget Bureau is asking each agency to justify its decisions by showing that it has diligently searched for the best and most efficient means of achieving stated goals and objectives. In the past, justification of this nature has been, by and large, noticeable by its absence. The old form of "justification," which rarely contained the rational basis for any justification whatsoever, is no longer considered acceptable.

ROLE OF THE ANALYST IN PPB. Let me distinguish between decision-makers and analysts in the PPB system. They may, in rare cases, be the same individuals, but there is a conceptual difference between the two roles which is worth noting.

PPB assumes that the decision-makers are agency heads and their principal assistants. Analysts are people trained to dissect problems, to look at them different ways, to develop alternatives, to employ quantitative analysis where practicable, and to make suggestions for the decision-maker's consideration.

The phrase "develop alternatives" is really the key to the distinction between decisionmakers and analysts. All too frequently a proposal submitted to the decision-maker describes but one course of action. The only choice the decision-maker then has is to accept it or reject it. This procedure is neither logical, necessary, nor desirable. It de facto turns the analyst into the decision-maker. Program budgeting strives to separate the two functions by having analysts develop and present choices between meaningful and imaginative alternatives.

[graphic]

Operations Research Research and Government O.R.'

Dr. Alan J. Goldman"

National Bureau of Standards

A

1.

Introduction

I'd like to lead up to my "message for today" with a little analogy. It concerns, on the one hand, the typical relationship between the manager and the operations researcher-and, on the other hand, that between everyman and his physician. What I have in mind is that, unfortunately, most of us don't visit a doctor unless and until we're already feeling uncomfortably ill-and when we do arrive, what we long for is that famous "fast, fast relief." Similarly, an operations analyst's introduction to a new task is all too often in terms like these: "We have this problem. It's hurting us badly, very badly. Please give us a guaranteed optimal solution by yesterday."

Under these circumstances, it's pretty clear what will happen. Perhaps, by rare good fortune, the problem will happen to fall neatly into one of those areas in which both theory and practice are especially well developed. But most of the time, the "solution" obtained will necessarily be of the "quick and dirty" variety.

Such a solution may (quite properly) be accepted and acted upon as "best available." However, it may in fact be so crude as to lead to really unsatisfactory results-we all know that even our best efforts can't ensure the soundness of an O.R. study's conclusions. Or, the quick and dirty solution may be technically satisfactory, but considerably more expensive to implement than some alternative that was missed. The proposed solution may even be fairly acceptable on all counts, but the manager will never know how much better an answer might have been forthcoming if only a little more time or effort could have been sparedor, and this is the possibility I want to stress if only the state of the art had been a little

more advanced.

[blocks in formation]

This brings me close to my message. If most government practitioners of operations research are feverishly racing the clock in attempts to solve pressing practical problems, then just who is going to advance the state of the art? There is a quite obvious need for the analog of medical research-for groups whose dominant concern is not the rapid resolution of specific agency problems, but rather the systematic improvement, extension and creation of the methods of operations research. This is what I like to call "operations research research," i.e., research into the techniques and tactics of operations research. Abbreviation: O.R.R.

A natural reaction is that O.R.R. activities are properly the province of the universities. And so they are largely, but certainly not exclusively. It's easy to see why:

(a) We're concerned here with research which is applied rather than basic,3 i.e., takes its main stimulus from present and anticipated real-world problems rather than the general urge to advance human knowledge.

(b) More specifically, we're concerned with methods of operations research needed in government O.R., studies. Appraisal of these needs requires extensive exposure to such government studies; bringing the new methods into effective use quickly also requires close contact with the broad stream of government O.R. activity. I suggest that any university group which finds itself in such a position needs to do some earnest soul-searching about the primary functions of a university.

(c) Any university outfit will (and often should) be deeply concerned with issues distinctly peripheral to the needs of governmental O.R. A not infrequent occurrence is the tendency to slide the research toward some area which is sufficiently fashionable academically as to permit the guiding professors to publish rather than perish, and the guided graduate students to achieve acceptable theses.*

3 This rather fuzzy distinction is inappropriate in many of the contexts in which it appears, but seems genuinely relevant here. The phrase "not infrequent" was carefully chosen; "typical" would have been unjust.

LA

« PreviousContinue »