he modern world is a highly
complex one. It is moreover
in a constant state of change. As testimony
to these facts, one need only take a glance at the
complexities and changes of today's demographics,
economies, technologies, and environmental surroundings.
are highly aware of these intricate and mutable realities
and are striving, as best they can, to keep in step.
Public budgeting is one area in particular that governments
are giving attention to in order to respond to a changeable
world. To do this, governments are attempting to provide
reliable and complete information to budgeters and
policy-makers alike so that substantive budget choices
can be made.
today are especially trying to ascertain how well
public organizations and programs are doing in providing
services and products to their citizenry. Governments
are asking: "What kind and how many services
are we getting from allocated dollars?" "Are
these public services of good value?" "Are
they making a difference in citizens' lives?"
answer these questions, and other equally significant
ones, governments are developing and implementing
"performance-based budgeting" systems. No
longer satisfied with traditional budgeting processes,
new and, in some cases, renewed interest in linking
planning and performance measurement to budgeting
is taking hold. Governments are looking beyond inputs
or line-item expenditures to make informed decisions,
choices that address long-term effects or outcomes,
and choices that are grounded in measurable progress
this section, performance measurement and budgeting
will be discussed. Performance budgeting will be defined
and its various uses will be reviewed. Next, the main
characteristics or criteria associated with successful
performance measurements will be examined as well
as its several strategic aspects. Lastly, lessons
learned from governments' prior implementation efforts
will be scrutinized to determine the "dos and
don’ts" of performance-based budgeting.
is no one single definition of performance-based budgeting
(PBB). A review of the literature does, however, suggest
what it means commonly. Most observers of -- and experts
on -- public budgeting do agree that, generally speaking,
PBB is the allocation of funds to achieve programmatic
goals and objectives as well as some indication or
measurement of work, efficiency, and/or effectiveness.
(See Snell and Hayes, November 1993, p. 1; Garsombke
and Schrad, February 1999, p. 9; Epstein, 1984 p.2).
John Mikesell, for example, states that performance
budgets are basically the linking of inputs or costs
to program activities and goals. He states that performance
budgets may, and most often do, contain one or more
of the following the elements: workload data (units
of activity provided), productivity data (cost per
activity), and effectiveness information (level of
goal achievement) (Mikesell, 1999, pp. 185-186).
Joyce, while acknowledging that "no standard
definition" of PBB exists, states that it "involves
a sophisticated web of relations, from inputs to outputs,
to outcomes… the connecting of resources to results
for budgeting purposes” (Joyce, 1999, p. 598). Similarly,
Charles Dawson describes performance measurement and
budgeting "as general terms applied to systemic
efforts to assess government activity and enhance
accountability for progress and outcomes in achieving
results” (Dawson, 1995, p. 1).
1994 report published by the National Conference of
State Legislatures defined PBB in the following way:
budgets use statements of missions, goals and objectives
to explain why the money is being spent… .[It is a
way to allocate] resources to achieve specific objectives
based on program goals and measured results. …Performance
budgeting differs from traditional approaches because
it focuses on spending results rather than the money
spent -- on what the money buys rather than the amount
that is made available." (Carter, 1994, pp. 2-3).
the literature implies, therefore, performance-based
budgeting has four primary characteristics. First,
PBB sets a goal, or a set of goals, to which monies
are "connected," i.e. allocated. From these
goals, specific objectives are delineated and funds
are then subdivided among them. Second, PBB provides
information and data on past performance and thereby
proceeds to allow for meaningful comparisons between
"expected" and "actual" progress.
Third, adjustments to programs are made either at
this point or during a future budget preparation cycle
to close any performance gaps that may exist. Fourth,
as an ancillary yet important characteristic, PBB
provides an opportunity for regular or special (ad
hoc) program evaluations. When utilized, these evaluations
are valuable in that they give independent and verifiable
information to budget decision-makers and program
of PBB Terms
INPUT MEASURES. These are the volume of resources used or total expenditures
(costs) consumed to achieve a given output.
OUTPUT MEASURES. These
are the quantifying of goods and services performed
or delivered to customers.
EFFECTIVENESS MEASURES. These are the indices that assess how well a program
achieved its goals and objectives; e.g., percent
of wetlands preserved as a result of permit
issuance; percent of inmates convicted of another
crime after release, percent of placements successful
after 30 days, etc.
These are indices that assess or compare how
much output was achieved per unit of input (costs);
e.g. cost per complaint processed, cost per
license issued, cost per prisoner incarcerated,
WORKLOAD MEASURES. These
are indices that assess the level of effort
required to carry out an activity; e.g., number
of applications processed, number of inspections
completed, number of miles patrolled, etc.
See Garsombke, H. and Schrad, Jerry. (February
1999). Performance measurement systems: results
from a city and state survey. From Government
Finance Review. Chicago, IL: Government Finance
it is important to note that many experts in public
finance believe that the cardinal aim of PBB is accountability.
Performance information and data used in budgeting
holds public officials, especially program managers,
accountable for service quality, cost-efficiency,
and program effectiveness. The focus for PBB is, once
again, on results, not simply inputs. Hence, governors,
legislators, service or program recipients, and the
public generally can determine accountability with
a degree of certainty with the use of PBB methods,
where this is not possible utilizing traditional or
line-item approaches. This ability to assess performance
and hold managers accountable serves as a powerful
incentive to ratchet up quality or positive service
implications of PBB to stir agency and program officials
to meet or exceed performance expectation have, of
course, two sides -- one that "rewards"
and the other that "penalizes." With regard
to "good" or "excellent" performance,
rewards can be several. For one thing, agencies or
programs might be recompensed with additional or increased
funding, or may be allowed to carry "saved"
funds, due to efficiencies, forward to the next fiscal
year. Agencies or programs could also be absolved
from burdensome paperwork requirements, awarded some
form of bonus, or perhaps even have agency responsibilities
enhanced in some fashion. On the other side, penalties
could be handed out to agencies or programs that perform
poorly. This might include a reduction in funding
or elimination of a program altogether. It might additionally
be cause for ordering a management audit or evaluation,
transferring program responsibilities to another agency,
or the firing of the agency director (Snell and Hayes,
1993, p. 4).
of performance-based budgeting systems can be understood
in two ways. One way to make sense of the use of PBB
is to speak broadly to the subject matter as it relates
to its primary aims. This would include PBB's twin
aims of improving decision-making and enhancing service
delivery. The other way to understand the use of PBB
would be to examine its application or implementation
among specific states. This examination would allow
for grasp of PBB's use in an actual or "hands-on"
way. In this section, a brief discussion of these
uses will be presented with the purpose of
gaining a more valuable comprehension of performance-based
have pinpointed, according to the literature, two
utilitarian aims that PBB readily strives to fulfill.
These aims are (1) improved decision-making, and (2)
enhanced service delivery (see Epstein, 1984, pp.
6-7; Joyce, 1999, pp. 600-601; Lee and Johnson, 1998,
budgeting is essentially about making choices. To
make better choices, decision-makers need qualitative
and complete information and data. PBB can provide
this through its various components or devices; e.g.,
the setting of goals and objectives, the prioritizing
of these ends, and the measuring of performance levels
(via the indices of efficiency and effectiveness).
and quite simply, measurement of performance assists
government officials to assess "what" and
"how well" a program is doing. For instance,
what is Program X intended to do? Is Program X achieving
these intended ends? Are Program X’s activities or
operations cost-efficient? Asking and answering these
and other similar questions will permit decision-makers
to make wiser, more intelligent program policy and
further, for example, the corresponding improvement
of the decision-making process with the addition of
differing levels or tiers of performance information
and data. Let's assume that a state's mental health
agency wants to fund a program to treat individuals
with alcohol and drug addiction. The first budget
tier of information might speak only to personal services,
operating expenses, and assistance payments. A decision-maker
(a governor, a legislator, a program manager, etc.)
has little to work with here -- just broad expenditure
classifications. However, add a second tier of performance
information, i.e., "program objectives,"
and the decision-maker has more meaningful information
with which to base and make a judgment. Objectives
for an alcohol and addiction program might include,
for example, "(1) to increase the number of community
programs available for involuntary alcohol and drug
services by three by June 30, 2002, and (2) to provide
six scheduled treatment hours per patient per day
for inpatient alcohol and drug addiction programs."
now consider adding a third tier of performance data.
With another tier the decision-maker gets a picture
of the effectiveness of the program as measured from
previous fiscal years. Effectiveness indices show,
for example, "that the number of community programs
providing involuntary services increased from three
in FY 1999 to six (6) in FY 2000." Also, indices
for "scheduled treatment hours per patients indicate
that these have increased from 4.5 to 5.5 for the
same period." Add information as to the "number
of voluntary alcohol and drug admissions" and
the decision-maker has workload data – a fourth tier.
Add information on "cost per patient per day"
and a fifth tier of efficiency data is made available.
The upshot of these multiple tiers of information
obviously becomes clear -- the more tiers of information
available, the more informed, intelligent the decisions
are going to be.
other functional aim of PBB is, once again, enhanced
service delivery. Performance-based budgeting strives
to provide individuals with quality services that
meet individual needs in a prompt and complete fashion.
It does this through the establishment of specific
objectives and measurement indices. In other words,
the budgeter or decision-maker establishes performance
targets and simultaneously provides for some measuring
device to monitor efficiency and effectiveness of
service delivery efforts. The intent or design is
again enhanced service delivery.
so-called "industrial engineering" techniques
are employed to improve operations and service delivery.
"Quality management" approaches are also
used in many organizations to increase service performance.
Particularly in the public sector, program evaluations
or "performance audits" are a commonly used
method to evaluate service delivery and recommend
authors and practitioners of public budgeting especially
believe that the focus of a PBB process should center
on the notion of “quality” services. The emphasis
here is placed on the satisfying of customer (constituent,
client, consumer or user) needs. According to the
Council of State Governments, 32 states have some
form of statewide quality management program in place.
Quality Management (TQM) and similar quality schemes
are, and have been for some time, a leading philosophy
and management process embraced and utilized at all
levels of the public sector. TQM and other quality
programs concentrate on customer satisfaction, teamwork
and employee participation, performance measurement,
and open or flexible organizational structures (Young,
September 1998, pp. 4-5).
The Elements of Quality
Top management leadership
Focus on the customer.
Enhanced rewards and recognition.
Commitment to training.
Employee empowerment and teamwork.
Effective and renewed communications.
Application of statistical process analysis.
See Federal Quality Institute. ( May 1991).
Introduction to total quality management
in the federal government. Washington,
DC: Federal Quality Institute.
to the National Association of State Budgeting Officers
(NASBO), states using PBB systems are on the increase.
Additionally, governors and legislators, and in some
cases both, are finding performance-based budgeting
mechanisms useful. Currently, all 50 states, plus
Puerto Rico, utilize some form of performance measures.
Twenty-five states surveyed by NASBO indicate that
they actively use performance measures at some stage
in their budgeting and appropriating processes. An
additional eight states attest that while they do
not use PBB per se, they do use some "public
accountability" or "goal /priority building"
processes in their funding decisions. Finally, of
the 50 states and Puerto Rico using some form of performance
measurement, with or without linkages to budgeting
practices, 38 maintain that they do regularly monitor
performance (NASBO, 1999, p. 48).
Use among the States
of Performance Measurement
See National Association of State Budgeting Officers.
(October 1999). Budget processes in the states.
Washington, DC: National Association of State Budgeting
GP…Goal/ Priority Building
Figure 3 above shows, each state uses performance
measurement in a differing or particular way. This
can be ascribed to the different economic, social,
cultural, and political aspects of the states. NASBO
(1999) describes, for example, Minnesota, Oregon and
Texas as having "broad, comprehensive PBB approaches"
that have a long history. Oppositely, Virginia and
California are relatively new states to the practices
of PBB and are more limited in their applications.
more recent study of performance measurement systems
among the states was published by H. Perrin Garsombke
and Jerry Schard in 1999. In this study, 105 questionnaires
were sent to state auditors, controllers or treasurers,
in early 1998, to inquire as to their "extent
of use" and "satisfaction with" performance
measurement. According to their survey results based
on a 60% response rate (63 questionnaires completed
and returned), 67% of state governments used performance
measurement systems "to some degree." Eight
additional states said that while they did not use
performance measurements, they were in the planning
stages to do so. The Garsombke and Schard study also
found that 45% of the states responding to the survey
linked performance measurement, in some way, to budgeting
and planning. Thirty-eight percent of these stated
specifically that they linked performance measures
or incorporated them into their strategic planning
processes (Garsombke and Schard, 1999, pp. 9-10).
set of survey findings was published in the spring
of 2000 by Robert Lee and Robert Burns in the Journal
of Public Budgeting and Finance (see Lee and Burns,
2000). In this article, a determination of the extent
of "advancement" or "backsliding"
of PBB methods and uses was made for the period between
1990 and 1995. One key question of Lee and Burns was:
"What changes occurred between 1990 and 1995
in the use of performance measurement in state budgeting?"
Another critical question was if there was significant
change, "What were the reasons for either advancement
the findings were diverse. The survey results found
that, between 1990 and 1995, states requiring agencies
to submit program effectiveness and efficiency measures
when proposing new programs dropped 16%, from 39 states
requiring this information in 1990, to 31 in 1995.
The survey conversely found that 6% of the states
indicated that they had revised their performance
measures for the same period. Additionally, effectiveness
measures actually utilized in budget documents increased
by 8%, from 19 to 23 states, for the same 1990-1995
period. Efficiency measures used in budget documents
also increased by 8%, from 15 (in 1990) to 19 (in
1995) states (Lee and Barnes, 2000, pp. 38-32).
the literature generally suggests, a few states have
been specifically recognized as using intensively
or well-developed PBB approaches. Among those frequently
mentioned are Arizona, Florida, Minnesota, Oregon,
Texas, Virginia and North Carolina.
for example, uses a budgeting system that combines
strategic planning, performance measurement, and program
evaluation. The system, called Program Authorization
Reviews (PAR), requires all agencies to submit a one-page
overview of its performance measurements for the upcoming
fiscal year along with its regular detailed budget
request. The recent FY 1998-1999 budget also required
an extensive PAR budget submittal from 14 select agencies
that included complete performance information and
data on 30 programs and subprograms (Freidman, 1997,
specifically, PAR required these 14 agencies to answer
four main questions in their budget submittals. One
question addressed how programs and their objectives
related to their agency mission statements. Another
question asked was how efficient and effective programs
were in carrying out their activities and in attaining
their objectives. The two remaining questions inquired
as to how well programs measured up in comparing expected
to actual results and, additionally, as to the use
of cost-effective alternatives (Freidman, 1997, p.17).
PBB approach has been applauded for not “overloading”
its budget document with superfluous performance information
and data. Providing decision-makers with a manageable,
yet thorough, set of performance data for making good
spending choices is a time-consuming and hard won
endeavor. Arizona appears to have proven that this
can be done.
state that has worked diligently to establish a working,
useful PBB system is North Carolina. In 1991 North
Carolina instituted a statewide agency performance-based
budget system. Most recently, its FY 1998-1999 budget
request process produced performance measures for
more than 3,000 agency or departmental activities.
These measures were “outcome focused,” with particular
emphasis placed on the “effectiveness” of programs
(Freidman., 1997, p. 18).
Carolina’s PBB system includes all state and federally
funded activities. These activities are grouped into
ten mammoth programmatic areas that are distinguished
mainly by program recipients and analogous program
outcomes. These ten broad program categories include,
for instance, area designations such as “human services,”
“education,” “commerce,” and “justice and public safety.”
By using this budget method, decision-makers are forced
to make spending choices based on programs – their
activities and performance – without relation necessarily
to agency jurisdictions or boundaries (Freidman, 1997,
Carolina decision-makers are, therefore, freed – relatively
speaking -- from contrived budget decisions due to
agency advocacy or similar organizational influences.
This is perceived as a plus, since concentration and
judgments about budgets are oriented towards programs
and results, not by favored or well-liked agency lobbyists
this juncture, a few brief observations on "choosing"
performance measures will be useful. In nearly all
cases, performance-based literature speaks to the
necessity and priority of establishing "specific
criteria" to discriminate between "good"
measures and "bad" ones. Based on common
sense and practical experience, most observers and
experts believe that in order to have a truly successful
PBB system, performance measures must meet pre-established,
written criteria that are set early in planning stages
prior to actual PBB implementation. (See Mikesell,
1999, p. 189; Lee and Johnson, 1998, pp. 103-104;
Joyce, 1999, pp. 613-615; Grizzle, 2001, pp. 357-361.)
is meant by performance criteria? Criteria are a set
of standards, guidelines or "yardsticks"
by which performance measures are determined to be
adequate or satisfactory. These criteria, for example,
would include such standards as "relevance,"
"validity," "clarity," and so
criterion (of the total set of criteria) would be
individually defined and, in turn, this definition
would aid the budgeter or decision-maker with the
task of determining if each performance measure satisfactorily
met the criterion. For instance, if "relevance"
is a criterion, these questions would be asked of
the performance measure: "Is this information
or data constituting the performance measure useful
to decision-makers?" "Is there a clear,
logical connection between the measure and the program
to determine the sufficiency and adequacy of performance
measures have been identified fairly well given the
history, relatively speaking, of PBB. All governments,
however, decide which set of criteria most benefits
their unique circumstances and preferences. Governments
also usually place emphasis on some individual criterion
or criteria more than others. Thusly, one state's
PBB system may have eight criteria, another a dozen
or more. In one state, "validity," clarity,"
and "quantification" may be much more valued
or weighted greater than other criteria, while in
another state these may be important but other criteria
such as "reliability" and "accuracy"
may rank much higher in terms of significance to decision-makers.
for Selecting "Good" Measures
Relevance-- Is this information useful to decision-makers?
Is there a clear, logical relationship between
the measure and the program objective?
Validity-- Does it accurately gauge what is supposed to be
Significance -- Is it
important? Is it worth measuring?
Uniqueness -- Does it contain information not supplied by any
Clarity-- Will decision-makers understand it? Is it lucid,
distinct and intelligible?
Timeliness -- Will the
information be collected in time for decision-makers
to use it for making policy?
Reliability -- Will the
data be collected from the same source in the
same way every time?
Quantification-- Can the measure be expressed in the form of a numerical
value? Does it measure quantity and/or quality
of service provided?
Practicality-- Can the data be collected and processed on a regular
basis, at a reasonable cost, without undue strain
on staff resources?
Completeness-- Does the measure(s) cover all major elements of
the program objective(s)?
Control-- Is the activity or objective being measured within
the control, authority or power of the agency?
See State Reorganization Commission. (1984). Program
Performance Workshop Manual. Columbia, SC:
State Reorganization Commission.
study of performance measurement criteria consisted
of a review of 24 books and articles to determine
which criteria were most frequently used and how they
were generally defined. The criterion most cited,
and presumably most used, was "validity."
It was cited 15 times and was defined commonly as
a "measure that logically represented the concept
or construct to be measured" (Grizzle, 2001,
criteria most frequently cited were "clarity"
(14), "reliability" (13), "relevance
to objectives or decisions" (11), "accuracy"
(10), and "sensitivity" (8). The criterion
sensitivity was defined as "the distinguishing
power of the measurement procedure or operation that
is ample enough to capture the change and diversity
that occurs in the object, event, or situation being
measured" (Grizzle, 2001, pp. 358-359).
study further identified a total of 22 separate criteria
used to determine the sufficiency of performance measures.
Of the varying criteria, the study's author organized
them into broad categories that captured their group-type
similarities. For example, "practicality"
was a category used to capture the sense of both "cost"
and "ease of data collection." Another category
was "utility-user independent" which
encompassed the criteria of "comparability,"
"sensitivity," and "clarity."
Alternatively, the category of "utility-user
dependent" included "relevance,"
"timeliness," and "controllability"
(Grizzle, 2001, pp. 358-359).
is strategic planning? Drawing on Olsen and Edie (1982),
I define strategic planning as a disciplined effort
to produce fundamental decisions and actions that
shape and guide what an organization is, what it does,
and why it does it. To deliver the best results, strategic
planning requires broad yet effective information
gathering, development and exploration of strategic
alternatives, and an emphasis on future implications
of present decisions. Strategic planning can help
facilitate communication and participation, accommodate
divergent interests and values, foster wise and reasonably
analytic decision making, and promote successful implementation.
In short, at its best strategic planning can prompt
in organizations the kind of imagination--and commitment--that
psychotherapist and theologian Thomas More thinks
are necessary to deal with individuals' life conundrums.
M. Bryson, 1995
budgeting, when linked to strategic planning methodologies,
is a powerful and advantageous decision-making tool.
Today, many states are utilizing PBB systems along
with strategic planning, recognizing that the two
systems taken or applied together are a logical and
practical fit. In fact, according to the survey conducted
in 1998 by Garsombke and Schrad, 38% of states using
PBB systems were also combining such efforts with
formal strategic planning practices (Garsombke and
Schrad, 1999, p. 10.)
planning is a process of developing a long-term plan
to guide an organization, for example, a state agency,
department or commission, towards a clearly articulated
mission, goals and objectives. It is a process of
assessing where an organization is presently, ascertaining
the challenges and opportunities that present themselves,
and determining what destination is most desirable
and how to get there. PBB adds or emphasizes the critical,
additional step of measuring progress (see Southern
Growth Policies Board, 1996, pp. 6-7).
review of recent literature suggests that states could
benefit from the strategic planning process mainly
for the reason that the development of multi-year
policy plans could link present situations or circumstances
with a more meaningful vision of the future. In other
words, a strategic planning process would enable,
let's say, the governor and the legislature, to understand
more clearly where their state is now and where they
would like it to be in the future. Basically, a strategic
plan would indicate to state leaders -- more lucidly
-- what is state government’s (or more particularly
an agency's) overall mission, its goals and objectives,
its strategic or programmatic activities, and its
resources (people, monies, technologies, facilities,
etc.). This process would further allow state officials
to have a solid grasp of the state’s on-going performance
and what results are actually being achieved (Young,
1998, p. 13). More specifically, the benefits of a
statewide strategic planning process would be:
establishment of a long-range, unified and broad
direction – a “plan” – for state government in the
policy areas of education, health and human services,
transportation, public safety, commerce, natural
resources, and criminal justice.
facilitation of the governor and legislature in
being more responsive and accountable to the current
and emerging needs of their state.
allocation of limited resources, via the state’s
budgetary process, in a more rational, and “results-producing”
improvement of communication among all state leaders
and better coordination of the “omnibus” policy/fiscal
measurement of the progress of statewide strategic
efforts, by all planning participants, and the updating
or revision of these efforts as warranted (Young,
are the key elements or steps of the strategic planning
process? Strategic planning is simply a formal yet
flexible process to determine where an organization
is currently and where it should be in the future.
There is agreement, as evidenced in recent literature,
in both theory and practice, on the general steps
that are involved in a strategic planning process.
By and large, there are six steps that can be summarized
“environmental scan” or a situational analysis of
the strengths and weaknesses of one’s organization,
including an analysis of external threats and opportunities;
formation of or the “putting into words” of a vision
for the future and an accompanying mission statement
which defines the fundamental purpose of an organization,
its values, and its boundaries;
development of general goals, specific targets or
objectives, and performance measurements to gauge
set of strategies to indicate what will be done
to accomplish its goals and objectives;
implementation of detailed operational or tactical
plans that provide for staff assignments and schedules;
evaluation component to monitor and revise the overall
strategic approach as it unfolds (Southern Growth
Policies Board, 1996, pp. 6-7).
Council of State Governments published a paper, in
1997, examining state trends and models of state strategic
planning and "benchmarking" (see Council
of State Governments, April 1997). Several statewide
planning initiatives were highlighted including those
in Utah, Oregon, Minnesota, Florida, Texas, Connecticut,
Pennsylvania, Kentucky and Michigan. The Council found
that each state’s strategic planning process contained
unique characteristics. Most states did, however,
attempt to set into place the key steps that constitute,
by generally accepted practices, a strategic plan.
Planning Process Model
Where Are We Now?
Mission and Principles
Inventory/ Environmental Scan
Statement of the Organization’s Purpose
Values, Actions to Achieve Mission
Do We Want To Be?
Goals and Objectives
Image of Desired Future
Desired Result After 3 or More YearsSpecific
& Measurable Targets for Accomplishment
Do We Get There?
Used to Accomplish Goals and
Do We Measure
to Measure Results
Monitor Progress & Compile Management
See Office of Strategic Planning and Budgeting. (1995).Managing for results: strategic planning and
performance measurement handbook. Phoenix, AR:
Office of Strategic Planning and Budgeting.
states that are particularly noted for their long-run
and exceptional efforts at linking PBB with strategic
planning methods are Oregon and Minnesota. A few comments
on these two state's experiences will be worthwhile.
is recognized, arguably, as the most sophisticated
or highly evolved state in terms of model strategic
planning and PBB initiatives. Called "Oregon
Benchmarks" – and alternately "Oregon
Shines" – the model system was introduced in
1989 when over a hundred citizens and policy-makers
came together to develop a multi-year strategic plan
for the state. The state legislature also created
that year the Oregon Progress Board to maintain, revise
and oversee the implementation of the comprehensive
state's strategic plan "well into the twenty-first
century" (retrieved January 3, 2003 from http://www.econ.state.or.us/opb/jttestim.htm).
1991, with plentiful input from all levels of government
and the people of Oregon, the Progress Board adopted
158 indices or "benchmarks" that they considered
of the greatest priority to the progress of the state.
These measures were oriented to performance and not
effort. The Progress Board was interested, for example,
not in measuring or monitoring school expenditures
to assess school performance, but rather, in measuring
student achievement as predicated on standardized
testing (retrieved January 3, 2003 from http://www.econ.state.or.us/opb/jttestim.htm).
1994, the Progress Board implemented a program to
facilitate performance by restructuring many of the
state's intergovernmental and programmatic relationships.
For instance, it managed to relax federal guidelines
and restrictions to implement more efficiently and
effectively programs dealing with child services,
disabled employees, wildlife preservation, juvenile
justice, and welfare recipients. As of 1997, 32 agencies
were participating in the Progress Board's "restructuring"
program (retrieved January 3, 2003 from http://www.econ.state.or.us/opb/jttestim.htm).
1997, Oregon's legislature mandated that the Progress
Board's strategic planning/PBB process be a permanent
fixture of the state's government. The law required
that the Progress Board report to the state legislature
as to the general status of efforts in strategic planning
and PBB among Oregon's agencies. A detailed and "complete
update" of Oregon Benchmarks is to be completed
and reported to the legislature every six years (retrieved
January 3, 2003 from http://www.econ.state.or.us/opb/jttestim.htm).
agencies in Oregon's state government are required
to develop "results-oriented" performance
measures that are tied directly to both agency strategic
plans and budgets. Input is encouraged not only from
internal agency personnel but also from other state
agencies, elected officials, service delivery clients,
interest groups, and the public at-large. Participants
and observers alike believe that this input is invaluable
to the planning and budgeting process and ultimately
reflects the values and priorities of all Oregonians
(retrieved January 3, 2003 from http://www.econ.state.or.us/opb/jttestim.htm).
is another state that has received much attention
with regard to its strategic planning and performance
measurement efforts. "Minnesota Milestones"
was begun in 1991 as a planning/performance measurement
system that established some 20 goals and 79 milestones
to measure progress. Termed a "citizen-based
planning process," literally thousands of Minnesotans
came together in a series of meetings across the state,
and through participation in mail-in surveys, to contribute
to the first Minnesota Milestones document. The planning
document, containing dozens of measurement indicators,
was extended to a 30-year time frame with designated
yearly milestones to indicate progress. According
to the Minnesota Planning Division, the planning document
was and continues to be a centerpiece, for developing
the state's budget recommendations (retrieved January
3, 2003 from http://www.mnplan.state.mn.us/pdf/mm98-2.pdf).
1997 and 1998, the public came together once again
to review, update and adjust the Minnesota Milestones'
master plan. The result was the determination of 19
major goals and 70 measurement indicators. According
to Minnesota's planning division, from 1990 to 1998,
the state has achieved marked success on seven goals,
relapsed slightly on two, and had "mixed"
results on five goals (retrieved January 3, 2003
Assessing Progress of the
Goal Categorical AreasThe State of Minnesota
what has Minnesota learned from its decade-long effort
at implementing a strategic planning and performance-based
budget system? The Minnesota Planning Office says
that on several fronts it is still too early to tell.
What they do know, however, is two things with certainty.
One is that forming a mission or vision with broad
public participation works for them. Policy experts,
public administrators, public officials and the citizenry
have worked well together and have been able to reach
agreement on milestones. On the down side, the Planning
Office admits that some legislators are not actively
involved in using the planning/measurement process.
Though some legislators were involved in the planning
series of meetings, other legislators are hesitant
or simply lax in using the Minnesota Milestones in
budget decision-making. Many believe that this hesitancy
is based on partisan politics. Nevertheless, many
involved in the budget process are optimistic that
political wrangling can be minimized in certain situations
where good performance data are available, and cannot
be ignored (retrieved January 3, 2003 from http://www.mnplan.state.mn.us/pdf/mm98-2.pdf).
discussed earlier, performance-based budgeting has
been around for some time. Indeed, the late 1980s
and early 1990s saw a renewed interest in PBB processes
at all levels of government. The experiences of cities,
states and the federal government have resulted in
an amassed compilation of literature on lessons learned
from devising and implementing PBB systems, the "dos
and don'ts" so to speak. In this short analysis,
a few of these lessons are conveyed for those who
wish to understand or perhaps profit from these past
hard-won trials and experiences.
National Performance Review
1993, President Clinton and Vice President Gore began
a government-wide review process to overhaul federal
government administrative and management practices.
The "National Performance Review" (NPR)
was aimed at "reinventing government" as
based on the premises discussed in the much celebrated
David Osborne and Ted Gaebler book published in 1992
(see Osborne and Gaebler, 1992). NPR hoped to foster
the Osborne and Gaebler reinvention themes of cooperative
and systemic benchmarking, identifying best practices,
adapting them, and replicating them where and when
years later, in early 1997, NPR published a study
on best practices documenting problems with the bureaucracy
and how to go about fixing them. In this comprehensive
study, entitled "Benchmarking: Best Practices
in Customer Driven Strategic Planning," was a
mini-report on best practices in performance measurement.
This mini-report summarized best practices across
the U.S., discussed a model methodology for establishing
and using performance measurements, and offered strategies
on how to implement a successful PBB process. (See
identified nine findings -- "lessons learned"
-- that were critical in its study of performance
measurement. The following summary touches upon each
of these findings.
finding or lesson is that "leadership"
is of the utmost importance in developing and implementing
PBB systems. Agency heads, senior managers, line
managers, supervisors, and other management personnel,
including commission or board members if applicable,
should be actively involved in PBB efforts. Agency
leadership should be clearly present at each stage
of developing and executing the organization's vision,
mission, objectives, strategies, and performance
organized, comprehensible "framework"
for PBB should be established, according to NPR.
As such, governmental entities should put together
a written, easily understood document that shows
how the PBB process works and a calendar that clearly
states when milestones, target dates and so forth,
should occur. This framework should link logically
with missions, goals and objectives.
found that those governmental entities which foster
communications, both externally and internally,
are the most successful ones. Agency and departmental
employees should regularly and frequently communicate
with one another regardless of specific duties and
responsibilities within the organization. Agencies
should also take proactive steps to communicate
with stakeholders and agency clients -- inquiring
as to needs, satisfaction with services, etc.
was found that accountability is a primary factor
in PBB initiatives. Government employees must not
only be held accountable for "actions"
but also for "results." PBB is crucial
for determining accountability.
it was discovered, is about "intelligence gathering,"
not simply data collection. PBB indices must be
substantive, informative and meaningful to all participants
found additionally that rewards, whether in the
form of compensation or recognition, should be related
to performance. If employees are held accountable
for achieving certain measures, successful attainment
should result clearly or unambiguously in the receipt
of any rewards.
and without contradiction, NPR discovered that PBB
systems should not be punitive in nature. Non-attainment
of agency performance goals and objectives should
be constructively reviewed, problems should be identified,
adjustments should be made, and re-implementation
should be instituted, as necessary.
and the accomplishment of results should especially
be communicated to all participants and to the public
at-large. Governmental entities should report the
outcomes and advances made through all media, including
final general finding of the NPR study is that PBB
systems are not a panacea. PBB systems are useful
to managers and decision-makers for accumulating
the best information and data possible, accurately
measuring performance, and readjusting and fine-tuning
programs and agency operations and outcomes. PBB
processes are not a cure-all to make all things
happen in either a precise way or a special elixir
to achieve complete and flawless results. PBB is
a process that requires continuous upkeep and adaptation.
It will never deliver perfection, only improvements
based on the efforts of all participants (retrieved
January 2, 2003 from http://www.npr.gov/library/papers/benchmkr/nprbook.html).
Why is Performance Budgeting
ACCOUNTABILITY TO THE PUBLIC. In
the public sector, resources are borrowed from
the shareholders. As stewards of the resources
,governments are required to deliver some product
DRIVES REDESIGN OF PROGRAM (FOCUSES
ON IMPROVEMENT). Performance
supported budgeting can be a driving force in
the redesign of programs and the driving force
in integration within agencies, and across agencies.
If it is focused on improving, there will be a
more effective overall plan.
RATIONALIZE BUDGET ALLOCATIONS (USES PERFORMANCE
AS A BASIS OF EVIDENCE). Performance
budgeting ensures that performance information
is part of the budget and resource allocation
debate. Performance information needs to be, in
some manner, part of the resource allocation.
UNDERSTANDING OF CROSSCUTTING PROGRAMS IN GOVERNMENT. It
is possible to better understand total costs and
benefits of comparative crosscutting programs
if they can indeed frame them to be understandable
to each other, outsiders, and stakeholders.
AGENCIES LINK THEIR DAILY ACTIVITIES TO OVERALL
GOVERNMENT OUTCOMES AND SIMILAR ACTIVITIES OF
OTHER AGENCIES. Tracking
costs and performance-based budgets against goals
helps to understand roles in achieving government
goals. Process outputs need to be identified and
measured at each step, then tracked against the
outputs. Also, each agency has to reassess its
ALIGN GOVERNMENT SPENDING WITH OVERALL GOALS. There
needs to be an assurance that the resources provided
by the public are spent sensibly for public purposes.
Performance has to be an integral part of that
equation of getting it done right.
COST EFFECTIVENESS BETWEEN PROGRAMS. More
can be achieved if we know which activities are
most effective. It is a cost-effectiveness argument
where there are similar measures for different
Newcomer and Sharon Caudle presented a research paper,
in 1999, at the American Society for Public Administration's
(ASPA) 60th National Conference in Orlando,
Florida. The paper dealt with the subject of support
("technical" and "cultural") for
performance-based measurement and management systems
and, additionally, summarized lessons learned as relates
to "what is currently happening at various levels
of government in the United States" (see Newcomer
and Caudle, 1999, pp. 1-15).
Newcomer and Caudle identified ten lessons learned
about performance-based measurement and management
systems. Some of these lessons included those found
by NPR such as do communicate to all participants
and stakeholders, do provide incentives or
rewards for using PBB processes correctly, and do
ensure committed leadership at all phases of PBB.
Other lessons disclosed in the ASPA presentation,
and not addressed fully or only in a limited fashion
in the NPR study, include:
strategic planning the PBB point of departure. Strategic
planning embraces all the requisite mechanisms that
are essential to initiating successfully a PBB approach
-- vision and mission statements, objectives, environmental
scans, action strategies, etc.
genuine "partnerships" with stakeholders.
To make PBB work, partnering with customers and
other stakeholders is important. Stakeholders, therefore,
must be real participants and not given just lip
service. PBB cannot be viewed as an "insiders'
game," it must be an open process where program
recipients or clients, interest groups, and the
public generally participate, in some way, in goal
setting, quality measurement, and improvement strategies.
responsibilities, especially if these are split
among differing levels of government. If responsibilities
for performance activities and results are divided
among federal, state and local authorities, sort
out early in the PBB developmental stages who will
be accountable for what. If changes in activities
occur during implementation, adjust responsibilities
on the "right" performance objectives
and measures. PBB systems should focus on only vital
goals, targets, objectives, and measures. Too much
or superfluous information and data will "overload"
the system and ultimately "turn off" decision-makers.
long-term. First, PBB systems take time to implement.
A minimum of three to four years should be considered
to get a system up and running reasonably well.
Second, performance goals and objectives cannot
always be achieved overnight. Depending on the complexity
of the organizational or agency program, its clientele,
funding levels and so on, it may take several years
to reach the goals envisioned. Long-term and strategic
planning methods will enhance an agency's or department's
chances on achieving future successes.
a centrally directed approach and provide sufficient
resources. PBB systems are most successful when
they are coordinated and assisted by a government's
central administrative unit. This approach lends
itself to consistency, coherence and uniformity
of PBB processes among several and variant governmental
entities. It also facilitates in providing on-demand,
or "timely," technical expertise or assistance
to individual agencies at critical stages during
PBB implementation. Adequate resources (sufficient
staff, equipment, and funds) are essential to PBB
success, second only perhaps to the requirement
of "good and sustained leadership" (Newcomer
and Caudle, 1999, pp. 1-15).
Conference of State Legislatures
published in 1993, Dianna Gordon's short article for
the National Conference of State Legislature's (NCSL)
magazine, State Legislatures, is recognized
among many state budget practitioners as an excellent
overview of both the dos and don'ts of budget reform.
(Gordon, October 1993, p. 17). Based on interviews
with the top budget officials from the states of Arizona,
Texas, and Mississippi, Gordon deciphered what appeared
to be working as well as what seemed not to be working
for these three states as they reformed their respective
budget systems in the early 1990s.
"dos" are relegated to nine brief observations,
lessons or statements by Gordon. For example, states
should adopt realistic expectations when introducing
budget reforms. Some states expect too much too soon,
and are quickly disappointed when things fail to meet
anticipated outcomes. Also, Gordon culled from the
three states that it is imperative to study other
state experiences in undertaking PBB and budget reform.
Such a review will have all the usual benefits, i.e.,
save money, time and effort. Additionally, using a
pilot approach to budget reform will provide a testing
ground for all aspects of the budget reform initiative.
It is a more manageable and reliable way to get sophisticated
PBB systems, for example, off to a good start. Other
dos include: Consider the application of new reform
ideas or concepts on an individual basis; provide
training related to budget reform to all participants,
especially lawmakers; and be prepared to develop more
reliable data and information as the budget reform
effort evolves (Gordon, 1993, p.17).
"don'ts" stated by Gordon are, in their
own fashion, sound advice of what to avoid or not
do. For instance, states don't want to repeat the
blunders, errors and oversights that other states
have made. "To err is human, but to repeat the
mistakes of other states is foolish and wasteful."
Moreover Gordon says don't go "overboard"
with complex and "impressive sounding" reform
schemes. Focus rather on budget reforms, such as PBB
and its various components, which seem right for the
needs and wants of your state's particular set of
circumstances and players. "Things that work
in some other states may not be right for your state."
Also, don't give up on budget reform mechanisms that
are working. If they work, then "build on them."
States such as Texas and Arizona agree that persistence
with and expansion of those things that work pays
off in the short and long runs. And finally, Gordon
re-emphasizes a recurrent admonition among budget
reform practitioners: avoid or don't attempt a "massive
overhaul" of the current system. Stalinist-type
schemes and directives to reform complete budget systems
within a given timeframe are doomed to failure. Realistic
and well-planned approaches to budget reform are a
more sensible and ultimately workable design for governments
(Gordon, 1993, p.17).
1998, the California Legislative Analyst's Office
(LAO) published a report on the state's performance
budgeting pilot project. California's governor initiated
the pilot project among select agencies in January
1993 after he had declared the state's budget process
as "dysfunctional." The legislature followed
the governor's remarks and actions by establishing
in law a PBB system and providing for periodic review.
Hence, the LAO study comes five years after the beginning
of the pilot project. (Cornett, 1998, pp. 1-4).
LAO findings speak to the lessons learned from 1993
through 1998. Many of the findings are duplicative
of the NPR, ASPA, and NCSL conclusions. These include,
by way of illustration, the previous findings "that
focusing on missions, goals and objectives and linking
these to performance measurement is a time-consuming
and difficult task that requires ample agency resources."
The LAO, however, did emphasize some conclusions not
fully touched upon or discussed in the other studies
summarized above. For example, California's pilot
agencies have observed a heightened sense of enthusiasm
among their employees. This is due, according to the
LAO, to their ability to clearly recognize what they
are aiming for in terms of program goals and objectives,
and their new ability to gauge their performance in
achieving them. Pilot participants have likewise welcomed
the new managerial flexibility resulting from relaxed
administrative regulations. Employees are able now
to administer their programs with greater ease and
concentrate on programmatic results rather than excessive,
redundant and tedious compliance requirements (Cornett,
LAO also found that, on the downside, agencies have
not been able to "redirect savings" due
to PBB techniques and implementation. The pilot project
designers had anticipated that any savings due to
PBB analyses would be shifted to other agency needs.
It is not clear that this has, in fact, occurred.
Additionally, the LAO concluded that that – overall
– many remnants of the traditional budgeting approach
have remained despite the best efforts to get decision-makers
to use PBB-generated information and data. The LAO
believes that this will require more time and some
targeted training to special users (Cornett, 1998).
North Dakota Study
North Dakota Legislative Council compiled a listing
of key lessons learned for presentation to that state’s
Budget Committee on Government Finance. Presented
in June of 1998, the list identifies a number of findings
other states have discovered about using PBB systems.
Legislative Council, for instance, discovered that
the fundamental or primary aim of performance-based
budgeting is "not simply to measure but to improve
agency services." The Council found that many
states and their organizational units strive to put
measures in place and that this becomes their focus
rather than actually improving program activities
and results. In other words, some states have become
so caught up in the conceptual framework, methods,
and technicalities of PBB that the process itself,
and its "machinations," take priority over
producing results or improving services (North Dakota
Legislative Council, 1998).
Council also found that a few states have published
their performance measures before anyone had been
instructed on how to use them. This can be catastrophic
and bring the PBB process to a grinding halt in terms
of meaningful usage among decision-makers. States
must give attention when planning and implementing
PBB systems to the "educational process"
that is required to bring about maximum utilization
of performance measures (North Dakota Legislative
and penalty clauses establishing PBB systems may cause
the inaccurate reporting of performance or progress,
according to the Legislative Council. This is so because
state agencies are tempted to present performance
in a favorable light to avoid punishment. Agencies
can do this in several ways including publishing meaningless
workload data, "low-balling" performance
targets or objectives, or presenting data and information
that may be "interesting" but speaks little
to efficiency or effectiveness (North Dakota Legislative
Council, 1998.). Given these and other findings, the
North Dakota Legislative Council recommends seven
important considerations for those states contemplating
the use of PBB:
measures to only the most significant ones.
measures, to the fullest extent possible, to the
baseline data and information for comparison purposes
when assessing data reported.
agencies with mediocre or poor performance rather
than punish them.
that measures are understandable to everyone involved,
especially the governor, legislators and the public.
training to all participants in the PBB process.
the correctness and accuracy of the measures on
a regular basis (North Dakota Legislative Council,
J. M.(1995). Strategic planning for public
and nonprofit organizations: A guide to strengthening
and sustaining organizational achievement. San
Francisco, CA: Jossey-Bass Publishers.
K. (1994). The performance budget revisited: A
report on state budget reform. Legislative Finance
Paper No. 91. Denver, CO: National Conference of State
C. (1998). California's performance budgeting pilot
project: A view from the legislature. Cal-Tax Digest,
of State Governments. (1997l). Managing for success:
A profile of state government for the 21st
century. Lexington, KY: Council of State Governments.
C. S. (1995). Performance measurement and budgeting:
Relearning old truths. Albany, NY: Legislative
Commission on Government Administration.
P. D. (1984). Using performance measurement in
local government: A guide to improving decisions,
performance, and accountability. New York, NY:
Van Nostrand Reinhold Company, Inc.
M. (1997). A guide to developing and using performance
measures in results-based budgeting. Baltimore, MD:
Fiscal Studies Institute.
H. P. & Schrad, J. (1999). Performance measurement
systems: Results from a city and state survey. Government
Finance Review, February.
Accounting Office. (1997). Performance budgeting:
Past initiatives offer insights for GPRA implementation.
Washington, DC: General Accounting Office.
D. (1993). The dos and don’ts of budget reform. State
G. A. (2001). Performance measures for budget justifications:
Developing a selection strategy." In.G.
J. Miller, W. B. Hildreth, & J. Rabin (Eds.),.
Performance-based budgeting Boulder, CO: Westview
P. G. (1999). Performance-based budgeting. In R. T.
Meyers (Ed.), Handbook of government budgeting.
San Francisco, CA: Jossey-Bass Publishers.
Jr., R.D. and Burns, R. C. (2000). Performance measurement
in state budgeting: Advancement and backsliding from
1990 to 1995. Public Budgeting and Finance, Spring.
Jr., R. D. and Johnson, R. W. (1998). Public budgeting
systems. (6th ed.). Gaithersburg, MD:
Aspen Publishers, Inc.
J. (1999). Fiscal administration: Analysis and
applications for the public sector. (5th
ed.). Fort Worth, TX: Harcourt Brace College Publishers.
G. J., Hildreth, W. B., and Rabin, J. (2001). Performance-based
budgeting. Boulder, CO: Westview Press.
Association of State Budgeting Officers. (1999). Budget
processes in the states. Washington, DC: Author.
Conference of State Legislatures. (1999). Legislative
budget procedures: A guide to appropriations and budget
processes in the states, commonwealths and territories.
Denver, CO: Author.
K. & Caudle, S. (1999, April). Structural support
for performance-based management: Cracks in the foundation.
Paper presented at the American Society for Public
Administration 60th National Conference,
D. & Gaebler, T. (1993). Reinventing government:
How the entrepreneurial spirit is transforming the
public sector. New York: Dutton/Plume.
Growth Policies Board. (1996). Results-oriented
government: A guide to strategic planning and performance
measurement in the public sector. Research Triangle,
R. & Hayes, K. (1993, November). Performance
budgeting and the states. Presented to the Nebraska
Legislative Appropriations Committee, Lincoln, NB.
R. D. (1998). A statewide strategic planning process:
The need for the state's leadership in South Carolina
to plan ahead. Unpublished manuscript. University
of South Carolina.
D. Young has been a senior research associate with
the Institute for Public Service and Policy Research
at the University of South Carolina since 1998. He
conducts research on a myriad of public policy and
public administration topics relating to state and
local governments. Mr. Young previously worked with
the Senate of South Carolina and the State Reorganization
Commission in various positions of research. Prior
to this, Mr. Young taught at the University of Louisville,
Hanover College, Indiana University Southeast, and
the University of Kentucky Campus in Louisville, Kentucky.
Mr. Young has a B.A. (1973) and M.A. (1975) from the
University of Louisville. Mr. Young has written and
published several papers and reports on public policy
issues and public management theory, including A
Brief Guide to State Government in South Carolina
(1999), A Guide to the General Assembly of South
Carolina (2000), and Perspectives on Public
Budgeting: Budgets, Reforms, Performance-based Systems,
Politics and Selected State Experiences (2001).
Mr. Young has recently published State Reorganization
in South Carolina: Theories, History, Practices, and
Further Implications (2002). Along with Dr. Luther
F. Carter, President of Francis Marion University,
he is author of The Governor: Powers, Practices,
Roles and the South Carolina Experience, scheduled
for publication in 2003. Mr. Young heads up the Institute’s
Project for Excellence in Government, a project to
foster and promote scholarly research on Baldrige
concepts and criteria as they relate to state and
local governments. Mr. Young can be contacted at YOUNG-RICHARD@sc.edu
Richard D. Young, Editor in Chief Public Policy & Practice
Institute for Public Service and
University of South Carolina
Columbia, SC 29208
Phone: (803) 777-0453
Fax: (803) 777-4575