Customer Driven Focus and Excellence in the

Public Sector

 

By

 

Richard D. Young

 

 

Introduction

 

This paper is about customer excellence. More specifically, it is about the Baldrige National Quality Program’s criterion for achieving organizational success through “customer and market focus.” According to Baldrige, customer excellence involves ascertaining and attaining three key things:

1)      Customer and “Market” Knowledge. This involves an organization’s determination of the requirements of customers and markets (customer groups). How are customers and groups targeted? How are customers listened to and what is learned from them in this “listening” experience? How is this knowledge of customer preferences and expectations kept current?

2)      Quality Customer Relationships. This encompasses an organization’s relations (dealings, associations, interactions, etc.) with customers at every point of contact. Simply put, how are customers acquired and, time after time, satisfied? More importantly, how do customers become “loyal to the organization” (repeat customers, customers who make positive referrals, etc.)? How are customers’ complaints resolved and, in turn, how are these complaints (and their various resolutions) used to improve an organization’s performance? How are customer relationships regularly maintained and improved?

3)      Determination of Customer Satisfaction. Closely related to # 2 (above), this deals with an organization’s assessment of customer approval and fulfillment. How are an organization’s customers deemed satisfied or dissatisfied? More specifically, what measurements of satisfaction are used by an organization? Do such measurements provide indications or an understanding of changing customer needs? What follow-up and feedback approaches are used to ascertain customer satisfaction? Is “benchmarking” utilized to assist an organization to measure satisfaction? How are customer satisfaction determination methods updated? (See http://www.quality.nist.gov).

 

Figure 1

Customer Excellence Process Components

Knowledge of

Customer

Requirements

 

Customer

Satisfaction & Loyalty

 

Customer

Relations

Interface

 
 

 

 

 

 

 

 

 


Public sector organizations can benefit by proactively taking into consideration these Baldrige principles of customer excellence. The understanding, attracting, and satisfying of public needs and preferences is crucial, according to current literature, to the success of governmental programs and organizations. In fact, it is argued by many public management and administration experts that the citizenry and its needs, as customers desiring non-private services, is of the greatest civic importance—the centerpiece—for any governmental action taken, service delivered, or product received. Good government is government that best meets the wants and penchants of the public. Such a government is a government that instills public pride, trust and confidence.

 

Yet government has much to do to improve its image among the people it serves. Consider for example the survey results reported in the National Performance Review’s (NPR) Putting Customers First: Standards for serving the American People. The report states that confidence in government to address satisfactorily the public’s needs and problems has fallen 58%, from roughly 75% in 1963 to a low of 17% in 1993. (NPR, 1997, p. 1). More recently, the Brookings Institute found that:

Trust in government has been eroding since the 1960s. In the aftermath of September 11, however, this long-term trend was sharply reversed, with 57 percent of Americans saying that they trusted the federal government to do what is right just about always or most of the time. By May, this number had dropped 17 percentage points, to 40 percent. (Brookings Institute, January 1, 2002, available at http://www.brook.edu/comm/news/20020530post911surge.htm).

 

With this on-going lack of confidence prevalent among the public, studies indicate that government (at all levels) in the United States needs to “reinvent” itself. The crux of this reinvention is to improve customer or constituency service and relations. The public appears dissatisfied with government and the NPR, as have other studies since, believes that steps need to be taken to restore public confidence. This is perhaps why in 1997 that President Clinton issued Executive Order 12862, which called for “putting customers first” and being resolute in the establishment of a “customer-driven government” that equals or surpasses private or corporate standards of excellence. (NPR, 1997, p. 1).

Figure 2

Executive Order 12862—Overview

Setting Customer Service Standards

· Identify customers who are, or should be, served by an agency, department, or other unit of government.

· Survey customers as to the type and quality of services they need and their degree of satisfaction.

· Make customer service standards known to all and assess results based on these standards.

· Use benchmarking techniques for comparative purposes to enhance service standards.

· Survey line-personnel to determine limitations for quality service delivery and solicit ideas for improvements.

· As possible, provide customers with a menu of services to address needs appropriately.

· Make services easily accessible and provide adequate complaint processes.

· Address customer complaints promptly, courteously, and satisfactorily.

Source: National Performance Review, 1997, p. 1.

 

As a result of the NPR study, presidential intervention, and increased awareness and understanding on state and local levels as well, now, in the early part of the 21st century, agencies and governmental units of every kind are becoming more and more focused on customer excellence. For example, at the federal level, agencies like Transportation and Commerce, the Department of Energy, and the U.S. Department of Education are all setting hundreds of “customer service standards” to improve public customer relations and to attain greater customer satisfaction.

 

Indeed, early successes can be pointed out. For example, DALBAR Financial Services Inc. has designated the Social Security Administration’s customer telephone service calling system “the best among world class number systems among private or public sectors.” (DALBAR Financial Services, May 3, 1995, press release).

 

State government in South Carolina is also making strides in focusing on customer excellence. The Budget and Control Board, for instance, has adopted the Baldrige criteria and, in particular, the guiding principle “to provide consistently outstanding and excellent customer services, as defined by customers, and to improve constantly this customer service process.” (Sponhour, 2001, p. 5). In January 2002, 30 top management BCB staff attended a two-day assessment planning session conducted by David McClaskey, a nationally recognized Baldrige expert. The session dealt with several Baldrige themes, including customer service, and eventually the group participants adopted a rallying cry “We Make Government Better.” (Ibid.).

 

At the local level of government in South Carolina, a benchmarking project stands out, too, as an exemplar demonstration of measuring and improving services for city residents. In mid-June of 1996, 11 South Carolina municipalities1 embarked on a project to improve performance and customer services through the implementation of several innovative approaches. Called the “South Carolina Municipal Benchmarking Project,” the results so far indicate a greater comparative understanding of service delivery in fire, police, and solid waste services across local jurisdictions. Of several categories of measures being employed in the project to determine efficiency and effectiveness, one such category is “quality measures.” Quality measures determine how well municipal services are meeting the needs and expectations of customers and stakeholders; e.g., percentage of citizens who indicate they are satisfied with fire services, percentage of citizens who indicate they feel safe or very safe in their neighborhood at night, and so on. (Berger, 2002, p. 28).

 

Customer and “Market” Knowledge

 

In this section, concentration will be given as to how an organization assesses customer and market segment requirements, desires or preferences (Baldrige Criterion 3.1). As stated in the literature, the aim of an organization should include the identification of customers and customer groups and their changing needs, by using various data-gathering approaches, in order to make certain the on-going relevance of governmental services and products. To accomplish this, the Baldrige program states that three questions should be asked: “How do you target customers and accompanying market or group segments? How do you listen and learn about customer requirements or needs? How do you keep current these customer targeting and listening approaches?” (NIST, 2002, p. 16).

 

Targeting Customers and Markets

 

Osborne and Pasterick (1997) state that typically public sector customers are defined as  “an individual or entity that is directly served by a governmental department or agency” (p. 182). But this definition, they believe, is too broad. Osborne and Pasterick think some distinctions should be made. They state that “primary customers” are a person or group that an organization is chiefly designed to aid or help. “Secondary customers” are other persons or groups an organization is to assist indirectly. “Compliers” are persons or segments of the population that are required to comply with laws and regulations; e.g., developers who are required to comply with permitting agencies, or utilities that are overseen by regulatory agencies. And finally, “stakeholders” are a remaining distinction among customers; namely, persons or groups that have “an interest” in the structure and performance of an organization. This would include, for example, teachers and public school systems, or bankers and government financial oversight boards. (Ibid.).

 

Figure 3

Customer Typologies

Text Box: Primary Customers
Text Box: Secondary Customers
Text Box: Compliers
Text Box: Stakeholders
 

 

 


Thus, while stating that secondary customers, compliers, and stakeholders are important in their own right, targeting primary customers is and should be the essential task for a government organization. Indeed, many governmental organizations are “haphazard” or indiscriminate in determining their customers. Unfortunately, government officials, management, and line-staff don’t communicate sometimes among themselves, or at least often enough, to determine “who are their customers” and “what are their changing requirements and needs.” The Internal Revenue Service was an example of this failure to reach out and understand and meet shifting customer needs, at least until a few years ago. In the fall of 1997, the IRS began to address numerous taxpayer service complaints and criticisms in dealing with what many consider to be the most onerous of federal agencies. Some of the problems that were cited included: “Trying for hours-on-end to reach a live employee on a service line to answer a single question; being charged penalty and interest for a simple error on your return; or having an error from three years ago snowball into three years’ worth of penalties and interest." (See http://www.house.gov/moranks01/prirs102.htm). Today, thanks to “the efforts of the NPR panel and many of dedicated IRS employees, tens of thousands of American taxpayers have already received better service including extended phone and walk-in hours and special Problem Solving Days.” (Available at http://www.irs.gov/irs/display/0,,i1%3D46&genericId%3D16941,00.html).

Thus, targeting customers and groups involves mainly collecting information and data about their needs and desires. The quality of such information and data depends of course on the type of information sought, and the frequency and methods of collection. Data should be objective and valid. Many sources of information and data are available to a governmental unit or agency about customers. Such sources may include the media, focus groups, customer preference information, feedback from users, satisfaction survey systems, complaints, and so forth.

 

Listening and Learning about Customer Requirements

 

The Baldrige Criterion 3.1, concerning “customer and market knowledge,” places also significant emphasis on listening and learning about customer requirements. Whether it is a federal, state or local agency, people who come into contact with the government “want to be listened to.” They also, according to the literature, want and care about: courteous and respectful treatment, dealing with informed and competent government workers, getting things done quickly, and getting things done right. Government employees, therefore, must ask questions of their customers, find out what is important to them, act deliberatively to ensure resolution to problems, and follow-up to guarantee that the customer is satisfied with service, outcomes and/or results. (See NPR, 1997, pp. 8-12).

 

Indeed, there are several ways of listening and learning. First and foremost is, of course, the direct contact of a customer with an agency representative or worker. But to prepare and build on making this first-hand contact experience, other things can and should be done to enhance listening and learning. This might include:

· Working with focus groups with “demanding” or leading-edge customers;

· Interviewing or surveying “lost” customers;

· Determining and analyzing key factors affecting customers;

· Training line workers in customer listening skills;

· Monitoring external conditions and situations that affect customers and their needs. (Heaphy and Gruska, 1995, p. 116).

 

How do governmental agencies and their workers listen and learn? For example, the Federal Emergency Management Agency (FEMA) mails customer survey cards to people who have applied for assistance. The Veterans Administration conducts customer surveys at all 172 of its hospitals. The National Park Service likewise surveys its visitors and park rangers who are trained in customer service relations. (NPR, 1997, p. 9).

 

State agencies and departments of South Carolina also are taking numerous efforts to put customers—the citizenry—first. For instance, all agencies are required to report on customer focus efforts and results in their annual agency accountability reports. In 2001, the S.C. Department of Revenue (SCDOR) reported implementation of several customer service activities. These included workshops to update taxpayers and tax practitioners on tax law changes and form revisions, a 1-800 Tax Helpline, employment of a Taxpayer Advocate (customer ombudsman), intensive and regular customer service training directed and overseen by a Taxpayer Education Coordinator, monthly customer satisfaction interviews, customer comment cards, and an annual survey conducted by the USC Survey Research Lab as to customer satisfaction. (See http://www.lpitr.state.sc.us/reports/aar2000/r44.doc).

 

The S.C. Department of Parks, Recreation and Tourism is another example of a state agency that is approaching customer focus through proactive listening and learning activities. The agency is doing this by surveying overnight guests to ascertain their level satisfaction and future requirements. Also, agency staff members meet regularly with special interest groups (e.g., equestrian and bike enthusiasts) to assess specific needs and concerns. Additionally, similar to SCDOR efforts, Parks, Recreation and Tourism maintains a toll free customer service number, trains its park rangers and welcome center personnel in customer relations, and conducts an annual customer satisfaction survey. (See http://www.lpitr.state.sc.us/reports/aar2000/p28.doc).

 

At the local level, in 2002, several cities and towns in South Carolina were surveyed by University of South Carolina’s Institute for Public Service and Policy Research, on behalf of its Municipal Benchmarking Project, to make a determination of customer requirements and satisfaction. Generally, customers were asked to rate municipal services from “very poor” to “excellent.” Specific services areas were also addressed as to customer satisfaction; including, law enforcement, fire, and sanitation and recycling. In this way, 19 cities across South Carolina have greater knowledge about customer requirements and their satisfaction with select municipal services.

 

Figure 4

South Carolina Municipal Benchmarking Project—Overview

The South Carolina Municipal Benchmarking Project provides a forum for South Carolina’s cities of varying sizes to share performance information on four service areas: police, fire, solid waste services and parks and recreation. By benchmarking their performance and meeting with counterparts across the state, participants are able to better monitor their performance and learn more efficient and customer-friendly ways of doing business.

Project staff at the Governmental Research and Services unit of the Institute for Public Service and Policy Research (University of South Carolina) has spent the last year focusing on how the performance measurement results are being and can be used by the participating cities. At the end of this year's Project cycle, staff will be better equipped to assist the participating organizations realize the tangible impacts this benchmarking initiative can have on their operations and communities

Source: Available at http://www.iopa.sc.edu/cfg.

 

Keeping Customer Knowledge Current

 

Once a public sector agency or unit has gauged customer needs and preferences (including market or group segments), and customer listening and learning approaches are in place, the agency must keep abreast of changes. According to Baldrige expert and author Mark Graham Brown (2001), keeping knowledge current usually involves:

· The evaluation and improvement continuously of methods to determine customer requirements;

· The conduct of research to identify potential future markets/customers and their needs;

· The identification of potential customers or customers of competitors (p. 25).

 

Keeping customer knowledge current involves a government agency (or unit) using valid research methods to predict accurately long-range trends. Just because something has met customer requirements in the past, and appears to be working at the moment, doesn’t mean it will work in the future. Projected changes in state and local demographics can mean real and substantive changes in customer requirements in 5 to 10+ years.

 

For instance, the so-called “baby boomers” (those born between 1946 and 1964), a population cohort of approximately 75 million in the U.S. according to recent census data, has the full attention of many workforce planners, retirement advisors and public benefit aging program managers. The diversity of baby boomers, their anticipated longer life span, and their unprecedented number, create challenges for both public policy and the “mature marketplace.” The question here is: “Is government keeping knowledge current of this coming ‘aging bulge’ and their inevitable accompanying customer needs associated with growing old?” In testimony presented to the U.S. Senate Special Committee on Aging, November 8, 1999, Fernando M. Torres-Gil, Director of UCLA’s Center Policy Research on Aging, stated:

Baby Boomers are the key generation that will redefine a politics of aging--their collective strength will come from their numbers: 75 million strong. To the extent that they have a collective sense of priorities and need, they will be an extraordinarily influential part of the electorate. On the other hand, we know that they are quite diverse: racially, economically, and generational. At least one quarter are non-white--African-American, Hispanic and Asian--and at least l8 million are considered to be economically "at risk:" non-home owners, single women, low education levels. Baby Boomers as a group have two distinct sub-cohorts: those born between l946 and l954 and those born between l955 and l964. The first wave tends to be our focus and they have a greater sense of themselves as a cohort. The second wave tends to share some of the insecurities about downsizing, technological advances and housing costs that affect younger groups such as Generation X. Thus we need to recognize this diversity in developing a long-term agenda for the aging of this group. And of course, we cannot over generalize about their views: some reflect the popular conception of a liberal and activist group while most others are like everyone else: struggling to build a life, take care of their families, and pay the bills.

 

Consequently, in accordance with Baldrige approaches, the final question at this juncture is, “How do you keep your listening and learning methods current with organizational needs and directions?” This question is a familiar one in the Baldrige sequential process and asks a government organization to provide evidence of systematic evaluation and improvement of customer knowledge—and, most importantly, to keep up-to-date and be progressive in determining future customer expectations.

 

Customer Relations

 

Once an assessment of customer requirements has been completed and is ongoing, the actual relationship with the customer(s) takes a measure of particular importance for a governmental agency.  By relationship, it is meant that a government organization (or, most importantly, its employees) seeks to exceed or surpass basic customer requirements and, as possible, offer exceptional customer services and/or products. There is a proactive attempt to distinguish an organization and its relation with its customers from all competitors. As a government organization, among all other organizations, there is an effort to create a special connection with the customer. Again, the goal is not simply to satisfy a customer but, more importantly, to make the customer a loyal and trusting one—a customer who will “spread the word” about the excellence of the agency, will be a “repeat” customer, and will “feel good” about the whole experience that has transpired. 

 

Figure 5

Baldrige Customer Relationships

Three Key Questions as to Customer Relations

 

  1. How do you build customer relationships (acquire and satisfy customers)?
  2. What is your complaint management process?
  3. How do you keep customer relationship approaches current?

 

Source: Available at http://www.quality.nist.gov.

 

Building Relationships with Customers

 

Building relationships with customers has several meanings. According to Baldrige processes, for example, an organization should be fully cognizant about how it provides information and easy access to allow customers to seek assistance, to make comments, and to complain if necessary. An organization—especially a public one—should have in place customer contact measures and service standards. An organization should also be equipped to deploy readily these measures and standards and, therefore, build positive relations with a customer(s) from the immediate point of contact. To accomplish this, new and unique services or rapid service standards are most useful.

 

By way of illustration, the U.S. Census Bureau has accepted the standard or motto: “The customer is always right!”  The Census Bureau promises that each customer will be satisfied with its products or it will refund the customer’s money in 30 days. Additionally, the U.S. Office of Personnel Management has initiated an “anytime” electronic access to job seekers by providing online information about federal job openings. This Internet service also allows for downloading job applications and applying for vacancies online, day or night. Other customer relation improvement standards at the federal level include:

· Consumer Product Safety Commission—24-hour “hotline” for consumer complaints;

· Environmental Protection Agency—public recognition of the achievements of businesses in reducing pollution;

· Social Security Administration—replacement of SSN cards within 5 days;

· Internal Revenue Service—paper returns refunded in 40 days or less, and if electronic, 21 days.

 

State government in South Carolina is likewise being practical and positive in building customer relations. Take for example the state’s Budget and Control Board.  The Budget and Control Board recognizes that customer feedback is critical to the Board’s commitment to the highest levels of customer service quality. Beginning in FY 2000, the Board surveyed all of its customers during the fourth quarter and has continued to do so regularly since. 

 

In doing so, Board management created an innovative easy-to-complete, one-page survey using standardized questions derived from current best practices. Additionally, the Board’s specialized units (General Services, Retirement Service, Human Resources, etc.) added questions on the back of the one page survey, addressing particular, unique functions to each unit. 

 

The standardized questions of the survey captured five areas common to all customer relations.

These included:

· Reliability—the ability to perform the promised service dependably and accurately.

· Responsiveness—the willingness to help customers and provide prompt service.

· Empathy—the demonstration of caring and individualized attention.

· Assurance—employees are knowledgeable and courteous and are able to convey trust and confidence.

· Tangibles—the physical appearance of facilities, equipment, people. (See http://www.lpitr.state.sc.us/reports/aar2000/bcb.doc).

 

Today, the Budget and Control Board continues additionally to address “good” customer relations through extensive training, planning, and supervisory oversight. (Ibid.).

 

Assessing the Complaint Management Process

 

A first-rate management complaint system is also critical to maintaining customer relations. Paying attention to customer complaints, however trivial, understanding fully the complaint, and working in an aggressive manner to address quickly and completely the complaint will satisfy customers and build a relationship of trust and customer commitment. The Baldrige process asks, “What is your organization’s complaint process?” (NIST, 2002, p. 17). This question entails, by its very nature, according to quality experts, the following characteristics: 1) the provision of access to seek assistance, 2) the procedural ease of making a complaint, 3) the simplification yet thoroughness of complaint forms, 4) the friendly and considerate personal one-on-one contact with customer representatives or other organizational employees, and 5) the follow-up procedures to ensure that a complaint has been indeed addressed to the complete satisfaction of the customer.

 

Figure 6

Complaint Process

 

Complaint Form or Contact – Friendliness, Ease, and Simplicity

 

Identifiable Complaint Access Point

 

Complaint Resolution and Follow-Up

 
 

 

 

 

 

 

 

 


Many experts, especially those familiar with Baldrige techniques, feel that the complaint management system gets at the very heart of excellent customer relations. One obvious reason for this perception among experts is that no matter how good a government service or product is, occasions will invariably present themselves that result in consumer criticisms, objections, etc. Acting professionally and systematically—and promptly—to address the problem or complaint is again essential to retaining good customer relations. But there is also another benefit to a well-conceived and implemented complaint management system. Data and information on complaints can be compiled and analyzed overall, and this in turn allows for a useful data base that gives management and other key personnel insight and direction on what problems are reoccurring and what needs to be done to fix them or prevent them from happening in the future. As such, new service or product standards can be put into place and repetitive or cyclical problems can be put right. Examples of such service standards might include “ all complaints are resolved within a 24-hour period or sooner if possible,” “all customer complaints will be coordinated by one individual or by a single point of contact,” “telephone customers with complaints will not be put on hold for more than 30 seconds.”

 

For instance, the Department for Occupational Safety and Health Administration (OSHA) has shortened its customer complaint(s) response time from 7 days to 1 day. The National Performance Review (1997, p. 6) reported that OSHA began listening to complaints of “slow turnaround” as early as 1994. Based on data from individuals, focus groups, and various survey mechanisms, OSHA began to set new standards to handle customer complaints. OSHA has not only expedited its response time, but has also instituted “direct call-backs” for follow-up purposes to assess fully the resolution of complaints. OSHA also prefers to deal now with a complainant by personal contact rather than by letter.

 

Similarly, the Federal Bureau of Land Management (BLM) received plentiful and frequent complaints about the slow response of the agency in processing permit applications. The former, self-described “snail-paced” system required a letter from the customer, after which BLM would write the customer with a cost estimate for the permit. The customer would then respond in writing once again and include payment. The typical permitting time (depending on the type of permit) could take 15 days turnaround. Today, customers required to get permits from BLM can simply fax their requests and call BLM with credit card payment. Customers can also simply go on-line at BLM’s Web site and complete the permitting process often in minutes. (Ibid.).

 

At the state level, South Carolina offers a similar innovation to answer customer demands.  The Office of the Secretary of State had a total number of business filings made last year (FY 2000-01) of 88,938.  This figure is more than 20,000 higher than the previous year, or a 32% increase.  Nevertheless, the number of customer service staff remained constant or at 11 employees. 

 

According to the most recent annual report of the Secretary of State’s office, it is placing greater emphasis on customer relations.  Currently, the office promises a 24-hour turnaround for all corporate filings. The secretary states that “this is essential to the business community; when a filing is made, then it should be on our records within 24 hours. This is a difficult task given the increase in volume and a limited staff. Additionally, most states do not offer this service and many charge special fees for ‘expedited’ service.  Still, the South Carolina Secretary of State sees this as a fundamental requirement of state government and provides customers with a prompt and quality service at no additional cost.” (See http://www.lpitr.state.sc.us/reports/aar2000/e08.doc).

 

Similarly, the S.C. Department of Alcohol and Other Drug Abuse Substances (DAODAS) “calculates that some 310,000 persons in South Carolina are currently experiencing substance abuse problems that call for intervention and treatment.  However, the DAODAS provider system (county alcohol and drug abuse authorities) has only been able to reach just over 53,000 of these South Carolinians during FY 2000-01.

The specific customer standards used by DAODAS include:

· To ensure timely access to care and to engage clients in the continuum of care, clients should receive at least one unit of assessment within 2 calendar days from intake;

· Clients with an assessment should have at least one unit of service within 6 calendar days from assessment;

· Detoxification client episodes should be followed by at least one unit of service immediately (1 calendar day) after the detoxification episode;

· Residential client episodes should be followed by at least one unit of services within 6 calendar days after the end of the residential episode.” (Available at http://www.lpitr.state.sc.us/reports/aar2000/j20.doc).

 

“Additionally, the county substance abuse authorities, in cooperation with DAODAS, utilize various survey instruments to gauge customer relations.  Most of these instruments measure the approval of clients with their facilities, accessibility, courtesy, professionalism, and treatment results.  To determine an overall rating of customer relations or satisfaction with services provided by the county authorities, DAODAS reviews the county plans submitted to the department each year.  With regard to these plans, the following objective has been established—at least 95% of the client-customers should rate ‘satisfactory or above’ the services received from the county substance abuse authorities.” (See http://www.lpitr.state.sc.us/reports/aar2000/j20.doc).

 

Figure 7

Customer Satisfaction Objective and Results

Customer Satisfaction

Objectives

FY98

FY99

FY00

FY01

FY02

FY03

FY04

FY05

89%

90%

95.8%

95%

95%

95%

95%

95%

(Note: The FY 2000 rating of 95.8% surpassed the objective of 90% and is now the benchmark for achieving customer satisfaction.  The objectives for FY 2001-05 are for the local substance abuse authorities to continue to achieve this benchmark.) Source: DAODAS , available at http://www.lpitr.state.sc.us/reports/aar2000/j20.doc.

 

Keeping Customer Relation Approaches Current

 

The National Malcolm Baldrige Program asks specifically, “How does your organization keep its approaches to building customer relations and keeping customer access current?” (NIST, 2002, p. 17). Again, it is important to realize that keeping “up-to-date” and “forward-thinking” is imperative in customer service and relationships. Changing circumstances, needs, preferences, etc. must be continuously monitored and adaptation must be the rule for achieving and maintaining excellence—results—in any government undertakings, whatever they may indeed be in relation to the time continuum. This means as internal and external situations change an agency, a department or other organizational unit of government must be prepared to retain customers and keep them satisfied. 

 

How is this done? As stated above, it is imperative to keep up with customers by building operational and management systems that are designed to keep customers happy once contact with them has been established. Also, it is important to have systems in place that solicit and address complaints in a prompt, friendly, and “personal” manner. One expert in Baldrige processes, argues for the following “actions” to be done on a continual, reoccurring basis:

· Recruit the “best and brightest” customer contact people, compensate them suitably, train them well, and give them authority to solve problems;

· Delineate customer service standards and then compare them with results;

· Provide toll-free numbers for customers to access representatives of agency or department;

· Pursue all complaints, however trivial, and speedily resolve them to the satisfaction of the customer;

· Collect and analyze information and data on customers in order to build a database for continuous improvement purposes. (Brown, 2001, pp. 27-28).

 

Customer Satisfaction

 

Customer satisfaction is, or rather should be, the primary aim of an organization, private or public, which provides services and/or products. Those organizations that ignore this essential and fundamental fact will inevitably not succeed. Indeed, one thing that all literature on administration and management theory and practice appears to agree on is this—“a satisfied customer is a happy and contented customer.” Even Aristotle in his Nicomachean Ethics (c. 350 B.C.) states “happiness is ultimately the goal of humankind.” (Book 1, Section 7).

 

Gathering Accurate and Useful Customer Data

 

Continuous improvement efforts in an organization must be anchored in customer needs and satisfaction. A frequent mistake in the United States has been to focus the continuous improvement effort on what managers and employees assume is important to customers. Mangers will be the first to say, “We have contact with our customers all the time; we know what they want.” I usually have the managers write down the top 10 needs of their customers. The overlap between the manager’s list and that of the customers is nominal. (Kessler, 1995, p. 59).

 

In the Baldrige framework, an organization should have in place various methods to ascertain customer satisfaction. There are two chief ways that are ordinarily suggested as sound methods to assess customer satisfaction: (1) Comment or feedback surveys and (2) the annual survey of all or a sample of customers. But even though the validity of these are normally deemed certain, the study of customer “behaviors” is additionally a good way of determining customer satisfaction. For example, repeat use of services or products is equally indicative of customer satisfaction. Take for instance a person who returns frequently to a particular social security office for assistance with benefits. This person may find that working with this particular office is more friendly and productive than working with another social security office closer to home. In sum, multiple measures of opinions, including that of behaviors, may be a better way to grasp individual satisfaction and general customer satisfaction trends.

 

Thus the determination of customer satisfaction is multifaceted and should take on a number measurement approaches. These approaches may consist of:

· Indicators of loyalty to an organization and its services and/or products;

· Numerous sources and methods of data on customer approval (surveys, complaints, feedback);

· Objective data on “repeat” business and the “reasons” for such recurring business;

· Collection and analysis of opinion data using various methods to estimate or pinpoint customer endorsement;

· Regularity of surveying the satisfaction of customers.

 

In state government, for example, customer satisfaction is an important part of the management system at the S.C. Department of Education (DOE). “It serves as the foundation of DOE’s continuous improvement efforts and includes all attributes that contribute value to internal and external customer satisfaction.” (See http://www.lpitr.state.sc.us/reports/aar2000/h63.doc).   Further “DOE determines near- and long-term requirements and expectations of its customers both formally and informally. Both methods provide feedback that is used to update strategic plans and action plans, design appropriate training services, provide technical assistance, and develop new products and procedures directed to improve learning and educational opportunities.” (Ibid). Customer satisfaction at DOE is assessed, for instance, by:

· On-site assessment of student/teacher satisfaction of curricula, classroom equipment, etc.;

· Surveying and feedback (evaluation) of instructional performance of teaching staff;

· DOE bimonthly meetings with local superintendents and other school district officials. (Ibid.).

 

The S.C. Department of Social Services (DSS) is another state agency which is focusing on customer satisfaction by meeting the social services requirements, expectations and preferences of its customers/clients. DSS is giving particular attention, according to its most recent annual report, “to defining who their customers are and realizing that new strategies and initiatives must be implemented to carry out the agency’s mission.”  (See http://www.lpitr.state.sc.us/reports/aar2000/l04.doc).  DSS states that:

With the establishment of agency outcomes focusing on improving the lives of children and families that we assist, the agency is moving away from monitoring activity or outputs and instead reviewing quality service delivery.  Goals established in the past focused on how many people we placed in jobs or whether or not we completed a treatment plan.  Under our outcome-based focus we need to know: that the jobs in which we are placing clients are helping our clients become self-sufficient; that the provision of services that have been indicated in the treatment plan were mutually agreed upon; and that they will lead clients to improving their health and well-being. (Ibid).

 

DSS has developed customer service surveys to measure the attainment of its mission, goals and objectives. For example, data from the surveys show that 80% of parents, once on welfare, have found employment.  In response to a query as to “whether life was better on welfare or off, 75% of families responded that life was better after leaving welfare.” (Ibid.). Similar to other survey trends now being utilized by governments to gauge customer satisfaction, a 24-hour “helpline” number has also been established at DSS. This toll-free helpline number allows those with concerns or needs regarding the care of children under the agency’s foster care programs to call. Children are also surveyed and participate in focus groups regularly to determine satisfaction levels. (Ibid). DSS also indicates that other surveying techniques and forms are being contemplated to ascertain customer satisfaction in an accurate and meaningful way.

 

Following-up and Customer Feedback

 

Important to Baldrige quality assessment is how an organization “follows-up” with customers on services and/or products. This follow-up provides useful and timely information to an organization as well as allows for “quick and actionable” customer feedback. (NIST, 2002, p. 17). These things, taken together, by all accounts with regard to current literature, are basic to achieving customer satisfaction.

 

One significant aspect of “quality-laden” follow-up and feedback is the frequency with which they are carried out. Obviously, a once-a-year customer satisfaction survey, while useful, is not ideal. Baldrige and customer survey experts generally agree that the more follow-up and feedback techniques utilized by a public agency or organization, the better. Frequent contact with customers is best for determining and sustaining satisfaction levels. Depending on circumstances, many experts suggest monthly contacts, or at a minimum, quarterly ones.

 

Additionally, survey or follow-up/feedback methods such as “transaction cards,” phone calls, mail surveys, incident documentation, etc. are recommended by Baldrige experts. For each instance of contact between an agency and a client, customer satisfaction should ideally be determined at that time and some reasonable follow-up should occur to verify satisfaction of customer needs and expectations. It should also be noted that such follow-up and feedback should be unobtrusive, sensible, and timely so as not to become somehow problematic or cause customer aggravation.

 

At the federal level, many government agencies and departments are concentrating on follow-up and feedback to ensure customer satisfaction. The following are indicative of this federal attempt to ensure satisfaction:

· Administration on Aging (AOA).  AOA tracks all grants on a monthly basis for those customers desiring closeout;

· Health Care Financing Administration (HCFA). HCFA measures satisfaction with recipients of Medicaid and Medicare benefits frequently through surveys, public comments, meetings with customer representatives, and focus groups;

· Rural Housing and Community Development Service (RHCDS). Per the Community Facilities Loan Program, RHCDS ensures loan applicants are provided appropriate forms in five working days, provides check-up services to assist in completing forms, and makes certain financial pre-loan requestors are contacted as to eligibility within 45 days. (Brown, 2001, pp. 10, 11, and 27).

 

In South Carolina, the Department of Health and Environmental Control provides a wide range of services (health and environmental programs) to the public. According to the agency, for example, “customer/client (user) satisfaction is one of the agency’s key values.  Each unit of DHEC is expected not only to survey its specific customer group, but also, to use that customer feedback to reshape and refocus what the unit does based on customer input. Staff has developed customized customer service training for the different types of employees at all levels throughout DHEC to meet this requirement.” (See http://www.lpitr.state.sc.us/reports/aar2000/j04.doc). DHEC states that: 

DHEC targets customers, customer groups and market segments by utilizing census data, input from the State Chamber of Commerce and Community Relations Council, statutes, laws, regulations, collaborations, feedback/assessments, partnerships/focus groups, and referrals. Some of the longer-term requirements are also mandated by laws, statutes and regulations.  As an agency we participate in many professional organizations that allow DHEC to benchmark services.  Participating in these organizations also affords staff the opportunity to take part in panel discussions, focus groups and forums where consideration of current and new trends occur. (Ibid.).

 

Using Data In Relation to Benchmarks

 

Customer satisfaction can be determined by a method generally referred in the management/administrative genre as “benchmarking.” Benchmarking is typically a measurement or process that designates a superior service or product of one organization that another organization desires to emulate or duplicate, or even better, exceed. Benchmarking techniques are a significant part of the Baldrige process in assessing customer satisfaction. (See NIST, 2002, p. 17). “Additionally, states like Oregon, Minnesota, Texas, and many others have adopted benchmarking practices to improve their performance and service delivery areas. Though each state utilizes benchmarking concepts and approaches similar to those of corporations as well as other states, it should be noted that each state uses methods that are unique and appropriate to its own cultures, economies and politics.” (Berger, 2002, p. 25).

 

According to the literature, benchmarking as used by governmental agencies can mean one of three things. First, corporate-style benchmarking is a frequently used benchmarking approach. It is an adaptation of “best practices,” which entails the act of identifying or endeavoring “to copy” and “go beyond,” if possible, the finest processes, products, and/or services as established by standards of superlative quality. A second kind of benchmarking entails what is called “targeting.” This is the process of setting goals and objectives to be achieved generally through strategic planning actions. Two states exemplify this type of benchmarking—Oregon Benchmarks and Minnesota Milestones. A third category of benchmarking is the comparison of performance statistics as benchmarks. In this form of benchmarking, organizations or governmental agencies partner with other comparably designated organizations and identify what services are to be benchmarked. Benchmarks constitute, as such, the comparative performances of services and/or products among partnering governmental units (Ibid).

 

Figure 8

What to Benchmark?

Ten Questions

  1. What is the most important factor associated with the organization’s success (customer/client satisfaction)?
  2. What factors are causing the organization the most problems?
  3. What services are provided by the organization?
  4. What factors constitute customer satisfaction?
  5. What operational problems appear to be occurring in the organization?
  6. Where in the organization is competition being brought to bear?
  7. What are the significant costs in the organization?
  8. What organizational functions, areas, etc. indicate need for substantial improvement?
  9. What functions distinguish the organization from other organizations (competitors)?
  10. What is necessary for “buy-in” or commitment from organizational employees and stakeholders?

Source: Best Practices Benchmarking. The Innovation Groups. (nd).

 

When using benchmarking processes or methodologies, the use of several comparative sources of data is recommended. This provides for greater accuracy and comprehensiveness of measurement of satisfaction levels. Also, objectivity of data is essential to the soundness of determining customer satisfaction. Hence, data sources should be balanced, complete, and reliable to the extent possible. Further, the amount of data collected is of consequence. The more reliable data collected the greater the probability or degree of dependability of accurate and meaningful analytical interpretation or comparability.

 

The U.S. Air Force is a good example of benchmarking for quality improvement purposes. In 1995, the Air Force initiated a benchmarking program for its basic training program at Lackland Air Force Base in San Antonio, Texas. The Air Force command structure wanted to update and improve its training of recruits. Benchmarking was chosen as the method for doing this. First, competitors were identified. These included six basic military training programs throughout the armed forces branches of the U.S. and the United Kingdom. After extensive analysis of basic training standards and processes of these “competing” basic training programs, particularly the identifying of performance gaps or differences, the Air Force set benchmarks for improvements with regard to its own basic training program. One benchmark was to improve classroom performance by reducing class size from 100 recruits to 50. The result of this class reduction was an instructor-recruit participation (Q & A interchange) increase of 312 %. Also recruit inattentiveness was decreased by 32 %. The Air Force additionally set a benchmark for its physical training component as well. This benchmark involved segregating recruits into groups based on physical readiness. Out-of-shape recruits were given special attention by placing them into intensive physical training units. The result of implementing varying physical fitness levels was improved performance for Air Force trainees. For instance, running time was increased by an average one minute for a two-mile run. (Available at http://www.apqc.org/free/articles/dispArticle.cfm?ProductID=649).

 

Closer to home, in South Carolina, the S.C. Municipal Benchmarking Project provides for South Carolina’s cities of varying sizes to share performance information. Performance information or statistical data on four service areas (police, fire, solid waste services and parks and recreation) are being shared among project participants. By benchmarking their performance and meeting with counterparts across the state, participant municipalities are monitoring and comparing their performance to learn more efficient and effective ways of providing local services. (See http://www.iopa.sc.edu/cfg/Benchmark/index.htm).

 

Keeping Customer Satisfaction Methods Current

 

Customer satisfaction expectations and preferences are in a constant state of flux. Therefore, organizations, especially public ones, should keep abreast of customer change. This is very important as to determining customer satisfaction. Satisfaction methods should be kept current. The question for a governmental agency or unit is thus—“What ways are you keeping customer satisfaction methods up-to-date to reflect customer changing needs and desires?”

 

As emphasized earlier in this discussion on customer needs, relations and so forth, frequent reviews of customer data and information are required to understand and meet customer demands. To keep up with, maintain and even surpass customer satisfaction levels a public agency should review and modify its data-gathering and analytical processes on a regular basis.

 

For example, again, the South Carolina’s Budget and Control Board management team reviews and updates its customer/stakeholder satisfaction determination approaches often. The senior management of the board reviews its post-service questionnaires, feedback cards, and complaint systems on almost a constant basis to ensure the reliability and timeliness of customer satisfaction data. (See 2000-01 BCB Annual Accountability Report, pp. 14-16). Other state agencies in South Carolina appear to be doing the same. (See http://www.lpitr.state.sc.us/reports/aar2000/aar2000.htm).

 

In closing, a key to keeping satisfaction determination methods current is being flexible. As customers change so should the organization. To do this, and to do it consistently, a public or governmental unit, for example, must not “straitjacket” itself with measuring tools of customer satisfaction that are out-dated. Flexibility within an organization permits employees and processes to adjust to changing external environments and customer whims, altering needs, and levels of satisfaction. Gathering and analyzing the satisfaction of customers should change as do the customers themselves and their varying, shifting expectations. 

 

Conclusion: Customer Knowledge, Relations and Satisfaction

 

In this paper the concepts and criteria associated with customer-driven focus and excellence have been discussed in relation to public sector entities. Using the Baldrige National Program as a basis or foundation upon which to rest, the significance of customer requirements, relations, and satisfaction has been explored in a general sense. A few examples were pointed to at the federal, state and local levels of government, though it is abundantly clear in the literature that much of American government is seriously striving to identify and comprehend the comprehensive needs of its citizenry and is attempting to offer customer excellence where and when it can. In many cases, governmental entities are succeeding in the meeting of citizenry needs. In other cases, unfortunately, public organizations are falling short in meeting all customer/client/citizen needs and expectations. What is of import here, however, is that agencies and departments, and even minute units of government, are listening and learning, following-up on complaints, taking corrective action, and so on. This is encouraging and indicates that government does care about those for whom it serves. Hopefully, the future will offer greater and greater evidence that this “customer-driven” focus will continue and grow.

 

References

 

Berger, Anna. (2002). “The S.C. Municipal Benchmarking Project.” June 2002. Public Policy &Practice. Vol. 1, No. 3: pp. 25-32.

 

Brown, Mark. G. (2001). The Pocket Guide to Baldrige Award Criteria. 8th Edition. Portland, OR: Productivity, Inc.

 

Brown, Mark G. (2001). Baldrige Award Winning Quality: How to Interpret The Balridge Criteria for Performance Excellence. 11th Edition. Portland, OR: Productivity, Inc.

 

Day, George S. (1999). The Market Driven Organization: Understanding, Attracting, and Keeping Valuable Customers. New, NY: The Free Press-Simon & Schuster, Inc.

 

Fisher, Donald C. (1994). Measuring Up to the Baldrige: A Quick & Easy Self-Assessment Guide for Organizations of All Sizes. New York, NY: AMACOM Publishers.

 

Gitomer, Jeffrey. (1998). Customer Satisfaction is Worthless—Customer Loyalty is Priceless. Austin, TX: Bard Press.

 

Heaphy, Maureen S. (1995). The Malcolm Baldrige National Quality Award: A Yardstick for Quality Growth. Reading, MA: Addison-Wesley Publishing Company.

 

Hutton, David W. (2000). From Baldrige to the Bottom Line: A Road Map for the Organizational Change and Improvement. Milwaukee, WI: ASQ Quality Press.

 

Kessler, Sheila Kessler. (1995). To Total Quality Service: A Simplified Approach to Using the Baldrige Award Criteria. Milwaukee, WI: ASQC Quality Press.

 

National Institute of Standards and Technology. (2002). Criteria for Performance Excellence: Baldrige National Quality Program. Washington, DC: U.S. Department of Commerce.

 

National Performance Review. (October 1997). Putting Customers First: Standards for Serving the American People. Washington, DC: NPR, Office of the Vice President.

 

Osborne, David and Plastrik, Peter. (1997). Banishing Bureaucracy: The Five Strategies for Reinventing Government. Reading. MA: Addison-Wesley Publishing Company, Inc.

 

Sponhour, Michael. (2002). “Turn Your Annual Report into an Engine of Change.” October 2002. Public Policy &Practice. Vol. 2, No. 1: pp. 5-9.

 

U.S. Department of Education. (December 2000). Customer Focus at DOE: The Way We Do Business: A Manager’s Guide for Action Planning. Washington, DC: U.S. Department of Education.

About the Author

 

Richard D. Young

 

Richard D. Young has been a senior research associate with the Institute for Public Service and Policy Research at the University of South Carolina since 1998. He conducts research on a myriad of public policy and public administration topics relating to state and local governments. Mr. Young previously worked with the Senate of South Carolina and the State Reorganization Commission in various positions of research. Prior to this, Mr. Young taught at the University of Louisville, Hanover College, Indiana University Southeast, and the University of Kentucky Campus in Louisville, Kentucky. Mr. Young has a B.A. (1973) and M.A. (1975) from the University of Louisville. Mr. Young has written and published several papers and reports on public policy issues and public management theory. The Institute for Public Service and Policy Research has also published Mr. Young's A Brief Guide to State Government in South Carolina (1999) and A Guide to the General Assembly of South Carolina (2000). The Institute has additionally published Mr. Young's book entitled Perspectives on Public Budgeting: Budgets, Reforms, Performance-based Systems, Politics and Selected State Experiences (2001). He and Dr. Luther F. Carter, President of Francis Marion College, have recently co-authored a paper, due to be published in 2002, entitled The Governor: Powers, Practices, Roles and the South Carolina Experience.  Mr. Young heads up the Institute’s Project for Excellence in Government, a project which aim is to foster and promote scholarly research on Baldrige concepts and criteria as relates to state and local governments. He is currently editor of the e-journal Public Policy & Practice (See http://www.iopa.sc.edu/ejournal/ ).

 

 



1 As of this publication of this paper, 19 municipalities are participating in the project.