Planning Without Politics

Nov. 1, 2004
Empirical benchmarking enables administrators to back up their facility requests with objective data.

It's a common way to plan new facilities: principals, teachers, maintenance staff, parent groups and others are asked to compile lists of needed improvements. A consultant is hired to assess facilities. Then a consultant estimates the costs. A task force of citizens, parents, students and community leaders is assembled to review, modify, prioritize and endorse the list. The school board tweaks the list a little more; sometimes it sends the list back to the task force for changes. Sometimes the citizens are polled to gauge the level of support. Finally, the board approves the plan and presents it to the public.

Why does this method often fail? It's laced with politics and subjective judgment. Principals and teachers have differing ideas of adequacy — they ask for what they want, which is often more than they need. Maintenance staff will lobby for replacing as many systems as they can to minimize future repair work. Coaches and fine-arts instructors try to get as much as they can. The task force prioritizes by subjective ranking. The board often will add a few pet projects. The citizen poll is based on subjective judgment. The public will be asked to support a bond based on the opinions of the task force.

The intentions of those involved are sincere, but the process has few objective controls. The district staff typically knows most of what is needed, but it takes only a few frivolous improvements or a perception of unfairness to sink an improvement program and damage the district's credibility.

The way to avoid these pitfalls is to use empirical benchmarks. Empirical benchmarks take the politics out of facility planning. Be warned, though: many don't like using a benchmark system because it is difficult to manipulate.

The process

Step 1: Establish benchmarks

The board should approve benchmarks before an assessment. Benchmarks should be fairly simple and spelled out in terms the public can understand. They should be measurable. The following are major categories of benchmarks. There often are exceptions, but they must be justified with objective evidence:

  • General building area (square foot per student)

    Sub-categories such as space utilization rates or circulation ratios also can be used, but don't get too complicated. Compare district facilities with planning standards and peer districts.

  • Individual space size

    Compare classrooms, labs and support space with established standards.

  • Establish a standard number of spaces at each campus such as labs, gyms and rehearsal rooms

    Look at similar school districts to establish facilities benchmarks (such as one gym per 500 students, one music rehearsal area per 650 students, or one science lab per 180 students). This step is important because it provides control for a number of spaces, and it serves as a reference that citizens relate to better than planning standards.

  • Systems/material age

    Compare the actual age of a facility with its life expectancy. Agree that only systems beyond or near their normal lifespan are to be replaced. If a system is worn out well before it should be, that is evidence of a malfunction or maintenance problem. Correct these problems quickly to avoid issues with inferior maintenance procedures.

  • Code and regulatory compliance

    There's no measurement here — a facility either complies or it doesn't. Ask the board to agree to put any non-compliant items on the list of needed improvements. This includes building codes, accessibility, life-safety codes and state facility standards.

  • High-impact features

    This measures facility features that have proven to affect student achievement such as lighting, temperature control, indoor air quality, acoustics, space and technology. These can and should be measured in quantifiable terms.

  • Site features

    Measure site features such as the age of materials, the size of the site area, the number of parking spaces and playing fields, and the level of accessibility.

  • Operating cost

    Compare utility cost per square foot with industry standards and peers. It also is important to consider the cost per student. Consider the number of custodians (square foot per custodian).

It's important to use peer district comparisons where possible. Citizens relate to peer comparisons more readily than they do to comparisons with state or professional planning standards.

Step 2: Benchmark education

Educate those involved in planning about the benchmarks, and emphasize that they need to be followed. Common knowledge that benchmarks are in place will discourage most frivolous requests. Require anybody requesting features that exceed the benchmarks to justify their proposals. Even more effective is requiring those seeking additional features to persuade the board to support improvements that exceed benchmarks.

Step 3: Assessment/planning process

Execute the planning and assessment process preferred by the district, but apply the benchmarks to the process. Poll the district staff, have the assessment done, assemble task forces, hold community meetings, have tours, hold board workshops and price the improvements. Keep the benchmarks in simple terms, and stay within the benchmarks. Complete the improvement plan, and show how each improvement is justified based on the benchmarks. Prioritize the suggested additional improvements based on how much they deviate from the agreed-upon benchmarks. Resist the temptation to slip in a few projects that aren't justified.

Step 4: Present improvements to the public

Show the empirical justification for each improvement in simple terms. It is important to advertise the use of benchmarks to the public. Emphasize that the benchmarks will be kept in place after improvements are completed. This demonstrates initial and long-term fiscal management.

Step 5: Keep the benchmarks in place

Resist the tendency to ignore benchmarks once funding has been secured and detailed planning begins. Use the benchmarks as an annual monitoring system. Provide annual reports to the board and public.

Case in point

Here are some cases in which benchmarking led to more informed planning decisions:

  • Classrooms

    A science department was lobbying hard for more labs. Because of political connections, proponents had convinced almost everyone that new labs were the most critical need. Empirical benchmarks determined that there was a moderate need for science labs, but there was a more critical need for standard classrooms. The outcome? Both are under construction.

  • Gymnasiums

    A district was not having much success convincing citizens that elementary schools should have play gyms separate from cafetoriums. Peer comparison revealed that 90 percent of the schools in peer districts have separate play gyms. The gyms now are under construction.

  • Tennis courts

    A district was having difficulty persuading the public to support having more tennis courts in a bond program. Benchmarking found that the district's court count was 60 percent less than peer districts. The tennis team now has new courts.

  • Fewer classrooms

    A principal was lobbying hard for 12 more classrooms. Benchmarks indicated only eight were needed, and the board approved only eight. People commented during the bond campaign that this demonstrated fiscal responsibility. The eight rooms constructed six years ago have proven to be adequate to this day.

  • Meeting basic needs vs. political limits

    A district historically had polled citizens to determine the size of the bond they would support and then developed improvements up to that amount. In its most recent proposal, the district identified $60 million as this “political” limit. Benchmarking demonstrated a basic need of much more. Citizens were persuaded to support $89 million to meet basic needs.

  • Maintenance credibility

    One district was having a problem with the perception that facilities had not been maintained properly. Its bond campaign was struggling. Benchmarks demonstrated that the district was getting more life out of systems than normal. The district was able to demonstrate the quality of its maintenance program, and the bond passed.

  • Facility program redo

    A district used a subjective planning process as the basis for a bond election. The campaign was full of unanswered questions, the perception of excess and pet projects. It damaged the credibility of the district. A subsequent proposal, based on empirical benchmarks, passed by 20 percentage points.

Sidebar: Benchmarking benefits

  • Ensures most critical needs are met.

  • Ensures facility equity.

  • Ensures basic needs are met without overbuilding.

  • Ensures credibility.

  • Corrects deferred maintenance.

  • Eliminates unwarranted pet projects.

  • Demonstrates fiscal responsibility.

  • Provides annual monitoring tool.

  • Improves facility efficiency and effectiveness.

Making a case

By using normal-life benchmarks for materials and systems, a district can demonstrate to the public why materials and systems need replacement. A simple bar graph can demonstrate that major systems exceed their normal life expectancy. This is helpful in convincing the public that the buildings have been maintained properly and that some systems wear out after a normal life expectancy.

Hunter, AIA, is president of Hunter Corral Associates, Educational Facility Consultants, Odessa, Texas.

Sponsored Recommendations