Architecture Benchmarking Survey Team

Printer-friendly versionPrinter-friendly version
Team ID: 
778
College / Administrative Unit: 
Arts and Architecture, College of
Date Started: 
February 2007
Objective: 
Identification of institutions with undergraduate programs and facilities similar to those of the PSU Department of Architecture. The following process was used: 1. Contacting those institutions with requests for specific data (enrollment, fte faculty, permanent budgets, etc.) regarding their programs. 2. Compilation of respondents’ data in tabular form. 3. Distribution to respondents whose institutions agreed to participate in sharing data. 4. Analysis of data and evaluation of the survey instrument.

What prompted the benchmarking survey were inquiries within the Penn State Department of Architecture regarding how its programs, resources, and facilities compare with those at similar institutions. The objective targeted by the survey was collection of pertinent data for analysis, thus to direct improvement efforts.
Desired Results: 

The objective was accomplished given the limitations of the survey items, in that the data gathered did indeed inform certain comparisons (e.g., the student/faculty ratio appears to be high at Penn State). Furthermore, one measure of a survey's success (independent of data quality) being the response rate, the instrument performed well at a little above 40% (cf. Kaplowitz, Hadlock, & Levine, 2004, http://poq.oxfordjournals.org/cgi/content/full/68/1/94). This is not a trivial consideration since some of the survey items required a coordination of effort among several individuals and departments at participating institutions in order to produce a response.

However, what the data also revealed was that the survey instrument as developed and administered was not sufficient in collecting all that was desired. Certain items related to endowments were left unanswered as a rule, and those responses would have been of great interest.

One concluding suggestion is that the benchmarking survey should be followed up with direct questioning of respondents regarding institutional satisfaction with the data shared. Those additional responses might support efforts to duplicate or even elaborate the survey when it becomes necessary to update the data.

Contact Person: 
Dan Willis
Members:
  • Dan Willis, Project Director
  • Charlie Cox, Member
  • Jodi La Coe, Member