"Assessment of Student Learning: A Dialogue"

Printer-friendly versionPrinter-friendly version
May 2006

At the May 16, 2006 Quality Advocates session, Renata Engel, Chair of the Coordinating Committee on University Assessment (CCUA) and Executive Director, Schreyer Institute for Teaching Excellence and Associate Dean for Teaching Excellence and Professor, Engineering Science and Mechanics and Engineering Design led a conversation among members of the CCUA and faculty and staff from University Park and several campuses about implementation of the University’s Assessment Plan for Student Learning.  Charged by the Provost in 2005, CCUA serves as a standing committee of the University with initial responsibility to develop and implement a university-wide plan for the assessment of student learning.

Development of Our Assessment Plan

Renata EngelDr. Engel began by providing an overview of how CCUA developed the plan.  Our early discussion was about what we were trying to assess, focusing first on what we would want to know about a program, why we would want to know that, why this information was important, and only after that, what data would be needed to provide that information.  The Committee also recognized that different units within the University were in different places on the assessment path.  For example, even though General Education and cocurricular programs cut across the University, General Education had program goals and identified outcomes, the foundation for assessment, whereas cocurricular learning outcomes were not identified when the committee began its work.  Many of our academic programs are formally accredited through external associations such as ABET and AACSB and their outcomes-based assessment processes.  There are many other programs that are not formally accredited but have assessment processes in place.  The Committee realized that the most effective approach is to recognize the good processes and practices and share these across the University.  Each program will need to develop an approach to assessment that answers the questions that will lead to improvement in learning outcomes.  The key is to ensure that all units are moving along the path to map from course assessment (where much information is already available) to program assessment to University assessment.  Dr. Engel pointed out that in the cover letter to the plan, President Spanier wrote that “the plan will continue to evolve…”

Assessment Approaches

Jack SelzerJack Selzer, Associate Dean for Graduate and Undergraduate Studies, talked about how he is approaching assessment for undergraduate programs in the College of the Liberal Arts.  The College has for some time had a good approach for assessment of their graduate programs, and this has resulted in advancement of these programs over the last few years.  In the area of undergraduate studies, Dr. Selzer is now asking department heads and undergraduate coordinators how they are doing in various areas related to undergraduate program effectiveness. The indicators he is using are analogous in some ways to those used for the graduate programs, and they include items such as:  

  • Number of students in a major and minor, and the demographics of those students (e.g., gender, ethnicity)
  • Number and percentage of honors students
  • Quality of advising
  • Evidence of student satisfaction
  • Number and percent of courses taught by tenure-line faculty (overall and upper division)
  • Study abroad, internship, and research involvement of students
  • Involvement of students in the intellectual life of the department, such as attending presentations by guest speakers and the presence of clubs and organizations
  • Outcomes, such as graduate school placement and types of jobs held by graduates

Departments also suggest indicators relevant to their specific disciplines.

Lisa Shibley, Institutional Research and Assessment Officer, Penn State Berks, discussed several initiatives at that campus.  Starting several years ago, Berks asked program coordinators to design their own assessment, with learning objectives tied to programs, the college, and the University.  Six to eight objectives were defined for each program, to keep the project manageable, and two to three strategies were identified to assess each objective.  Curricular maps are used to identify both overlaps and gaps in program curriculum.  Units are systematic in their approach, spreading their efforts over several years if necessary to gather the needed data, and using multiple means to gather data, including exit interviews, focus groups, portfolios, alumni surveys, internship assessment by both student participants and field supervisors, and student presentations to program advisory councils.   Dr. Shibley identified two specific examples as to how programs use assessment to improve the curriculum.  Sometimes the assessment initiative leads to further exploration of a particular issue while other times faculty use the information to make improvements within program courses.  First, a preliminary assessment revealing that Chemistry 12 had the highest failure rate led Penn State Berks to complete a full assessment of the course.  The full assessment includes gathering data about student attitudes about chemistry, and exploring hybrid and blended learning, peer teaching, and assessing learning within the classroom.  Improvements are being implemented in this course in phases over several years.  Second, Penn State Berks identified opportunities for improvement in their Professional Writing program courses.  Through exit interviews, portfolios, and student presentations to the program’s advisory council, the Berks’ Professional Writing faculty discovered that students wanted less theory and more practice in the program.  Their changes focused on helping students understand why they needed the theory they were getting and seeking appropriate opportunities to enhance students’ practical professional writing experiences.

Using Student Portfolios

The opportunity to use student portfolios, and e-portfolios, as a means to drive assessment and improvement was addressed by several units.  The College of the Liberal Arts is exploring using student portfolios as the basis for awarding a certificate in communication excellence, similar to the Teaching with Technology certificate for graduate students, for students who have demonstrated effective communication in the items within their portfolio. Recent advances in e-portfolios were discussed.  These included an e-portfolio system that would allow not only review of the work of an individual student, but would allow administrators to review a ‘dashboard’ matrix of items uploaded to fulfill particular learning objectives, and the feedback provided to students by faculty, and use this information to plan special programs, invite speakers, or make presentations to accrediting agencies.  Also mentioned was that the novelty of a portfolio, either on paper or an e-portfolio, can start a discussion of assessment through the questions of what evidence should be in a students portfolio after one, two, three, or four years in a program.  Penn State is part of Cohort III, along with 12 other universities and university systems, of the National Coalition for Electronic Portfolio Research. This research group will explore linking classroom and cocurricular learning outcomes through e-portfolios.

group

Data Collection and Resources

The conversation touched on several points about data collection.  First, different ethnic and cultural groups may respond differently to different methods of data collection.  Second, there may be interest in measurement of changes in attitudes and behaviors as well as knowledge and skills.  Third, it is still difficult to determine how to collect meaningful data in some cocurricular areas.  Finally, what is the relationship between student satisfaction and learning outcomes, in both the near term, at or shortly after graduation, and longer term, such as a year or several years after graduation?

Participants shared experiences with use of multiple concurrent methods for data collection.  This can include asking for self-reports in several areas or using focus groups and at the same time gathering related quantitative data such as GPAs.  It can also mean using moderated control groups, in which groups of students who did and did not have a particular experience are compared.  It can mean having peers involved in data collection, and using several approaches to reach out and let underrepresented populations know that there is interest in their responses.

There was also discussion about making the most effective use of available resources.   CCUA will provide information on good practices and build linkages on their Web site.  In fall 2006, the Schreyer Institute for Teaching Excellence will offer the Assessment Academy for faculty.  Dr. Shibley mentioned that relating assessment to strategic planning provides a link to resources.  Penn State Berks and Penn State Altoona are sharing ideas for understanding and improving assessment initiatives and resources at their respective campuses.  Penn State DuBois uses teaching portfolios to track outcomes and improvements.  Penn State Hazleton surveys graduates one year after graduation.  Health Policy and Administration uses a Web application for surveys that is straightforward enough for staff to set up the survey. 

As the group discussed approaches to resources, Louise Sandmeyer, Executive Director, Office of Planning and Institutional Assessment, pointed out that we are building a culture of evidence because it is the smart and right thing to do.  Dr. Selzer pointed out that it was likely that resources would find and follow assessment initiatives because of their importance.  Dr. Engel stressed the importance of moving to action beyond the session’s discussion and closing the loop – using the data developed during assessment to make improvements.

The Quality Advocates Network meets several times each semester to share ideas and examples of improvement and change. To join the Quality Advocates Network mailing list or to learn more about the meetings scheduled, contact the staff at psupia@psu.edu.

The Quality Advocates Network is open to all Penn State faculty, staff, administrators, and students.