24:1 (2009:03) Evaluation & Assessment Committee Annual Report

March 18, 2009 at 4:29 pm | Posted in Evaluation & Assessment | Leave a comment

EVALUATION & ASSESSMENT COMMITTEE ANNUAL REPORT

Committee members:  Lori Terrill, Chair (University of Wyoming), Ann Doyle Fath, Co-Chair (Getty Research Institute), Carole Bell (Temple University), Jana Brubaker (Northern Illinois University), Sarah Corvene (Harvard Business School), Susan Davis (State University of New York, Buffalo), Janice Lindquist (Rice University), Martha Spring (Loyola University of Chicago), Christina Torbert (University of Mississippi)

Board Liaisons:  Alison Roth (January-June), Anna Creech (July-December)

The committee spent the first part of the year preparing for the conference evaluation survey, reviewing standard questions and consulting with the Program Planning Committee (PPC), Conference Planning Committee (CPC), Site Selection Committee (SSC), and the Executive Board in order to identify any needed updates to those questions.  Low response rates in 2007 for preconference and poster session evaluations were discussed.  The committee decided to incorporate the separate forms for those sessions into the main conference evaluation form in an effort to increase the response rates.

After reviewing the features and abilities of ArcStone’s survey software, the committee recommended to the Executive Board that SurveyMonkey continue to be used for evaluation forms and the board concurred.  2008 was the first time the evaluation forms were available only online and individuals filling out the evaluation were eligible for a drawing to receive a free conference registration in 2009 or 2010.  The survey was available on the conference website and announcements were made via e-mail on NASIG-L, SERIALST, and via a blast message.

In July, the committee analyzed the results of the evaluation survey.  The committee was pleased to have one of the best response rates in NASIG history:  328 of the 520 conference attendees filled out the online evaluation form—a 63% response rate.  The single form led to better response rates for preconference and poster session evaluations.  The winner of the free conference registration was selected in a random drawing and the results were announced in the NASIG Newsletter and via the NASIG discussion forums in mid-July.  The confidential report of the conference evaluation results was sent to the Executive Board and to PPC and CPC co-chairs on August 1, 2008.  An abridged version of the report was submitted to the NASIG Newsletter and published in vol. 23, no.3 (Sept. 2008).   Twenty-eight conference presenters requested and received individual evaluation results.  These were sent out in late July and early August.

In June, the committee was asked to prepare a ranked list of the last two years’ most popular themes and programs, based on conference evaluation results, for use by the Continuing Education Committee (CEC) and PPC.  The report on our findings was completed and submitted to CEC and PPC on August 29th.

In the committee’s report for the fall board meeting, we recommended that the board establish a policy for access to the SurveyMonkey account given that many of the groups using it are collecting confidential information.  Possible points to include in the policy were recommended and a decision is in process.  Once the board has established a policy, it will be incorporated into the committee’s procedure manual.

In the fall, the committee developed a survey to go out to the membership on issues related to conference attendance.  The survey was submitted to board members for input prior to dissemination.  The survey was live for four weeks (October 27-November 21, 2008) and garnered 593 responses.  Results of the survey were reported to the Executive Board on December 8, 2008, and were published in the NASIG Newsletter, vol. 23, no. 4 (Dec. 2008).

As a wrap-up to the committee’s work, the chair and co-chair reviewed the E&A Committee Procedure Manual and made updates.

Given the change in committee workload due to the transition to online-only conference evaluations (specifically the fact that we no longer have the time-consuming task of manually inputting data), the committee recommends that the Executive Board reduce the committee’s size.  The committee size has been nine members.  We recommend reducing membership to five members.  A gradual reduction in committee size may be necessary in order to keep terms staggered.

ACTION ITEM: Board should consider gradually stepping down the size of the E&A Committee from nine to five members.

Leave a Comment »

RSS feed for comments on this post. TrackBack URI

Leave a comment

Blog at WordPress.com.
Entries and comments feeds.