EGLS 3 - Evaluation for Greater Learning Student Survey System Implementation at Houston Community College 1 EGLS 3 Dr. Alan Ainsworth Former HCC Faculty Senate President Mario Heredia Director Research & Support Services February 23, 2012 Dr. Mac Adkins President SmarterServices LLC Jeff Worford Lead Developer SmarterSurveys™ Product
Houston Community College ● A comprehensive community college serving the Houston area ● Over 75,000 students credit, non credit in Fall 2011 ● One of highest number of international students of all community colleges in the country 2 EGLS 3
Student Evaluation of Instruction – History ● Paper-Based Student Evaluation of Instruction (SEOI) survey has been in place for 27 years, revised incrementally. ● For Face to face classes, the SEOI was the HCC’s paper method to administer the student evaluation ● For online classes Distance Education used their own online survey ● Structure. Four part-time staff plus a supervisor working for two months. ● Large work room ● Given the current technology, Institutional Research sought -Online administration methodologies -New Instrument 3 EGLS 3
HCC’s SEOI Survey Paper Distribution System ● IR manually prepared and routed more than 5,000 packets and 100,000 SEOI instruments each semester. Each packet contains the SEOI instrument and instructions for faculty and proctors. ● The packets were sent to their respective campuses administrators. ● Campuses administrators were responsible for distributing the materials to the Instructors and collecting from the proctors (who administer the survey to the students). ● Once IR received the packets, entered them manually in the system to track the return rate. ● IR cleaned the packets, scanned the surveys and generated data file. The report was run in SPPS. ● The final report was saved on disks and sent to the College Administrators for manual distribution. Students’ hand written comments were sent back to faculty. ● Process is cumbersome, labor intensive and costly. ● SEOI’s means report results were not used systematically in the evaluation process. ● Return rates have been an issue, but in Spring 2010 the rate of return increased to 95% and the response rate was of 73%. 4 EGLS 3
Student Evaluation of Instruction (SEOI)– Paper Base Instrument 5 EGLS 3
EGLS 3 Student Evaluation Survey Instrument 6 EGLS 3 Houston Community College EGLS 3 – Evaluation for Greater Learning Student Survey System Organization and Explanation of Materials Strongly Agree Agree Disagree Strongly Disagree My Instructor explains difficult material clearly. My Instructor communicates at a level that I can understand. My Instructor makes requirements clear. My Instructor identifies relationships between and among topics. Learning Environment Strongly Agree Agree Disagree Strongly Disagree My Instructor establishes a climate of respect. My Instructor is available to me on matters pertaining to the course. My Instructor respects diverse talents. My Instructor creates an atmosphere in which ideas can be exchanged freely. Self-Regulated Learning Strongly Agree Agree Disagree Strongly Disagree My Instructor gives assignments that are stimulating to me. My Instructor encourages me to develop new viewpoints. My instructor arouses my curiosity. My Instructor stimulates my creativity. Overall Opinions Strongly Agree Agree Disagree Strongly Disagree I like this instructor. I am interested in this subject. I think the classroom was appropriate for this class. I would recommend a course taught by this instructor. General Information This class is: A Required Course An Elective I am not sure What grade do you expect to earn in this course? A B C D F
Important EGLS 3 Features from UNT’s SETE ● These are the factors or elements of teaching effectiveness that are being measured : Factor 1: Organization and explanation of materials Factor 2: Learning Environment Factor 3: Self-regulated Learning ● Four questions are used in the EGLS3 validity analysis. Scale scores (the weighted responses ) are reported for each of the three domains . An overall score is also computed The following are sample scale ranges for EGLS3 7 EGLS 3
SEOI-EGLS 3 Survey Instrument Comparison Student Evaluation of Instruction(SEOI) Paper-Based Home Grown Survey Included non-instructional Questions Library and course content questions No Validated Means report IR cleaned the packets, scanned the surveys and generated data file. The report was run in SPPS. The final report was saved in a disk and sent to the College Administrators. Hand-written comments returned to faculty. Process was cumbersome, labor intensive and costly. T he results were not used in a systematic evaluation process. EGLS 3 - Survey Instrument Is implemented online to allow flexibility, transparency, and accountability Provides a measure of teaching effectiveness as perceived by students. Allows college-wide comparison in key areas Facilitates student evaluations of their instructors (not the course or the content ) The EGLS3 scale scores can be used to make assessments for continual improvement of the institution Results available on line for individual faculty. User built their own reports 8 EGLS 3
Early Stages ● Vice Chancellor of Planning and Institutional Effectiveness has Faculty and OIR representatives visit University of North Texas to discuss SETE. ● HCC Administration and Faculty Senate decide to move forward with assessing SETE for possible online evaluations. 9 EGLS 3
Faculty Governance and Endorsement 10 EGLS 3 ● Faculty Senate President requests the evaluation of SETE for replacement of SEOI by Educational Affairs Committee ● Faculty Senate sees reports of SETE and votes unanimously to support pilot program for Spring 2011
Seeking Additional Endorsement 11 EGLS 3 ● Approval of Pilot from Vice Chancellor of Instruction, Deans’ Councils. ● Forums held for information and feedback
Student’s evaluation process, the transition from paper to online ● IR Partnership with Faculty Senate ● IR Responsibility: Develop and implement the Operational plan ● Faculty Senate President: Navigating Faculty Governance and Endorsement ● Pilot Implementation Spring 2011 . Two Locations were selected ● Outcome: Response Rate 20% ● Follow up: Student Focus Groups ● Findings: Login process cumbersome 12 EGLS 3
Solutions for Log-In Process found with PS Integration ● HCC - SmarterServices People Soft login Integration of EGLS3 - Student Center 13 EGLS 3
Student’s evaluation process, the transition from paper to online ● HCC - SmarterServices People Soft login Integration of EGLS3 - Faculty Center 14 EGLS 3
Integration and Responsibility of IR in EGLS3 Implementation ● Institutional Research liaison between SmarterServices and the HCC departments participating in the EGLS3. ● Develop timeline and monitor progress ● Develop and review d ocumentation with SmarterServices, IT/Costumer Support ● Develop and review communication plan with Communications Department. ● Several media channels used to notify the institution: meetings, email, paper, social media, web page, plasma screens, posters, etc. 15 EGLS 3
Fall 2011 - Full Implementation Facts ● Only Regular Term and Second Start Classes surveyed, including Distance Education and Dual Credit ● Data submitted: 6,460 classes with 140,800 enrollment. ● System runs smoothly ● 23,005 responses with a Response Rate of 16% Technical Issues ● Pop –ups blocked in College Computer Labs and other computers. ● System Slow due to lack of services running on the web server. Reporting ● Available on our Web Site ● Faculty , Administrator access ● Administrators can Build their Own Reports Follow up ● Faculty Focus Groups 16 EGLS 3
FALL 2011, Implementation Full Implementation Fall 2011. District-Wide ● Steering Committee in place ● Teams and strategy are developed ● Operations Strategy : How do we implement? What processes and procedures need to be developed ● Communication Strategy: How do we communicate? What and where? What are the benefits? How is it different? How does it relate to our vision? ● Feedback Strategy : How will this improve student learning? How will it improve teaching effectiveness? 17 EGLS 3
Spring 2012-Next Steps ● Spring 2012 Next Steps ● Immediate: Increase response rate ● As survey reports became available, faculty and administrators will be more involved in the process ● Develop committees to work to integrate this students evaluation with HCC Faculty/Staff evaluation system PEP. ● Redevelop PEP ● Research Project 18 EGLS 3
About SmarterServices ● Incorporated in 2002 . ● Served over 500 educational institutions including K-12, technical colleges, career colleges, community colleges, private colleges, universities, and corporations. ● Some of the other solutions provided by SmarterServices include SmarterMeasure , SmarterFaculty , and SmarterProctors . EGLS 3
About UNT’s SETE™ Instrument ● Benefit from extensive research at the University of North Texas . ● Control for bias of 24-variables ● Student grades ● Time of course ● Delivery system ● Track improvement with a standardized effectiveness score (1-1000 scale ) ● Useful for benchmarking and longitudinal analysis of faculty effectiveness ● Meet state reporting guidelines EGLS 3
Student’s Perspective within SmarterSurveys ™ EGLS 3
● 18 Standard SETE Questions ● 16 Likert scale questions (4 per Category) ● 2 General Information questions Student’s Perspective within SmarterSurveys ™ EGLS 3
● Custom Question can be added to SETE. ● Custom questions can be L ikert , open ended, multiple choice, etc. ● HCC choose to add 5 open ended questions to their evaluation Student’s Perspective within SmarterSurveys ™ EGLS 3
● Confirmation pop-up is presented to Student’s for verification upon submission of a survey. Student’s Perspective within SmarterSurveys ™ EGLS 3
● After completing a survey, the course is marked as “Completed” ● Any remaining surveys can be taken immediately or at another time Student’s Perspective within SmarterSurveys ™ EGLS 3
● HCC cho o se to suppress results until the end of the survey period. ● Faculty can still see their completion percentages, survey status, and results release date. Faculty’s Perspective within SmarterSurveys ™ EGLS 3
SETE™ Data Processing Flow Survey Responses Faculty and Student Attributes + Grades Data Processed EGLS3 Scores Generated Faculty ID, Course ID & Scores EGLS 3
● Once survey results open, faculty can view course responses and also pull aggregate reports for the term. Faculty’s Perspective within SmarterSurveys ™ EGLS 3
Individual Course Response Distribution Report – Per Course Header and Likert Responses Faculty Reports within SmarterSurveys ™ EGLS 3
Individual Course Response Distribution Report – Per Course Open Ended Responses Faculty Reports within SmarterSurveys ™ EGLS 3
EGLS3 Score Report – Per Term Faculty Reports within SmarterSurveys ™ EGLS 3