West Texas A&M University

Buff Transit Tracker
SSR Addendum Standard 1

West Texas A&M University’s Self-Study Report (SSR) Addendum In Response to the Formative Feedback Report (FFR)

As presented in the Self-Study Report (SSR) and in response to the Formative Feedback Report (FFR), the mission of the Department of Education is to prepare educators who are confident, skilled, and reflective professionals. This is accomplished through the coursework, field, and clinical experiences of our teacher candidates. Throughout the progression of the EPP at admission, during development, and upon completion, our candidates gain confidence through coursework, develop skill in their field and clinical experiences, and continue to grow as reflective practitioners. At the conclusion of the program, our candidates are prepared to impact student learning and development in diverse P-12 settings.

The EPP’s concerted drive for all coursework, field, and clinical experiences for our candidates is to impact P-12 student learning. This is why we embrace evidence-based continuous improvement in all we do within the EPP to produce confident, skilled, and reflective professionals. Once our completers are employed as in-service teachers in diverse P-12 settings, the outcome that we seek to achieve above all others is for our teachers to have a positive impact on student learning and development. In this addendum, the EPP’s evidence, data, and exhibits are organized and reflective of the organization of the department (admission, development, and completion) and provides evidence of how we are achieving this all-important outcome.

For admission to WTAMU, applying freshmen must meet expectations for ACT and/or SAT scores for acceptance into the university. Cohort averages on the ACT and/or SAT as nationally normed tests represent scores of the 50% percentile or higher at admission. Based upon admission data, the first-time, full-time, degree seeking freshman cohort data for Fall 2013 showed 950 freshman had an average of 21.19 on the ACT and 603 freshmen averaged 965.32 on the SAT. In Fall 2014, 958 freshmen averaged 20.78 on the ACT and 614 averaged 970.90 on the SAT. In Fall 2015, 1019 freshmen averaged 20.71 on the ACT and 683 averaged 967.07 on the SAT.

A score of 967 on the 1040 SAT exam is considered an average score. The average score on the 1600 SAT exam is 740 and 810 is considered above average. Cohort averages of 21 on the ACT and 967 on the SAT represent scores above the 50% percentile for all WT test takers on these nationally normed tests that meet CAEP requirements. Texas Student Success Initiative (SSI) data is limited, but will be available along with ACT and SAT data onsite. [For more information concerning the Texas Student Success Initiative (SSI), please see http://www.wtamu.edu/advising/texas-success-initiative.aspx].

At admission to the EPP, the Grade Point Averages (GPAs) of candidates following their core course work are evidence of candidate competencies in their selected areas of study (i.e., English Language Arts, Mathematics, Science, Social Studies, etc.). For example, candidates are competent in their chosen core areas because they have a cumulative GPA of 2.75 or higher when they are accepted into the program, which exceeds the state requirement of 2.50. The mean of each cohort from 2013 to 2015 meets or exceeds the CAEP GPA requirement of 3.0 GPA.

Also at admission, the EPP begins with a trifold approach to candidate preparation that includes an introduction to the Program Educational Outcomes (PEOs), Ethical and Professional Dispositions, and the TExES Competencies for Effective Teaching. Evidence shows alignment with state-selected standards, research, and our conceptual framework that identifies who we are, what we are about, and what kind of professional educators we strive to produce who are critical creative thinkers, effective communicators, advocates of diverse learners, users of technology, life-long learners, and stewards of the profession.

During development, candidates are required by state statute to complete 40 hours of field observations in diverse settings that are preselected according to Texas Academic Performance Reports (TAPR) for districts/campuses within our service area [See Texas Academic Performance Reports at https://rptsvr1.tea.texas.gov/perfreport/tapr/]. The majority of campuses served are Title I campuses with 50% low socioeconomic student populations within Free/Reduced Lunch Programs. Evidence from Methods syllabi, coursework, KEI assignments, assessments, TExES content and PPR certification exam results, and additional
field experiences for candidates such as the Opportunity School (http://opportunityschool.com/), the Center for Learning Disabilities (http://www.wtamu.edu/academics/learning-disabilities-contact-us.aspx), the Go Global Study Abroad Initiative (http://www.wtamu.edu/academics/go-global.aspx), and others ensure the definitive development of candidates as future professional educators.

New innovations within the EPP over time provide evidence of consistent and continuous improvement in the EPP. Some of the innovations include the following:

  • the development of EdCamps through the Texas Panhandle Professional Learning Network or TXPPLN (http://www.wtamu.edu/academics/txppln.aspx);

  • the i3 CORE Investing in Innovation National Study (https://www.corepartners.org/) in collaboration with seven other states and universities from across the United States in providing technology, technology support, and continual professional development to small, rural schools within our service areas (Drs. Hindman, Garcia, Coneway, and Williams);

  • the Panhandle Math and Science Conference promoting STEM best practices (http://www.wtamu.edu/academics/math-and-science-conference.aspx);

  • the Region 16 ESL/Bilingual Institute (faculty and teacher candidates);

  • the Region 16 Literacy Conference (faculty and teacher candidates);

  • the Williams’ Children’s Literature Collection and Reading Room (http://www.wtamu.edu/academics/the-williams-childrens-literature-collection-and-reading-room.aspx);

  • the Williams’ Early Childhood Model Classroom (Old Main, Room 207) equipped with state of the art technology;

  • the Center for Learning Disabilities at the Amarillo Center (http://www.wtamu.edu/academics/learning-disabilities-contact-us.aspx);

  • the Helen Piehl Distinguished Annual Lecture Series;

  • the Dr. Geneva Schaeffer Distinguished Annual Lecture Series;

  • the funding and construction of the Dr. Geneva Schaefer STEM Lab (Old Main, Spring 2017);

  • ongoing faculty research studies in the area of Children’s Literature (Drs. Sharp, Coneway, and Diego-Medrano);

  • the Windows on the Wider World (WOWW) research study in art instruction and integration (Drs. Sharp, Coneway, Hindman, Garcia, and Bingham);

  • the Ronald E. McNair Inclusion Study (faculty and candidates) (http://www.wtamu.edu/academics/mcnair-scholars.aspx);

  • the Attebury Honor Program (http://www.wtamu.edu/academics/honors.aspx) Scholar’s Retention Study (faculty and candidates);

  • Standards-Based Instruction (Drs. Hindman, Coneway, Garcia, and Williams);

  • an Exploration of the Effects of Experiential Learning on Teacher Candidates’ Perceptions of Preparedness in Teacher Preparation Program research study (Dr. Diego-Medrano and Dr. Hughes);

  • the ESL Experiential Study of Evaluating Pre-service Teacher Preparedness for Teaching Culturally and Linguistically Diverse Student Populations by engaging in experiential learning in a short term study abroad at Machu Picchu of Peru (Dr. Garcia, Dr. Green, Dr. Castillo (Summer 2016);

  • the Faculty Recruitment Research Study (Dr. Coneway and Dr. Garcia);

  • the development of a new Secondary Methods course for ESL (EDSE 4331) offered in Fall 2016;

  • Go Global Study Abroad Initiatives to Costa Rica, Hong Kong, China, and Peru;

  • the studies in Special Education that include dyslexia (http://www.wtamu.edu/academics/learning-disabilities-resources.aspx) and Attention Deficit Hyperactivity Disorder or ADHD (http://www.wtamu.edu/academics/learning-disabilities-adhd.aspx); and

  • the Doctoral Task Force Proposal for a Doctorate of Education (Ed.D.) Program to be offered at WTAMU.

Endowed chairs and professorships of faculty in 2013 through 2016 and beyond, (i.e., The John G. O’Brien Distinguished Chair in Education, the Helen Piehl Professor of Education, and the Geneva Schaeffer Professor of Education and Social Sciences), continues to improve the quality of instruction our teacher candidates receive in the EPP. (http://www.wtamu.edu/academics/college-of-education-and-social-sciences-endowed-chairs-and-professorships.aspx).

The SSR Addendum provides evidence of the field observation experience of our candidates during their Methods courses. Prior to field observation placement, our candidates attend a mandatory Methods Field Observation Orientation provided by the EPP each semester. The Methods Chair and Director of the Office of Teacher Preparation and Advising review the Field Observation Handbook, ethics, procedures, paperwork, and expectations of the EPP with all teacher candidates. During candidate development in Methods, candidates complete the state-required 40 hours of field observation experience in diverse settings. Feedback of the assessment and evaluation of candidates during these experiences are provided by the school-based cooperating teacher and EPP-based university faculty. The EPP provides opportunities of sufficient depth and breadth for candidates to observe and work with diverse P-12 student populations. Through the EPP’s analyses of the Texas Academic Performance Reports (TAPR) of all districts and campuses in our service area or state prior to field observation placement, the EPP ensures field observations occur in diverse settings with rich diversity of P-12 student populations. The school-based clinical teachers (cooperating teachers) and EPP-based university faculty complete assessments and/or evaluations of the field observation experience and provide feedback to candidates. Candidates reflect on the experience in a variety of ways including writing reflection papers and through class discussions.

The culminating experience for our candidates within the EPP is their clinical teaching experience in diverse settings. Just as previously described for field experience, candidates attend a mandatory Clinical Teaching Orientation prior to student/clinical teaching placement. The Director of the Office of Teacher Preparation and Advising reviews the Clinical Teaching Handbook, ethics, procedures, PDAS/T-TESS evaluations, paperwork, requirements, and expectations of the EPP with all candidates. After fulfilling all EPP requirements, candidates are placed in our partnering schools with diverse P-12 student populations that were analyzed from the TAPR data. As the culminating experience, our candidates complete thirteen weeks of clinical teaching in their specialty licensure/certification areas. The cooperating teacher, the University Field Supervisor (UFS), and the Director of the Office of Teacher Preparation and Advising assess candidate progress throughout the clinical teaching experience. Candidates also assess their cooperating teachers, University Field Supervisors, the Director, and the EPP. Assessment evidence demonstrates candidate proficiencies in teaching, learning, and professional behaviors. Additionally, letters from University Field Supervisors reflect the quality and impact that our teacher candidates are having in the classroom during their clinical experiences.

For our completers who are currently serving as in-service teachers, the addendum presents evidence from our major partnering districts of the hiring, retention, and promotion of our completers for three academic years. Superintendents, Human Resource (HR) Directors, Principals, and Teachers have provided letters of support and Teacher Work Samples (TWS) that demonstrate the positive impact our completers have upon P-12 student learning and development.

Overarching Concerns/Questions/Issues in the FFR

In response to the overarching concerns, questions, and/or issues raised in the FFR by the CAEP Offsite Review Team, this addendum provides narratives, comprehensive responses, explanations, and amendments as either corrections or clarifications and/or empirical evidence of Addendum Exhibits (AE) that support the EPP in meeting all CAEP Standards and Components. The EPP used the CAEP Accreditation Manual February 2015 resource in the preparation of our Self-Study Report (SSR). Although several references to the CAEP Accreditation Handbook, Version III of March 2016 and the CAEP Evaluation Rubric for Visitor Teams (March 2016) were made in the FFR, these resources were not yet available to the EPP during our preparation and submission of our SSR. We had not seen the CAEP Evaluation Rubric for Visitor Teams or the CAEP Accreditation Handbook prior to our submission. In response to the FFR for this Addendum, the EPP has used the CAEP Accreditation Manual of February 2015 in our preparation as directed by CAEP staff and the collaborative direction of our CAEP Team Chair.

Due to a misunderstanding of the “Specialty Licensure Area Data” on page 16 of the Self- Study Report, the State Program Review (State-selected standards) option was selected. Because Texas is neither a partnership state nor a SPA state, the Program Review Option that should have been checked is the EPP Review with Feedback (State-selected standards). The EPP has selected the EPP Review with Feedback as our State Program Review.

A list of all programs of the EPP (listed as Program Characteristics in AIMS) is provided in the SSR Addendum along with the disaggregation of three cycles of specialty licensure/ certification data. Rubrics for the Program Educational Outcomes (PEOs), the Candidate Evaluation Instrument (CEI) for Ethical and Professional Dispositions of Candidates, and the Syllabi Analyses I and II are also provided with validation and reliability studies. The Crosscuts of Diversity and Technology and the Selected Improvement Plan (SIP) are addressed, explained, and clarified. Each one of the CAEP Standards and all Components have been addressed in this SSR Addendum. Addendum Exhibits (AE) have been uploaded in AIMS.

Note: Addendum Exhibits (AEs) highlight summaries of analyses of evidence for the EPP; however, references in the FFR indicated requests for complete documents. Therefore, as per the Team Chair’s suggestion, links are provided within the Addendum for convenience to the CAEP Review Team. Also, additional printed copies of specific evidence and empirical data described for specialty licensure/certification areas are housed in program notebooks onsite and will be available for review during the CAEP Onsite Visit in November 2016. Thank you.

A Very Special Thank You

The EPP would like to offer our warmest thank you to Dr. Ana Maria Schuhmann, our CAEP Team Chair, and to Offsite Team Members Dr. Shawn A. Faulkner, Dr. Mary Jo Finney, Dr. Anthony J. Kirchner, and Ms. Rowena Shurn, and Offsite Team Observers, Mr. Cole Bowers and CAEP Staff for all of your diligent, comprehensive, and committed work for our review. We wish to extend to each of you our deepest appreciation for your dedication to the CAEP Accreditation Process, to our noble profession of education, to ensuing generations of diverse American youth, and to all of the teachers who teach them. Our work as an Educator Preparation Program (EPP) is enhanced with being more dynamic, more impactful, and remains even more genuinely committed to excellence through continuous improvement throughout this accreditation process and beyond.

On behalf of Dr. Eddie Henderson, Dean of COESS, Dr. Judy Williams, Department Head/Associate Dean, Dr. Janet Hindman, Director of Accreditation, and the WTAMU Department of Education:

Thank you!

West Texas A&M University’s Self-Study Report (SSR) Addendum In Response to the Formative Feedback Report (FFR)

Special Note: All italicized lettering used in this SSR Addendum is intended as direct quotes from the CAEP Review Team’s Formative Feedback Report (FFR). Thank you.

  1. INTRODUCTION

  1. Brief overview of the EPP:

(FFR, p. 2, paragraph 1)

A copy of the award of regional accreditation was not uploaded in the self-study instead the EPP uploaded the Texas Education Agency (TEA) letter accrediting the education program.

Response:

West Texas A&M University is accredited by the Southern Association of Colleges and Schools Commission on Colleges (SACSCOC), a regional agency recognized by the United States Department of Education. This is the highest accreditation a university can receive and signifies that WTAMU has "a purpose appropriate to higher education and the resources, programs and services sufficient to accomplish and sustain that purpose." SACSCOC is the regional body for the accreditation of degree-granting higher education institutions in Alabama, Florida, Georgia, Kentucky, Louisiana, Mississippi, North Carolina, South Carolina, Tennessee, Texas, Virginia, and Latin America.

Due to a misunderstanding of the “Specialty Licensure Area Data” on page 16 of the Self- Study Report (SSR), the State Program Review (State-selected standards) option was selected. Because Texas is neither a partnership state nor a SPA state, the Program Review Option that should have been checked is the EPP Review with Feedback (State-selected standards). The EPP has selected the EPP Review with Feedback as our Program Review option.

[Please see Addendum Exhibit (AE1) SACSCOC Regional Accreditation of West Texas A&M University].

[See Addendum Exhibit (AE2) Letter from Dr. Tim Miller, Director of Educator Preparation, Testing, Program Accountability, and Program Management of the Texas Education Agency (TEA)]. [See also http://www.zoominfo.com/p/Tim-Miller/50572021 and http://mansfield.tea.state.tx.us/TEA.AskTED.TSD/TSDfiles/tsd2015/not_tagged/tea_hierarchy_org_charts.pdf].

[See Addendum Exhibit (AE3) List of Texas Approved EPP Programs]. [SBEC Screenshot. See also Texas Approved EPP Programs at https://secure.sbec.state.tx.us/SBECOnline/approvedprograms.asp].

Note: Dr. Tim Miller from the Texas Education Agency has tentatively accepted our invitation to attend the CAEP Site Visit as a state representative on Monday, November 14, 2016.

Program Review: EPP Review with Feedback

With the selection of the Program Review Option of EPP Review with Feedback, the Addendum provides the EPP’s following answers to the three questions on pages 16-17 of the SSR:

(SSR, p. 16, bottom of page and p. 17, top of page)

  1. Based on the analysis of the disaggregated data, how have the results of specialty licensure area or SPA evidence been used to inform decision-making and improve instruction and candidate learning outcomes?

Response:

Based upon evidence to improve instruction and candidate learning outcomes, the EPP transitioned in 2015 from previously adopted Student-Centered Proficiencies and Behaviors of Candidates (2013-2014) to Program Educational Outcomes (PEOs) as Candidate Learning Outcomes (CLOs). In addition to candidate instruction in the Texas Code of Ethics, the EPP enhanced instruction and candidate learning outcomes in 2015 by implementing the Ethical and Professional Dispositions of Candidates into our teacher preparation program.

Specific examples of our analyses of disaggregated data of specialty/licensure/ certification areas that were used in inform our decision-making and to improve instruction and candidate learning outcomes include multiple areas throughout the progression of the EPP. Based upon the evidence at admission in EPSY 3341, the GPAs of candidates were 2.75 or higher from 2013 to 2015. In 2014, candidates were required to pass one state certification exam. In 2015, candidates were required to pass both the TExES Content Certification Exam and the TExES PPR Certification Exam prior to their placement in clinical teaching. Evidence demonstrates candidates successfully passed both state certification exams.

The EPP continuously improves instruction and candidate learning outcomes through programmatic changes and improvements in all programs and at all levels. With a change of administrative leadership in Fall 2013, the EPP disaggregated data from the TEA Compliance Audit detailing deficiencies and prompted the EPP to make bold and decisive programmatic changes and improvements. The EPP disaggregated and analyzed the TEA 2012-2013 Compliance Audit and ASEP certification exam data and implemented new curriculum changes, improvements in test preparation, hired a remediation specialist to work with struggling candidates, and compiled numerous online resources for test preparation for our candidates posted on the EPP’s website. These changes began to make a significant difference in learning outcomes and improvement in the program as evidenced by increased ASE P score results in 2014 and 2015. Additionally, these programmatic decisions had a positive impact on candidate learning outcomes and improved instruction in Elementary Education, Early Childhood EC-6 (CORE Subjects EC-6), Grades 4-8, and Reading. It is important to note that for certification purposes, 4-8 is considered part of the Elementary Education Program.

Based upon evidence and input from partnering stakeholders and candidates during development, changes in all Methods courses were implemented for continuous improvement of the EPP. In 2013, candidates were required to complete ten hours of field observations in EPSY 3341 and thirty hours of field observations in their Methods courses. In 2014 and 2015, the requirement of completing all forty hours of field observations occurs during the first semester of the clinical experience year when Methods courses are completed. This allows for increased consistency, guided monitoring or supervision of candidate progress and development, and more focused reflections of candidates upon their field experiences. Specific Key Effectiveness Indicator (KEI) assignments, candidates teaching three twenty minute mini-lessons in collaboration with cooperating teachers during the field experience, and guided reflections have strengthened and improved the field observation experience for all candidates.

All Methods candidates attend a mandatory Field and Clinical Experience Orientation each semester. The orientation prepares candidates for the field observation experience with a careful review and discussion of the Field Methods Handbook, procedures, ethics, evaluations, and expectations of the EPP. In Fall 2016, new and improved Field Observation documents and assessments developed collaboratively among faculty and the newly hired Director of Teacher Preparation and Advising will be implemented in all Methods courses to improve instruction and candidate learning outcomes.

All Clinical Teaching candidates attend a mandatory Clinical Teaching Orientation each semester. The orientation prepares candidates for the clinical teaching experience with in depth review and discussion of the Clinical Teaching Handbook, procedures, ethics, PDAS/T-TESS evaluations, and expectations of the EPP. Candidates complete essential paperwork for fingerprinting, background checks, and requests for clinical teaching assignments. At the conclusion of the orientation, teacher candidates or clinical teachers meet with their assigned University Field Supervisors (UFS). Working together collaboratively, the candidates and their University Field Supervisors schedule their pre-conference and first clinical teaching observation, discuss expectations and procedures, and answer any questions that candidates may have.

Additionally, for Fall 2016 the EPP has combined the two routes of initial certification in traditional and alternative certification into a single unit with one Director of the Office of Teacher Preparation and Advising. Based upon available evidence, the EPP believes the combination of these offices will improve efficacy, consistency, and productivity, enable tracking, and will provide streamlined services for all our candidates from both traditional and alternative certification routes.

Based upon candidates not feeling prepared in classroom management collected from the Student/Clinical Teacher Exit Evaluations and the Dean’s meetings with area superintendents and principals, evidence informed the EPP’s decision making in order to improve instruction and candidate learning outcomes. As a result of the EPP’s analysis, the unit implemented a new policy of a Methods/Clinical Teaching year. One semester of Methods courses with the required 40 hours of observations for candidates was implemented and followed by student/clinical teaching. The classroom management course (EDPD 4340) was changed from a three-week course offering to a full semester course to improve the learning outcomes of our candidates in the area of classroom management.

In Fall 2016, the complete cycle of course offerings for all candidates was streamlined, improved, and implemented. Assessment training, evaluation practice, and regularly scheduled meetings for University Field Supervisors (UFS) were implemented to improve instruction and mentoring of candidates and their clinical experiences. To improve candidate learning outcomes, all candidates who are ready to begin their clinical experiences will attend the August Experience. Candidates are assigned to diverse P-12 settings in partnering area schools for the first week of school to observe the first day and first week of school. After these August Experience in-school observations, candidates will return to the WTAMU campus, submit reflections of their experience, and attend training seminars in Technology, Diversity/Poverty, School Safety and School Violence, and Mental Health prior to their clinical teaching. For those candidates who are seeking alternative certification through internships, ACP Orientation is also provided by the EPP.

In collaboration with their cooperating teachers for each week of the twelve weeks of clinical teaching and the one week of their August experience for a total of thirteen weeks of clinical teaching experience, clinical teachers are required to complete and submit Weekly Progress Reports each Friday. These reports are aligned with the state’s T-TESS evaluation instrument. From 2013 to 2015, University Field Supervisors (UFS) evaluated student/clinical teachers with the PDAS Appraisal Form for three 45-minute lessons during their clinical teaching experience as required by state statute. For 2016, the T-TESS Evaluation Form will be used for UFS’s to evaluate the progress and development of all clinical teachers in three 45-minute lessons. Upon the completion of their clinical experience, candidates in all specialty licensure/certification areas are required to submit a Teacher’s Notebook with specific requirements and expectations to the Director of the Office of Teacher Preparation and Advising. Examples of these Teachers’ Notebooks are available onsite in the Director’s office.

With the successful completion of all of the requirements for the state and for the EPP, candidates are approved to apply for educator certification upon graduation from West Texas A&M University. [These underlined items are located in the Addendum Exhibits (AEs) for CAEP Standard 2].

Based upon the analyses of disaggregated data results of specialty licensure/certification areas, the EPP’s efforts through an evidence-based and informed decision-making process have continually improved instruction and candidate learning outcomes in these specialty licensure/certification areas.

Multiple Evidence Sources

[See Addendum Exhibit (AE4) New Testing Policy of the EPP].

[See Addendum Exhibit (AE5) New Department Head].

[See Addendum Exhibit (AE6) GPA Data].

[See Addendum Exhibit (AE7) Specialty Licensure/Certification Data].

[See Addendum Exhibit (AE8) TEA Compliance Audit Report (2012-2013). (http://tea.texas.gov/WorkArea/linkit.aspx?LinkIdentifier=id&ItemID=25769807819&libID= 25769807821)].

[See Addendum Exhibit (AE9) ASEP Reports].

[See Addendum Exhibits for CAEP Standard 2 for Weekly Progress Reports, PDAS Appraisal Form, T-TESS Evaluation Form, and Clinical Teacher’s Notebook].

  1. Based on the analysis of specialty licensure area data, how have individual licensure areas used data for change?

Response:

Based upon the analyses of specialty licensure/certification area data, some examples of how data were used for change in individual licensure/certification areas in the EPP include changes implemented in Elementary Education (See Early Childhood EC-6 or Core Subjects EC-6, Grades 4-8, and Reading as addressed in the previous response), Bilingual/English as a Second Language (ESL) Education curriculum [See Texas Education Agency http://tea.texas.gov/bilingual/esl/education/], Secondary Education for Middle/High School, Special Education, and MAT/ACP programs.

Bilingual/ESL Education uses research data to change curriculum and to improve instruction. Based upon current research, the combination of a double helix and readiness spiral that measures six levels of candidate readiness to teach culturally and linguistically diverse (CLD) P-12 populations (Herrera and Murray, 2011) were implemented. For continuous improvement, faculty administered a quantitative pre-assessment questionnaire to candidates prior to taking their ESL courses. The purpose of this assessment is to evaluate candidates’ level of readiness to work with CLD populations. Each item on the questionnaire is linked to one of the six levels of the accommodation readiness spiral. After candidates take the two ESL courses (EDPD 4378 and EDPD 4388), the identical post-assessment is administered to candidates a second time to measure any changes and increases of candidate readiness in working with CLD populations.

Emerging trend data indicates an increase in content and pedagogical knowledge, confidence, and levels of readiness in candidates who have taken the two ESL courses in working with CLD populations. Developed as a research study with an approved IRB, the first assessment of the study was administered in May 2016 (Spring 2016) and the second assessment in July (Summer I, 2016). Results of the study are currently being analyzed. The study will continue in the Fall 2016 semester in Bilingual/ESL Education.

From 2013 to 2015 in Secondary Education, curricular changes and a new course were developed and implemented in the EPP. In 2013, candidates in the program previously had only one required Secondary Methods course offered (EDSE 4320 Teaching in the Secondary School), primarily because Secondary Education candidates were completing the required content area hours provided by other colleges within the university [See Degree Checklists by Applicable Calendar Year at http://www.wtamu.edu/advising/degree-checklists.aspx].

In 2014 and 2015, curricular changes in EDSE 4320 and the creation/development of a new course (EDSE 4330 Teaching in the Secondary School II) were added to EPP course offerings. Based upon the data, these curricular changes integrated focused ESL and Special Education instruction within the two courses resulting in the strengthened and improved preparation of our secondary candidates for middle and high school instruction.

In EDSE 6333 Secondary Methods for MAT/ACP, candidates use research data to write papers, for bulletin board discussions, and in the application of their research findings. Some of these MAT/ACP candidates have already taught for one semester or more. Because of the applied research data, candidates implement what Marzano, Dean, and others have suggested to impact student learning and development in their lesson planning. Multicultural education is also emphasized in EDSE 6333 for MAT/ACP candidates. As a result, ASEP scores of candidates in specialty/licensure/certification areas for both traditional and alternative certification routes continue to improve as a result of these curricular changes in Secondary Education.

In Fall 2016, the newly developed Secondary Methods course (EDPD 4331) will be offered for the first time. This course prepares non-traditional secondary candidates who are interested in teaching ESL and culturally and linguistically diverse P-12 student populations.

In the specialty licensure/certification area of Special Education (EC-12), faculty and candidates are using data in EPSY 6350 Exceptional Children in the Mainstream of Society and EDSP 6357 Teaching Secondary Students with Mild Disabilities. Candidates are required to complete case studies of functional behavior assessments to develop a behavior improvement plan based upon the data that they have collected. Candidates work with a child of their own selection within a school district (ideally with a student who has emotional disabilities). In EDSP 6357 Teaching Secondary Students with Mild Disabilities (SpEd Secondary Methods), the KEI assignment is a class project of developing lesson plans from one of the core academic areas for students with a particular disability (candidates’ choice of disabilities; i.e., autism spectrum disorders (ASD), etc.). The lesson plan is required to include accommodations and modifications for use in general education settings. Candidates are also required to attend regular parent and community meetings at the Center for Learning Disabilities and to assist faculty, special speakers, parents, and children in areas when there is need. Special Education courses provide instruction in Admission, Review, and Dismissal (ARD) and Individual Education Plans (IEPs) to improve instruction and candidate learning outcomes. [See http://texasprojectfirst.org/node/163].

Based upon the evidence, the EPP identified needs within the department and sought and acquired outside funding from private donors to meet those needs. The Williams Early Childhood Model Classroom construction project has recently been completed for classes to be offered in Fall 2016. The EPP will soon begin construction in Spring 2017 on the new STEM Lab for Grades 4-8 Math and Science that is privately funded. Each of these model classrooms will be equipped with the latest state-of-the-art technologies for twenty-first century and beyond teaching and learning.

For the WTAMU Distinguished Lecture Series on September 29, 2016, the EPP will host the Dr. Geneva Schaeffer STEM Presentation for Educators presented by Dr. Tricia Berry, UT Austin who has been called one of the Top 100 Leaders in STEM by STEMconnector. The Center for Learning Disabilities will host the 5th Annual Helen Piehl Lecture of Executive Functions: What Are They, Why are They Important, and How Can I Help? Presented by Dr. Cheryl Chase, Licensed Clinical Psychologist, Chasing Your Potential, LLC on October 14, 2016. Planning for these Lecture Series are based upon data and the learning needs of our candidates and communities.

In sum, based upon the evidence presented in the SSR Addendum, the EPP’s analyses of data inspired enhanced changes in the specialty licensure/certification areas of Elementary

Education, Grades 4-8, Secondary Education for Middle/High Schools, Special Education, the MAT/ACP programs and in the construction of new learning labs and model classrooms for our candidates.

Through the EPP’s Advisory Council and Teacher Educator Unit (TEU) Meetings with our school and university partners, the EPP continues to collaborate with stakeholders in teacher preparation of admitted candidates in our program.

[See also previous responses to Question 1 of the SSR Addendum on pages 8-11]. [See Addendum Exhibit (AE7) Specialty/Licensure/Certification Data].

[See Addendum Exhibit (AE10) Changes in Bilingual/ESL Memorandums].

[See Addendum Exhibit (AE11) Traditional and Alternative (ACP) Routes for Initial Certification].

[See examples below of Secondary Education Course Syllabi links for the following academic years].

Fall 2013

EDSE 4320 Teaching in the Secondary School http://syllabus.wtamu.edu/syllabi/u/2/7/8/2013FA_EDSE_4320_01.pdf

EDSE 4321 Teaching in Agriculture http://syllabus.wtamu.edu/syllabi/?d=9&s=34

[Note: Please click on the EDSE 4321 link to access the course syllabus].

Spring 2014

EDSE 4320 Teaching in the Secondary School http://syllabus.wtamu.edu/syllabi/u/9/0/8/2014SP_EDSE_4320_01.pdf

Fall 2015

EDSE 4320 Teaching in the Secondary School I (Secondary Methods I): http://syllabus.wtamu.edu/syllabi/u/0/1/9/2015FA_EDSE_4320_01.pdf

EDSE 4330 Teaching in the Secondary School II (Secondary Methods II): http://syllabus.wtamu.edu/syllabi/?d=9&s=49

[Note: Please click on the EDSE 4430 link to access the course syllabus].

EDSE 6333 Secondary Methods (for MAT/ACP) http://syllabus.wtamu.edu/syllabi/?d=9&s=49

Summer 2016

EDSE 6333 Secondary Methods (for MAT/ACP) http://syllabus.wtamu.edu/syllabi/u/1/9/6/2016SU1_EDSE_6333_70.pdf

Fall 2016

EDSE 4320, EDSE 4330, and EDSE 6333

http://syllabus.wtamu.edu/syllabi/?d=9&s=56 (upon availability for the semester).

  1. For Program Review with Feedback only: How does the specialty licensure area data align with and provide evidence for meeting the state-selected standards?

Response:

The alignment of the specialty licensure/certification area data with the state-selected standards of the Texas Higher Education Coordinating Board [See http://www.thecb.state.tx.us/], the Texas Essential Knowledge and Skills (TEKS) Curriculum Standards, Texas Teaching Standards, and Texas Approved Educator Standards provides evidence that the EPP is meeting the state standards. Compliance reports for educator preparation in Texas assure that each Educator Preparation Program (EPP) is held accountable for compliance with the Texas Administrative Code Chapter 229 for certification of candidates completing the programs.

Additionally based upon the evidence, Texas releases annual accountability ratings for each EPP within the state, results of candidate certification examinations, annual EPP performance reports, and the performance for beginning teachers on the Appraisal System through Principal Survey results (first year). [See http://tea.texas.gov/Texas_Educators/Preparation_and_Continuing_Education/Consumer_Information_about_Educator_Preparation_Programs.aspx].

West Texas A&M University’s Department of Education (or EPP) is currently “Accredited” by the Texas Education Agency in meeting all state-selected standards through the alignment of all specialty licensure/certification areas within the EPP. All EPP course syllabi have been carefully aligned with state-selected standards [including English Language Proficiency Standards (ELPS) and Technology Application Standards], InTASC Standards, ISTE Standards, current research, and university General Learning Objectives (GLOs) to ensure the success of our candidates in meeting state requirements for educator certification.

The state-selected standards of the EPP are described as follows:

Texas State Standards

The Texas State Board for Educator Certification creates standards for beginning educators. These standards are focused upon the Texas Essential Knowledge and Skills, the required statewide public school curriculum. They reflect current research on the developmental stages and needs of children from Early Childhood (EC) through Grade 12. [For additional information, please see http://tea.texas.gov/texas_educators/preparation_and_continuing_education/approved_educator_standards/].

Additionally, the Commissioner of TEA has adopted new rules pertaining to Texas teaching standards:

Texas Teaching Standards Adopted in Chapter 149

Approved Educator Standards

[See Texas Approved Educator Standards at http://tea.texas.gov/texas_educators/preparation_and_continuing_education/approved_educator_standards/].

Curriculum Standards

The Texas Education Agency’s (TEA) website describes state curriculum standards in this way:

“Because of student mobility, Texas has adopted curriculum standards that are to be used in all the state's public schools. The current standards, which outline what students are to learn in each course or grade, are called Texas Essential Knowledge and Skills (TEKS). The standards are adopted by the State Board of Education, after extensive input from educators and other stakeholders.”

Links to the current standards (as well as information about the adoption process):

TEKS Texas Essential Knowledge and Skills

TEKS in Spanish

TEKS Review

TEKS-Related Documents

The following links provide access to TEKS-related documents:

English Language Proficiency Standards (English Language Learners)

Prekindergarten Guidelines

College Readiness Standards

Technology Applications Standards for All Teachers

State Board for Educator Certification (SBEC)

Standard I. All teachers use technology-related terms, concepts, data input strategies and ethical practices to make informed decisions about current technologies and their applications.

Standard II. All teachers identify task requirements, apply search strategies and use current technology to efficiently acquire, analyze, and evaluate a variety of electronic information.

Standard III. All teachers use task-appropriate tools to synthesize knowledge, create and modify solutions and evaluate results in a way that supports the work of individuals and groups in problem-solving situations.

Standard IV. All teachers communicate information in different formats and for diverse audiences.

Standard V. All teachers know how to plan, organize, deliver and evaluate instruction for all students that incorporates the effective use of current technology for teaching and integrating the Technology Applications Texas Essential Knowledge and Skills (TEKS) into the curriculum.

[See https://www.txstarchart.org/standards.html].

The 2013 to 2015 evidence of accreditation ratings, ASEP and PPR program specialty licensure/certification certification examination results (that reflect state content and pedagogical criteria), LBB certification and annual performance reports of the EPP, and Principal Survey data for beginning teachers are included as Addendum Exhibits (AEs) to demonstrate how specialty licensure/certification data aligns with and provides evidence that the EPP meets the state-selected standards. Additional evidence of PDAS evaluation data also shows alignment with InTASC Standards and Texas Educator Standards.

As summary, based upon the analysis of the disaggregated data of specialty licensure/certification areas, the EPP has demonstrated how evidence has been used to inform decision-making, to improve instruction and candidate learning outcomes, for change in individual licensure/certification areas, and to show alignment with and to provide evidence that the EPP meets the state-selected standards.

[See Addendum Exhibit (AE7) Specialty/Licensure/Certification Data].

[See Addendum Exhibit (AE9) ASEP Reports].

[See Addendum Exhibit (AE12) PPR Exam Results].

[See Addendum Exhibit (AE13) LBB Certification Reports].

[See Addendum Exhibit (AE14) Accreditation Ratings].

[See Addendum Exhibit (AE15) Annual Performance Reports].

[See Addendum Exhibit (AE16) Principal Survey Results].

[See Addendum Exhibit (AE17) PDAS Evaluation Data].

(FFR, p. 2, paragraph 2)

The Department of Education consists of 21 faculty members for undergraduate and graduate programs, ten part-time instructors, and eight staff members including the Director of Teacher Preparation and Advising, the Director of the Panhandle Alternative Certification for Educators, the Director of Accreditation, and nine graduate teaching assistants.

Response of Correction and Clarification: The Director of Accreditation is not considered staff, but is included in the 21 faculty members.

  1. Summary of programs offered:

(FFR, p. 2, paragraph 5, p. 3, paragraph 1)

According to the WTAMU website the EPP offers the following programs which are not included in Table 2 (Program Characteristics) of the self-study:

Response:

The EPP offers two routes of initial licensure/certification for Traditional and Alternative Certification that include our programs: Elementary Education (Core Subjects EC-6, Reading, and Bilingual/ESL); Grades 4-8; Secondary Education (Middle/High Schools); Special Education EC-12; and MAT/ACP. The WTAMU website, handbook, and catalog describe these programs and Addendum Exhibits (AE) demonstrate the program characteristics of each of the programs the EPP offers to candidates. The identified course of study for candidates under a specific program will determine the area of certification completed. The EPP’s website, handbook, and catalog provides information for candidates and potential candidates interested in our program.

[See Addendum Exhibit (AE2) Letter from Dr. Tim Miller, TEA].

[See Addendum Exhibit (AE3) Texas List of Approved EPP Programs].

[See Addendum Exhibit (AE11) Traditional and Alternative (ACP) Routes for Initial Certification].

WTAMU Colleges and Departments (Includes Instruction of Teacher Candidates):

College of Agriculture and Natural Sciences

College of Business

  • College of Business, accredited by the Association to Advance Collegiate Schools of Business (AACSB) and the Computer Information Systems (CIS) program (accredited by ABET Computing Accreditation Commission in the disciplines of applied science, computing, engineering, and engineering technology):

  • <http://www.wtamu.edu/academics/college-business.aspx>

College of Education and Social Sciences

College of Nursing and Health Sciences

School of Engineering, Computer Science, and Mathematics

Sybil B. Harrington College of Fine Arts and Humanities

School of Music

Master’s of Art in Teaching (MAT) / Panhandle Alternative Certification for Educators (MAT/PACE)

Graduate School (Not included in this review)

Additional Link of Interest for Learning Outcomes:

Office of Learning Assessment

(FFR, p. 3, paragraph 2)

The EPP also offers a Master of Arts in Teaching for alternative certification. According to the self-study, this program is offered on campus and on-line.

Response of Correction and Clarification:

The Panhandle Alternative Certification for Educators (PACE) or Alternative Certification Program (ACP) for initial certification is managed on the main campus; however, candidates with baccalaureate degrees seeking initial certification are only provided online coursework with no face-to-face course offerings.

[See Master’s of Art in Teaching (MAT) / Panhandle Alternative Certification for Educators (MAT/PACE) at http://www.wtamu.edu/academics/pace.aspx].

(FFR, p. 3, paragraph 3)

There is a letter dated June 2015 from the Texas Education Agency (TEA) indicating that the EPP was “accredited”, however, the state program reports were not uploaded as required for the EPP’s under the State Review option.

Response:

Please see the EPP’s previous response in the SSR Addendum to the FFR on pages 4-5, pages 7-11, and the Addendum Exhibits (AE1), (AE2), and (AE3). Thank you.

II. CAEP Standards and Evidence

STANDARD 1: Content and Pedagogical Knowledge

  1. Summary of preliminary findings

  1. Narrative summary of preliminary findings

(FFR, p. 3, paragraph 4)

A letter dated June 2015 from the TEA indicating that the EPP was ‘accredited’ is provided; however, the state program reports were not uploaded as required under the State Review option.

Response:

Please see our previous responses in the Addendum to the FFR on pages 4-5, pages 7-11, and the Addendum Exhibits (AE1), (AE2), and (AE3). Thank you.

(FFR, p. 3, paragraph 4)

With few exceptions, data presented in the self-study and exhibits are not disaggregated for the programs cited above.

Response:

All course syllabi, KEI assignments, and assessments are aligned with InTASC Standards and course outcomes. When candidates pass each course and maintain a 2.75 GPA in

Elementary Education, Grades 4-8, Secondary Education (Middle/High Schools), Special Education EC-12, and MAT/ACP programs, candidates have met InTASC Standards and state-selected standards. Candidates in all programs at all levels are required to complete clinical teaching in the traditional route and are regularly assessed by the University Field Supervisors with the state PDAS evaluation instrument and T-TESS in the Fall 2016 and by their cooperating teachers and Director of Teacher Preparation and Advising. Both the PDAS and T-TESS evaluation instruments are aligned with InTASC Standards.

Addendum Exhibits (AEs) provide disaggregated programmatic evidence for three cycles of triangulated data of candidate GPAs, specialty licensure/certification exam results in content and pedagogy, field observation evaluations by university Methods faculty and cooperating teachers, and University Field Supervisor (UFS) PDAS evaluations. Evidence demonstrates the number (N) of participants, percentages in specialty licensure/certification areas, and comparison data.

All data presented in the Addendum Exhibits for CAEP Standard 1 (AES1) have been disaggregated by program and by specialty licensure/certification area for initial certification.

Addendum Exhibits for CAEP Standard 1:

[See Addendum Exhibit (AES1.1.1) Deep Understanding of InTASC Standards].

[See Addendum Exhibit (AES1.1.2) Research and Evidence].

[See Addendum Exhibit (AES1.1.3) Completers Apply Content/Pedagogical Knowledge in Outcome Assessments].

[See Addendum Exhibit (AES1.1.4) Access to Rigorous College and Career Readiness Standards].

[See Addendum Exhibit (AES1.1.5) Model and Apply Technology Standards].

[See Addendum Exhibit (AE6) GPAs Data].

[See Addendum Exhibit (AE7) Specialty Licensure/Certification Data].

[See Addendum Exhibit (AE9) ASEP Reports].

[See Addendum Exhibit (AE12) PPR Exam Results].

[See Addendum Exhibit (AE16) Principal Survey Data].

[See Addendum Exhibit (AE17) PDAS Evaluation Data].

[See Addendum Exhibit (AE18) Field Observation Evaluations].

[See SSR Exhibit 1.1.1 Program Educational Outcomes (PEOs), Ethical and Professional Dispositions, and Standards Alignment of the EPP].

(FFR, p. 3, paragraph 5)

In addition, Praxis I and/or the Core Praxis scores are provided as evidence of candidate content knowledge.

Response of Clarification:

Texas does not administer the Praxis I and/or Core Praxis exams, but instead administers the TExES Content Exams per content or specialty/ licensure/certification area and the TExES Pedagogical and Professional Responsibilities (PPR) Exam for pedagogy.

[See Addendum Exhibit (AE9) ASEP Reports].

[See Addendum Exhibit (AE12) PPR Exam Results].

(FFR, p. 4, paragraph 2)

Data, however are not disaggregated by specialty licensure area.

Response:

All data presented in the Addendum have been disaggregated by program and by specialty licensure/certification area for initial certification.

Please see our previous response in the SSR Addendum on data disaggregated by program and specialty licensure/certification areas. Thank you.

[See also Addendum Exhibit (AE7) Specialty/Licensure/Certification Data].

[See Addendum Exhibit (AE9) ASEP Reports].

(FFR, p. 4, paragraph 2)

PPR data are also disaggregated by Alternate Route and traditional programs, but not by the different licensure fields.

Response:

All data presented in the Addendum have been disaggregated by program and by specialty licensure/certification area for initial certification.

Please see our previous response in the SSR Addendum. Thank you.

[See Addendum Exhibit (AE7) Specialty/Licensure/Certification Data].

[See Addendum Exhibit (AE12) PPR Exam Results].

(FFR, p. 4, paragraph 2)

All the data on Table 1 (Exhibit 1.1.9) for 2013, 2014, and 2015 for the traditional program on Table 1 appear under EC-12. Most of the data for the alternate certification program on Table 2 of 1.1.9 also appear under EC-12.

Response:

As explanation, the SSR presented data as state protocol requires EPPs to report the data and as the state releases the data to the EPPs. Through the Addendum Exhibits (AEs) in the SSR Addendum, the EPP has provided disaggregated data per specialty licensure/ certification area to meet CAEP expectations.

[See Addendum Exhibit (AE12) PPR Exam Data].

(FFR, p. 4, paragraph 4)

In the fall 2015, the EPP began a pilot implementation of the PEO in the Reading program; data from two courses in this program are the only data are presented in Table 2 of exhibit 1.1.2.

Response:

Although the SSR presented data from the Elementary Education Program’s Reading Pilot Study, assessment of PEOs occurred in all programs of the EPP in the Fall 2015 semester. The EPP provides additional data from the Reading Pilot Study in Addendum Exhibit (AE36) and other examples of programmatic PEO assessments in the SSR Addendum.

For Elementary Education and 4-8 programs, in addition to EDRD 3301 and EDRD 4301, data for PEO assessments in EDEC 2383, EDEC 3301, and EDEC 3384 are housed in EPP Program Notebooks and will be available onsite.

In Secondary Education and MAT/ACP programs, PEO assessment data in EDSE 4320, EDSE 4330, and EDSE 6331 are available onsite in EPP Program Notebooks. Candidate examples of the Teacher’s Notebook and other KEI Assignments will also be available in the EPP Program Notebooks.

In Special Education, PEO assessments in Weekly Reflection Papers and a Final Reflection Paper in EDSP 4369 and EDSP 4358 are housed in EPP Program Notebooks and will be available for review onsite. The EPP Notebooks contain the syllabi and samples of candidate KEI Assignments for all courses in all programs, the rubrics and/or scoring guides faculty used in the assessments, and LARS program evaluations.

As summary, since all courses have been aligned with state-selected standards that are assessed on the state certification exams (TExES Content and TExES PPR exams that all candidates must pass prior to clinical teaching), all courses have been aligned with the PEOs, Ethical and Professional Dispositions of Candidates, InTASC, and PDAS, and all candidates must maintain a 2.75 GPA in all education courses throughout the program, the EPP demonstrates that our candidates are assessed on the Program Educational Outcomes (PEOs) in multiple ways.

The SSR Exhibit 1.1.2 has been revised in Addendum Exhibit (AE20).

[See Addendum Exhibit (AE20) Revised SSR Exhibit 1.1.2].

[See Addendum Exhibit (AE21) PEOs Additional Data].

[See Addendum Exhibit (AE32) Revised SSR Exhibit 1.2.1].

[See Addendum Exhibit (AE36) Reading Evaluation Reports].

[See SSR Exhibit 1.1.1 Program Educational Outcomes (PEOs), Ethical and Professional Dispositions of Candidates, and Standards Alignment of the EPP].

(FFR, p. 4, paragraph 5)

There are no rubrics for the PDAS; the instrument is a check list with a four level scoring guide.

Response:

Professional Development and Appraisal System (PDAS)

PDAS remained in place during the 2015-16 school year as the State's approved instrument for appraising its teachers and identifying areas that would benefit from staff development. Cornerstones of the process include 45-minute observations and completion of the Teacher Self-Report form. The University Field Supervisors complete three observations of 45-minute lessons of our clinical teachers during their student/clinical teaching experience.

PDAS includes 51 criteria within eight domains reflecting the Proficiencies for Learner- Centered Instruction adopted in 1997 by the State Board for Educator Certification (SBEC). The domains are:

  1. Active, Successful Student Participation in the Learning Process

  2. Learner-centered Instruction

  3. Evaluation and feedback on Student Progress

  4. Management of Student Discipline, Instructional Strategies, Time/Materials

  5. Professional Communication

  6. Professional Development

  7. Compliance with Policies, Operating Procedures and Requirements

  8. Improvement of All Students' Academic Performance

Included in the appraisal system are Instructional Leadership Development (ILD) and Administrator Appraisal.

PDAS required that new teachers and teachers new to a district receive an orientation. In addition, the PDAS Teacher Manual is to be given to ALL teachers (see Letter to the PDAS Trainer Addressed).

The Proficiencies for Learner-Centered Instruction adopted by the Texas State Board of Educator Certification (SBEC) include the following criteria:

Learner-Centered Knowledge

  • The teacher possesses and draws on a rich knowledge base of content, pedagogy, and technology to provide relevant and meaningful learning experiences for all students.

Learner-Centered Instruction

  • To create a learner-centered community, the teacher collaboratively identifies, plans, implements, and assesses instruction using technology and other resources.

Equity in Excellence for All Learners

  • The teacher responds appropriately to diverse groups of learners.

Learner-Centered Communication

  • While acting as an advocate for all students and the school, the teacher demonstrates effective professional and interpersonal communication skills.

Learner-Centered Professional Development

  • The teacher, as a reflective practitioner dedicated to all students’ success, demonstrates a commitment to learn, to improve the profession, and to maintain professional ethics and personal integrity.

http://www4.esc13.net/uploads/pdas/docs/LearnerCenteredSchools.pdf

Source: SBEC publication, Learner-Centered Schools for Texas, A Vision of Texas Educators, July 1997.

[See the Professional Development Appraisal System Framework for Texas at http://www4.esc13.net/pdas/] for the PDAS Framework, Appraisal From, and an explanation of the state criteria used for scoring. Hard copies will available for the onsite visit.

[See Addendum Exhibit (AE22) PDAS Appraisal Instrument] for CAEP Standard 2.

Texas Teacher Evaluation and Support System (T-TESS)

Note: In Fall 2016, the state will roll out the new Texas Teacher Evaluation and Support System (T-TESS) to replace the PDAS Appraisal System in evaluating teacher progress and development. All Texas administrators and teachers are in the process of being trained in this new system. Our University Field Supervisors attended a two-day T-TESS training at WTAMU on August 18 and 19, 2016 led by two state trained and T-TESS certified faculty members. New evaluation instruments aligned with the state’s T-TESS criteria have been developed and will be implemented in Fall 2016 in the evaluation of our clinical teachers.

[See Addendum Exhibit (AE23) T-TESS Evaluations].

[For more information, see also http://tea.texas.gov/Texas_Educators/Educator_Evaluation_and_Support_System/Texas_Teacher_Evaluation_and_Support_System/].

(FFR, p. 4, paragraph 5)

The data, aggregated by domains, is presented in raw numbers (not percentages) for ‘Proficient or Exceeds’ combined categories.

Response:

The data presented in Table 2 of the SSR Exhibit 1.1.3 have been disaggregated by specialty licensure/certification areas that include percentages rather than raw numbers in each category of the PDAS appraisal instrument in Addendum Exhibit (AE17). The EPP has provided an in-depth explanation of the PDAS Appraisal Instrument (AE22), the criteria, and the PDAS Appraisal Framework in our previous response in the SSR Addendum on pages 25-26.

[See Addendum Exhibit (AE17) PDAS Evaluation Data].

[See Addendum Exhibit (AE22) PDAS Appraisal Instrument].

(FFR, p. 4, paragraph 5)

The PDAS data are not disaggregated by specialty licensure area.

Response:

PDAS data have been disaggregated by specialty licensure/certification areas. Please see the EPP’s previous response in the SSR Addendum on pages 25-26. Addendum Exhibit (AE17) provides evidence of the disaggregated data and Addendum Exhibit (AE22) shows the PDAS Appraisal Instrument that is used by the EPP for the assessment of clinical teachers and statewide to assess the performance of all in-service teachers in Texas.

[See Addendum Exhibit (AE17) PDAS Evaluation Data].

[See Addendum Exhibit (AE22) PDAS Appraisal Instrument].

(FFR, p. 4, paragraph 6)

The Student/Clinical Teacher Evaluation is the candidates’ assessment of their preparation at the completion of their clinical teaching. Data for four consecutive semesters documenting content knowledge are not disaggregated by specialty licensure area, but are presented under “All level/Secondary” (Table 7, 1.18).

Response:

The candidates’ Student/Clinical Teacher Evaluations for four consecutive semesters that document content knowledge have been disaggregated by specialty licensure/certification areas including secondary programs in Addendum Exhibit (AE17). The specific certification areas were not available at the time of submission of the SSR.

[See Addendum Exhibit (AE17) PDAS Evaluation Data].

(FFR, p. 4, paragraph 6)

Data from this assessment are disaggregated by specialty area (although data for secondary programs are combined) when documenting pedagogical knowledge (Table 6, 1.1.9).

Response:

Assessment data from the Student/Clinical Teacher Evaluations for four consecutive semesters that document pedagogical knowledge in secondary programs have been disaggregated by specialty licensure/certification areas as evidenced in Addendum Exhibit (AE17).

[See Addendum Exhibit (AE17) PDAS Evaluation Data].

(FFR, p. 4, paragraph 7)

The Methods Experience Assessment is an instrument used by both the cooperating teacher and university field supervisor to assess candidates in their field work.

Response and Clarification:

The Methods Experience Assessment is an instrument used by both the cooperating teacher and university faculty to assess candidates in their fieldwork.

[See Addendum Exhibit (AE24) Methods Field Experience Assessment].

(FFR, p. 4, paragraph 7 and p. 5, top of page)

The form (Methods Field Experience Assessment) is a checklist with a three level scoring guide. No rubric was provided.

Response:

For continuous improvement of the EPP, university faculty, the Director of the Office of Teacher Preparation and Advising, and stakeholders have collaboratively revised the Methods Field Experience Assessment and the Methods Field Assessment rubric. Addendum Exhibits (AE24) and (AE25) provide the improved assessments and rubric that will be used to evaluate our candidates during their forty hours of Methods field observations.

[See Addendum Exhibit (AE24) Methods Field Experience Assessment].

[See Addendum Exhibit (AE25) Methods Field Experience Assessment Rubric].

(FFR, p. 5, top of page)

The evaluations are kept in candidates’ individual folders in the Office of Teacher.

Response of Correction:

The evaluations are kept in candidates’ individual folders in the Office of Teacher Preparation and Advising.

(FFR, p. 5, top of page)

There were no data provided for this assessment for the EPP or for the individual programs (Exhibit 1.1.11).

Response:

The Methods Field Experience assessments are housed in the Individual Candidate Folders in the Office of Teacher Preparation and Advising. EPP data for this assessment and for individual programs have been provided in Addendum Exhibit (AE26). The data presented in SSR Exhibits 1.1.11, 1.1.8, and 1.1.9 have been disaggregated by specialty licensure/certification areas and are combined for clarity.

[See Addendum Exhibit (AE26) Revised Program Data].

(FFR, p. 5, paragraph 1)

The data (Exhibits 1.1.8 and 1.1.9) are not disaggregated by specialty licensure area.

Response:

At the time of the EPP’s submission of the SSR, the state had only released the Teacher Preparation Effectiveness Survey (or Principal Survey) for 2012-2013. The EPP also experienced a TEA Compliance Audit in 2012-2013. As a result of the audit findings, many of the EPP’s current efforts for continuous improvement began in earnest in 2013. With a change in administrative leadership of the Department Head and the appointment of a new Director of Accreditation, the EPP began the journey toward CAEP Accreditation.

Since that time in 2012-2013, the state has released only the raw data from Principal Surveys for 2014 and 2015 in Excel spreadsheets. The EPP has reviewed, analyzed, and disaggregated the data by specialty licensure/certification areas as far as is possible for the surveys and developed a similar format to the 2012-2013 survey. In this format, the EPP provides evidence that our completers possess content and pedagogical knowledge in their effectiveness as first year teachers. Data are compared with the state and another Texas EPP from a university of similar size to WTAMU.

Please see the EPP’s previous response in the SSR Addendum. Thank you. [See Addendum Exhibit (AE16) Principal Survey Data].

[See Addendum Exhibit (AE26) Revised Program Data].

(FFR, p. 5, paragraph 2)

There are no rubrics provided for the assignments or data to provide evidence of growth or development (Exhibit 1.1.9).

Response:

The intent of the EPP was to create, develop, and provide data or evidence notebooks for our use before and during the onsite visit that represent multiple evidence sources supported by supporting data for each of the five CAEP Standards, all Components, and the Crosscuts that were used and developed in the EPP’s preparation of the Self-Study Report (SSR). Elementary Education, 4-8, Secondary Education, Special Education, and Alternative Certification (MAT/ACP) Programs have also developed data or evidence notebooks to encourage continuous improvement of all programs of the EPP and to meet CAEP Standards. Candidate exemplars of KEI Assignments, assessments, and rubrics are housed within the developed program notebooks for each program that will be available onsite. A purposive sample of representative Methods course assignments was presented in the SSR because of the exhaustive nature of each program’s KEI course assignments. Data has been provided in the SSR Addendum to demonstrate the proficient growth and development of our candidates from samples of candidate coursework and the rubrics faculty used to evaluate candidate progress in Addendum Exhibit (AE27). This Addendum Exhibit presents revised data from SSR Exhibit 1.1.9.

[See Addendum Exhibit (AE27) Rubrics and Development Data].

(FFR, p. 5, paragraph 2)

The Analysis Rubric used for to review, analyze, and evaluate syllabi was not found in the exhibits.

Response:

The Analyses Rubrics I and II developed by the EPP have been provided that were used to evaluate the syllabi of the EPP in Addendum Exhibits (AE28) and (AE29).

[See Addendum Exhibit (AE28) Syllabi Analysis I].

[See Addendum Exhibit (AE29) Syllabi Analysis II].

(FFR, p. 5, paragraph 3)

Course descriptions and candidate learning outcomes for 10 representative courses are listed to demonstrate candidates receive instruction in InTASC standards 1, 2, and 3. There are no data, however to provide evidence that candidates achieve those outcomes.

Response:

The EPP conducted faculty interviews of all faculty of initial certification programs in Spring, Summer I, and Summer II semesters of 2016. Of the fifteen questions from the interviews, data from specific questions that address the ten InTASC Standards demonstrate candidates are not only receiving instruction in the four categories and ten InTASC Standards in all EPP programs, but also are achieving InTASC outcomes. The questions included: “How does your program ensure that candidates demonstrate an understanding of the ten InTASC standards?” and “What data or evidence do you have to support this?”

For faculty responses in Elementary Education, Grades 4-8, Secondary Education, Special Education, and Alternative Certification Programs, the EPP used a qualitative methodology for analysis of the collected interview data. Emerging categories depicted alignment with InTASC Standards for all syllabi in all programs, KEI Assignments,

In addition to syllabi alignment in Elementary Education and 4-8, KEI Assignments provided evidence of learning outcomes of InTASC Standards (i.e., EDEC 2383 Dispositions and Philosophy Paper/Presentation; EDEC 3384 Lesson Plans; and EDEC 4385 Literacy Backpacks) and Reading (i.e., EDRD 3301 Author Illustrator Presentation; EDRD 3302 Balanced Literacy Project and Paper; EDRD 3304 Structured Literacy Project and Paper; and EDRD 4302 Diagnosis and Remediation Projects).

For the KEI Assignment or capstone project in EDRD 4302 Reading Evaluation Report, candidates find a child (K-12th grade) and actually administer three tests (IRI, DRA, and Running Record) with their selected child. Candidates are required to reflect how they conducted each assessment and collect hands-on activities for their student based on the observation. Upon completion, candidates present their findings with Power Point slides and the best activity demonstration. At the end of each semester, candidates completed the self- designed survey and their responses were thoroughly analyzed to improve the project.

Candidates provided constructive feedback and the project was revised accordingly each semester. The data demonstrated significant increases on almost all areas in Fall 2015, Spring 2016, and Summer 2016 semesters. In Fall 2015, approximately 50-60% of candidates demonstrated either “Distinguished” or “Proficient,” while in the spring and summer 2016, approximately 70-90% of candidates demonstrated as “Distinguished” or “Proficient” on the three test administrations. The alignment of EDRD 4302 and the capstone project with InTASC Standards (as well as in all courses and programs of the EPP) and the resulting data from the project provides evidence that candidates not only receive instruction but also achieve learning outcomes in the InTASC Standards.

Syllabi alignment in Secondary Education and MAT/ACP and KEI Assignments demonstrated evidence of InTASC Standards instruction and learning outcomes for secondary candidates (i.e., in EDSE 4320 Secondary Methods I and EDSE 4330 Secondary Met hods II, Teacher’s Notebook requirements address all ten InTASC Standards, EDSE 6333 Secondary Methods and EDSE 6311 Psychological Foundations of Education for MAT/ACP candidates provided evidence of InTASC instruction and learning outcomes through the Diversity/Micro Cultures Research assignments, the study of Marzano, Dean, Lemov’s Teach Like a Champion, the thirteen TExES Competencies for Effective Teaching that include lesson planning, assessment, technology, working with low socioeconomic, ELLs, and students with special disabilities. Professional behaviors and the Texas Code of Ethics are also taught in Secondary Education.

In Special Education, syllabi alignment with InTASC Standards and KEI Assignments demonstrate instruction and candidate learning outcomes (i.e., EDSP 4369 Special Education Methods, EDSP 4358 Classroom Management of Exceptional Learners, and EPSY 3350 Characteristics of Exceptional Learners KEI Assignments and the parent/community Center for Learning Disabilities Meetings, Conference, and special speakers provide evidence of Special Education Candidates InTASC learning outcomes.

Data from the EPP’s candidates of their GPAs (including End-of-Course Grades and KEI Assignment results) of 2.75 or higher in all education courses, Methods Field Observation Evaluations, PDAS evaluations, Clinical Teacher Exit Surveys, and TExES Content and TExES PPR exam results provide evidence that candidates achieve learning outcomes in InTASC Standards 1, 2, 3, and 4. When candidates pass their state certification exams in content and pedagogy (TExES Content and TExES PPR Exams) that are based upon the state competencies for Texas educators and are aligned with the InTASC Standards, then candidates have mastered the thirteen state teacher competencies and have achieved the learning outcomes of the InTASC Standards.

[See Addendum Exhibit (AE7) GPA Data].

[See Addendum Exhibit (AE9) ASEP Reports].

[See Addendum Exhibit (AE12) PPR Exam Results].

[See Addendum Exhibit (AE17) PDAS Evaluation Data].

[See Addendum Exhibit (AE24) Methods Field Experience Assessment].

[See Addendum Exhibit (AE25) Methods Field Experience Assessment Rubric].

[See Addendum Exhibit (AE30) PPR and TExES Competencies Alignment].

[See Addendum Exhibit (AE34) Faculty Interview Questions Data].

[See Addendum Exhibit (AE36) Reading Evaluation Reports].

[See Addendum Exhibit (AE42) PEO and CEI Data, Spring 2015].

[See Addendum Exhibit (AE47) Student/Clinical Teachers Exit Evaluations].

[See Addendum Exhibit for Standard 1 (AE1.1.3) Completers Apply Content/Pedagogical Knowledge in Outcome Assessments].

(FFR, p. 5, paragraph 4)

Exhibit 1.1.5, Table 2 provides grades for 12 courses for three academic years, 2012-2015, not disaggregated by licensure area.

Response:

EPP data in the revised Table 2 of the SSR Exhibit 1.1.5 are disaggregated by specialty licensure/certification areas for 2012-2015 in the Addendum Exhibit (AE31).

[See Addendum Exhibit (AE31) Revised SSR Exhibit 1.1.5].

(FFR, p. 5, paragraph 5)

There is little evidence presented to demonstrate that candidates use research and evidence to develop an understanding of the profession and use both to measure P-12 and their own professional practice (CAEP 1.2). This component does not seem to be addressed.

Response:

To fully address CAEP 1.2, the qualitative research study of Faculty Interview Questions Data included the question: “What evidence do you or your program have that candidates use research and evidence to develop an understanding of the profession and use both to measure P-12 and their own professional practice?”

The research study findings demonstrate the comprehensive use of research and evidence by EPP candidates and faculty in all programs and at all levels. Examples of data from the study in Elementary Education and 4-8 indicate for example, in Bilingual/ESL coursework, candidates access the TELPAS, STAAR, and TEA websites to research articles for analyses and syntheses; in EDEC 4385, candidates research Early Childhood Journals and peer- reviewed research articles for writing research papers and participate in the Opportunity School field observations; in 4-8 Math and Science, candidates participate in research field experiences and hands-on experiential learning at the Amarillo Botanical Gardens, Don Harrington Discovery Center, Wild Cat Bluff Nature Trails, Palo Duro Canyon research studies, and the Panhandle Math and Science Conference.

In Secondary Education and MAT/ACP, current research studies on school reform, best practices and innovations, direct and interactive instruction, innovative technologies, 1-1 laptop initiatives, Marzano’s Nine High-Yield Instructional Strategies and Dean’s adaptation of Marzano’s strategies, multiculturalism, student, parent, and community engagement, Web 2.0 tools, and working with diverse P-12 student populations that include gifted and talented, low socioeconomic, struggling, and at-risk students, and school violence are being conducted by both candidate and faculty alike in these programs.

Research and evidence in Special Education are demonstrated in candidate development of lesson plans that include scientifically-based and peer-reviewed interventions for students with disabilities. Candidates research specific disabilities and develop a Functional Behavior Assessment (FBA) and Plan. They work with students with disabilities throughout each semester and participate in the Center for Learning Disabilities parent/community meetings and conferences.

The Faculty Interview Questions Data research study found that all course syllabi describe the use of research and evidence in program assignments. Coursework, field observations, and clinical experiences provide opportunities for candidates to grow in confidence, skill, and reflection through research and evidence. Candidates use research to develop a personal understanding of the profession and evidence to measure P-12 and their own professional practice, and, most importantly, use research to impact P-12 student learning and development.

Note: Additional candidate examples of the use of research and evidence in KEI Assignments and coursework are housed in EPP Program Notebooks onsite and will be available during the CAEP Onsite Visit in November 2016. Thank you.

[See Addendum Exhibit (AE32) Revised SSR Exhibit 1.2.1].

[See Addendum Exhibit (AE34) Faculty Interview Questions Data].

(FFR, p. 5, paragraph 5)

One of the EPP’s 2013 program learner outcomes is ‘The student-centered educator understands how to solve problems critically using research-based and reflective pedagogy’ (Exhibit 1.2.1). Since then the outcomes have been revised. The current outcomes include ‘Critical Creative Thinkers’. The PEO rubric to measure this outcome was only administered in the fall 2015 semester in two courses, but the data did not indicate how many or what percentage of candidates scored at the different levels of the rubric (Accomplished, Proficient, etc.).

Response:

Please see the EPP’s previous response in the SSR Addendum on pages 21-22. Thank you.

The SSR Exhibit 1.2.1 has been revised in Addendum Exhibit (AE32).

[See SSR Exhibit 1.1.1 Program Educational Outcomes (PEOs), Ethical and Professional Dispositions of Candidates, and Standards Alignment of the EPP].

[See Addendum Exhibit (AE20) Revised SSR Exhibit 1.1.2].

[See Addendum Exhibit (AE21) PEO Additional Data].

[See Addendum Exhibit (AE32) Revised SSR Exhibit 1.2.1].

[See SSR Exhibit 1.1.1 Program Educational Outcomes (PEOs), Ethical and Professional Dispositions of Candidates, and Standards Alignment of the EPP].

(FFR, p. 5, paragraph 5)

The team found no evidence to document effective candidate use of data to assess P-12 progress and to modify instruction based on data with the exception of item 11 on the Principal survey: ‘use the results of formative assessments to guide instruction. The EPP average score was 2.14 compared to the state-wide average score of 2.16. Only one cycle of data was available.

Response:

The EPP provides evidence of effective candidate use of data to assess P-12 progress and to modify instruction based on data in Elementary Education, 4-8, Secondary Education, Special Education, and MAT/ACP Programs as follows:

As an example, candidates in Bilingual/ESL EDPD 4378 and EDPD 4388 use rubistar.com to create rubrics to assess student progress. An online data analysis chart is created for assessment of student growth. Candidates use data on the Texas Education Agency’s (TEA) TELPAS website to watch TEA videos and to practice rating Second Language Learners’ linguistic skills. Candidates have to submit their ratings and rationales to an online discussion board on Blackboard with instructor and peer-to-peer feedback.

In EDPD 4388, candidates write lesson plans that include Sheltered Instruction Observation Protocol (SIOP) components to address CLD student needs in content area lessons with instructor feedback. Candidates research venues to teach their lesson plans (in schools, ESL classes, church groups, after-school programs, etc.). After candidates teach their lesson plans, they write a reflection paper explaining and evaluating the lesson and how the SIOP components supported the CLD students’ learning and what changes they would make based on the teaching experience. Candidates reflect on if they were to teach the lessons again, what changes would they make and why? To come full circle based upon these experiences, candidates create an assessment that also includes SIOP support and administers the support to the P-12 students they taught. Candidates evaluate the students’ scores and performances with the rubric they have created and then analyze what that data tells them about their teaching and the SIOP support. Was their teaching effective? Candidates also write reflections of their analyses, data results, and effectiveness of their lesson plans.

The focus of EDRD 4302 Diagnosis and Remediation is upon candidate data analysis and evaluation in reading through diagnosis and remediation. The KEI Assignment or capstone project in the course is the “Reading Evaluation Report.” Candidates find a child (K-12th grade) and actually administer three tests (IRI, DRA, and Running Record) with their selected child. Candidates are required to reflect how they conducted each assessment and collect hands-on activities for their student based on the observation. At the end, candidates present their findings with Power Point slides and the best activity demonstration to the class.

In EDEC 3384, candidates design lesson plans based upon student data and research that are required to have differentiation for at least one sub population of their choosing. Through field observations and volunteer opportunities at the Opportunity School, Eastridge Elementary of Amarillo ISD, and Lakeview Elementary of Canyon ISD that have highly diverse student populations, candidates add to their data and research through observations and working with the teachers and young students at these locations.

In EDEC 4384, candidates write papers on formative and summative assessments by examining TEA’s STAAR website, videos, accommodations explorations, and current research to reflect and to demonstrate their understanding of the differences between formative and summative assessments of students. Candidates analyze student data on the STAAR state assessments and learn what the data truly means. For example, candidates practice linguistic accommodations and analyze the different tests that are offered by the state. In this course, candidates explore different ways to use data, learn how accommodations are necessary for different student populations, and how important making modifications to their own teaching makes positive impact upon their students.

In Special Education courses, candidates learn and practice behavior management. Candidates gather data on students to see how their behavior develops and progresses over time. In these classes, candidates use data from apps on their cell phones to use in the classroom for instruction in their Methods course. Candidates research various behavior or social skills apps that are appropriate for the classroom that teachers can use. Modifications are continuous in the Special Education Program through differentiation. Candidates research disabilities and interventions for particular content areas and then develop lesson plans using their research. Candidates are encouraged to interview teachers during the development of their lesson plans. Due to these assignments, candidates understand how to apply their research findings and experiences in the classroom and monitor their instruction.

In Secondary Education and MAT/ACP courses, candidates research and analyze student data from TEA’s STAAR website on End-of-Course exams in their specific content or specialty licensure/certification areas, in journal articles, and the Texas Academic Performance Reports (TAPR) of districts and campuses where they are doing field observations and will have clinical experiences. Interns in MAT/ACP courses are constantly using formative and summative evaluations to monitor student progress and their own instruction. Through coursework and KEI Assignments, candidates and interns alike understand how to analyze data and apply their research findings to experiences in the classroom.

In all Methods courses, candidates use research and evidence to analyze student data, current research of best practices in their specialty licensure/certification areas, and complete KEI Assignments, group projects, class presentations, and field experiences in our EPP and other colleges of the university in Agriculture, Art, Bilingual/ESL, English Language Arts/Social Studies, Sports/Exercise/Sciences (SES) or Physical Education (P.E.), Math, Music, Science, Theatre Arts, and Writing. EPP Program Notebooks onsite provide data of the use of research and evidence in all programs and specialty/licensure/certification areas.

As the EPP has indicated previously in the SSR Addendum, EPP Program Notebooks that house data, examples of lesson plans, reflective practice, Weekly Progress Reports during the clinical teaching experience, PDAS data, and candidate exemplars from all programs and courses will be available onsite for the November CAEP Onsite Review. Examples of Clinical Teacher’s Exit Notebooks and all Individual Candidate Folders will also be available onsite in the Office of Teacher Preparation and Advising.

At the time of the EPP’s submission of the SSR, the state had only released one cycle of the Teacher Preparation Effectiveness Survey (or Principal Survey) for 2012-2013. Since that time in 2012-2013, the state has released only the raw data from Principal Surveys for 2014 and 2015 in Excel spreadsheets.

The EPP has reviewed, analyzed, and disaggregated the data by specialty licensure/ certification areas as far as is possible for the surveys and developed a similar format to the 2012-2013 survey. Data are compared with the state and another Texas EPP from a university of similar size to WTAMU.

Please see the EPP’s previous response in the SSR Addendum on pages 24-25. Thank you.

[See Addendum Exhibit (AE16) Principal Survey Results].

[See Addendum Exhibit (AE26) Revised Program Data].

[See Addendum Exhibit (AE34) Faculty Interview Questions Data].

(FFR, p. 5, paragraph 5, p. 6, top of page)

The CEI item 5 states ‘Engages in continuous self-evaluation and improvement’. The CEI was piloted in the fall 2015 semester in two courses in the Reading program. In Exhibit 1.1.4 means were provided for candidates in the two courses, but the data did not indicate how many or what percentage of candidates scored at the different levels of the rubric (Accomplished, Proficient, etc.).

Response:

Just as the EPP engages in continuous evaluation and improvement, our candidates engage in continuous self-evaluation and improvement throughout the progression of the EPP. Important areas for candidates to continually evaluate for improvement are the Ethical and Professional Dispositions. In 2013, 2014, and continuing currently, the EPP provided instruction in the Texas Code of Ethics for Educators for all candidates in all programs. Candidates sign Affirmations for the Code of Ethics that are housed in their Individual Candidate Folders in the Office of Teacher Preparation and Advising. To go above and beyond ethical behaviors of teachers, the EPP developed Ethical and Professional Dispositions of Candidates and the Candidate Evaluation Instrument (CEI) to assess the ethical and professional behaviors of our teacher candidates.

To enhance and enrich the ethical and professional behaviors of our teacher candidates, the EPP implemented the CEI in Fall 2015. Program faculty assessed the Ethical and Professional Dispositions of Candidates in a variety of ways including use of the CEI (as in the Reading Pilot Study). Examples of how the dispositions were assessed in Fall 2015 in EDEC 2383, EDEC 2384, all Methods courses, in Special Education EDSP 4369 and EDSP 4358 Weekly Reflection Papers, PDAS evaluations during the clinical teaching experience, and reflection writings in all courses are housed in the EPP Program Notebooks that are available onsite for the review.

The 2014 and 2015 Principal Surveys released by TEA also demonstrate dispositional behavior of our completers at the end of their first year of teaching.

The SSR Exhibit 1.1.4 has been revised in Addendum Exhibit (AE35). Also, Addendum Exhibit (AE36) provides data of what percentages of candidates scored at the different levels of the CEI rubric (Accomplished, Proficient, etc.).

Please also see our previous response in the SSR Addendum on pages 29-30. Thank you.

[See Addendum Exhibit (AE16) Principal Survey Data].

[See Addendum Exhibit (AE35) Revised SSR Exhibit 1.1.4].

[See Addendum Exhibit (AE36) Reading Evaluation Reports].

(FFR, p. 6, paragraph 1)

There is little evidence to demonstrate that the EPP ensures that completers apply content and pedagogical knowledge as reflected in outcome assessments in response to standards of Professional Specialized Associations, the National board of Professional Teaching Standards, state, or other accrediting bodies (CAEP 1.3).

Response:

The Texas Education Agency (TEA) is the state accrediting body of Texas. As stated previously, Texas is neither a CAEP partnership state nor a SPA state; therefore, the EPP must follow state statute and TEA protocols as our accrediting body. Required state certification exams (TExES Content and TExES PPR) indicate candidates’ proficiency of specialty licensure/certification content and proficiencies. In the SSR Addendum, the EPP provides additional evidence of the application of the content and pedagogical knowledge of completers as reflected in outcome assessments in Addendum Exhibits (AE1.1.3) and (AE33).

[See Addendum Exhibit (AE9) ASEP Reports].

[See Addendum Exhibit (AE12) PPR Exam Results].

[See Addendum Exhibit (AE16) Principal Survey Data].

[See Addendum Exhibit for Standard 1 (AE1.1.3) Completers Apply Content/ Pedagogical Knowledge in Outcome Assessments].

(FFR, p. 6, paragraph 1)

The EPP provided a letter from the TEA indicating that it was ‘accredited’, however state reports for the different licensure areas were not included in the self-study.

Response:

Please see the EPP’s previous responses in the Addendum to the FFR on pages 3-4, pages 5- 6, and the Addendum Exhibits (AE1), (AE2), and (AE3). Thank you.

(FFR, p. 6, paragraph 1)

In addition, the majority of the data presented was not disaggregated by licensure area and there were no comparisons or trends across specialty areas based on data made in the Self- study.

Response:

The EPP has provided disaggregated data by specialty licensure areas in this Addendum and Addendum Exhibits.

Please see the EPP’s previous response in the SSR Addendum on pages 17-19. Thank you.

[See Addendum Exhibit (AES1.1.1) Deep Understanding of InTASC Standards].

[See Addendum Exhibit (AES1.1.2) Research and Evidence].

[See Addendum Exhibit (AES1.1.3) Completers Apply Content/Pedagogical Knowledge in Outcome Assessments].

[See Addendum Exhibit (AES1.1.4) Access to Rigorous College and Career Readiness Standards].

[See Addendum Exhibit (AES1.1.5) Model and Apply Technology Standards].

[See Addendum Exhibit (AE6) GPAs Data].

[See Addendum Exhibit (AE7) Specialty Licensure/Certification Data].

[See Addendum Exhibit (AE9) ASEP Reports].

[See Addendum Exhibit (AE12) PPR Exam Results].

[See Addendum Exhibit (AE17) PDAS Evaluation Data].

[See Addendum Exhibit (AE18) Field Observation Evaluations].

[See Addendum Exhibit (AE37) EPP Trend Data].

[See SSR Exhibit 1.1.1 Program Educational Outcomes (PEOs), Ethical and Professional Dispositions, and Standards Alignment of the EPP].

Note: The CAEP Accreditation Handbook, Version 3, March 2016 including “Appendix G Assessment Rubric” and the CAEP Evaluation Rubric for Visiting Teams Draft were not available to the EPP during the preparation of our Self Study Report.

In the CAEP Evaluation Rubric for Visiting Teams Draft for CAEP Sufficient Level Draft Component 1.3 states “The providers make comparisons and identifies trends across specialty licensure areas based on data” (page 3, sixth bullet in the middle of the page).

Comments in the FFR included “there were no comparisons or trends across specialty areas based on data made in the Self-study.” The CAEP Accreditation Manual, Version 2 February 2016 was used by the EPP in the preparation of our Self-Study. In the “CAEP Evidence Table” for Component 1.3 of Standard 1, “A. Measure of type of Evidence” suggests “Specialty area-specific state standards achieved OR evidence of alignment of assessments” (third bullet, left column), “B. Guidelines for review” recommends “State program approval” (third bullet, middle column), and “C. Accreditation review” suggests steps for the “Off-site” and “On-site reviews” (p. 91). There was no mention of the EPP needing to make “comparisons and identifies trends across specialty licensure areas abased on data”.

However, the EPP has complied with the FFR’s request as best as possible with the Addendum Exhibit (AE37). Upon the suggestions of CAEP Staff and the Offsite and Onsite Team Chair, the EPP includes this explanation of why this data was not included in the SSR and as a way to clarify the EPP’s approach to both the Self-Study Report and the Addendum in Response to the FFR. Thank you.

(FFR, p. 6, paragraph 2)

The EPP provides evidence that candidates demonstrate skills and commitment that afford all P-12 students access to college- and career-ready standards (CAEP 1.4) through data (not disaggregated by licensure area) for six semesters in the Student/Clinical Teacher Evaluation.

Response:

Evidence that our candidates demonstrate skills and commitment that afford all P-12 students access to college- and career-ready standards (CAEP 1.4) through data have been disaggregated by specialty licensure/certification areas for six semesters in the Student/Clinical Teacher Evaluations in Addendum Exhibits (AE38) and (AE47).

[See Addendum Exhibit (AE38) Revised SSR Exhibit 1.4.1.].

[See Addendum Exhibit (AE47) Student/Clinical Teachers Evaluations].

(FFR, p. 6, paragraph 2)

Only 50 percent of candidates felt prepared or very well prepared to address the needs of limited English proficient students while 54 percent indicated feeling prepared or very well prepared to work with special education students. The EPP indicates that it will create seminars to address these needs.

Response:

West Texas A&M University enjoys strong partnerships with area community colleges that include Amarillo College (Amarillo and branches in Dumas and Hereford), Frank Phillips College (Borger), Clarendon College (Clarendon), and South Plains College (Levelland). The community college graduates with associate degrees transfer to West Texas A&M University, many of whom are accepted into our program. These candidates take their Education Foundations (EDPD 3340) and Special Education classes at their community colleges and comprise some of the 50% and 54% of candidates who feel prepared or very well prepared to address the needs of special education students.

Based upon transfer data, out of 791 new community college transfers who came to WT in the Fall 2013, 477 of them were from one of the four local community colleges that equated to 60.3% of the new transfer population. In Fall 2014, out of 829 transfers, 446 came from local community colleges as 53.8% of the new transfer population. In Fall 2015, out of 787 transfers, 467 were from local community colleges as 59.3% of the new community population. Transcripts with transfer credit for these new community transfer candidates in Education Foundations (EDPD 3340) and the basic Special Education course were accepted by WTAMU.

For other candidates who take their both their ESL and all Special Education courses at WTAMU, their levels of preparedness are much higher as evidenced by 100% passing scores on their TExES Content and TExES PPR exams that are heavily focused on the competencies for teaching limited English proficient and Special Education students.

To fill any gaps that exist in levels of candidate preparedness in working with P-12 students with diverse needs, the EPP has continued to improve curriculum and instruction in all programs, developed new courses with a heavy focus upon teaching ELLs and students with disabilities, and created seminars to better prepare our candidates for teaching these and all students. In August/September, the EPP will offer seminars in Technology, Diversity/Poverty, Mental Health, and School Safety/School Violence by specialists and professional experts in the fields. Candidates will attend the seminars after their August Experience in Fall 2016.

In addition, Principal Survey evidence demonstrate that our completers are well prepared to teach limited English proficient and Special Education students as first year teachers.

[See Addendum Exhibit (AE16) Principal Survey Data].

[See Addendum Exhibit (AE33) Transfer Data].

[See Addendum Exhibit (AE47) Student/Clinical Teachers Evaluations].

[See Addendum Exhibit (AE48) Seminars for Clinical Teachers].

(FFR, p. 6, paragraph 3)

Evidence from the ASEP Principal Survey (one administration); the Student/Clinical Teacher Evaluations (data from three cycles indicate that 76% of candidates felt prepared or very well prepared to integrate educational technology into teaching); and learner outcomes from representative course syllabi are cited by the EPP to demonstrate that candidates model and apply technology standards (CAEP 1.5). There is no evidence, however, of candidates’ ability to track and share student performance digitally. The data are not disaggregated by specialty licensure area.

Response:

The EPP has provided evidence in the SSR and the SSR Addendum of disaggregated data for specialty licensure/certification areas. As previously stated in the Addendum, the state has released 2014 and 2015 Principal Survey data that the EPP has provided. Please see our previous responses in the SSR Addendum on pages 17-19.

In the CAEP Accreditation Manual Version 2 February 2016, Component 1.5 for Standard 1 states: “Providers ensure that completers [at exit] model and apply technology standards as they design, implement, and assess learning experiences to engage students and improve learning, and enrich professional practice” (p. 93).

The recommendations for “A. Measure or type of evidence” include evidence of “completers modeling and application of technology standards” through various measures of “Clinical experience observation instrument; lesson or unit plans, portfolios, work sample with exhibition of applications, and use of technology in instruction, technology course signature project/assignment” (pages 93-94).

Comments from the FFR stated: “There is no evidence, however, of candidates’ ability to track and share student performance digitally.” This was not a requirement nor expectation for the EPP and was never mentioned as outlined in the CAEP Accreditation Manual that was available to our EPP during the preparation of the SSR.

[See Addendum Exhibit (AE16) Principal Survey Data].

  1. No state program reports provided as required in the Self-study template.

Response:

Please see our previous responses in the Addendum to the FFR on pages 3-4, pages 5-6, and the Addendum Exhibits (AE1), (AE2), and (AE3). Thank you.

(FFR, p. 6, bottom of page)

<ol style="padding-left: 19px; id=" l14"="">
  • No state program reports provided as required in the Self-study template.

Response:

Please see the EPP’s previous responses in the Addendum to the FFR on pages 3-4, pages 5- 6, and the Addendum Exhibits (AE1), (AE2), and (AE3). Thank you.

(FFR, p. 6, bottom of page)

  1. No disaggregation of data for the specialty licensure areas (with the exception of Student/Clinical Teacher Exit Evaluation, grades for clinical teaching, and GPA at admissions).

Response:

Disaggregation of data for specialty licensure/certification areas of the EPP have been provided in Addendum Exhibits (AE7), (AE9), (AE11), (AE12), (AE13), (AE16), (AE17), and (AE26).

[See Addendum Exhibit (AE7) Specialty Licensure/Certification Data].

[See Addendum Exhibit (AE9) ASEP Reports].

[See Addendum Exhibit (AE11) Traditional and Alternative (ACP) Routes for Initial Certification].

[See Addendum Exhibit (AE12) PPR Exam Results]

[See Addendum Exhibit (AE13) LBB Certification Reports].

[See Addendum Exhibit (AE16) Principal Survey Data].

[See Addendum Exhibit (AE17) PDAS Evaluation Data].

[See Addendum Exhibit (AE26) Revised Program Data].

(FFR, p. 6, bottom of page)

  1. Not all rubrics and/or instruments were provided.

Response:

EPP rubrics and/or instruments are provided in Addendum Exhibits (AE22), (AE27), (AE28), and (AE29) and in SSR Exhibits (1.1.2) and (1.1.4). Additional course rubrics and/or scoring guides in each program are housed in EPP Program Notebooks that will be available onsite.

[See Addendum Exhibit (AE22) PDAS Appraisal Instrument].

[See Addendum Exhibit (AE27) Rubrics and Development Data].

[See Addendum Exhibit (AE28) Syllabi Analysis I].

[See Addendum Exhibit (AE29) Syllabi Analysis II].

[See SSR Exhibit 1.1.2. Program Educational Outcomes (PEOs) Rubric of the EPP].

[See SSR Exhibit 1.1.4. Candidate Evaluation Instrument (CEI)].

(FFR, p. 7, top of page)

  1. Data for only one cycle were provided for the PEO (only one program), CEI (only one program), the Teacher Preparation Effectiveness Survey, and the grades in the PPR Pre- practice and Post-Practice Test.

Response:

Additional data for PEOs, the CEI, Teacher Preparation Effectiveness Surveys (Principals Surveys), and grades in the Pre-practice and Post-practice Test have been provided in Addendum Exhibits (AEs).

[See Addendum Exhibit (AE16) Principal Survey Data].

[See Addendum Exhibit (AE21) PEO Additional Data].

[See Addendum Exhibit (AE42) PEO and CEI Data, Spring 2015].

[See Addendum Exhibit (AE46) Pre- and Post-Practice Test Grades].

(FFR, p. 7, top of page)

  1. The analysis of data does not include comparisons across programs

Response:

The EPP’s analysis of data includes comparisons across programs.

Please see the EPP’s previous response in the SSR Addendum on pages 17-19, and pages 36-37. Thank you.

[See Addendum Exhibit (AE7) Specialty Licensure/Certification Data].

[See Addendum Exhibit (AE9) ASEP Reports].

[See Addendum Exhibit (AE11) Traditional and Alternative (ACP) Routes for Initial Certification].

[See Addendum Exhibit (AE12) PPR Exam Results].

[See Addendum Exhibit (AE13) LBB Certification Reports].

[See Addendum Exhibit (AE16) Principal Survey Data].

[See Addendum Exhibit (AE17) PDAS Evaluation Data].

[See Addendum Exhibit (AE26) Revised Program Data].

(FFR, p. 7, top of page)

  1. Not all components were addressed in the self-study.

Response:

For CAEP Standard 1, Components 1.1, 1.2, 1.3, 1.4, and 1.5 have been addressed. All components have been addressed in the SSR and the SSR Addendum. Data presented in the Addendum Exhibits for CAEP Standard 1 (AES1) have been disaggregated by program and by specialty licensure/certification area for initial certification.

[See Addendum Exhibit Standard 1 (AES1.1.1) Deep Understanding of InTASC Standards].

[See Addendum Exhibit Standard 1 (AES1.1.2) Research and Evidence].

[See Addendum Exhibit Standard 1 (AES1.1.3) Completers Apply Content/Pedagogical Knowledge in Outcome Assessments].

[See Addendum Exhibit Standard 1 (AES1.1.4) Access to Rigorous College- and Career- Readiness Standards].

[See Addendum Exhibit Standard 1 (AES.1.1.5) Model and Apply Technology Standards].

[See Addendum Exhibit (AE7) Specialty Licensure/Certification Data].

(FFR, p. 7, top of page)

  1. There is little evidence that candidates use research and evidence to develop an understanding of the profession and use both to measure P-12 and their own professional practice.

Response:

Please see the EPP’s previous response in the SSR Addendum on pages 28-29. Thank you.

Note: Additional candidate examples of the use of research and evidence to develop an understanding of the profession and use both to measure P-12 and their own profession practice are located in KEI Assignments and coursework that are housed in EPP Program Notebooks onsite and will be available during the CAEP Onsite Visit in November 2016. Thank you.

[See Addendum Exhibit (AE32) Revised SSR Exhibit 1.2.1].

[See Addendum Exhibit (AE34) Faculty Interview Questions Data].

(FFR, p. 7, top of page)

  1. No documentation of the content validity or inter-rater reliability of the EPP developed instruments.

Response:

For content validity and inter-rater reliability of EPP developed instruments, the EPP is currently taking a three-fold approach. The first approach is for the EPP to assemble an unbiased Validity and Reliability Committee, the second is to engage professional colleagues from other colleges within our university, and the third approach will be to engage education faculty from another university who were unknown to the EPP to undertake validity and reliability studies in partnership with our university. For mutuality of benefit, the other university Department of Education Dean and faculty has requested copies of our Syllabi Analyses in exchange. Each group will use the EPP developed PEOs and CEI instruments to assess samples of candidate coursework submissions of KEI Assignments. If the validity and reliability studies achieve 80% or higher on these identical validity and reliability studies from the three groups, the EPP is enabled to ensure that our instruments are valid and reliable.

[See Addendum Exhibit (AE39) Validity and Reliability Data].

(FFR, p. 7, top of page)

  1. Only 50 percent of candidates felt prepared or very well prepared to address the needs of limited English proficient students.

Response:

West Texas A&M University has recently received the distinction of being named an Hispanic Serving Institution (HSI) from the Hispanic Association of Colleges & Universities (HACU) due to our serving 25% or more enrolled Hispanic students. This presents exciting opportunities for our candidates and the EPP as well as unique challenges. For more information on (HACU), please see: http://www.hacu.net/assnfe/CompanyDirectory.asp?STYLE=2&COMPANY_TYPE=1,5.

As one of our opportunities and challenges, West Texas A&M University enjoys strong partnerships with area community colleges that include Amarillo College (Amarillo and branches in Dumas and Hereford), Frank Phillips College (Borger), and Clarendon College (Clarendon). Amarillo College transfers about 75% of their graduates with associate degrees to West Texas A&M University who may later be accepted into our program. These candidates take basic classes at their community colleges and comprise some of the 50% and 54% of candidates who feel prepared or very well prepared to address the needs of limited English proficient. However, we accept their transcripts and transfer credit for the courses that these candidates have taken. For our WT candidates, the state requires that they take EPSY 3350 Children With Special Needs that includes ESL, gifted and talented, and children with special needs. The levels of preparedness for these candidates are much higher as evidenced by candidates’ passing scores on the TExES Content and TExES PPR exams that are heavily focused on the competencies for teaching limited English proficient P-12 and special needs students.

To fill any gaps that exist in levels of candidate preparedness in working with P-12 students with limited English proficiency or English Language Learners (ELLs), the EPP has continued to improve curriculum and instruction in the program, developed new courses with a heavy focus upon teaching ELLs, and created seminars to better prepare our candidates for teaching these and all students.

In addition, Principal Survey evidence demonstrate that our completers are well prepared to teach limited English proficient students as first year teachers.

[See Addendum Exhibit (AE16) Principal Survey Data].

[See Addendum Exhibit (AE47) Student/Clinical Teachers Evaluations].

[See Addendum Exhibit (AE48) Seminars for Clinical Teachers].

(FFR, p. 7, top of page)

  1. Only 54 percent indicated feeling prepared or very well prepared to work with special education students.

Response:

Please also see the EPP’s previous response in regard to the preparedness of our candidates to address the needs of limited English proficient students in the SSR Addendum on pages 43- 44. Thank you.

As previously stated, West Texas A&M University accepts many transfer graduates from area community colleges who may have already taken their basic special education classes at the community college level. Many of these transfers comprise part of the 54% of candidates who feel prepared or very prepared to address the needs of special education students. For candidates who take their special education courses at WTAMU, their levels of preparedness are much higher than the 54% as evidenced by 100% passing scores on their TExES Content and TExES PPR exams that are heavily focused on the competencies for teaching special education students and the state and federal laws and policies that guide special education instruction.

One of the EPP’s primary vehicles to better prepare our candidates to meet the needs of special education students is the Center for Learning Disabilities. Courses in Special Education at WTAMU require candidates to regularly attend parent and community meetings and listen to lectures presented by special guest speakers. Candidates also attend the annual Helen Piehl Distinguished Lecture Series Fall Conference hosted by the EPP and the Center for Learning Disabilities to participate in interactive presentations by national experts in the field. Candidates engage in class discussions and write weekly and final reflections on these experiences and the application of their learning with special education students.

To also bolster preparedness in working with P-12 students with diverse and special needs, the EPP has continued to improve curriculum and instruction in the program, developed new courses with a heavy focus upon teaching students with disabilities, and created seminars to better prepare our candidates for teaching these and all students. Faculty receive credit for attending the Helen Piehl Distinguished Lecture Series and others in their Annual Professional Summaries (APS). In August/September, the EPP will offer seminars in Technology, Diversity/Poverty, Mental Health, and School Safety and School Violence by specialists and professional experts in the fields. Candidates will attend the seminars after their August Experience in Fall 2016.

The Principal Survey evidence demonstrate that our completers are well prepared to teach special education students as first year teachers.

[See Addendum Exhibit (AE16) Principal Survey Data].

[See Addendum Exhibit (AE33) Transfer Data].

[See Addendum Exhibit (AE47) Student/Clinical Teachers Evaluations].

[See Addendum Exhibit (AE48) Seminars for Clinical Teachers].

(FFR, p. 7, middle of page)

  1. List of onsite tasks to be completed. Use the following three prompts for each task.

Standard 1 Task 1

  1. Evidence in need of verification or corroboration.

Response: The EPP has previously responded to each of these prompts within the SSR Addendum. In response, Addendum Exhibits have been delineated for each prompt in brackets. Thank you.

  1. List of all initial programs.

    [See Addendum Exhibits (AE7) and (AE40)].

  2. Disaggregated data for each licensure area for all assessments.

    [See Addendum Exhibits (AE7), (AE16) and (AE26)].

  3. State program reports.

    [See Addendum Exhibits (AE1), (AE2), and (AE3)].

  4. Description of the levels of performance for the PDAS.

    [See Addendum Exhibits (AE22) and (AE41)].

  5. Syllabi Analysis Rubric, Student/Clinical Teacher Evaluation Instrument, Teacher Preparation Effectiveness Survey instrument.

    [See Addendum Exhibits (AE22), (AE27), (AE28), (AE29), (SSR1.1.2), and (SSR1.1.4)].

  6. Data from the PEO and CEI for spring 2015.

    [See Addendum Exhibit (AE42)].

  7. Evidence of Elements 1.2, 1.3.

    [See Addendum Exhibits (AES1.1.2) and (AES1.1.3)].

  8. Comparison of data across licensure areas.

    [See Addendum Exhibits (AE7) and (AE26)].

  9. Evidence that candidates use data to reflect on their teaching.

    [See Addendum Exhibit (AE34)].

  10. Evidence that candidates use data to modify instruction.

    [See Addendum Exhibit (AE34)].

  11. Evidence that candidates use data to evaluate student progress.

    [See Addendum Exhibit (AE34)].

  12. Data demonstrating candidates’ ability to work with English Language Learners (ELLs) and students with disabilities.

[See Addendum Exhibit (AE34)].

(FFR, p. 7, bottom of page)

  1. Questions for EPP concerning additional evidence, data, and/or interviews, including follow up on response to 1c.

Response:

The EPP has previously responded to each of these questions within the SSR Addendum. As a response, Addendum Exhibits have been delineated for each question in brackets. Thank you.

  1. What evidence is there that candidates use data to reflect on their own teaching effectiveness?

    [See Addendum Exhibit (AE34)].

  2. What evidence is there that candidates use data to evaluate student progress?

    [See Addendum Exhibit (AE34)].

  3. What evidence is there that candidates use data to modify instruction?

    [See Addendum Exhibit (AE34)].

  4. What trends have been identified across specialty licensure areas based on data?

    [See Addendum Exhibit (AE37)].

  5. Are there additional data for the PEO and CEI?

    [See Addendum Exhibit (AE42)].

  6. Are there additional data demonstrating candidates’ ability to meeting the needs of English Language Learners (ELLs) and students with disabilities?

    [See Addendum Exhibits (AE34) and (AE43)].

  7. Have the seminars on ELLs and students with disabilities been implemented? What are the results?

[See Addendum Exhibit (AE48) Seminars for Clinical Teachers].

Preliminary recommendations for new areas for improvement and/or stipulations including a rationale for each.

Areas for Improvement (AFIs)

(FFR, p. 8, middle of page)

Area for Improvement: The EPP does not ensure that candidates in all programs demonstrate an understanding of the 10 InTASC standards.
Rationale: The EPP did not provide disaggregated data for all specialty licensure areas.

Response:

Based on evidence, the EPP ensures that candidates in all programs and at all levels demonstrate an understanding of the 10 InTASC Standards through disaggregated data by the specialty licensure/certification areas of the EPP in the SSR Addendum and Addendum Exhibits (AEs).

For example, the EPP conducted faculty interviews of all faculty of initial certification programs in Spring, Summer I, and Summer II semesters of 2016. Of the fifteen questions from the interviews, data from specific questions that address the ten InTASC Standards demonstrate candidates are not only receiving instruction in the four categories and ten InTASC Standards in all EPP programs, but also are achieving InTASC outcomes. The questions included: “How does your program ensure that candidates demonstrate an understanding of the ten InTASC standards?” and “What data or evidence do you have to support this?”

For faculty responses in Elementary Education, Grades 4-8, Secondary Education, Special Education, and Alternative Certification Programs, the EPP used a qualitative methodology for analysis of the collected interview data. Emerging categories depicted alignment with InTASC Standards for all syllabi in all programs, in KEI Assignments or Capstone Projects, and of 2.75 GPAs of candidates for all courses within all programs. In addition to syllabi alignment in Elementary Education and 4-8, KEI Assignments provided evidence of learning outcomes of InTASC Standards. Some of these examples of KEI Assignments include:

Elementary Education (Early Childhood EC-6, Reading, and 4-8):

  • EDEC 2383 Dispositions and Philosophy Paper/Presentation;
  • EDEC 3384 Lesson Plans;
  • EDEC 4385 Literacy Backpacks;
  • EDRD 3301 Author Illustrator Presentation;
  • EDRD 3302 Balanced Literacy Project and Paper;
  • EDRD 3304 Structured Literacy Project and Paper; and
  • EDRD 4302 Diagnosis and Remediation Projects.

For the KEI Assignment or capstone project in EDRD 4302 Reading Evaluation Report, candidates locate a child (K-12th grade) and actually administer three tests (IRI, DRA, and Running Record) with their selected child. Candidates are required to reflect how they conducted each assessment and collect hands-on activities for their student based on the observation. Upon completion, candidates present their findings with Power Point slides and the best activity demonstration. At the end of each semester, candidates completed the self- designed survey and their responses were thoroughly analyzed to improve the project.

Candidates provided constructive feedback and the project was revised accordingly each semester. The data demonstrated significant increases on almost all areas in Fall 2015, Spring 2016, and Summer 2016 semesters. In Fall 2015, approximately 50-60% of candidates demonstrated either “Distinguished” or “Proficient,” while in the spring and summer 2016, approximately 70-90% of candidates demonstrated as “Distinguished” or “Proficient” on the three test administrations. The alignment of EDRD 4302 and the capstone project with InTASC Standards (as well as in all courses and programs of the EPP) and the resulting data from the project provides evidence that candidates not only receive quality instruction, but also achieve learning outcomes in the InTASC Standards.

Syllabi alignment of Secondary Education and MAT/ACP and KEI Assignments demonstrated evidence of InTASC Standards instruction and learning outcomes for secondary candidates. Some examples of these KEI Assignments include:

Secondary Education and MAT/ACP:

  • EDSE 4320 Secondary Methods I Reflection Writings;
  • EDSE 4330 Secondary Methods II Teacher’s Notebook (requirements that address all ten InTASC Standards);
  • EDSE 6333 Secondary Methods Diversity/Micro Cultures Research Assignments;
  • EDSE 6311 Psychological Foundations of Education for MAT/ACP Diversity/Micro Cultures Research assignments;
  • Research studies include the study of Marzano, Dean, Lemov’s Teach Like a Champion, the thirteen TExES Competencies for Effective Teaching that include lesson planning, assessment, technology, working with low socioeconomic students, ELLs, and students with special disabilities; and
  • Professional behaviors and the Texas Code of Ethics.

In Special Education, syllabi alignment with InTASC Standards and KEI Assignments demonstrate evidence of instruction and candidate learning outcomes. Some examples include:

Special Education:

  • EDSP 4369 Special Education Methods;
  • EDSP 4358 Classroom Management of Exceptional Learners;
  • EPSY 3350 Characteristics of Exceptional Learners;
  • KEI Assignments in all Special Education courses;
  • the Center for Learning Disabilities Parent/Community Meetings;
  • Special Guest Speakers for the Center for Learning Disabilities; and
  • Fall Conferences.

Based upon EPP data, candidates maintain 2.75 GPA or higher in all education courses to remain in the program. The GPA results represent data from the following:

GPA Data Includes:

  • End-of-Course Grades
  • KEI Assignments
  • Methods Field Observation Evaluations
  • PDAS Evaluations
  • Clinical Teacher Exit Surveys
  • TExES Content Exam Results
  • TExES PPR Exam Results

These data provide evidence that candidates achieve learning outcomes in InTASC Standards 1, 2, 3, and 4. When candidates pass their state certification exams in content and pedagogy (TExES Content and TExES PPR Exams) that are based upon the state competencies for Texas educators that are aligned with the InTASC Standards, then candidates have mastered the thirteen state teacher competencies and have achieved the learning outcomes of the InTASC Standards.

Please also see the EPP’s previous response in the SSR Addendum on pages 27-29. Thank you.

[See Addendum Exhibit (AE7) Specialty Licensure/Certification Data].

[See Addendum Exhibit (AE9) ASEP Reports].

[See Addendum Exhibit (AE11) Traditional and Alternative (ACP) Routes for Initial Certification].

[See Addendum Exhibit (AE12) PPR Exam Results].

[See Addendum Exhibit (AE13) LBB Certification Reports].

[See Addendum Exhibit (AE16) Principal Survey Data].

[See Addendum Exhibit (AE17) PDAS Evaluation Data].

[See Addendum Exhibit (AE19) New Field and Clinical Experiences Documents].

[See Addendum Exhibit (AE24) Methods Field Experience Assessment].

[See Addendum Exhibit (AE25) Methods Field Experience Assessment Rubric].

[See Addendum Exhibit (AE26) Revised Program Data].

[See Addendum Exhibit (AE30) PPR and TExES Competencies Alignment].

[See Addendum Exhibit (AE34) Faculty Interview Questions Data].

[See Addendum Exhibit (AE36) Reading Evaluation Reports].

[See Addendum Exhibit (AE42) PEO and CEI Data, Spring 2015].

[See Addendum Exhibit (AE47) Student/Clinical Teachers Evaluations].

[See Addendum Exhibit for Standard 1 (AE1.1.3) Completers Apply Content/Pedagogical Knowledge in Outcome Assessments].

(FFR, p. 8, middle of page)

Area for Improvement: There is little evidence presented to demonstrate that candidates use data to reflect on teaching effectiveness or to assess student progress.
Rationale: CAEP 1.2 Component does not seem to be addressed in the self-study.

Response:

To fully address CAEP 1.2, the qualitative research study of Faculty Interview Questions Data included the question: “What evidence do you or your program have that candidates use research and evidence to develop an understanding of the profession and use both to measure P-12 and their own professional practice?” The EPP provides evidence that our candidates use data to reflect on their professional practice, teaching effectiveness, and/or to assess student progress to achieve our mission of preparing educators who are confident, skilled, and reflective professionals.

Research study findings demonstrate the comprehensive use of data, research, and evidence by EPP candidates and faculty in all programs and at all levels. Examples of data from the study in Elementary Education and 4-8 indicate for example, in Bilingual/ESL coursework, candidates access the TELPAS, STAAR, and TEA websites to research articles for analyses and syntheses; in EDEC 4385, candidates research Early Childhood Journals and peer- reviewed research articles for writing research papers and participate in the Opportunity School field observations; in 4-8 Math and Science, candidates participate in research field experiences and hands-on experiential learning at the Amarillo Botanical Gardens, Don Harrington Discovery Center, Wild Cat Bluff Nature Trails, Palo Duro Canyon research studies, and the Panhandle Math and Science Conference.

In Secondary Education and MAT/ACP, current research studies on school reform, best practices and innovations, direct and interactive instruction, innovative technologies, 1-1 laptop initiatives, Marzano’s Nine High-Yield Instructional Strategies and Dean’s adaptation of Marzano’s strategies, multiculturalism, student, parent, and community engagement, Web 2.0 tools, and working with diverse P-12 student populations that include gifted and talented, low socioeconomic, struggling, and at-risk students, and school violence are being conducted by both candidate and faculty alike in these programs.

Research and evidence in Special Education are demonstrated in candidate development of lesson plans that include scientifically based and peer-reviewed interventions for students with disabilities. Candidates research specific disabilities and develop a Functional Behavior Assessment (FBA) and Plan. They work with students with disabilities throughout each semester and participate in the Center for Learning Disabilities parent/community meetings and conferences.

The Faculty Interview Questions Data research study found that all course syllabi describe the use of research and evidence in program assignments. Coursework, field observations, and clinical experiences provide opportunities for candidates to grow in confidence, skill, and reflection through research and evidence. Candidates use research to develop a personal understanding of the profession and evidence to measure P-12 and their own professional practice, and, most importantly, use research to impact P-12 student learning and development.

Additional candidate examples of the use of research and evidence in KEI Assignments and coursework are housed in EPP Program Notebooks onsite and will be available during the CAEP Onsite Visit in November 2016.

Please also see the EPP’s previous response in the SSR Addendum on pages 30-31. Thank you.

Note: In the CAEP Evaluation Rubric for Visiting Teams Draft March 2016, for CAEP Standard 1 Component 1.2, the rubric states: “No documentation provided on candidates’ use of data to reflect on teaching effectiveness or to assess student progress” (page 3, third bullet on the left column).

The CAEP Accreditation Manual Version 2 February 2015 that was available to the EPP during our preparation of the SSR for CAEP Standard 1 Component 1.2 states: “Evidence, disaggregated by specialty license area (as applicable, include instruments and provider rubrics for scoring with evidence submissions), specific to research and evidence use in the content area from sources such as: Work sample, Provider-created or proprietary assessments, Pre and post data and reflections on the interpretation and use of this data, Portfolio (including assessment of assignments made to students and artifacts produced)” (page 90).

Also, “NOTES ON THE PURPOSE OF THESE MEASURES These examples could provide evidence that candidates or completers are able to use data for instructional decision-making; provide direct evidence (e.g., from edTPA, PPAT, reflections or portfolios) of candidate proficiencies in use of data or and research. Criteria would be identified and expectations defined in self-studies” (pages 90-91).

In the Area for Improvement (AFI), the FFR stated: “There is little evidence presented to demonstrate that candidates use data to reflect on teaching effectiveness or to assess student progress” and the Rationale: CAEP 1.2 Component does not seem to be addressed in the self-study”.

From the CAEP resources that were available to the EPP prior to our submission of the SSR, our understanding of this component is summarized in “Component 1.2: Providers ensure that candidates use research and evidence to develop an understanding of the teaching profession and use both to measure their P-12 students’ progress and their own professional practice.”

The differences in language used in the FFR Area for Improvement, the CAEP Accreditation Manual February 2015, and the EPP’s understanding explains why we approached this component to ensure that our candidates use research and evidence to develop their understanding of the profession and to measure their own practice and P-12 students’ progress. The EPP did not approach this component from the perspective of the FFR “that candidates use data to reflect on teaching effectiveness or to assess student progress”.

In the Area for Improvement, the FFR further states: “CAEP 1.2 Component does not seem to be addressed in the self-study”.

Respectfully, based upon the evidence presented in the SSR, SSR Addendum, and Addendum Exhibits, the EPP has carefully addressed each component of every standard. Thank you.

[See Addendum Exhibit (AE32) Revised SSR Exhibit 1.2.1].

[See Addendum Exhibit (AE34) Faculty Interview Questions Data].

[See Addendum Exhibit Standard 1 (AES1.1.2) Research and Evidence].

Areas for Improvement (AFIs)

(FFR, p. 8, middle of page)

Area for Improvement: There is only partial evidence that candidates apply content and pedagogical knowledge at specialty licensure area levels (SPA or state levels reports, disaggregated specialty licensure area data, NBCT actions, etc.).

Rationale: The EPP did not provide state reports as required.

Response:

The EPP provides evidence that our candidates apply content and pedagogical knowledge in specialty licensure/certification areas through disaggregated specialty licensure/ certification data provided in the SSR Addendum and Addendum Exhibits (AEs) in order to meet state-selected standards in the EPP Review with Feedback Option. Further explanation continues as follows:

West Texas A&M University is accredited by the Southern Association of Colleges and Schools Commission on Colleges (SACSCOC), a regional agency recognized by the United States Department of Education. This is the highest accreditation a university can receive and signifies that WTAMU has "a purpose appropriate to higher education and the resources, programs and services sufficient to accomplish and sustain that purpose." SACSCOC is the regional body for the accreditation of degree-granting higher education institutions in Alabama, Florida, Georgia, Kentucky, Louisiana, Mississippi, North Carolina, South Carolina, Tennessee, Texas, Virginia, and Latin America.

Due to a misunderstanding of the “Specialty Licensure Area Data” on page 16 of the Self- Study Report (SSR), the State Program Review (State-selected standards) option was selected. Because Texas is neither a partnership state nor a SPA state, the Program Review Option that should have been checked is the EPP Review with Feedback (State-selected standards). The EPP has selected the EPP Review with Feedback as our Program Review option.

Program Review: EPP Review with Feedback

With the selection of the Program Review Option of EPP Review with Feedback, the Addendum provided answers to the three questions on pages 16-17 of the SSR. Please see our previous answers to these questions in the SSR Addendum on pages 5-15.

A list of all programs or Programs Characteristics is provided in the addendum along with the disaggregation of three cycles of specialty/licensure/certification data. Rubrics for the Program Educational Outcomes (PEOs), the Candidate Evaluation Instrument (CEI) for Ethical and Professional Dispositions of Candidates, and the Syllabi Analyses I and II are also provided with validation and reliability studies. The Crosscuts of Diversity and Technology and the Selected Improvement Plan (SIP) are addressed, explained, and clarified. Each of the CAEP Standards and all Components have been addressed in this addendum. Addendum Exhibits (AE) have been uploaded in AIMS.

Please also see our previous responses in the Addendum to the FFR on pages 3-4, pages 5-6, and the Addendum Exhibits. Thank you.

Note: Dr. Tim Miller from the Texas Education Agency has tentatively accepted our invitation to attend the CAEP Site Visit as a state representative on Monday, November 14, 2016.

[See Addendum Exhibit (AE1) SACSCOC Regional Accreditation of West Texas A&M University].

[See Addendum Exhibit (AE2) Letter from Dr. Tim Miller, TEA].

[See Addendum Exhibit (AE3) List of Texas Approved EPP Programs].

[See also Texas Approved EPP Programs at https://secure.sbec.state.tx.us/SBECOnline/approvedprograms.asp].

[See Addendum Exhibit (AE7) Specialty Licensure/Certification Data].

(FFR, p. 8, middle of page)

Area for Improvement: There is little evidence to demonstrate candidates’ ability to work with English Language Learners (ELLs) and students with disabilities.

Rationale: Only 50 percent of candidates felt prepared or very well prepared to address the needs of limited English proficient students. Only 54 percent indicated feeling prepared or very well prepared to work with special education students.

Response:

West Texas A&M University enjoys strong partnerships with area community colleges that include Amarillo College (Amarillo and branches in Dumas and Hereford), Frank Phillips College (Borger), Clarendon College (Clarendon), and South Plains College (Levelland). Community college graduates with associate degrees transfer to West Texas A&M University who later may be accepted into our program. These candidates take their Education Foundation and Special Education classes at their community colleges and comprise some of the 50% and 54% of candidates who feel prepared or very well prepared to address the needs of limited English proficient and special education students. However, we must accept their transcripts and transfer credit for the courses that these candidates have taken. For other candidates who take their ESL and Special Education courses at WTAMU, their levels of preparedness are much higher as evidenced by 100% passing scores on their TExES Content and TExES PPR exams that are heavily focused on the competencies for teaching limited English proficient and Special Education students.

To fill any gaps that exist in levels of candidate preparedness in working with P-12 students with diverse needs, the EPP has continued to improve curriculum and instruction in all programs, developed new courses with a heavy focus upon teaching ELLs and students with disabilities, and created seminars to better prepare our candidates for teaching these and all students. In August/September, the EPP will offer seminars in Technology, Poverty, Mental Health, and School Safety by specialists and professional experts in the fields. Candidates will attend the seminars after their August Experience in Fall 2016.

In addition, Principal Survey evidence demonstrate that our completers are well prepared to teach limited English proficient and Special Education students as first year teachers.

Candidates Meeting the Needs of Limited English Proficient P-12 Students

West Texas A&M University has recently received the distinction of being a Hispanic Serving University. This presents exciting opportunities for our candidates and the EPP as well as producing unique challenges.

As one of our opportunities and challenges, West Texas A&M University enjoys strong partnerships with area community colleges that include Amarillo College (Amarillo and branches in Dumas and Hereford), Frank Phillips College (Borger), and Clarendon College (Clarendon). Amarillo College transfers about 75% of their graduates with associate degrees to West Texas A&M University who may later be accepted into our program. These candidates take their ESL classes at their community colleges and comprise some of the 50% and 54% of candidates who feel prepared or very well prepared to address the needs of limited English proficient. However, according to the Texas Higher Education Coordinating Board (THECB) code, we must accept transfer credit for the courses that these candidates have taken. For other candidates who take their ESL courses at WTAMU, their levels of preparedness are much higher as evidenced by passing scores on the TExES Content and TExES PPR exams that are heavily focused on the competencies for teaching limited English proficient and special needs P-12 students.

To fill any gaps that exist in levels of candidate preparedness in working with P-12 students with limited English proficiency or English Language Learners (ELLs), the EPP has continued to improve curriculum and instruction in the program, developed courses with a heavy focus upon teaching ELLs, and created seminars to better prepare our candidates for teaching these and all students.

In addition, Principal Surveys evidence demonstrate that our completers are well prepared to teach limited English proficient students as first year teachers.

Please also see the EPP’s previous response in regard to the preparedness of our candidates to address the needs of limited English proficient students in the SSR Addendum on pages 38- 39 and pages 43-44. Thank you.

[See Addendum Exhibit (AE16) Principal Survey Data].

[See Addendum Exhibit (AE33) Transfer Data].

[See Addendum Exhibit (AE47) Student/Clinical Teachers Evaluations].

[See Addendum Exhibit (AE48) Seminars for Clinical Teachers].

Candidates Meeting the Needs of Special Education P-12 Students

As previously stated, West Texas A&M University accepts many transfer graduates from area community colleges who may have already taken their basic special education classes at the community college level. Many of these transfers comprise part of the 54% of candidates who feel prepared or very prepared to address the needs of special education students. For candidates who take their special education courses at WTAMU, their levels of preparedness are much higher than the 54% as evidenced by 100% passing scores on their TExES Content and TExES PPR exams that are heavily focused on the competencies for teaching special education students and the state and federal laws and policies that guide special education instruction.

One of the EPP’s primary vehicles to better prepare our candidates to meet the needs of special education students is the Center for Learning Disabilities. Courses in Special Education at WTAMU require candidates to regularly attend parent and community meetings and listen to lectures presented by special guest speakers. Candidates also attend the annual Helen Piehl Distinguished Lecture Series Fall Conferences to participate in interactive presentations by national experts in the field. Candidates engage in class discussions and write weekly and final reflections on these experiences and the application of their learning with special education students.

To also bolster preparedness in working with P-12 students with diverse and special needs, the EPP has continued to improve curriculum and instruction in the program, developed courses with a heavy focus upon teaching students with disabilities, and has created seminars to better prepare our candidates for teaching these and all students. Faculty members receive credit for their attendance at the fall conferences and monthly community meetings on their Annual Performance Summaries (APS). In August/ September, the EPP will offer seminars in Technology, Poverty, Mental Health, and School Safety by specialists and professional experts in the fields. Candidates will attend the seminars after their August Experience in Fall 2016.

The Principal Surveys evidence demonstrates that our completers are well prepared to teach special education students as first year teachers.

Please also see the EPP’s previous response in regard to the preparedness of our candidates to address the needs of special education students in the SSR Addendum on pages 43-44. Thank you.

[See Addendum Exhibit (AE16) Principal Survey Data].

[See Addendum Exhibit (AE17) PDAS Evaluation Data].

[See Addendum Exhibit (AE33) Transfer Data].

[See Addendum Exhibit (AE47) Student/Clinical Teachers Evaluations].

[See Addendum Exhibit (AE48) Seminars for Clinical Teachers].

(FFR, p. 8, middle of page)

Area for Improvement: There is little evidence that candidates have the ability to track and share student performance digitally”.

Rationale: Data provided did not address candidates’ ability to track and share P-12 student learning using technology.

Response:

In Texas, clinical teachers are not allowed by district and campus policy to access schools’ student management system to track or share the academic progress, grades, and personal information of P-12 students due to the federal Family Educational Rights and Privacy Act or FERPA. [See http://familypolicy.ed.gov/ferpa-school-officials]. Our candidates are not permitted to use the districts’ technology resources to track and/or share student data.

Because of these restrictions by our districts, the EPP has striven to create our own proprietary technology supplement to enable our candidates to practice tracking and sharing academic progress that is both legal and appropriate. The EPP has carefully revised the Weekly Progress Reports for Clinical Teachers for Fall 2016 to foster additional opportunities to strengthen and/or gain abilities in tracking, sharing, and practice in communicating digitally with parents the academic progress of their students using technology.

In previous years from 2013 to 2015, clinical teachers worked collaboratively with their cooperating teachers to develop plans for weekly improvement during their twelve weeks of student/clinical teaching. Copies of these Weekly Progress Reports were sent digitally or electronically (using technology) each Friday afternoon to the University Field Supervisor, the clinical teacher, and the Director of Teacher Preparation and Advising. These weekly reports for 2013 to 2015 are housed in Individual Candidate Folders in the Office of Teacher Preparation and Advising.

As continuous improvement in Fall 2016, the EPP is shifting more of the responsibility of developing weekly plans for progress upon candidates rather than the cooperating teachers. Aligned with the T-TESS Evaluation, the new Weekly Progress Reports will be completed by candidates in consultation with their cooperating teachers. University Field Supervisors will receive copies each week from their assigned clinical teachers that will enable greater interaction and a “heads-up” of “look-fors” during their observations and evaluations of the clinical teachers.

In the CAEP Accreditation Manual Version 2 February 2015, Component 1.5 for Standard 1 states: “Providers ensure that completers [at exit] model and apply technology standards as they design, implement, and assess learning experiences to engage students and improve learning, and enrich professional practice” (p. 93). The recommendations for “A. Measure or type of evidence” include evidence of “completers modeling and application of technology standards” through various measures of “Clinical experience observation instrument; lesson or unit plans, portfolios, work sample with exhibition of applications, and use of technology in instruction, technology course signature project/assignment” (pages 93-94).

The Area for Improvement states: “There is little evidence that candidates have the ability to track and share student performance digitally”. This language is located in the CAEP Accreditation Handbook Version 3 March 2016 for CAEP Standard 1 Component 1.5 in the CAEP Evaluation Rubric for Visiting Teams Draft, March 2016: “Candidates demonstrate the ability to track and share student performance data digitally with performance at or above the acceptable level on rubric indicators” (page 5, bullet 5 of the Rubric).

The CAEP Accreditation Handbook, March 2016 and the CAEP Evaluation Rubric for Visiting Teams Draft, March 2016 were not available to the EPP during our preparation of our Self-Study Report (SSR). Having evidence to demonstrate candidates’ ‘ability to track and share P-12 student performance data digitally’ were not requirements or expectations for the EPP. This language is not mentioned in the CAEP Accreditation Manual, February 2015.

In discussions with CAEP Staff and our Team Chair, the EPP was instructed to include this information in our SSR Addendum. Due to evolving changes in the CAEP accreditation process with materials being added intermittently as the process evolves, the EPP understands how this oversight may have happened. We remain committed to the process and greatly appreciate the dedicated work of our CAEP Review Team. Yet, we also anticipate a fair review of the EPP’s work and efforts for continuous improvement based upon the CAEP Resources that were available to us. Thank you.

[See Addendum Exhibit (AE16) Principal Survey Data].

[See Addendum Exhibit (AE45) Tracking Student Performance].

[See Addendum Exhibit (AE49) Weekly Progress Reports for Clinical Teachers].