West Texas A&M University

Buff Transit Tracker
SSR Addendum Technology Crosscut

TECHNOLOGY

As required by the CAEP Accreditation Manual Version 2 February 2015 that guided our SSR and SSR Addendum preparation (p. 87), the EPP has earnestly addressed the cross- cutting theme of Technology through the following evidence presented:

  • The EPP incorporates technology to improve teaching effectiveness, enhance instruction, and manage student and assessment data while engaging students in the application of technology to enhance their learning experiences.

Standard 1:

  • The EPP endorses InTASC Core Teaching Standards.
  • The EPP ensures that completers model and apply technology standards as they design, implement, and assess learning experiences to engage students and improve learning and to enrich professional practice.

Standard 2:

  • The EPP provides technology-enhanced learning opportunities.
  • Appropriate technology-based applications are provided by the EPP.
  • The EPP promotes technology-based collaborations.

Standard 3:

  • Candidates integrate technology into all learning domains in their coursework, field, and clinical experiences throughout the progression of the EPP.

A Special Note: As previously noted by the EPP in the SSR Addendum on page 30, the intent of the EPP was to create, develop, and provide data or evidence notebooks for our use before and during the onsite visit that represent multiple evidence sources of supporting data for each of the five CAEP Standards, all Components, and the Crosscuts that were developed and analyzed in the EPP’s preparation of the Self-Study Report (SSR).

Elementary Education including Grades 4-8, Secondary Education, Special Education, and Alternative Certification (MAT/ACP) programs have also developed data or evidence notebooks to encourage continuous improvement of all programs of the EPP, to embrace the crosscut of Technology, and to meet CAEP Standards.

Candidate exemplars of KEI Assignments, assessments, and rubrics demonstrate multiple ways the EPP uses technology as a crosscut for all teacher candidates throughout the progression of our program. The exemplars are housed within the developed Program Notebooks for each program and will be available onsite.

Thank you.

1. Holistic summary of findings from self study report (SSR)

(FFR, p. 24, paragraph 3)
The EPP notes limited funding to provide access to technologies as a challenge and look for funding in sources such as budget requests, grants, partnerships with districts and private donors, and other financial sources.

Response:

As funding background, all public institutions of higher education except community colleges (and the Texas A&M University System College of Dentistry) receive funding for construction and capital purposes (including technology) from the Permanent University Fund (PUF) or the Higher Education Fund (HEF), sometimes referred to as the Higher Education Assistance Fund (HEAF). The Higher Education Fund includes a dedicated endowment to provide HEF funding in the future. Most long-time institutions in the University of Texas System and the Texas A&M University System benefit from the PUF; other institutions including newer UT and A&M system institutions benefit from the HEF.

The amount of funds allocated for each PUF institution is determined by the Boards of Regents of the UT and A&M Systems each year. The allocation of HEF funds to each institution is determined by the Legislature and may be revised every five years. The Texas Higher Education Coordinating Board makes recommendations to the Legislature regarding these allocations based on recommendations of an advisory committee comprised of representatives of HEF institutions.

The Permanent University Fund was established in the Texas Constitution of 1876 through the appropriation of land grants. Amendments to the Texas Constitution (Section 17, Article in 1984 and 1993 allow the Legislature to provide appropriations to universities. The Higher Education Assistance Fund (HEAF) provides funding to WTAMU for capital purposes, including technology.

Processes for technology funding for the EPP each year include the following protocols:

  • Comprehensive needs assessments of the EPP, faculty, and EPP technology needs are conducted annually.
  • The Department Head submits a list of capital purposes needs and requests to the Dean.
  • The Dean reviews each departmental request within the College and makes recommendations for HEAF departmental funding based upon the amount of appropriations that are available each year.

In addition to state appropriations, the EPP continually seeks grant opportunities. In 2016, The EPP was awarded a competitive two year i3 Investing in Innovation in Education Collaborative Regional Education Initiative Grant in partnership with Jacksonville State University in Jacksonville, Alabama and the United States Department of Education.

Through this grant and partnership with four (4) of our area school districts including Springlake-Earth, Nazareth, Hereford, and Brownfield ISDs, the EPP has provided Apple laptops, classroom iPads, a weeklong CORE Academy in Alabama each summer for teachers and principals, ongoing professional development, and technical support to our small rural districts who have limited access to technology.

Through the use of smartphones, iPads, notepads, and other electronic devices during school hours, educators are embracing and harnessing today’s cutting edge technology within their classrooms. Together, we are preparing P-12 students to more easily step into the digital world of the 21st century and beyond. This is part of what the EPP and our i3 CORE Initiative are seeking to do with our districts. Through our partnership, we are conducting a National Study on the power of technology and teaching 21st century skills to students within small rural classrooms.

The concepts in the i3 CORE Initiative align with current forward thinking in education and positions WTAMU (with the oversight of JSU and the USDE) and JSU as leaders in collaborative education efforts with a national focus on expanding new methodologies and technologies in current classrooms.

The vision of CORE is “to transform K-12 and higher education so students are increasingly engaged, instructors are increasingly innovative, and educational institutions are increasingly supportive of system-wide change and community-wide partnership building” through the interactive use of technology.

Additional sources of funding for technology for the EPP include private donors. The Grand Opening of the Williams’ Early Childhood Model Classroom in September 2016 that houses state-of-the-art technologies and for the future Dr. Geneva Schaeffer STEM Lab in 2017, funding from private donors profoundly aid the EPP with new technologies for our candidates and our faculty members.

The Instructional Technology (IT) Department uses its own funding budget to replace faculty computers on a cyclical basis in the EPP.

The EPP has met the challenges of obtaining funding for technology and will continue striving to meet our needs in the future.

[See Texas Higher Education Fund (HEF) http://www.thecb.state.tx.us/index.cfm?objectid=50A37F04-ADB9-76AE- D8757A7F17970A88].

(FFR, p. 24, paragraph 5)
The districts represented ‘consistently reported that preservice and in-service teachers need more in-depth knowledge and skills related to the effective use of technology to accomplish learning outcomes’ (4.2.1).

Response:

To better understand this district need, it is important to recognize the context of this request from districts. The majority of the school districts in our service area are highly advanced when it comes to technology and technology training for their teachers. The only exceptions to this fact are some of our very small, rural school with limited funding for technology.

Region 16 Education Service Center provides a Technology Conference each year for area districts. The projectile growth of technical knowledge in our districts is quite impressive. More and more of these districts have longstanding “1-to-1” laptop initiatives, “1-to-1” iPad initiatives, Notebooks, Chromebooks, and other technologies for use by their students, parents, and teachers; and others are implementing these initiatives each year. The quest for increased technology savvy teachers for our districts grows stronger each year.

To address these concerns, the EPP has undertaken specific steps to improve, strengthen, and provide in-depth knowledge and skills for our candidates. As developing educators who are confident, skilled, and reflective professionals, this in-depth knowledge and skill about technology directly relates to the effective use of technology to accomplish learning outcomes for all students.

The steps the EPP has undertaken to strengthen knowledge and skills in the use of technology include the following:

  • Established our Program Educational Outcomes (PEOs).
  • Implemented PEO #4: Users of Technology who seamlessly integrate multimedia in learning environments as instructional and management tools to enhance learning.
  • Hired Dr. Sandra Whaley, a new EPP faculty member who is a Technology Specialist (Fall 2016).
  • Dr. Whaley provided two days of Technology Training for all clinical teachers in August 2016 prior to their student/clinical teaching experience. She will provide ongoing technology support and training for clinical teachers each semester.
  • The EPP provided three (3) days of T-TESS training by state-certified T-TESS Trainers who are EPP faculty members to all University Field Supervisors in July 2016.
  • Weekly “Coffees” and “get togethers” with University Field Supervisors and the Director of Teacher Preparation and Advising are being held beginning in Fall 2016 every six weeks to keep abreast of the progress of our clinical teachers during their clinical teaching experiences and to support our field supervisors.
  • The regular “coffees” and “get togethers” will enable the EPP to build greater confidence within our University Field Supervisors and provide additional support for them to make a transition to a paperless, online evaluation system in the future.
  • The EPP will evolve to a paperless, online evaluation system (with the implementation of the EPP’s technology supplement system) for our stakeholders (including EPP-based University Field Supervisors, our candidates as clinical teachers, and the district-based cooperating teachers).
  • The EPP changed the course description of EDPD 6310 to Instructional Strategies in Technology Instruction and is providing this graduate course for Fall 2016.
  • The EPP will analyze multiple sources of data in an ongoing basis.
  • The EPP will continually monitor and make adjustments to these steps as needed.

(FFR, p. 24, paragraph 5)
The EPP indicated they will conduct a technology assessment of local districts to determine hardware/software used in those areas. A faculty member will collaborate with districts to develop curriculum to address these needs.

Response:

In Fall 2016, the EPP hired a new faculty member, Dr. Sandra Whaley to collaborate with districts to determine the hardware/software districts use, to develop curriculum, and to work together to address these needs.

Dr. Whaley has developed and provided a two-day Technology Seminar to our fall clinical teachers in August 2016 prior to their student/clinical teaching. She will continue to provide ongoing technical support and training throughout each semester. Dr. Whaley is currently also teaching a redesigned graduate level course in EDPD 6310 Instructional Strategies (of Educational Technology). In September 2016, Dr. Whaley received a Certificate of Learning for the completion of six hours at the Microsoft Innovative Educator (MIE) Teacher Academy sponsored by Microsoft Education to hone her technology skills.

Please see the EPP’s preceding response in the SSR Addendum on pages 183-184. Thank you.

[See Addendum Exhibit (AE86) Microsoft Innovative Educator (MIE) Teacher Academy (Certificate of Learning)].

a. Narrative summary of preliminary findings

(FFR, p. 25, paragraph 1)
This analysis is complicated by technology being embedded in Domain II and technology being included as part of Standard 8: Instructional Strategies. No additional information was provided such as percentages of proficient/target, overall N, etc. No validity or reliability information was provided for this assessment. (1.1.3).

Response:

In Domain II of the PDAS Field Observation Report or appraisal of our clinical teachers, the last indicator of Domain II: The candidate makes appropriate and effective use of available technology as part of the instructional process is evaluated by the University Field Supervisors in three 45 minute lessons over the course of clinical teaching.

For InTASC Model Core Teaching Standards, Standard 8: Instructional Strategies 8(n) The teacher knows how to use a wide variety of resources including human and technological, to engage students in learning; and 8(o) The teacher understands how content and skill development can be supported by media and technology and knows how to evaluate these resources for quality, accuracy, and effectiveness are evaluated in coursework, field, and clinical experiences throughout the progression of the EPP by EPP-based faculty, school- based cooperating teachers, EPP-based University Field Supervisors, and the Director of the Office of Teacher Preparation and Advising.

As previously discussed in the SSR Addendum on pages 156-160, the EPP addresses candidate learning outcomes that the EPP has aligned with the InTASC Model Core Teaching Standards that includes Standard 8. The InTASC Core Teaching Standards and alignment with state-selected standards ensure that our candidates attain content and technology proficiencies that are embedded throughout the progression of our program.

The PDAS includes 51 criteria within eight domains reflecting the Proficiencies for Learner- Centered Instruction adopted in 1997 by the State Board for Educator Certification (SBEC). Domain II Learner Centered Instruction is one of the eight domains on PDAS.

As previously discussed in the SSR Addendum, the Proficiencies for Learner-Centered Instruction adopted by SBEC include the following criteria as applicable to the integration or incorporation of technology:

Learner-Centered Knowledge

  • The teacher possesses and draws on a rich knowledge base of content, pedagogy, and technology to provide relevant and meaningful learning experiences for all students.

Learner-Centered Instruction

  • To create a learner-centered community, the teacher collaboratively identifies, plans, implements, and assesses instruction using technology and other resources.

Of the thirteen state competencies tested on the TExES PPR Exam, Competency 9 Using Technology as an Effective Instructional Tool; The teacher incorporates the effective use of technology to plan, organize, deliver, and evaluate instruction for all students. There are eight criteria within Competency 9 that include the following:

The beginning teacher:

A. Demonstrates knowledge of basic terms and concepts of current technology (e.g., hardware, software, applications and functions, input/output devices, networks).

B. Understands issues related to the appropriate use of technology in society and follows guidelines for the legal and ethical use of technology and digital information (e.g., privacy guidelines, copyright laws, acceptable use policies).

C. Applies procedures for acquiring, analyzing, and evaluating electronic information (e.g., locating information on networks, accessing and manipulating information from secondary storage and remote devices, using online help and other documentation, evaluating electronic information for accuracy and validity).

D. Knows how to use task-appropriate tools and procedures to synthesize knowledge, create and modify solutions, and evaluate results to support the work of individuals and groups in problem-solving situations and project-based learning activities (e.g., planning, creating, and editing word-processing documents, spreadsheet documents, and databases; using graphic tools; participating in electronic communities as learner, initiator, and contributor; sharing information through online communication).

E. Knows how to use productivity tools to communicate information in various formats (e.g., slide show, multimedia presentation, newsletter).

F. Knows how to incorporate the effective use of current technology; use technology applications in problem-solving and decision-making situations; implement activities that emphasize collaboration and teamwork; and use developmentally appropriate instructional practices, activities, and materials to integrate the Technology Applications TEKS into the curriculum.

G. Knows how to evaluate students’ technologically produced products and projects using established criteria related to design, content delivery, audience, and relevance to assignment.

H. Identifies and addresses equity issues related to the use of technology.

When candidates pass their state certification exams in content and pedagogy (TExES Content and TExES PPR Exams) that are based upon the state competencies for Texas educators the EPP has aligned with the InTASC Model Core Teaching Strategies, then candidates have mastered the thirteen competencies that include technology in Competency 9 Using Technology as an Effective Instructional Tool and have achieved the learning outcomes of the InTASC Model Core Teaching Standards.

The EPP provides additional information that includes percentages of proficient/target, the overall N and validity or reliability information for this assessment and others in the SSR Addendum Exhibits.

[See Addendum Exhibit (AE6) GPAs: All Programs].

[See Addendum Exhibit (AE9) ASEP Reports].

[See Addendum Exhibit (AE12) PPR Exam Results].

[See Addendum Exhibit (AE16) Principal Survey Reports (2013-2015)].

[See Addendum Exhibit (AE17) PDAS Evaluation Data].

[See Addendum Exhibit (AE24) Methods Field Experience Assessment (Phases 1 and 2)].

[See Addendum Exhibit (AE25) Methods Field Experience Assessment Rubric].

[See Addendum Exhibit (AE30) PPR and TExES Competencies Alignment].

[See Addendum Exhibit (AE34) Faculty Interview Questions Data].

[See Addendum Exhibit (AE39) Validity and Reliability Studies].

[See Addendum Exhibit (AE42) PEO and CEI Data, Spring 2015].

[See Addendum Exhibit (AE46) Pre- and Post-PPR Practice Test Grades].

[See Addendum Exhibit (AE47) Student/Clinical Teachers Exit Evaluations].

[See Addendum Exhibit (AE80) TEA Completer Exit Surveys (2012-2015)].

(FFR, p. 25, paragraph 2)
Cooperating teachers and University Field Supervisors showed some inconsistent scoring patterns with differences in all top three categories.

Response:

With the retirement and reassignment of our former colleagues, the Director of Teacher Preparation and Advising and the Director of the Panhandle Alternative Certification for Educators (PACE), the EPP has combined the two offices into one with a single Director of the Office of Teacher Preparation and Advising. As a previous area school superintendent, our new Director, Mrs. Joanna Martinez brings a different perspective to teacher preparation. As a superintendent, having been in the business of hiring, training, and managing teachers, Mrs. Martinez has begun anew with a required three (3) day T-TESS training of all of our University Field Supervisors by state-certified T-TESS trainers and faculty members.

District-based cooperating teachers have received T-TESS training from their districts. Mrs. Martinez and EPP faculty will meet with campus principals (as our former Director of Teacher Preparation and Advising and the Methods Faculty Chair have done for many years) for field observation and clinical teaching placement assignments. The principals communicate their campus needs to the EPP and co-select the clinical teachers that they would like to have assigned to their campuses.

The principals meet with their cooperating teachers. Cooperating teachers communicate with the clinical teachers, the University Field Supervisors, and Mrs. Martinez on a regular basis. Throughout clinical teaching, cooperating teachers submit Weekly Progress Reports for their clinical teachers to Mrs. Martinez and to the University Field Supervisors each Friday.

Director Mrs. Martinez maintains both direct contact with the principals and online communications with both principals and cooperating teachers to improve and to strengthen the alignment of all T-TESS evaluations of clinical teachers. With “coffees” and “get togethers” every six weeks with Mrs. Martinez and the University Field Supervisors, and ongoing communication with our school partners, the EPP will continue to improve the consistency and quality of clinical evaluations.

Through this established EPP protocol for improved teacher preparation and evaluation processes, the EPP believes that we can improve and lessen any inconsistencies in the evaluation of our teacher candidates by both school-based and EPP-based evaluators.

(FFR, p. 25, paragraph 2)
Additional content related data was discussed in this document but was not clearly related to technology (1.1.8).

Response and Clarification:

For clarification, the EPP reported numbers rather than percentages of candidates of the University Field Supervisors’ PDAS Evaluations. We apologize for any confusion or lack of clarity this may have caused. The PDAS Evaluations data has been revised in the SSR Addendum Exhibit. Explanation of the SSR Exhibit (1.1.8) Content Knowledge of Candidates that was submitted in the SSR follows:

For Fall 2014, in the SSR Exhibit (1.1.8) Content Knowledge of Candidates of Domain II Learner-Centered Instruction, for Indicator Q. Candidate makes appropriate and effective use of technology as part of the instructional process, for Observation 1 (N=68), PDAS candidate ratings included: 10 Exceeds Expectations; 58 Proficient; 0 Below Expectations; 0

Not Observed. 68 (100%) of candidates in Fall 2014 were rated Proficient and/or Exceeds Expectations by University Field Supervisors in making appropriate and effective use of technology as part of the instructional process.

In Fall 2014 for Observation 2 (N=68), 10 Exceeds Expectations and 58 Proficient for all 68 (100%) of candidates were rated Proficient and/or Exceeds Expectations for Indicator Q. For Observation 3 (N=68), 22 Exceeds Expectations, 45 Proficient, and 1 Not Observed for Indicator Q for 67 (99%) candidates were rated Proficient and/or Exceeds Expectations and 1 (1%) Not Observed.

[See Addendum Exhibit (AE17) PDAS Evaluation Data].

[See Addendum Exhibit (AE26) Revised Program Data].

(FFR, p. 25, paragraph 2)
Document 1.1.11 describes additional data and analysis of candidate and clinical experience assessments and changes made to this instrument to align to clinical teaching instrument used across the state. Data is collected for all components of the Domains including technology, but scores are only reported and discussed at the Domain level.

Response:

Please see the revised SSR Addendum Exhibits for additional data and analysis of candidate and clinical experience assessments. Thank you.

[See Addendum Exhibit (AE17) PDAS Evaluation Data].

[See Addendum Exhibit (AE26) Revised Program Data].

(FFR, p. 25, paragraph 3)
Table 1 in 1.3.1 shows application of content knowledge in the Clinical Teacher Evaluations for fall 2014, spring 2015, and fall 2015. It is unclear if the [sic] is the same evaluation discussed above or is [sic] different. It appears to be completed by the candidates at the end of the clinical.

Response and Clarification:

The EPP apologizes for any lack of clarity in the SSR or SSR Exhibits.

The PDAS Evaluations of clinical teachers or teacher candidates completed by University Field Supervisors during student/clinical teaching is the same appraisal form used statewide to evaluate in-service teachers and was discussed above.

The Student/Clinical Teaching Evaluation is a different assessment instrument that candidates do complete at the end of their clinical experience.

For clarity and convenience, data from these two different assessments have been revised in the SSR Addendum Exhibits. Thank you.

[See Addendum Exhibit (AE17) PDAS Evaluation Data].

[See Addendum Exhibit (AE26) Revised Program Data].

(FFR, p. 25, paragraph 3)
Item K (Integrate educational technology into teaching) shows 60% (N=20), 62.7% (N=47), and 73.4% (N=49) score. The EPP does not provide the original instrument or discuss the results of these data. The question that seems pertinent is ‘why is the other-30% indicating they are unprepared in technology?

Response and Clarification:

For clarification, the EPP has provided additional data and analyses in the SSR Addendum and Addendum Exhibits. The original PDAS evaluation document and revised tables for clarification and better understanding demonstrate how candidates integrate educational technology into teaching throughout the progression of the EPP.

Analyses of the Table in the SSR Addendum that follows on page 201 and supplementary data indicate a range of ratings from 92% to 99% from 2012 to 2015 that all state respondents felt Well Prepared and/or Sufficiently Prepared to integrate technology in their classrooms. Using state-level data for beginning teachers who are graduates/ completers of our program demonstrates valid and reliable data of candidate outcomes. The percentages data for our EPP as well as additional data are provided in the Addendum Exhibits.

Thank you.

[See Addendum Exhibit (AE16) Principal Survey Reports (2012-2015)].

(FFR, p. 25, paragraph 4)
The ASEP 2012-13 principal survey, as described in 1.5.1, shows the EPP average ratings for technology items similar too but below the state average for that year. The original instrument, additional data and analysis were not provided by the EPP.

Response and Update:

The original ASEP Principal Survey instrument, additional data for 2013-2014 and 2014- 2015, and analysis are provided in the SSR Addendum and Addendum Exhibits.

Please see the EPP’s response in the SSR Addendum following on pages 194-196. Thank you.

[See Addendum Exhibit (AE16) Principal Survey Reports (2012-2015)].

(FFR, p. 25, paragraph 4)
In fall 2015, the Director of Teacher Preparation and Advising surveyed exiting teacher candidates (2.2.5). The EPP provided the instrument but did not provide supporting evidence including validity/reliability.

Response:

The Student/Clinical Teacher Evaluations completed by candidates at the end of their clinical teaching experience reflects similar assessment data taken from the state-level PDAS appraisal instrument that is used statewide to evaluate in-service teachers. The state has extensively field tested both the PDAS appraisal instrument and the new T-TESS evaluation instrument for validity and reliability.

The EPP provides supporting evidence for the Student/Clinical Teacher Evaluations as well as PDAS evaluations in the SSR Addendum and Addendum Exhibits.

[See Addendum Exhibit (AE17) PDAS Evaluation Data].

[See Addendum Exhibit (AE39) Validity and Reliability Studies].

[See Addendum Exhibit (AE47) Student/Clinical Teachers Exit Evaluations].

(FFR, p. 25, paragraph 4)
Candidates were asked to anonymously report strengths and weaknesses of their clinical experiences. The EPP provided a list of many strengths and fewer weaknesses. One weakness item listed was technology. No discussion or explanation was provided by the EPP.

Response and Update:

As previously discussed in the Diversity crosscut of the SSR Addendum, the former Director of Teacher Preparation and Advising who retired in June 2016 provided a list of strengths and concerns of programs that candidates had identified as technology. At the time the list was provided to the EPP, we did not know how many of the candidates surveyed had identified this concern nor which program it represented. The supplementary data that supported their concern was not available prior to the submission of the SSR.

Based upon available data, the EPP has implemented enriched and authentic learning opportunities in technology for our candidates through their coursework, field, and clinical experiences. The EPP has taken concrete steps to alleviate the concern of our candidates over time with specific actions such as hiring new technology faculty who will provide technology integration training as discussed in the EPP’s previous responses in the SSR Addendum and offered additional technology trainings for faculty.

Since the EPP’s SSR submission, the actual candidate survey data was obtained for EPP analysis to give us a much better understanding of this identified concern in technology. On the Student/Clinical Teacher Evaluations with an evaluation scale of 5=Very Prepared; 4=Prepared; 3=No Opinion; 2=Not Prepared; and 1=Very Unprepared; the exit survey data showed the following for Fall 2015:

Elementary Education, EC-6 (N=23)

  • a. Teach the state’s core curriculum (i.e. TEKS): 9 5’s; 13 4’s; 1 3’s; 0 2’s; and 0 1’s.
  • b. Teach advanced content that exceeds the demands of the state’s core curriculum: 8 5’s; 11 4’s; 3 3’s; 1 2’s; and 0 1’s.
  • k. Integrate educational technology into your teaching: 12 5’s; 7 4’s; 1 3’s; 2 2’s; 0 1’s.
  • p. Overall, how prepared did you feel in your role as a student teacher: 18 5’s; 5 4’s; 0 3’s; 0 2’s; 0 1’s.
  • q. My student teaching impacted student academic success: 18 5’s; 5 4’s; 0 3’s; 0 2’s; and 0 1’s.

In Elementary Education (EC-6), to teach the state’s core curriculum (that includes required Technology Applications), 22 candidates felt Prepared/Well Prepared, 1 had No Opinion, and 0 felt Not Prepared or Very Unprepared.

To teach advanced content that exceeds the demands of the state’s core curriculum: 19 felt Prepared/Well Prepared, 3 had No Opinion, 1 felt Not Prepared, and 0 felt Very Unprepared.

To integrate educational technology into teaching: 19 felt Prepared/Well Prepared, 1 had No Opinion, and 2 felt Not Prepared.

For how candidates felt prepared overall in their roles as student/clinical teachers: 23 (100%) felt Prepared/Well Prepared.

For how their student teaching impacted student academic success: 23 (100%) felt Well Prepared/Prepared.

Grades 4-8 (N=4)

  • a. Teach the state’s curriculum content (i.e. TEKS): 2 5’s; 2 4’s; 0 3’s; 0 2’s; and 0 1’s.
  • b. Teach advanced content that exceeds the demands of the state’s core curriculum: 1 5’s; 2 4’s; 0 3’s; 1 2’s; and 0 1’s.
  • k. Integrate educational technology into your teaching: 2 5’s; 0 4’s; 0 3’s; 2 2’s; 0 1’s.
  • p. Overall, how prepared did you feel in your role as a student teacher?: 2 5’s; 2 4’s; 0 3’s; 0 2’s; 0 1’s.
  • q. My student teaching impacted student academic success: 3 5’s; 1 4’s; 0 3’s; 0 2’s; and 0 1’s.

For Grades 4-8, to teach the state’s core curriculum (TEKS) that includes required Technology Applications: 4 (100%) felt Prepared/Well Prepared.

To teach content that exceeds the state’s core curriculum: 2 felt Well Prepared, 0 had No Opinion, and 1 felt Not Prepared.

To integrate educational technology into their teaching: 2 felt Prepared/Well Prepared, 0 had No Opinion, 2 felt Not Prepared, and 0 felt Very Unprepared.

In 4-8, overall, how prepared candidates felt in their roles as student/clinical teachers: 4 felt Prepared/Well Prepared.

How their student teaching impacted student academic success: 4 (100%) candidates felt Well Prepared/Prepared.

Secondary/All Level (N=22)

  • a. Teach the state’s curriculum content (i.e. TEKS): 3 5’s; 18 4’s; 0 3’s; 1 2’s; and 0 1’s.
  • b. Teach advanced content that exceeds the demands of the state’s core curriculum: 1 5’s; 19 4’s; 0 3’s; 1 2’s; and 0 1’s.
  • k. Integrate educational technology into your teaching: 7 5’s; 8 4’s; 3 3’s; 4 2’s; 0 1’s.
  • p. Overall, how prepared did you feel in your role as a student teacher: 10 5’s; 11 4’s; 1 3’s; 0 2’s; 0 1’s.
  • q. My student teaching impacted student academic success: 10 5’s; 12 4’s; 0 3’s; 0 2’s; and 0 1’s.

For secondary/all level, to teach the state’s core curriculum (TEKS) that includes required Technology Applications: 21 students felt Prepared/Well Prepared; 0 had No Opinion; 1 felt Not Prepared; and 0 were Very Unprepared.

To teach advanced content that exceeds the demands of the state’s core curriculum: 20 felt Prepared/Well Prepared; 0 had No Opinion; 1 felt Not Prepared; and 0 were Very Unprepared.

To integrate educational technology into their teaching: 15 felt Prepared/Well Prepared; 3 had No Opinion; 4 felt Not Prepared; and 0 were Very Unprepared.

Overall how prepared candidates felt in their roles as student/clinical teachers for secondary/all level candidates: 21 felt Prepared/Very Prepared; 1 had No Opinion; 0 were Not Prepared or Very Unprepared.

In sum, as evidenced by the EPP’s data analysis, in Elementary Education only one (1) candidate felt Not Prepared in both teaching and exceeding the state’s core curriculum (TEKS) and 100% felt Well Prepared/Prepared overall. In 4-8, only one (1) candidate felt Not Prepared in both in teaching and exceeding state core curriculum and 100% felt Well Prepared/Prepared overall. In Secondary/All Level (that included Special Education candidates), there were four (4) candidates who felt Not Prepared to integrate technology and 0 felt Very Unprepared.

Therefore, the list of candidate “concerns” was misleading. By accessing the actual supplementary data, the EPP determined that there were only 1 candidate in Elementary, 1 in 4-8, and 4 in Secondary/All Levels who felt Not Prepared in integrating technology in Fall 2015.

Student/Clinical Teachers’ Exit Evaluation data for Spring 2016 are disaggregated by program and are provided in the Addendum Exhibits.

[See Addendum Exhibit (AE47) Student/Clinical Teachers Exit Evaluations].

Please see the EPP’s previous response in the SSR Addendum. Thank you.

c. Evidence that inadequately demonstrates integration of cross-cutting theme of technology

(FFR, p. 26, middle of page)

Response:

Because Technology Applications are embedded in our state-selected standards and are aligned with competencies for effective teaching (tested on the TExES Content and TExES PPR state certification exams), InTASC Core Teaching Standards, ISTE Standards, local General Learning Outcomes (GLOs) university standards, our Program Educational Outcomes (PEOs), and due to our misunderstanding concerning the SSR Exhibits for the Technology Crosscut, the EPP uploaded the following documents in error.

The SSR Addendum and Addendum Exhibits provide adequate and relevant evidence that demonstrate the EPP’s integration of the cross-cutting theme of technology.

We apologize for any inconvenience to the team because of our misunderstanding.

Thank you.

  1. 1.1.4. Candidate Evaluation Instrument (CEI) (shows no data or analysis relating to technology).

  2. 1.1.9. Pedagogical Knowledge of Candidates (shows no data or analysis relating to technology).

  3. 1.1.10. Program Progression of the EPP 2013-2015 (shows no data or analysis relating to technology).

  4. 1.1.11. Candidate Field and Clinical Experience Assessments Fall 2014, Spring 2015, and Fall 2015 (discussion is at the Domain level and is not directly tied to technology).

  5. 1.1.12. Ethical and Professional Dispositions of Candidates.

  6. 1.1.14. Progression of Candidates Deep Understanding.

  7. 1.1.16. Decision Points of the EPP.

  8. 1.4.1. Candidates Demonstrate Skills and Commitment for All P-12 Students Access to TCCRS.

  9. 2.1.1. Effective Partnership and Stakeholder Evidence.

  10. 2.2.1. High Quality Clinical Practice.

  11. 2.2.3. Criteria for Performance and Retention.

  12. 2.2.3. Criteria for Performance and Retention.

  13. 2.2.4. Cooperating Teacher and Field Supervisor Support Evidence.

  14. 2.2.5. Candidate Assessments.

  15. 2.2.6. Completer Follow-Up Survey. (No data currently available. No validity/reliability).

  16. 2.3.1. Clinical Knowledge, Skills, and Dispositions KSD and Positive Impact on P-12 Students.

  17. 3.4.1. Monitoring the Progression of Candidates.

  18. 3.5.1. Employing High Exit Criteria.

  19. 3.6.1. Candidates Developing Understanding of Ethical and Professional Aspects of Teaching.

  20. 4.1.1. Completer Impact on P-12 Student Learning and Development.

2. Questions for EPP concerning additional evidence, data and/or interviews, including follow up on response to 1.c.

(FFR, p. 26, bottom of page)
1. Provide instrument for Student/Clinical Teacher Evaluations. How is this instrument validated/implemented/and used to measure effectiveness in technology?

Response:

The student/clinical teacher evaluation instruments have been provided as Addendum Exhibits in the SSR Addendum.

The Student/Clinical Teacher Evaluation is completed by candidates at the end of their clinical teaching to evaluate their preparedness to be a teacher specifically in technology on Indicator “k” Integrate educational technology into your teaching”. This item is provided for candidates to measure their own effectiveness in technology.

For EPP-based University Field Supervisors to evaluate student/clinical teachers, the PDAS Appraisal Framework and T-TESS in Fall 2016 are used for three 45-minute lesson observations over the course of the clinical teaching experience. These are state-approved and state-supported appraisal and evaluation instruments that have been extensively field-tested across the state prior to the implementation in school districts statewide. The PDAS has been used as the appraisal framework for the state to evaluate in-service teachers for many years.

The T-TESS has been piloted and field-tested extensively in 2015-2016 and will be rolled out statewide in 2016-2017. All Texas teachers and administrators have received training in both evaluative instruments.

Our district-based cooperating teachers, our student/clinical teachers, and University Field Supervisors have all received T-TESS training. The EPP has aligned the clinical teacher evaluation instrument with the T-TESS and will be implemented for candidate evaluation in Fall 2016.

To measure effectiveness in technology on these assessments, the EPP uses the PDAS evaluation data, the Student/Clinical Teacher Evaluations and TEA’s Completer Exit Surveys and ASEP Principal Surveys of our graduate/completers as beginning teachers to measure technology effectiveness of our candidates and completers.

For the EPP, we have designed validation and inter-rater reliability studies that we are undertaking in Fall 2016. As previously explained in the SSR Addendum, our three-prong approach will include analysis of all EPP-developed assessment instruments by the Validation and Reliability Committee, analysis of our instruments by colleagues from other colleges across our University, and analysis by an education committee unknown to us from another university in another state. Validation and reliability of these instruments will be achieved with an 80% agreement rate.

Additionally, the EPP has requested validation and reliability data from Dr. Tim Miller of the Texas Education Agency. We anticipate his reply to our request will be forthcoming.

Thank you.

[See Addendum Exhibit (AE16) Principal Survey Reports (2012-2015)].

[See Addendum Exhibit (AE22) PDAS Appraisal Instrument].

[See Addendum Exhibit (AE41) PDAS Performance Levels].

[See Addendum Exhibit (AE39) Validity and Reliability Data].

[See Addendum Exhibit (AE47) Student/Clinical Teachers Exit Evaluations].

[See Addendum Exhibit (AE80) TEA Completer Exit Surveys (2012-2015)].

2. Provide ASEP Principal Survey instrument and data for 2013-14, 2014-15. How is this instrument validated/implemented/and used to measure effectiveness in technology?

Response and Update:

As the EPP has previously discussed in the SSR Addendum, the ASEP Principal Surveys for 2013-2014 and 2014-2015 were not yet available to the EPP prior to our submission of the SSR. TEA recently released an Xcel spreadsheet that the EPP reformatted as similar to the 2012-2013 ASEP Principal Surveys.

The EPP has provided the ASEP Principal Survey instrument and data for 2013-2014 and 2014-2015 in the SSR Addendum and Addendum Exhibits. This identical survey instrument was sent to Texas Principals each year for the evaluation of their beginning (first year) teachers. The generated has proven to be reliable data for the state.

The state has validated the instrument over time through the implementation of the survey instrument statewide over the last several years.

To measure effectiveness in technology, the survey includes eight (8) questions with ratings of Well Prepared; Sufficiently Prepared; Not Sufficiently Prepared; and Not Prepared at All:

Q. 31: To what extent were you prepared to use technology available on the campus to integrate curriculum TEKS and Technology Application TEKS to support student learning?

Q. 32: To what extent were you prepared to provide technology based classroom learning opportunities that allow students to interact with real-time and/or online content?

Q. 33: To what extent were you prepared to teach students developmentally appropriate technology skills?

Q. 34: To what extent were you prepared to use technology to make learning more active an engaging for students?

Q. 35: To what extent were you prepared to use available technology to collet, manage, and analyze student data using software programs (such as Excel or an electronic grade book)?

Q. 36: To what extent were you prepared to use available technology to collect, manage, and analyze data from multiple sources in order to interpret learning results for students?

Q. 37: To what extent were you prepared to use available technology to document student learning to determine when an intervention is necessary and appropriate?

Q. 38: To what extent were you prepared to use available technology to collect and manage formative assessment data to guide instruction?

Additionally, Q. 53: What is your overall evaluation of how well the educator preparation program prepared you? Select the one statement that most closely matches your current overall perspective on the program. The statements include the following:

  • I was well prepared by the program for the first year of teaching.
  • I was sufficiently prepared by the program for the first year of teaching.
  • I was not sufficiently prepared by the program for the first year of teaching.
  • I was not at all prepared by the program for the first year of teaching. The EPP provides a summary of these state survey data below.

Principal Survey
Technology Questions

2012-2013
N=19,345
State Respondents

2013-2014
N=21,353

State Respondents

2014-2015
N=21,694

State Respondents

Q. 31

97%
Well Prep./Suff. Prep.

97%
Well Prep./Suff. Prep.

97%
Well Prep./Suff. Prep.

Q. 32

96%
Well Prep./Suff. Prep.

96%
Well Prep./Suff. Prep.

96%
Well Prep./Suff. Prep.

Q. 33

96%
Well Prep./Suff. Prep.

96%
Well Prep./Suff. Prep.

96%
Well Prep./Suff. Prep.

Q. 34

98%
Well Prep./Suff. Prep.

97%
Well Prep./Suff. Prep.

97%
Well Prep./Suff. Prep.

Q. 35

93%
Well Prep./Suff. Prep.

93%
Well Prep./Suff. Prep.

94%
Well Prep./Suff. Prep.

Q. 36

93%
Well Prep./Suff. Prep.

93%
Well Prep./Suff. Prep.

94%
Well Prep./Suff. Prep.

Q. 37

92%
Well Prep./Suff. Prep.

92%
Well Prep./Suff. Prep.

93%
Well Prep./Suff. Prep.

Q. 38

92%
Well Prep./Suff. Prep.

92%
Well Prep./Suff. Prep.

93%
Well Prep./Suff. Prep.

Q. 53 Overall

99%
Well Prep./Suff. Prep.

99%
Well Prep./Suff. Prep.

99%
Well Prep./Suff. Prep.

 

Analyses of the previous table and supplementary data indicate a range of ratings from 92% to 99% from 2012 to 2015 that all state respondents felt Well Prepared and/or Sufficiently Prepared to integrate technology in their classrooms. For additional data, please see Addendum Exhibit (AE16). Thank you.

[See Addendum Exhibit (AE16) Principal Survey Reports (2013-2015)].

3. Preliminary recommendations for areas for improvement and/or stipulations including a rationale for each

Area for Improvement (AFI)

(FFR, p. 26, bottom of page, and p. 27, top of page)
Area for Improvement: There is little evidence of incorporation of technology to improve teaching effectiveness, enhance instruction, and manage student and assessment data while engaging students in the application of technology to enhance their learning experiences.
Rationale: The EPP does not provide sufficient verifiable evidence of their candidate’s ability to ‘model and apply technology standards as they design, implement, and assess learning experiences to engage students and improving learning and enrich professional practice.

Response:

The EPP integrates or incorporates “technology to improve teaching effectiveness, enhance instruction, and manage student and assessment data while engaging students in the application of technology to enhance their learning experiences” in multiple ways throughout the progression of the EPP.

First, the EPP incorporates technology to improve teaching effectiveness and enhance instruction through the development and implementation of our approved ADA-compliant syllabi template for all EPP faculty and Part Time Instructors (PTIs). The template is based upon current research and is aligned with international, national, professional, state, and local standards. Through the implementation of this template, the EPP improves and models teaching effectiveness and enhanced instruction for our candidates by ensuring that all course syllabi in each course we offer are consistently aligned with our standards in support of the EPP’s Program Educational Outcomes (PEOs) and the Ethical and Professional Dispositions of Candidates. The EPP’s Standards include the following:

  • InTASC Model Core Teaching Standards;
  • International Society for Technology Education (ISTE) Standards;
  • State-selected Texas Essential Knowledge and Skills (TEKS);
  • Technology Application (TEKS) Standards;
  • Texas College- and Career-Readiness Standards;
  • Texas Educator Standards for Effective Teaching;
  • General Learning Outcomes (GLOs) of West Texas A&M University; and
  • Curriculum Standards of the Texas Higher Education Coordinating Board (THECB).

Second, to improve teaching effectiveness and enhance instruction while engaging students in the application of technology to enhance their learning experiences, in coursework (as directed by each course syllabus), candidates are required to complete a Key Effectiveness Indicator (KEI) assignment or capstone while maintaining a 2.75 GPA in all education courses. Course delivery and instruction is facilitated through technology in every course through Blackboard and WT Class in SMART classrooms. Faculty members receive ongoing technology trainings from the WTAMU Instructional Technology (IT) Department to support the use of technology in the classroom.

Faculty members model the integration of technology in their courses for candidates in order for them to gain knowledge and practice in the skill of integrating technology into their classrooms. Within these technology-enhanced learning environments, candidates learn to use technology widely for their capstone assignments for group projects and class presentations. Candidate exemplars disaggregated by programs are housed in the EPP Program Notebooks and will be available onsite for review.

Third, candidates’ ability to incorporate technology to improve teaching effectiveness and enhance instruction is assessed on the PPR state certification exam. The Texas Educator Standards for Effective Teaching assesses candidates’ content, pedagogical, and technical (technology) knowledge on the TExES PPR Exam that certifies Texas teachers. EPP policy requires candidates must pass both content and the PPR state certification exams prior to their clinical teaching.

The EPP provides instruction for candidates in the thirteen state teaching competencies in EDPD 3340 and EPSY 3341. All candidates must take and pass a released PPR Practice Test as their final exam in EPSY 3341. After passing the PPR Practice test, candidates are approved to take the state TExES PPR Exam.

Of the applicable technology standards included on the TExES PPR Exam, the following standards are assessed on approximately 33% of the test from Domain III Implementing Effective, Responsive Instruction and Assessment:

PR STANDARD I:
The teacher designs instruction appropriate for all students that reflects an understanding of relevant content and is based on continuous and appropriate assessment.

PPR STANDARD III:
The teacher promotes student learning by providing responsive instruction that makes use of effective communication techniques, instructional strategies that actively engage students in the learning process and timely, high-quality feedback.

TECHNOLOGY APPLICATIONS STANDARD I:
All teachers use technology-related terms, concepts, data input strategies and ethical practices to make informed decisions about current technologies and their applications.

TECHNOLOGY APPLICATIONS STANDARD II:
All teachers identify task requirements, apply search strategies and use current technology to efficiently acquire, analyze and evaluate a variety of electronic information.

TECHNOLOGY APPLICATIONS STANDARD III:
All teachers use task-appropriate tools to synthesize knowledge, create and modify solutions and evaluate results in a way that supports the work of individuals and groups in problem- solving situations.

TECHNOLOGY APPLICATIONS STANDARD IV:
All teachers communicate information in different formats and for diverse audiences.

TECHNOLOGY APPLICATIONS STANDARD V:
All teachers know how to plan, organize, deliver and evaluate instruction for all students that incorporates the effective use of current technology for teaching and integrating the Technology Applications Texas Essential Knowledge and Skills (TEKS) into the curriculum.

As previously indicated, the EPP ensures candidates have content, pedagogical, and technical knowledge of teaching effectiveness, enhanced instruction, managing student and assessment data in the application of technology to enhance their learning experiences by requiring all candidates to pass the PPR Exam and the TExES Content Exam prior to their clinical teaching.

Fourth, during Methods courses, candidates complete 40 hours of field observation experience in diverse settings with diverse P-12 student populations. During field observations, the EPP provides a broad range of opportunities for candidates to observe teachers using technology in their classrooms in a variety of ways. Candidates are required to teach a minimum of three (3) 20-minute lessons during their field observations. Often, candidates use available campus technology to teach their lessons.

During Methods, the school-based cooperating teachers and the EPP-based faculty and/or instructors provide forums for discussions, assignments, and reflection writings for candidates to evaluate their field observation experiences. The cooperating teacher, university faculty, and the candidates complete assessments of these field experiences.

Therefore, field observations allow candidates to improve their teaching effectiveness, enhance their instruction, manage student and assessment data, and engage students in the application of technology to enhance their learning experiences. Once candidates have passed both state exams, successfully completed their course work, and required field observations; they are now prepared to begin their clinical teaching experience.

Fourth, the EPP improves teaching effectiveness, enhances instruction, managing student and assessment data while engaging students in the application of technology to enhance their learning experiences during the student/clinical experiences of our candidates. Once approved for clinical teaching, candidates complete 13 weeks of clinical teaching (that includes one week of the August Experience) with an assigned cooperating teacher and University Field Supervisor in diverse settings to work with diverse P-12 students.

The EPP demonstrates how candidates improve teaching effectiveness, enhances instruction, managing student and assessment data while engaging students in the application of technology to enhance their learning experiences with evidence from the evaluations of their school-based cooperating teachers, PDAS evaluations from the EPP-based University Field Supervisors, and from the candidates themselves in the Student/Clinical Teaching Evaluations.

Fifth, the EPP integrates or “incorporates technology to improve teaching effectiveness, enhance instruction, and manage student and assessment data while engaging students in the application of technology to enhance their learning experiences” in multiple ways throughout the progression of the EPP. Evidence from multiple sources for our completers demonstrate our integration of technology in the TEA Exit Surveys completed by our graduates/completers and the Principal Surveys completed by principals for beginning teachers.

In sum, the EPP has provided verifiable, multiple sources of evidence of our candidates’ abilities to model and apply technology standards as they design, implement, and assess learning experiences to engage students, improve learning, and enrich professional practice through coursework, field observation, and clinical experiences. These multiple data include:

  • EPP ADA-Compliant Syllabi Template;
  • KEI Assignments for all courses and all programs housed onsite in the EPP Program Notebooks;
  • ASEP Reports for the TExES Content and TExES PPR Exams;
  • Methods Field Observation Evaluations;
  • EPSY 3341 Grade Reports;
  • PDAS Evaluations;
  • Student/Clinical Teaching Evaluations;
  • TEA Exit Surveys (2013-2015); and
  • Principal Survey Results (2013-2015).

Please see the EPP Program Notebooks for KEI Assignment Candidate Exemplars onsite. Thank you.

[See Addendum Exhibit (AE7) Specialty Licensure/Certification Data].

[See Addendum Exhibit (AE9) ASEP Reports].

[See Addendum Exhibit (AE12) PPR Exam Results].

[See Addendum Exhibit (AE13) LBB Certification Reports].

[See Addendum Exhibit (AE16) Principal Survey Reports (2012-2015)].

[See Addendum Exhibit (AE17) PDAS Evaluation Data].

[See Addendum Exhibit (AE18) Field Observation Evaluations].

[See Addendum Exhibit (AE21) PEO Additional Data].

[See Addendum Exhibit (AE24) Methods Field Experience Assessment].

[See Addendum Exhibit (AE25) Methods Field Experience Assessment Rubric].

[See Addendum Exhibit (AE30) PPR and TExES Competencies Alignment].

[See Addendum Exhibit (AE39) Validity and Reliability Studies].

[See Addendum Exhibit (AE46) Pre- and Post-Practice Test Grades].

[See Addendum Exhibit (AE47) Student/Clinical Teachers Exit Evaluations].

[See Addendum Exhibit (AE48) Seminars for Clinical Teachers].

[See Addendum Exhibit (AE49) Weekly Progress Reports for Clinical Teachers].

[See Addendum Exhibit (AE80) TEA Completer Exit Surveys (2012-2015)].