Focus – The focus of this workshop is on maximizing observation, analysis, feedback, and reflection during clinical experiences while facilitating a team approach for those who are involved in clinical experiences and observation of future teachers (or practicing teachers).
Relevance – Preparing teachers to teach effectively and engaging learners at high levels involves complex tasks that sequence knowledge and awareness, practice, observation and data collection, followed by analysis and reflection, essential processes for changing teaching practice. Embedded into the complex act of teaching are numerous teacher and student actions and responses, occurring in a short amount of time, so much so that novices are left with general impressions and memories of their actions during the teaching episode. As such, novices are highly dependent on feedback from an observer, much of which is qualitative in nature.
During the observation and feedback process, pre-service teachers and teacher preparation programs, share a need with practicing teachers and administrators, for using more quantitative indicators of teaching during the assessment process, and that such feedback is more evidence-based in nature. Data collected during a teaching session (from video or real-time) can include: 1) types of questions and responses, 2) average and specific wait-times, 3) specific type and length of teaching strategies utilized, 4) specific type of interchange between students or general student participation, 5) predominate patterns of interactions between the teacher and students, 6) student engagement data specific to individual students and also delineated by demographics such as gender, learning disability, minority status, or ELL/Bil. 7) misbehavior data to the individual student level and 8) teacher intervention to misbehaviors (individual or whole group). These are all factors that can be observed and noted during observation and used during the analysis and feedback phase, and if the video is used during data collection, all data points are linked to specific video segments for use during the feedback phase, so that novices can see themselves in action.
However, the problem is collecting the data without using more than one observer, and without experiencing cognitive overload, while maintaining a collaborative learning environment. Research shows that technology use supports collaborative learning and when teacher candidates are provided opportunities to reflect upon and discuss classroom practices their understanding of the teaching situation deepens (Brookfield, 1995; Bruce & Levin, 2003; Hughes & Mapes, 2012; Lee & Young, 2010; Martin, 2005; Matthew, Felvegi & Callaway, 2009). Efforts to gather substantially more quantitative data during an observation have been successful (Ashmann and Berg, 2013; Berg, 2013, 2014). Current use of the technology demonstrates that data collected during a teaching session can include all of what is mentioned above and more. Data collection can be extensive without reaching cognitive overload, and data analysis is instantaneous upon completion of the observation, with critical factors displayed in various visual representations. Feedback to the teacher can include:
• A complete profile of all teacher actions and teacher-student interactions in the lesson to show the predominance of behaviors and teacher tendencies
o What types of questions were asked by the teacher, and how many of each type.
o What types of teacher responses followed student actions, and how many of each type
o Wait-time averages in general, and specific wait-times for each teacher and student action.
• A complete profile of all student actions showing interactions with the teacher, with other students, and student misbehaviors
o Which students are interacting, which are passive? Are most questions answered by a few students, while the other students are satisfied to be non-responsive throughout the lesson? Did the teacher employ strategies that engaged most or all students?
o Were students with special needs, or ELL students engaged at a level comparable to regular education students?
• An analysis of the data uncovering the critical patterns of teacher-student interactions
o When teachers ask questions and students respond, is it a productive pattern, or one contrary to the goals of the lesson. If student engagement and thinking is the goal, are open-ended questions present or absent, or were all follow-ups to student responses teacher-clarifying instead of asking the student to further explain their answer?
• An analysis of small group member interactions and teacher-student interactions
o Small groups are often semi-productive with a subset of members doing most of the work. Was there equity among small group members regarding work and product generation? What was the nature of the teacher’s interactions with the small group and did the interaction support or undermine the goals of the lesson?
• A complete profile of student misbehaviors and how the teacher dealt with such behavior.
o Are misbehaviors initiated by a few students versus many, or were there many misbehaviors without a teacher intervention?
o What if misbehavior counts are high during x type of lesson, and low during y type of lesson?
As such, the focus of this workshop is to help methods instructors and clinical observers maximize quantitative data collection during observations which then allows for a rich analysis of a set of critical factors that set the foundation for robust feedback and data anchored self-reflection. The web-based app technology allows the user to collect and analyze data while providing many different visual representations including tables, graphs, and heat maps of seating charts to use in the feedback process.
A substantial amount of quantitative data can be gleaned by one observer from a teaching episode and be a source for critical feedback to the teacher. Since many observers are mostly grounded and attuned to the qualitative aspects of observation and feedback, the very act of using the technology and engaging in quantitative data collection with related feedback, is similar to having a new lens from which to view instruction, which can affect both teacher and observer in a positive manner to help elevate and improve teaching skills. The novice’s improvement in teaching is dependent upon two key participants: a) the person teaching – a participant with the goal of improving instruction, and b) the observer – (cooperating teacher and/or university supervisor) the person with more advanced knowledge of teaching, who can contribute with poignant observations and more specific and evidence-based feedback.
In short, this workshop is designed to address the need to upgrade the observation, analysis and feedback part of the teaching observation, by using more quantitative data collection as a primary source for the feedback provided to the teacher, which then lays the groundwork for evidence-based reflection, and for establishing future quantitative targets for teaching.
Who Would Be Interested? – Any educator who works with pre-service teachers, such as methods instructors or cooperating teachers, or those who work in any capacity with classroom observation or practice teaching in virtual environments (Berg, 2013; Dieker, et. al., 2008), would be interested in learning about and using this tool to enhance the observation and feedback process; a tool that also facilitates a team-based approach to observation. In addition, those who are researching or examining the impact of professional development on in-service teacher instruction, or the impact of curriculum activity on student actions, misbehaviors, or student engagement, or teacher-student interactions, would find this a useful tool to add to their observation and assessment package.
The Expertise of Workshop Presenters – Craig Berg has been involved in science teacher preparation for over 35 years in the role of methods instructor and clinical supervisor. Through hundreds of observations and a keen interest in advancing supervision, he has played the primary role in developing the tool noted in this application, and tested and refined the tool for others to use. He is currently researching a number of aspects how using this tool affects both the teacher and the observer. Ray Scolavino is a colleague at UW-Milwaukee who id deeply immersed in the teacher preparation and observation efforts.
Learning Objectives (And How To Assess)
• To create an awareness of how the technology can enhance the observation and reflection process of clinical observations through demonstrating both the qualitative and quantitative data collection modes, followed by showing the analysis components using data-connected video, with a focus on the quantitative features and reflection-promoting aspects of the tool.
o Assess: Presenter-participant interactions and responses to questions about value and projected use of the tool within the constraints of their working environment)
• To have the participants leave the workshop with a working knowledge of how to use the tool.
o Assess: Participants will bring laptops and go through a series of training steps to develop a working knowledge.
• To develop a group of university methods instructors and supervisors who want to continue to collaborate on use of the tool and on developing a research agenda.
o Assess: Brainstorm initial plans for a post-workshop network, and initial thoughts regarding a research agenda.
Workshop Activities (2 hrs)
30 min – The what, why, and how: the capabilities of the tool for collecting both qualitative and quantitative information during clinical observations, and the rich data analysis features.
A. Qualitative Commentary and Reflection
B. Quantitative Data Collection – Lesson, teacher actions and interactions, student actions, interactions and misbehaviors (individual, small group, and large group).
C. Quantitative Analysis and Feedback
60 min – Participants bring their own laptops (or share) to practice developing their data collection skills using the tool with video segments of science teachers, and using the analysis component to provide feedback and to stimulate reflection. Synergistic observation and how the cooperating teacher can be a part of the future teacher, university supervisor, and cooperating teacher team.
D. Lesson Events, Teacher Actions, Student Actions, Wait-Time Data, Heat Maps of Student Actions, etc.
E. Team Building and Synergism Embedded
30 min – To develop a group of university methods instructors and supervisors who want to continue to collaborate on the use of the tool and on developing a research agenda.
F. Brainstorm initial plans for a post-workshop network, and initial thoughts regarding a research agenda. Set an agenda and dates/times for post-ASTE online follow-up meetings.
Continuing Learning and Collaboration – The last segment of the workshop should establish who is interested in participating in follow-up sessions with the purpose of continuing to work together on the use of the tool for clinical observations, and using the tool for research into the clinical-related aspects of teacher preparation.