A post by Amy Tepper and Patrick Flynn
Many of us have been wowed watching the Jeff Foxworthy show with 5th graders, but recently, we were wowed as we watched 6th graders set the bar for instructional leaders everywhere as we visited classrooms in one of our CT middle schools.
What We Know
According to Hattie (and the work that gets us up everyday), high quality feedback that promotes growth should always ensure a learner recognizes the following:
Where am I going? How am I going? And What’s next?
To accomplish this, feedback should:
- Be built from clear expectations and rubric language
- Provide clear examples as evidence
- Include questions to promote reflection
- Communicate strengths vs. praise and areas of growth vs. criticisms
Enter 6th graders.
What We Saw
We had the chance to observe a debate. We know one of the greatest challenges in this lesson structure is to keep the observers engaged. For this lesson, the 6th grade class watched 8 students debate while crafting feedback for their peers through the tech tool “Today’s Meet,” a safe way to run a Twitter-like live feed during a lesson. We thought students might struggle to listen and process the strength of the arguments, follow the feed, respond to peers, and maintain critical and formative feedback based on the debating rubric. But, were we proven wrong…these are 12-year olds living in 2018 after all. In addition, they work with teachers who set high expectations, recognize the value of unpacking rubrics, explicitly teach students what quality feedback is and how to deliver it, and provide platforms for them to convey their findings. Consider some of the feedback that was provided for the peers debating:
Feedback on the quality of presentation
“Pro side sounds prepared and states points in clear ways”
“I like how pro side has said transitional phrases”
“S was clear and loud, but not enthusiastic”
“C knows his cards and is using good eye contact”
“Words like ‘um’ and ‘like’ should be avoided”
“T is the only one talking during the rebuttals.”
Oh, and just in case you thought they weren’t listening…
Feedback on the quality of the argument
“Forgetting your lunch or gym clothes isn’t much of an ‘emergency’” [cell phone pro side]
“I like how he used a website and a good fact for rebuttal.”
“C and P have a good idea that students can use their phones in school with educational apps to help them learn.”
“I think S should stick to facts and not hypotheticals.”
“Con keeps reusing the same facts.”
“What do germs have to do with having phones…everything has germs.” [cell phone con side]
Our favorite: “Is ilovetoknow.com a reliable source?”
Later, we went down the hall to a 7th grade Socratic discussion where the outer circle of students was providing feedback through the app Padlet. This allowed for a slower pace than the live feed, but again we saw high quality feedback about the quality of the arguments appear on the main screen. Peers recognized the strength of the evidence used, how speakers were building from each other’s responses, and the appropriateness of the evidence cited.
These two classroom examples align with our standards of the high quality feedback leaders should be providing for teachers:
Standard A: A claim about practice and supporting evidence aligns to rubric language
Standard B: Specific evidence is provided to support growth
Standard D: Areas of strength and growth are rooted in the evidence and rubric
Standard F: The feedback can serve as a learning tool
Imagine the conversation and new learning that occured after the debate or Socratic using this level of feedback. If 12-year olds can provide this for their peers, can’t we all?
What support can we provide our instructional leaders so that they produce high levels of feedback for their teachers based on observations?
A Post by Amy Tepper and Patrick Flynn
We were sitting in Patrick’s living room the other day attempting to write, but found ourselves distracted by the geese in the backyard. We caught ourselves saying out loud, “welcome back” when we noticed new additions had arrived.
Two of the geese have spent the last few weeks of winter in this spot (probably feeling as irate as we have been about snow in April), but now several others have joined.
As Amy has lived in Florida for 20 years, and as all good writers do when faced with a deadline, we decided to read more about Canadian geese migration patterns.
So, did you know the first two are the scouts? They determine the best spots (Patrick’s yard, of course) and guide the others.
You may have heard the many connections between leadership principles and the behavior of geese, but this view out the window resonated so much after Patrick spent the day and presented at the CT E2LEAD 2018 Conference at Mohegan Sun organized by CT’s Teacher of the Year Council. So, we gathered a few ideas about what instructional leaders can learn from our flying friends:
They rotate. When one in front grows tired, another takes the lead.
Successful administrative teams share the responsibilities, leverage other resources such as instructional coaches or curriculum specialists, and foster teacher leadership to add layers of feedback and support for a school.
They honk. This distinguishable sound wakes many of us, but this is either encouragement and/or communication among the birds.
It is critical for our teachers to have a clear idea of where they are going and how they are going and for leaders to confirm effective practices – and the greatest avenue to achieve this is through high quality feedback.
They fly together. The always impressive V-pattern creates efficiency and ensures that they reach the destination together.
This reminds us that we are stronger together. How are you creating common understandings and building teacher capacity to support each other?
This spring, consider what you need as a leader to take flight with your teachers and staff.
- Remember, the early bird catches the worm.
- Address issues like water rolling off a duck’s back.
- Make your impact your swan song.
- Don’t let yourself become a lame duck.
Okay, we’ll stop.
A Post by Amy Tepper and Patrick Flynn
“My powers are ordinary,
only my application brings me success.”
Sir Isaac Newton.
In the last chapter of our new book Feedback to Feed Forward: 31 Strategies to Lead Learning, we highlight extraordinary practice of several districts and administrators. In that same chapter we provide six challenges to administrators and coaches who are charged with observing and providing feedback to teachers. One of our favorites is:
Go beyond policy
How often are you engaged in tasks that directly impact teaching and learning in your building beyond the policy requirements? Meet our heroes who are far from ordinary from New Fairfield Middle School. As a team, four leaders (including the principal Christine and the one AP Cheryl) have conducted 96 classroom walkthroughs since the beginning of this school year. By October, they had visited every room once. It is important to know this is over and above their required informals, formals, and midyears. They are disappointed in themselves as they were shooting for 350 by break. Yes, 350. They feel disconnected even when a week or two goes by and they aren’t out in their classrooms.
The walkthroughs provide valuable data as to how teachers and students are doing on a day to day basis and send a message to teachers that instruction is the central priority in the building. The leaders use the classroom visits to identify and communicate strengths, identified as “I appreciates,” and potential next steps communicated as “I wonders.” All teachers often feel threatened or nervous whenever leaders visit and the team has been working to build trust and establish an environment for growth through this practice.
They too struggle with the same things you do. They have fights, bullying, and bus incidents. They fill in for cafe duty and have to call and meet with parents, but they are finding the time.
What’s keeping you from your classrooms and leading the teaching and learning in your building? How are you going beyond policy and supporting the growth of your teachers?
This past week I had the privileged to travel to Tianjin China and present at a Forum hosted by Tianjin University of Technology.
The forum provided an opportunity for experts from various fields to collaborate and share ideas on potential ways that Big Data analysis could influence the future of their particular field. Presenters focused on topics ranging from way big data can impact our day-to-day air quality and the resulting impact on respiratory and cardiovascular health to how big data could influence predictions within the stock market.
These presenters each imagined how finding patterns in related data and engaging in cross analysis of that data can lead to deeper understanding of important advances in human, social, cultural, and economic health in our world.
The honor to speak as a representative in the field of K-12 education cannot be overstated.
Big Data and Education
Given the Spring 2018 release by Corwin of our book on classroom observation and feedback that I co-authored new with Amy Tepper, I focused my Keynote on imagining ways in which advances in information technology could influence the way in which we collect, organize, and analyze data about student learning in the classroom.
Making Interdisciplinary Connections
Knowing the great lengths we go every day working with administrators to build capacity to collect data in support of teacher learning and increased student learning in the classroom, I called experts in the room to action in consideration two things:
1. How can what we know about Big Data help us to redefine educator effectiveness and accountability and allow us to create better models of interaction with data, providing improved support for those who serve our children?
2. How can new information technology help us to build efficiencies in the way in which we collect evidence of student learning in the classroom, leading to knew understanding of what learning looks like for our students and what teacher actions lead to that learning?
During a two-day collaboration with experts in the fields of Information Technology and Management it is clear that advances in technology and growing understanding of the potential behind Big Data analytics offers much to the field of education.
Presenters provided overviews of innovative practice in their fields including:
• Dr. J. Gregory McVerry, a ReVISION Learning Senior Contributor and Professor at Southern Connecticut State University, challenged us to rethink methodologies when using big data in the classroom. Dr. McVerry suggested we use more formative design based research projects centered on improving communities rather than chasing citation counts.
• Professor Chien-Lung Chan, leader for the research and development of big data and digital fusion center at Yuan Ze University, Taiwan presented his work in the field of disease patterning and the demand for and impact of long-term care solutions. Dr. Chan’s cross analysis of environmental and health data sets can provide support for the health industry, potentially securing new methods for early warning detection based on environmental factors.
• Guanzhong You, from Columbia University, and Professor Jheng-Long Wu, a researcher at Academia Sinica in Taiwan, each provided insights on the potential of machine learning and machine analytics. Work in stick prediction analysis provides examples of methods that can be applied to support the development of smart cities, creating efficiencies in transportation, energy consumption, and services.
• Professor Yan Quan Liu of Southern Connecticut State University, presented on Smart Libraries, demonstrating how new applications in library-media sciences can become a significant resource for communities, helping to personalize services.
• Tours and presentations by the Department of Management and the Department of Information-Technology at Tianjin University of Technology, provided insights to exciting research and development in the field of Big Data and Analytics including computer visioning and face-recognition technology.
Each of the presentations helped me glimpse the impact Big Data will have on education.
Creating Tomorrow’s Classroom Today
Highlighting some of the early work with Big Data approaches at the AltSchool in the US, I described data collection tools such as use of infrared cameras in classrooms tracking resource use and recording student and teacher talk. The use of fit-bit like devices with students to track movement, heart-rate, and time between meals. The information technology available to collect data offers tremendous opportunity in our classrooms. As I presented these innovative data collection approaches, I also demonstrated the challenges we face in the field of education.
Intent remains the biggest question and critics, rightfully so, will question privacy risks. Time and data overload can often make too much information meaningless. Yet many of the presenters tackled these challenges in their fields and education can do the same. We must ensure that the development of efficient data collection strategies and analysis techniques on behalf of our students becomes our highest priority for students. Getting this information into the hands of teachers can transform classrooms.
Establishing the trust in that data and its collection is our first step.
*A follow up to my previous post – Connecting Teacher Action with Student Outcomes.
This morning I attended a meeting of the Performance Evaluation Advisory Council (PEAC). PEAC, charged with leading Educator Evaluation in the state of CT, began conversations today on the weighting associated with the current components of educator evaluation. It has been long understood that the structures have required discussion and change, so I applaud the focus on this important topic.
In conjunction with some creative PDEC Committees (Bethany Public Schools comes to mind), ReVISION Learning has processed the importance of educator reflection in ensuring student performance and its corresponding role in educator evaluation. In other words, we began our work together with the premise that teacher and/or collective reflection on student outcomes is at the heart of greater levels of student growth (Hattie, 2009). As a result of focusing discussions in this way, development of district-level policy is driven by student growth and, as equally important, a deep and meaningful understanding of how that student growth has been directly impacted by practice.
What has been clear to this point in the evolution of CT’s educator evaluation programming, is that while there has been progress made in conversations about teaching practice, there is still much to be done in creating reflective practice that connects teacher actions to student outcomes. Even as some will cite reflection among teachers and administrators as having improved through the current process, the question now becomes…
How does the system ensure that this reflection is not only about how teachers do their job,
how and why do their practices impact student learning?
Below is an outline of our suggestions to PEAC as they consider their decisions in the upcoming month and year.
To understand our proposal, I’ll start with the current CT structure:
|Evaluation Component||Current Weight||Weighting Breakdown|
|Performance and Practice||40%|
|Student Learning Objectives (SLO)||45%|
|Whole Student Learning Goals and Student Feedback||5%|
Our Proposal (rooted in what we know is good practice about learning)
Change from 50% to 60% or higher.
We have long proposed that the emphasis in educator evaluation be placed on HOW teachers are impacting students. Revealing this understanding among all educators is what the performance and practice component should be codifying.
The 60% Practice component would include:
- Performance and Practice -50% based on three core modalities of evidence collection by a qualified supervisor, each tied directly to a four-point rubric.
- Artifact Review
- Collegial Dialogue.
The 2015 version of the Common Core of Teaching provided by the State Department of Education provides a quality tool for analysis of learning in the classroom. What is most important is that evaluators stop focusing their evidence collection on only teacher practice and begin to collect evidence of learning, setting up teacher reflection on the thing that matters most – how students are learning.
It is important to note our common disclaimer at this very moment – evaluators NEED training AND training needs to be measurable through a quality performance assessment completed by evaluators.
Also, non-negotiables throughout the state need to be established about the quality of evaluators:
- Evaluators need to clearly demonstrate capacity to observe and collect evidence, analyze evidence, and provide feedback.
- Districts need to document inter-rater agreement among all evaluators.
If we do not invest in evaluator training then we simply will be attempting to improve a system that will be poorly executed (at no fault to all the evaluators throughout the state currently working very hard to make the work meaningful).
Maintain the current 10% based on Stakeholder Feedback with one important change.
Our recommendation is that targets should be set as a school and, through collective practice, teachers and administrators are then charged with addressing fundamental issues revealed in annual climate surveys. Additionally, PDEC committees should consider alignment between the associated rubrics used to measure performance and practice (such as CCT) and the establishment of actions steps to be taken to meet targets (such as communication with stakeholders-CCT Domain 4). For example, if annual surveys reveal that ongoing communication with parents is an issue, then all teachers would work collectively to remedy this and measurement of success would be based on changes revealed in the corresponding climate survey. Collective efficacy is an essential part of successful educational practice and should be measured accordingly.
Change from 50% to 40% or lower. Before anyone claims that we are trying to water down the student outcomes, take time to re-read the Practice Outcome information just shared and then pay close attention to the design we are presenting to our Student Learning Objectives (SLOs). We do more with less than what is currently in place – I assure you.
We recommend up to two SLOs, each would be worth 20%. This would include:
- Up to five Indicators of Academic Growth (IAGDs) for each SLO, depending on the assessment being used to monitor progress.
- One IAGD would be based on completion of a Student Outcome Portfolio (designed to support targeted reflection linking practice and outcome). The remaining IAGDs would be designed as Banded Student Growth Goals. Banded Student Growth goals are design to target the different levels of student performance commonly found in our classrooms while closing the gap that currently exists.
Need more information on either of those ideas, contact us or keep an eye out for future posts.
- At least one SLOs/IAGDs would be based on a standardized assessment. Where possible, both SLOs/IADGs should be based on standardized assessments.
- All SLOs should be directly connected to the strategic initiatives of the district and school improvement plans.
Eliminate the 5% associated with Whole School Indicators or Student Surveys. Student Surveys can and should be part of the Stakeholder Feedback component already described and Whole School Indicators are examined more effectively through the design of SLOs in alignment with strategic school improvement plans.
The suggestions we are outlining are designed in alignment with our work in educator evaluation over the past five years. We have worked with over 800 evaluators and 52 districts since our inception, and one thing that has been overwhelmingly clear is that policy needs to catch up with practice. Practitioners are ready for meaningful, reflective practice, now PEAC needs to provide the policy.