This past week I had the privileged to travel to Tianjin China and present at a Forum hosted by Tianjin University of Technology.
The forum provided an opportunity for experts from various fields to collaborate and share ideas on potential ways that Big Data analysis could influence the future of their particular field. Presenters focused on topics ranging from way big data can impact our day-to-day air quality and the resulting impact on respiratory and cardiovascular health to how big data could influence predictions within the stock market.
These presenters each imagined how finding patterns in related data and engaging in cross analysis of that data can lead to deeper understanding of important advances in human, social, cultural, and economic health in our world.
The honor to speak as a representative in the field of K-12 education cannot be overstated.
Big Data and Education
Given the Spring 2018 release by Corwin of our book on classroom observation and feedback that I co-authored new with Amy Tepper, I focused my Keynote on imagining ways in which advances in information technology could influence the way in which we collect, organize, and analyze data about student learning in the classroom.
Making Interdisciplinary Connections
Knowing the great lengths we go every day working with administrators to build capacity to collect data in support of teacher learning and increased student learning in the classroom, I called experts in the room to action in consideration two things:
1. How can what we know about Big Data help us to redefine educator effectiveness and accountability and allow us to create better models of interaction with data, providing improved support for those who serve our children?
2. How can new information technology help us to build efficiencies in the way in which we collect evidence of student learning in the classroom, leading to knew understanding of what learning looks like for our students and what teacher actions lead to that learning?
During a two-day collaboration with experts in the fields of Information Technology and Management it is clear that advances in technology and growing understanding of the potential behind Big Data analytics offers much to the field of education.
Presenters provided overviews of innovative practice in their fields including:
• Dr. J. Gregory McVerry, a ReVISION Learning Senior Contributor and Professor at Southern Connecticut State University, challenged us to rethink methodologies when using big data in the classroom. Dr. McVerry suggested we use more formative design based research projects centered on improving communities rather than chasing citation counts.
• Professor Chien-Lung Chan, leader for the research and development of big data and digital fusion center at Yuan Ze University, Taiwan presented his work in the field of disease patterning and the demand for and impact of long-term care solutions. Dr. Chan’s cross analysis of environmental and health data sets can provide support for the health industry, potentially securing new methods for early warning detection based on environmental factors.
• Guanzhong You, from Columbia University, and Professor Jheng-Long Wu, a researcher at Academia Sinica in Taiwan, each provided insights on the potential of machine learning and machine analytics. Work in stick prediction analysis provides examples of methods that can be applied to support the development of smart cities, creating efficiencies in transportation, energy consumption, and services.
• Professor Yan Quan Liu of Southern Connecticut State University, presented on Smart Libraries, demonstrating how new applications in library-media sciences can become a significant resource for communities, helping to personalize services.
• Tours and presentations by the Department of Management and the Department of Information-Technology at Tianjin University of Technology, provided insights to exciting research and development in the field of Big Data and Analytics including computer visioning and face-recognition technology.
Each of the presentations helped me glimpse the impact Big Data will have on education.
Creating Tomorrow’s Classroom Today
Highlighting some of the early work with Big Data approaches at the AltSchool in the US, I described data collection tools such as use of infrared cameras in classrooms tracking resource use and recording student and teacher talk. The use of fit-bit like devices with students to track movement, heart-rate, and time between meals. The information technology available to collect data offers tremendous opportunity in our classrooms. As I presented these innovative data collection approaches, I also demonstrated the challenges we face in the field of education.
Intent remains the biggest question and critics, rightfully so, will question privacy risks. Time and data overload can often make too much information meaningless. Yet many of the presenters tackled these challenges in their fields and education can do the same. We must ensure that the development of efficient data collection strategies and analysis techniques on behalf of our students becomes our highest priority for students. Getting this information into the hands of teachers can transform classrooms.
Establishing the trust in that data and its collection is our first step.
*A follow up to my previous post – Connecting Teacher Action with Student Outcomes.
This morning I attended a meeting of the Performance Evaluation Advisory Council (PEAC). PEAC, charged with leading Educator Evaluation in the state of CT, began conversations today on the weighting associated with the current components of educator evaluation. It has been long understood that the structures have required discussion and change, so I applaud the focus on this important topic.
In conjunction with some creative PDEC Committees (Bethany Public Schools comes to mind), ReVISION Learning has processed the importance of educator reflection in ensuring student performance and its corresponding role in educator evaluation. In other words, we began our work together with the premise that teacher and/or collective reflection on student outcomes is at the heart of greater levels of student growth (Hattie, 2009). As a result of focusing discussions in this way, development of district-level policy is driven by student growth and, as equally important, a deep and meaningful understanding of how that student growth has been directly impacted by practice.
What has been clear to this point in the evolution of CT’s educator evaluation programming, is that while there has been progress made in conversations about teaching practice, there is still much to be done in creating reflective practice that connects teacher actions to student outcomes. Even as some will cite reflection among teachers and administrators as having improved through the current process, the question now becomes…
How does the system ensure that this reflection is not only about how teachers do their job,
how and why do their practices impact student learning?
Below is an outline of our suggestions to PEAC as they consider their decisions in the upcoming month and year.
To understand our proposal, I’ll start with the current CT structure:
|Evaluation Component||Current Weight||Weighting Breakdown|
|Performance and Practice||40%|
|Student Learning Objectives (SLO)||45%|
|Whole Student Learning Goals and Student Feedback||5%|
Our Proposal (rooted in what we know is good practice about learning)
Change from 50% to 60% or higher.
We have long proposed that the emphasis in educator evaluation be placed on HOW teachers are impacting students. Revealing this understanding among all educators is what the performance and practice component should be codifying.
The 60% Practice component would include:
- Performance and Practice -50% based on three core modalities of evidence collection by a qualified supervisor, each tied directly to a four-point rubric.
- Artifact Review
- Collegial Dialogue.
The 2015 version of the Common Core of Teaching provided by the State Department of Education provides a quality tool for analysis of learning in the classroom. What is most important is that evaluators stop focusing their evidence collection on only teacher practice and begin to collect evidence of learning, setting up teacher reflection on the thing that matters most – how students are learning.
It is important to note our common disclaimer at this very moment – evaluators NEED training AND training needs to be measurable through a quality performance assessment completed by evaluators.
Also, non-negotiables throughout the state need to be established about the quality of evaluators:
- Evaluators need to clearly demonstrate capacity to observe and collect evidence, analyze evidence, and provide feedback.
- Districts need to document inter-rater agreement among all evaluators.
If we do not invest in evaluator training then we simply will be attempting to improve a system that will be poorly executed (at no fault to all the evaluators throughout the state currently working very hard to make the work meaningful).
Maintain the current 10% based on Stakeholder Feedback with one important change.
Our recommendation is that targets should be set as a school and, through collective practice, teachers and administrators are then charged with addressing fundamental issues revealed in annual climate surveys. Additionally, PDEC committees should consider alignment between the associated rubrics used to measure performance and practice (such as CCT) and the establishment of actions steps to be taken to meet targets (such as communication with stakeholders-CCT Domain 4). For example, if annual surveys reveal that ongoing communication with parents is an issue, then all teachers would work collectively to remedy this and measurement of success would be based on changes revealed in the corresponding climate survey. Collective efficacy is an essential part of successful educational practice and should be measured accordingly.
Change from 50% to 40% or lower. Before anyone claims that we are trying to water down the student outcomes, take time to re-read the Practice Outcome information just shared and then pay close attention to the design we are presenting to our Student Learning Objectives (SLOs). We do more with less than what is currently in place – I assure you.
We recommend up to two SLOs, each would be worth 20%. This would include:
- Up to five Indicators of Academic Growth (IAGDs) for each SLO, depending on the assessment being used to monitor progress.
- One IAGD would be based on completion of a Student Outcome Portfolio (designed to support targeted reflection linking practice and outcome). The remaining IAGDs would be designed as Banded Student Growth Goals. Banded Student Growth goals are design to target the different levels of student performance commonly found in our classrooms while closing the gap that currently exists.
Need more information on either of those ideas, contact us or keep an eye out for future posts.
- At least one SLOs/IAGDs would be based on a standardized assessment. Where possible, both SLOs/IADGs should be based on standardized assessments.
- All SLOs should be directly connected to the strategic initiatives of the district and school improvement plans.
Eliminate the 5% associated with Whole School Indicators or Student Surveys. Student Surveys can and should be part of the Stakeholder Feedback component already described and Whole School Indicators are examined more effectively through the design of SLOs in alignment with strategic school improvement plans.
The suggestions we are outlining are designed in alignment with our work in educator evaluation over the past five years. We have worked with over 800 evaluators and 52 districts since our inception, and one thing that has been overwhelmingly clear is that policy needs to catch up with practice. Practitioners are ready for meaningful, reflective practice, now PEAC needs to provide the policy.
“Growth” has been one of the more elusive words in educational jargon over the past few years. Its use triggers reactions from so many due to ill-fated or, better put, poorly constructed attempts to connect it to educator evaluation.
Anyone who has read or spoken with me in-depth knows that I do consider student outcomes to be an important aspect of every educator’s evaluation and do not believe it should be removed as a criterion for determining levels of performance. However, the current approaches being used to design and implement SLOs typically provide for little connection between what the teacher does and the outcomes for students, serving no other purpose than to inflate and invalidate data on teacher or leader effectiveness.
What I have recently been working on is a redefinition that I believe can address some of the shortcomings of previous approaches and that could/would actually be embraced by teachers as meaningful and realistic.
Yup, that is a big job I know but how else do we expect to change and improve?
A ReVision of Student Learning Objectives
First, it is important to note that the idea is not original and is more of a mix of other ideas that have been successful. I am, however, as far as I can tell, the first to propose the approach to satisfy the typical requirements of educator evaluation practice. Additionally, this is written in alignment with the Connecticut Guidelines of 45% of student outcomes measuring overall teacher and administrator performance.
My proposal is simple – stop measuring just the outcome in terms of student achievement but also measure the influence specific teacher practices have on those outcomes.
If you need to know what targeted practices can be measured that will influence student achievement then look no further than John Hattie’s list in Visible Learning for Teachers. Below is a glossary of those influences in order of their effect. These all have an effect size greater than .70.
- Student Self-Reported Grades
- Piagetian programs
- Response to intervention
- Teacher credibility
- Providing formative evaluation
- Classroom discussion
- Comprehensive interventions for learning disabled students
- Teacher clarity
If we know that these practices have the most positive effect size on student achievement then why not measure these in alignment with student outcomes? While student outcomes, in my opinion (and I am sure others will/can disagree), must remain a part of performance measurement in our schools. Paying closer attention to how we achieve those outcomes, however, is the most important information to ensuring sustainable achievement. In other words, it’s not knowing that kids achieved that makes me effective, it is knowing what I did to support that learning that allows me to apply it again and again.
Proposal (with more details to be worked out with PDEC committees who are ready to step up):
“Visible teaching and learning occurs when there is deliberate practice aimed at attaining mastery of the goal, when there is feedback given and sought, and when there are active, passionate, and engaging people (teacher, students, peers) participating in the act of learning.” ~John Hattie
While it will be a challenge for many – once again, nothing great happens without a significant challenge.
My proposal is to continue to create SLOs/IAGDs that examine student improvement based on standardized and/or non-standardized measures (still believe CBMs are the best assessments for this purpose – once again more on that later).
Teachers would design the required two SLOs and corresponding IAGDs to address growth over the course of the year for EVERY student in their classrooms or, for our secondary teachers, a significant number of students in their classrooms or for whom you have an impact. Stretch significantly for those students who require greater growth to bridge any existing gaps – that is not only a collective responsibility but EACH individuals as an educator of that student.
Teachers and their supervisors would analyze beginning, middle, and end of year performance of students based on the design of goals that are well-designed to the assessment being used. Student performance would collectively reflect 22.5% of the teacher’s overall evaluation. Criteria have already been established in most educator evaluation models and can continue to be used. This is based on a simple model – if all or a quantifiable number of our students are demonstrating learning, meeting the goals we are setting, then 22.5% of their overall evaluation is scored accordingly.
Sample provided below…
|100% of students’ met the SLO and IAGD Targets.||At least 90%’of students’ met the SLO and IAGD Targets.||At least 80%’of students’ met the SLO and IAGD Targets.||Less than 79% of students’ met the SLO and IAGD Targets.|
For the remaining 22.5%, teachers would engage in investigation, reflection, and on-going re-examination of what Hattie has demonstrated is the most significant element of that success – educator action/application of key strategies.
Here is where “system support” becomes essential so that there is an educational environment for analysis. Supervisors create a system (preferred system would rely on grade level Data Teams with potential portfolios) for teachers to collect evidence of their practice in one or more of the above listed educator actions from Hattie’s meta-analysis.
Then, collaboratively, teachers and supervisors connect those practices to the successes in student achievement. If data teams are in place, for example, the analysis and documentation of student performance and related teacher actions associated with a review are readily available. In other words, the development of a portfolio of a teacher’s investigation, reflection, and on-going re-examination is not an add-on to the work – it’s just what we do. By the way, if a school or district has clearly defined the vision for it’s instruction and has aligned the goals to that vision, the layers of support for teachers is inevitable.
Some will tell me that I am just increasing the performance and practice measurement by 22.5% and in many ways they would be right. I have always thought that adult action is the most valid measures for educator evaluation. The more cynical will tell me that even with a shift, a teacher should never be individually measured by student outcomes. To either party I simply ask, if you are not willing to try something new, what change do you ever expect to see.
I left for the Learning Forward Conference in Vancouver yesterday and, since I left, I have received two messages – loud and clear – that I need to “disconnect” a little. Now that word means many different things to different people and in different situations. My meaning for it is pretty simple in this context and, as I thought about it this morning, I believed it was a great message to share with my fellow participants here at the Conference.
It began when I boarded my first flight from Hartford, CT yesterday. I have traveled quite a lot in my career but was taken by surprise when I ascended the stairs to board the plane (yes – walked outside to get on the plane), was greeted by the Pilot who, admittedly, I thought was a flight attendant in the moment, and found a plane with 18 seats – 9 single seats each side. Definitely the smallest commercial plane I have ever flown.
Now, when I get on planes, the first thing I am doing is opening my computer, finding how the WiFi works, whether I need to pay, and getting to work. Needless to say, there was no WiFi on this plane. This became my first opportunity to disconnect. I put on a Jim Oliver recording, meditated for ¾ of the flight and contemplated what and where I was going this week and how I had gotten here in the first place. It was the most productive 60 min I have had in quite a long time.
What was even better was that I was present (the word “present” has a whole bunch of meaning right here) to witness, as we approached Montreal, a 4th grade student sitting just across and in front of me and who was on his first trip outside of the US. As the clouds cleared and we were making our approach, I noticed he had yet to see outside the window. Having heard him earlier express his excitement to his Mom and Dad who were in the rows in front of him, I leaned over and said, “Hey buddy, that’s Montreal right down there”. His eyes lit up as he excitedly called to his Mom and Dad and they all shared in a wonderful family moment. I would have never had the opportunity to watch that unfold if my face was stuck in the computer screen.
My second message came during this morning’s run.
As I started the run, I did the usual…started up my playlist – headphones on, head down and get running. You know, being a productive runner.
About ½ mile into the 7 mile loop, I caught the pic below (also featured above):
That was the first moment on the run when I realized, wow; I should probably be paying attention to something other than the running.
Then, only 1 mile into the 7 mile loop, my iPhone died. It was too cold and the battery went dead.
From that point on, I noticed and experienced so much more than simple exercise.
I heard a half dozen different birds calling in Stanley Park, I saw the morning mist rise and give way to Grouse Mountain and Mt. Fromme. I listened to water fall from the high ledges on the western side of Stanley Park looking out at English Bay.
The run meant so much more without all the interference and noise that typically coincides with my routine exercise.
So, what’s the message?
Well, to my fellow participants and presenters alike here at the Learning Forward Conference this week….
While here – in the midst of all the new learning and meeting of new people, the networking and attending of events and functions, the great sessions you will attend, and all the materials and resources at vendor tables you will discover – stop and take some time to… not be productive. Instead, take time to see and listen to all that is happening around you and disconnect from being “productive”.
Disconnect a little as often as you can and see if it might lead to something even more productive than what you had planned out of this trip in the first place.
Let wonder and excitement guide you this week and, if you do, let me know how it worked for you.
will be presenting along with ReVision Learning Partnership next week at the Learning Forward Conference in Vancouver, British Columbia.
The Learning Forward Conference, entitled Connecting Landscapes for Learning, has been designed this year to examine the impact of professional learning and quality feedback on teacher practice and student outcomes and increasing coherence and relevance of professional development.
The attention to these ideas in West Hartford and Canton Public Schools cannot be questioned and it is only fitting that their efforts will be recognized in this international setting.
Next Tuesday, Natalie Simpson, Assistant Director of Human Resources of West Hartford Public Schools and Jordan Grossman, Assistant Superintendent of Canton Public Schools will lead a session for “advanced leaders” that focuses on their districts’ successful journey to shift the focus of teacher evaluation from an inspection model to a growth model. Participants will learn how these two districts designed, implemented, and analyzed the professional learning for leaders to more directly support student learning in the classroom. The importance of student outcomes in West Hartford and Canton classrooms has become the core focus of observation and feedback for on-going learning and support.
Professional Learning in West Hartford and Canton: It Begins with Great Leaders
What has been most impressive about the professional learning designs for leadership in these two districts has been the coherence they have sought to establish. Each district has designed learning opportunities that go beyond the simple workshop approach, targeting, instead, the most important aspect of educational leadership practice – serving as an instructional leader.
Focusing on the knowledge, skills, and dispositions of an instructional leader for all members of their administration has led to a more coherent and meaningful implementation to their supervision and evaluation practice.
Professional learning has been designed and implemented to ensure that each building and district-level administrator, department head, and curriculum leader has the capacity to observe for and provide feedback about the way students are learning in classrooms. In this manner, the districts have chosen to ensure supervision and evaluation is focused on supporting teachers directly in the most important aspect of their jobs – knowing what kids are learning and how a teacher’s practice is impacting that learning.
Round Table Discussions
Recently, ReVision Learning sponsored a Round Table discussion with six leaders from these two CT districts in which leaders described how their district’s professional learning program directly impacted their practice and support for the teachers they serve.
Supporting teachers by shifting their observations from a simple assessment of teacher practice to a careful examination of student learning has helped these two CT districts to create more substantial and meaningful outcomes from their on-going supervision and evaluation practice.
Two videos were created to help capture and share the work that is happening in these two districts. It is our hope that these can prove helpful to other district leaders as they consider the design, implementation and impact of their leaders’ professional learning.
An abbreviated version providing just a few highlights
More extensive footage of the Round Table discussions.
We have been honored to have had the opportunity to work with these two districts for the past few years and we encourage other district leaders to reach out to them to hear more about the outstanding work and outcomes within these two districts.