Small Victories

Today’s post is the second in a four-part series on transforming the data culture at the Gloucester Twp. School District.  Tim Trow, Director of Curriculum and Instruction, writes for the SJDLP.  


 

“I can’t do that.  I don’t even know where to begin.  It is just too complicated and I’m no good with numbers.”

“We need to radically change what we are doing if we are going to get better results for our students.”

Observation suggests that these two extremes attack progress with utilizing data effectively within classrooms, schools and districts.  They do so insidiously– slowly, and from opposite ends of the continuum.  Individuals either give up before they begin or bite off more than they can chew.

In my last post, we looked at establishing a culture of data leadership.  In this post we will look at the need for incremental growth by looking for “small victories” that demonstrate progress for each member of the organization along a continuum of data use.  Jay McTighe has described this as starting small and looking for an early win in November. This acknowledgment– that nothing big happens fast and that the psychology of a successful experience builds confidence and desire for more— is essential to improving data use and building a culture of sustained growth.

At the conclusion of my first year in my current position as Director of Curriculum, our district was awarded a Achievement Coach state grant from the NJDOE.  After training selected teachers and administrators in various aspects of best practices for student learning and achievement, we chose to turnkey-train our staff in the use of newly-developed reports to inform instructional decisions to improve student achievement. We saw our success in noticeable improvement in student outcomes the subsequent spring on our state assessment results.

Even with this success, one of the lessons learned was that there were pockets of success that drove much of the gains with those who “bought in” and understood the training best.  This was true at the principal and teacher level.  While we thought we had started small and had success, qualitatively it was evident that we should have started with even smaller, more manageable improvement targets.

We believed that with an 11-school district we needed to be sure to do a better job of developing our building administrators’ capacity to lead data discussion and activity if we were going to have the systematic data culture we needed, rather than mere pockets of data-rich discussion and activity.  Sure, I had always been known as a “data guy,” but had I really done enough to make data accessible and practical for others?

     Last summer we contracted with Dr. Tracey Severns who led an administrative retreat with a strong orientation toward curricular leadership, particularly in the area of assessment and data.  Our administrators were inspired and developed action plans for the following year that focused primarily on improving their work in setting expectations and support for richer discussion and use of data in PLC meetings.

I also committed to improving my own modeling and leadership at district administrative curriculum meetings.  We had a district “Data Day” for all administrators in August to review student achievement data in a fashion we had learned from Dr. Severns.  The response of administrators was very positive and I asked them to conduct their own similar activity at their September faculty meeting. District supervisors led activities with specific subject matter teachers at the middle school level and I met with each principal to review their plan and activities early in the fall.  Follow-up meetings in which principals shared their successes and challenges later at principal’s meetings provided opportunities for collaboration and growth as well as keeping the focus on the important goals set by each of them and the district.

What did I learn through this process?  

  • I learned that each principal was at a different place with their comfort and use of data.  
  • I learned that with modeling and support, each of them moved and improved their practice and comfort level with data.  Some discussed data with me for the first time ever. All improved their discussions with staff. 

Throughout this year, I have had the great pleasure of sitting in on numerous building-level, data-rich discussions that identified root causes and led to practical discussions improving practice both instructionally and in support of students’ social/emotional needs.

Are we perfect?  Far from it, but what I know is that we experienced incremental improvement resulting in “small victories” across our district.  Each small victory will allow us to take another step forward as each administrator feels more positive about the district practice and is ready to learn something new.

Regardless of your personal or district level data use, understand that through small intelligent and strategic steps, you can conquer the mountain of data available to us and find practical ways to improve outcomes for students.  The satisfaction of making a difference for students is the reason we all entered this profession and the effort is well worth it!

In my next post, I will explore the need to Trust the Process” as we improve our data practice.

Our New Name

If you haven’t noticed, our blog looks slightly different.  So do our logo, our website, and our Twitter account.  It’s because we’ve changed our name– we’re happy to re-brand as the South Jersey Data Leaders Partnership.  So naturally, you probably ask, “Why?”

When we began as the South Jersey Data Specialists Partnership, our mission was to create a space for folks in the data specialist position.  Most of us had titles like “Data Specialist,” “Director of Planning and Assessment,” or “Chief Performance Officer.”  Many of us were curriculum people by title, but were the primary data specialist in the district.  So it made sense for our name to reflect who we ourselves were.

As we’ve worked and grown, a new and more-important purpose emerged.  We began to realize that if we were going to change schools for the better and improve the lives of our students, we would need all educators– in every role– to be confident data users and leaders.  To succeed, students need their teachers, counselors, CTS team members, assistant principals, principals, supervisors, directors, and superintendents– and everyone else– to be data-confident and data-reflexiveThis is the SJDLP’s real work.

What happened?

The question “What happened?” is so basic that it’s easy to forget its importance.  It’s one of the first phrases we learn– if you don’t believe that, read for a while with a three-year old.  And like many simple but important questions, it can be difficult to find a satisfactory answer.

But “What happened?” is really two parts– the “what” and then, separately, the “happening” that brought about the what.  If you walked to a parade and saw, for instance, this:

…you’d be able to say, probably right away, a few things:

1) There is a gorilla carrying a woman in a cage, parading down the street.

2) This town really knows how to celebrate July 4th.

You could then go on to make some observations about the cage, the woman’s clothes, the fact that this is probably a person in a gorilla suit, etc…. But you wouldn’t as yet, be able to tell why this spectacular scene is before you.  It is a political statement?  Modern Art?  A revolt at the zoo?  David Lynch’s retelling of Goodnight Gorilla?

In other words, you’d know the what, but not the what happened.

With schools and data, it’s easy to get lost in the what.  Take, for example, this graph:

Observations, Unit 1

If we take a look at this performance of, say, all the science teachers in a school over a 6-8 week period, there’s no shortage of “what” to dissect.  You can pull out some conclusions of your own, but here are some highlights:

– the teachers tend to plan lessons well

– teachers walk around a lot and have good rapport with kids

– everything looks safe and well-organized

At the same time, however…

– formative assessment doesn’t seem to be happening much at all

– assessment in general, but especially higher-order and self-directed assessments, seems to be a weak area

And this is all good.  We’ve successfully identified that whats.  We know there’s a gorilla and a lady in a cage, and we know what these teachers have and have not been observed doing.  But what we don’t know is the why— why are we seeing the things we’re seeing?

This is where story comes in.  To be a good storyteller, you have to know what happened.  You have to know, for instance, that this is a school where there were rampant discipline problems 10 years ago, and the most important thing was for teachers to get control of their classrooms– more important than performance on any assessment.  And the school hired supervisors who cracked down in specific ways– teachers who had their plans in on time were rewarded, and those who could demonstrate the easy stuff– like walking around the room and being nice to the kids– were left alone.  No one cared about “messy” formative assessment and checking for student understanding because the administrators believed that the more work the teacher gave to kids, the busier they’d be and the better they’d behave.  And now, 10 years later, the teachers still show these behaviors, even though the school has changed a great deal, but they’re still teaching the kids from 10 years ago.

A graph like the one above is like a movie still.  If I see this picture, for instance:

… I think that this is a movie about a prince and princess falling in love.  But if I don’t see this, too:

… I’m missing a lot of the story, and a lot of the root causes behind what I’m seeing in my snapshot in time.

To use data effectively, we need to know what happened.  And that involves digger much deeper than we’re accustomed to.  It’s not enough to look at presented data, make some observations about what we’re seeing, pat ourselves on the back, and say “Let’s stop doing that!”  Knowing why we’re seeing certain data– what happened to make the data even be there is the most important part of telling stories with data.  Only then can we begin to use data to change the way things are, and write the story of what will happen from that point onward.

We will periodically be coming back to the idea of story on our site.  So much of data leadership goes beyond technical work– it’s more about understanding why people behave how they do, what root causes underlie organizational culture, and what will motivate people to change.  Through all of this, we’re talking about people— and to understand people, we have to understand the role of story and narrative in how we see our world.

Developing the Data Reflex

At the SJDLP we often discuss the concept of becoming data reflexive.  Too often, “data” is something done in bursts, on special days, when the consultant comes in, or when someone’s looking.  It might involve reviewing binders full of charts, making teachers fill out data protocols and hand them in to supervisors, and arguing about having too many or not enough tests.  When data is something that’s done in discrete chunks like this, it’s like p.d. that you go to for one day and then go back to your classroom and never use– without sustained interaction and reflection, it’s rarely, if ever, effective.

Data reflexivity by contrast represents the idea that, when making a decision, an educator reflexively turns to data in order to inform that decision.  Data isn’t an awkward appendage; it’s the source code, so to speak, of what’s happening in the life of a classroom, a school, and a district.  A programmer knows that when software isn’t working, go to the code and de-bug.  That’s a reflex– the same can be said for a teacher re-stating directions when a students says he doesn’t understand or a counselor closing the door and offering a tissue when a student comes in crying.  Those actions are reflexes, for the most part– as opposed to something formally learned (though we can certainly learn to hone our abilities in these areas).   In the same way, teachers that use data reflexively always think of what’s happening in class as an endless source of data to be taken, and can swiftly and fluently collect and analyze data and use it to inform what they do next.

Of course, data isn’t just numbers or performance on a test.  “Constantly collecting data” doesn’t mean over-testing, and it doesn’t mean becoming robotic about students’ lives.  We collect data all the time without being conscious of it– when we meet someone, we feel out their mood, their trustworthiness, their personality; all of these things are sources of data.  That’s important to remember since too often, “data” only means performance, and only on certain tests, at that.

Ultimately it’s the reflex to turn to data to inform fundamental questions like “What do my students already know,” “What things that I do resonate with my students most,” or “What do students of difference ethnic groups think of their school experience” that marks true data reflexivity.  Using data can be as much a behavior to shape as it is a concept to understand– our hope is that thinking about it as such can remove some of its mystery.