A Different Way to Lead: System Improvement Leads Networked Improvement Community

Continued from CCEE Connection September 2023

By Sandra Park, Co-Founder, Improvement Collective

<< Go Back

Approaching the work with curiosity and humility has not only provided teams with a better understanding of their systems but has also shifted leaders’ relationships with their colleagues.  As one program specialist noted, “I think personally I’ve had a shift in…my thinking around collaboration and working in a group…honestly, like active listening to [teachers] who have perspectives that I may not understand or on the surface may not have valued as much.  This work has given me more practice to kind of step back and listen and really try to understand a different perspective…I notice a difference in my professional relationships that have been really positive.”

A SELPA administrator at another district also commented on how the nature of the work has shifted her interactions with teachers and program specialists, “We have had to build a relationship with them in a different way because we’re having to say…here’s what we’re working on, we really need your help.  Here’s where we screwed up, here’s where we did well, we need your lens.  It’s been a good partnership not just for growth but for those relationships in the district.”  

These comments reflect many of the key dispositions of improvement leaders described above. This includes a growth mindset or newfound appreciation of the knowledge everyone in the organization offers, a humility and vulnerability about what they do and don’t know, and a comfort with the uncertainty that comes with tackling complex problems.  Furthermore, members of the network now realize that improving the IEP process is “not about blaming [individuals] but about working together to solve the problem [and] change the system.”

After investigating their local systems, teams began using the Plan-Do-Study-Act (PDSA) cycle to try out potential “change ideas”  to improve the IEP goal-setting process.  These have included a rubric to evaluate the quality of IEP goals and a checklist tied to the IEP process.  A PDSA is a mini-experiment used to test ideas out in practice.  In the early stages of testing, teams often, but not always, try out ideas on a small scale; for example, one teacher with one student.  Here, the goal is to learn about the feasibility of a change idea – can it even be done in practice, does it produce the desired outcome, what are potential challenges that might need to be addressed. Ideas that show promise are then tested on a larger scale to learn how to adapt them to different contexts and then finally how to implement them across the entire system.  

This approach runs in direct contrast to how change often happens in education – find a “silver bullet,” a new curriculum or a new software, and then implement it full-scale.  However, when they fail, which they often do, there is little understanding of why they didn’t work.  The PDSA approach, on the other hand, helps teams learn what works in which contexts and for whom and equally important, what doesn’t.  Through this process, organizations learn which changes when implemented actually lead to sustained improvements.  

Conducting small-scale experiments using scientific reasoning and evidence represents a major shift in how network members have approached reform efforts in the past.  At first, some leaders were impatient with the process, wanting to move to full-scale implementation as quickly as possible. However, they soon discovered how much learning happens with each PDSA and how much more confidence they have in spreading ideas that have been vetted through early testing.  

As one director of special education commented, “I have been in special education for 22-23 years. Everybody always has these great ideas and I’ve been part of many teams where those great ideas have been put in place without much inquiry.  But many [of these] programs haven’t lasted. What’s exciting to me is all the nitty gritty work that we do [through the PDSAs].  At first, I wondered, ‘Why are we taking so long?  Why are you making us do that [cycle] again?  Every time we did a PDSA, I would [tell our coach], ‘We just did that but just a little bit different.’  But I totally get it now.  Seeing those small improvements we make every week…is super exciting.” 

PDSAs also challenge the push for full-scale implementation when they reveal that an idea doesn’t work.  For example, when testing a specific change idea and reviewing the data, one team discovered that the idea wasn’t as helpful as they thought it would be.  Abandoning the idea, however, wasn’t easy.  “It was hard to accept that we thought it was working but that it wasn’t really helping that much.  And I think [we’re] so used to just continuing with something because that’s what we’ve done or it looks shiny and new, or people like it.  And it’s a huge shift to be in a space where once we get information that something’s not working, it’s okay to just move on and abandon it,” said one team member.

For many teams, the use of data or evidence as part of the PDSAs distinguishes it from other forms of inquiry and reform efforts in general.  “There are so many books and leaders on educational reform and change theory, [but] none of it is like this. There are tons of pieces, structures, and theories that this work incorporates, but this stands out as being very different. The differentiating point is those PDSAs…it’s a more structured approach where you’re looking at data [that comes from trying something out in practice], which allows for more disciplined inquiry,” said one program specialist.  

Most notably, she contrasted this with how “data” is more commonly used.  “We always talk about making data-based decisions but in reality we typically don’t. We typically hear from maybe a few stakeholders and they bring some subjective information to the table and then we make decisions about support and professional development based on that.”

For many network members in the SIL NIC, their approach to leadership and other change efforts in their organizations has shifted profoundly.  They are looking at their systems with a more critical eye, seeking out the knowledge and expertise of others in their organizations, embracing failure as a valuable part of learning, and testing and vetting ideas with more discipline and rigor.  They are also seeing the results of this new approach; across the network, IEP data practices, IEP goal quality, and family collaboration have all improved. These improvements are making a real difference for the nearly 30,000 students with disabilities served by teams in the network. In a little over a year, the rate of annual IEP goal completion has jumped from 38% to 56%.  There is much work to be done, but these leaders are committed to keep working until they achieve their aim. As they do so, they are becoming the improvement leaders we need to transform our systems to serve all of our students.   

The System Improvement Leads (SIL) project is a collaborative grant project between the El Dorado County Office of Education, El Dorado County SELPA, and Riverside County SELPA. The SIL team builds the capacity of COEs, SELPAs and LEAs in continuous improvement, data best practices, and high-leverage change ideas in order to improve outcomes for students with disabilities.  The SIL project is supported by the California Department of Education (CDE) and the California Collaborative for Educational Excellence (CCEE). To learn more, visit systemimprovement.org or email [email protected]