Artwork

Content provided by Darlene Suyematsu and The Deming Institute. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Darlene Suyematsu and The Deming Institute or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

Goal Setting is Often an Act of Desperation: Part 5

28:58
 
Share
 

Manage episode 423050241 series 2320637
Content provided by Darlene Suyematsu and The Deming Institute. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Darlene Suyematsu and The Deming Institute or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

In this episode, John Dues and Andrew Stotz apply lessons five through seven of the 10 Key Lessons for implementing Deming in classrooms. They continue using Jessica's fourth-grade science class as an example to illustrate the concepts in action.

TRANSCRIPT

0:00:02.2 Andrew Stotz: My name is Andrew Stotz and I'll be your host as we continue our journey into the teachings of Dr. W. Edwards Deming. Today I'm continuing my discussion with John Dues, who is part of the new generation of educators striving to apply Dr. Deming's principles to unleash student joy in learning. This is episode five about goal setting through a Deming lens. John, take it away.

0:00:23.2 John Dues: Yeah, it's good to be back, Andrew. Yeah, like you said, for the past few episodes we've been talking about organizational goal setting. We covered four healthy conditions, or four conditions of healthy goal setting and 10 key lessons for data analysis. And then what we turn to in the last episode is looking at an applied example of the 10 key lessons for data analysis and in action. And, if you remember from last time we were looking at this improvement project from Jessica Cutler, she's a fourth grade science teacher, and she did the improvement fellowship here at United Schools Network, where she learned the tools, the techniques, the philosophies, the processes behind the Deming theory, continual improvement, that type of thing. And in... And in Jessica's specific case, in her fourth grade science class, what she was settled on that she was gonna improve was, the joy in learning of her students. And we looked at lessons one through four through the eyes or through the lens of her project. And today we're gonna look at lessons five through seven. So basically the next, uh, the next three lessons of those 10 key lessons.

0:01:34.8 AS: I can't wait. Let's do it.

0:01:37.3 JD: Let's do it. So lesson number five was: show enough data in your baseline to illustrate the previous level of variation. Right. So the basic idea with this particular lesson is that, you know, let's say we're trying to improve something. We have a data point or maybe a couple data points. We wanna get to a point where we're starting to understand how this particular concept works. In this case, what we're looking at is joy in learning. And there's some different rules for how many data points you should, should have in a typical base baseline. But, you know, a pretty good rule of thumb is, you know, if you can get 12 to 15, that's... That's pretty solid. You can start working with fewer data points in real life. And even if you just have five or six values, that's gonna give you more understanding than just, you know, a single data point, which is often what we're... What we're working with.

0:02:35.6 AS: In, other words, even if you have less data, you can say that this gives some guidance.

0:02:40.9 JD: Yeah.

0:02:41.1 AS: And then you know that the reliability of that may be a little bit less, but it gives you a way... A place to start.

0:02:46.9 JD: A place to start. You're gonna learn more over time, but at least even five or six data points is more than what I typically seen in the typical, let's say, chart where it has last month and this month, right? So even five or six points is a lot more than that. You know, what's... What's typical? So I can kind of show you, I'll share my screen here and we'll take a look at, Jessica's initial run chart. You see that right?

0:03:19.3 AS: We can see it.

0:03:21.2 JD: Awesome.

0:03:22.3 AS: You wanna put it in slideshow? Can we see that? Yeah, there you go.

0:03:24.9 JD: Yeah, I'll do that.

0:03:25.4 AS: Perfect.

0:03:26.3 JD: That works better. So, you know, again, what we're trying to do is show enough data in the baseline to understand what happened prior to whenever we started this improvement effort. And I think I've shared this quote before, but I really love this one from Dr. Donald Berwick, he said "plotting measurements over time turns out in my view to be one of the most powerful things we have for systemic learning." So what... That's what this is all about really, is sort of taking that lesson to heart. So, so you can look at Jessica's run chart for "joy in science." So just to sort of orient you to the chart. We have dates along the bottom. So she started collecting this data on January 4th, and this is for about the first 10 days of data she has collected. So she's collected this data between January 4th and January 24th. So, you know, a few times a week she's giving a survey. You'll remember where she's actually asking your kids, how joyful was this science lesson?

0:04:24.4 JD: Mm-hmm.

0:04:27.2 JD: And so this is a run chart 'cause it's just the data with the median running through the middle, that green line there, the data is the blue lines connected by, or sorry, the blue dots connected by the points and the y axis there along the left is the joy in learning percentage. So out of a hundred percent, sort of what are kids saying? How are kids sort of evaluating each of these science lessons? So we've got 10 data points so far, which is a pretty good start. So it's starting to give Jessica and her science class a decent understanding about, you know, when we, you know, define joy in science and then we start to collect this data, we really don't have any idea what that's gonna look like in practice. But now that she started plotting this data over time, we have a much better sense of what the kids think of the science lessons basically. So on the very first day...

0:05:25.4 AS: And what is the... What is the median amount just for the listeners out there that don't see it? What would be the... Is that 78%?

0:05:33.8 JD: Yeah, about 78%. So that very first day was 77%. The second day was about 68%. And then you sort of see it bounce around that median over the course of that, those 10 days. So some of the points are below the median, some of the points are above the median.

0:05:50.4 AS: And the highest point above is about 83, it looks like roughly around that.

0:05:54.4 JD: Yeah. Around 82, 83%. And one technical point is at the point that it's a run chart we don't have the process limits, those red lines that we've been taking a look at and with a run chart and, you know, fewer data points, we only have 10. It's fairly typical to use the median, just so you know, you can kind of better control for any outlier data points which we really don't have any outliers in this particular case but that's just sort of a technical point. So, yeah, I mean, I think, you know what you start to see, you start to get a sense of what this data looks like, you know, and you're gonna keep collecting this data over an additional time period, right? And she hasn't at this point introduced any interventions or any changes. Right now they're just learning about this joy in learning system, really. Right.

0:06:51.8 JD: And so, you know, as she's thinking about this, this really brings us to... To lesson six, which is, you know, what's the goal of data analysis? And this is true in schools and it's true anywhere. We're not just gonna look at the past results, but we're also gonna, you know, probably more importantly, look to the future and hopefully sort of be able to predict what's gonna happen in the future. And, you know, whatever concept that we're looking at. And so as we continue to gather additional data, we can then turn that run chart from those initial 10 points into a process behavior chart. Right. You know, that's a, sort of a, you know, it's the run chart on steroids because not only can we see the variation, which you can see in the run chart, but now because we've added more data, we've added the upper and lower natural process limit, we can also start to characterize the type of variation that we see in that data.

0:08:00.1 AS: So for the listeners, listeners out there, John just switched to a new chart which is just an extension of the prior chart carrying it out for a few more weeks, it looks like, of daily data. And then he's added in a lower and upper natural process limit.

0:08:18.9 JD: Yeah. So we're still, we're still plotting the data for joy in science. So the data is still the blue dots connected by the blue lines now because we have 24 or so data points, the green line, the central line is the average of that data running through the data. And we have enough data to add the upper and lower natural process limit. And so right now we can start to determine do we only have natural variation, those everyday ups and downs, that common cause variation, or do we have some type of exceptional or special cause variation that's outside of what would be expected in this particular system. We can start making...

0:09:00.7 AS: Can you...

0:09:02.2 JD: Go ahead.

0:09:02.8 AS: I was gonna... I was gonna ask you if you can just explain how you calculated the upper and lower natural process limits just so people can understand. Is it max and min or is it standard deviation or what is that?

0:09:18.3 JD: Yeah, basically what's happening is that, so we've plotted the data and then we use that data, we calculate the average, and then we also calculate what the moving range, is what it's called. So we just look at each successive data point and the difference between those two points. And basically there's a formula that you use for the upper and lower natural process limits that takes all of those things into account. So it's not standard deviation, but it's instead using the moving, moving range between each successive data point.

0:09:52.9 AS: In other words, the data that's on this chart will always fall within the natural upper and lower. In other words it's... Or is, will data points fall outside of that?

0:10:05.7 JD: Well, it depends on what kind of system it is.

0:10:07.8 AS: Right. Okay.

0:10:09.8 JD: If it's a stable system, that means all we see is sort of natural ups and downs in the data. And we use those formulas for the process limits. The magnitude of the difference of each successive data point is such that it's not necessarily big or small, it's just based on what you're seeing empirically. It's basically predictable. Right. And if it's not predictable, then we'll see special causes. So we'll see special patterns in the data. So I think maybe last time we talked about the three patterns, or you know, in some episode we talked about the patterns that would suggest there's a special cause that goes to the study. Those three patterns that I use are, is there a single one of these joy in science data points outside of either the upper or lower natural process limit that'd be a special cause.

0:11:05.4 JD: If you see eight data points in a row, either above the central line or below the central line, that's a special cause. And if I see three out of four in a row that are either closer to the upper limit or to the lower limit than they are to that central line, that's a pattern of the data that suggests a special cause. So we don't, in this particular dataset, we don't see any special causes. So now we have... Now we have a very solid baseline set of data. We have 24 data points. And when you're using an average central line and get... Getting technical, once you get to about 17 data points, those upper and lower natural process limits start to solidify, meaning they're not gonna really change too much 'cause you have enough data unless something really significant happens. And then if you're using the median, that solidification happens when you get to about 24 data points.

0:12:07.5 JD: So when you're, you know, when you're getting to 17 to 24 data points in your baseline, you're really getting pretty solid upper and lower national process limits. So, as of this March 1st date, which is the last date in this particular chart, there are 24 data points. So you have a pretty solid baseline set of data. Right now, the upper natural process limit is 95%. That lower limit is sitting at 66%, and then the average running through the middle, that green line is 81%. So this basically tells us that if nothing changes within Jessica's fourth grade science system, her classroom, we can expect the data to bounce around this 81% average and stay within the bounds of the limit. So we would call this a common cause system because we don't see any of those rules that I just talked about for special causes. And that's important.

0:13:07.4 JD: So do we have an unstable system or a stable system? We have a stable system. A stable system means that the data is predictable and unless something happens, you know, and this could be something that happens in the control of the teacher in the class, or it could be out of the control of the teacher in the class, but unless something happens that's significant, this data is just kind of keep humming along like this over the course of March, April, May of this particular school year. Right. So once we get to this point, so we have baseline data we've collected in a run chart, we start to understand how that data is moving up and down. We got some more data and we added the upper and lower natural process limits. Now we can assess not only the variation, but also the stability and the capability of the system, all of those things, those questions can start to be answered now that we have this process behavior chart.

0:14:09.3 JD: And this brings us to the final lesson for today, which is lesson 7, which is the improvement approach depends on the stability of the system under study. So that's why one of the reasons why the process behavior chart is so powerful is because now I have an understanding of what I need to do, like what type of approach I need to take to improve this particular system. Right? So in this particular case, I have a predictable system. And so the only way to bring about meaningful improvement is to fundamentally change this science system, right?

0:14:52.6 JD: The flip side would be if I did see a special cause let's say, it was an unpredictable system. We saw special cause on the low side. I'd wanna study that, what happened on that particular day. Because if I see a special cause, let's say on February 2nd I saw a special cause, let's say I saw a single data point below the lower natural process limit that's so different and unexpected, I'd actually wanna go to her classroom and talk to her in her class and say, okay, what happened on that day? I'm gonna try to remove that special cause. Study of that specific data point is warranted. If you don't see those special causes, then those, even though there are ups and downs, there are increases and decreases. They're within that, you know, the expected bounds of this particular system. Right.

0:15:46.9 AS: And I was gonna say, I can't remember if I got this from Dr. Deming or where it came from, but I know as an analyst in the stock market analyzing tons and tons of data in my career, I always say if something looks like a special cause or looks strange it's probably an error.

[laughter]

0:16:03.2 AS: And it could just be for instance, that a student came in and they didn't understand how to fill it out or they refused to fill it out or they filled out the form with a really bizarre thing, or maybe they thought that number 10 was good and number one was bad, but in fact on the survey it was number one that was good and number 10 that was bad. And you find out that, you know, that special cause came from some sort of error.

0:16:26.6 JD: That's certainly possible. That's certainly possible.

0:16:29.5 AS: As opposed to another special cause could be, let's just say that the school had a blackout and all of a sudden the air conditioning went off for half of the class and everybody was just like really frustrated. They were burning hot. It was really a hot day and that special cause could have been a legitimate cause as opposed to let's say an error cause but you know, it causes an extreme, you know response on the survey.

0:16:56.9 JD: Yeah. And the thing is, is yeah, it could be a number of different things. Maybe she tried, maybe she had gotten some feedback about her lessons and maybe even she tried a different lesson design and it was new to her and it just didn't work very well. Maybe she tried to use some new technology or a new activity and it just didn't go well. But you know, if I'm seeing that data show up as a special cause and let's say I'm seeing that the next day or a couple days later, it's still fresh in my mind and I can even go into my chart and label what happened that day. Okay. And I... Now, okay, I'm gonna remove that thing or I'm, you know, if it's a lesson I'm trying, maybe I don't wanna give up on it, but I know I need to improve it 'cause it led to some issues in my classroom, but it's close enough to the time it actually happened that I actually remember what happened on that particular day and I can sort of pinpoint that issue.

0:17:52.9 AS: Yeah.

0:17:54.5 JD: And the data told me it was worth going into studying that particular data point because it was so different than what I had seen previously in this particular 4th grade science system.

0:18:06.5 AS: Makes sense.

0:18:09.9 JD: But in this case, we don't see that, that was a hypothetical. So all we see is sort of the data moving up and down around that green average line. So we have a stable system. So again, that tells me I need to improve the science system itself. There's no special causes to remove. So, the next question I think I would ask, and if you remember one of the data lessons is that we sort of combine the frontline workers, which is the students in this case. We have the manager or the leader, that's the teacher, and then someone with profound knowledge from the Deming lens, that's me, we're bringing these people together and we're saying, okay, you know, we're seeing this hum along this joy in science thing, hum along at sort of like an 81% average. So I think it's a reasonable question to ask, is that good enough? And should we turn our attention to something else. Now, there could be some situations where it's not good enough or some situations where that is good enough. They chose to keep moving to improve that joy in learning. But I think it'd be perfectly reasonable in some context to say, well, you know, sure, maybe we could get a little better here, but maybe it's not worth the effort in that particular area. Maybe we're gonna turn our attention to something else. You know.

0:19:23.7 AS: So you learn something from the chart and that could be...

0:19:26.4 JD: Learn something from the chart. Yeah, yeah.

0:19:27.9 AS: Because when I look at this chart, I just think hard work is ahead.

0:19:31.2 JD: Yeah. Yeah.

0:19:34.7 AS: 'Cause in order to, if you have a stable system with not a lot of extreme... Firefighting is kind of a fun thing, right? When you got special causes, you feel really important. You go out there, you try to figure out what those individual things are, you're the hero. You fix it, you understand it, you see it, whatever. But then when you get a stable system, it's like, oh man, now we got to think about how do we make some substantial changes in the system. It doesn't have to be substantial, but how do we make changes in the system, you know? And then measure whether that has an impact.

0:20:06.4 JD: Yeah. And to your point about fire... Fighting fires, like I didn't know, we had never measured joy in learning like this before, so I didn't know what we were gonna get with Jessica. And so you know what I think you also see here is a pretty well-run classroom. These are kids that are finding a pretty high amount of joy in their lessons. I think that you can kind of objectively say that, but they did choose to move on with the project and keep focusing on this particular system. And I thought it was really interesting. They actually... I'll flip slides here.

0:20:45.6 JD: They actually made this sort of rudimentary fishbone diagram, so you can, if you're viewing the video here you can see that Jessica just took a pen and a piece of paper and put this on the overhead in the classroom, and basically just drew a fishbone. And on the fishbone diagram is also called a cause and effect diagram. So out on the right it says effect. And she wrote low enjoyment, so she's meaning low enjoyment of science class. And they started brainstorming, those are the bones of the fish, what's leading to what's causing the effect of low enjoyment in science class. And so they... She did this brainstorming activity with the kids. So some of the things they came up with were why is there low enjoyment with science class? Well, the computers are sometimes lagging when the kids are trying to use them. They're mad at Ms. Cutler for one reason or another. There's a lot of writing in a particular lesson. There's a lot of reading in a particular lesson.

0:21:58.2 AS: Other teachers coming into the room.

0:22:00.7 JD: Other teachers coming into the room and disrupting the class.

0:22:02.7 AS: Stop bothering me.

0:22:04.1 JD: Yeah. I mean, you know, these are the things you don't often think about. And then they talked about classmates making noises throughout classes, another distraction. And they basically categorized these into different categories. So there were sort of things that made the lesson boring. That was one category. Accidents happening, those are like the computers not working correctly. Scholar... We call our student scholars. So students getting in trouble was one, and then distractions was another category. And so then they did another activity basically after they had this fishbone. And they basically did like a voting activity where they would figure out which of these is the most dominant cause of low enjoyment. And actually what they came up with is their classmates call, like making noises, like students making a lot of noise, making noises, random noises throughout the lesson, they identified that particular thing as the thing that they're gonna then do something like design a plan, do study around, like how are we gonna reduce the amount of noise in the class?

0:23:12.7 JD: And this is all the students coming up with these ideas. Of course, Jessica's guiding these conversations as the adult in the room, but the kids are coming up with this. Like I never would have, well, maybe I shouldn't say I would never have, but it probably wouldn't likely have been on my radar that teacher, other teachers coming into the room was a main source of distraction. You know, who knows what they're doing, dropping off papers that have to be passed out, that dismissal or coming to find a kid for this thing or that thing. Who knows why they're stopping by. But schools are certainly rife with all kinds of disruptions, announcements, people coming into the room, those types of things.

0:23:51.0 AS: It's interesting too to see mad at Miss Cutler because... I was just reading a book about or some research about how to get rid of anger and that type of thing. And they talk about meditation and I do breathing exercise before every class, when every class starts. And it's a way of just kind of calming down and separating the class time from the chaos of outside, but it also could be something that could help with feeling mad.

0:24:27.9 JD: Yeah. And I think if in certain classrooms that certainly could have risen to the top. And then what you do is then design the PSA around that. So how do you do meditation? How do you know if you're going to do... How do you know if you're doing it right? How long do you do it? You know? Does it have the intended impact? You could study all kinds of different things with meditation, but...

0:24:52.4 AS: And are you really mad or is there... Are you really mad at Ms. Cutler or are you... Are you frustrated about something else? Or that...

0:24:58.1 JD: Exactly. Yeah. Is it warranted? Is there actually something that she should stop doing or start doing? There's all kinds of possibilities there. But the main point, and I think this kind of would bring us to the wrap up is taking this approach is very different. Even just the step, Jessica's step of saying, I'm gonna work on joy in science, joy in learning in science class. That's a very different approach. And then step beyond that, I'm gonna involve my students in this improvement approach. And we have these various methods and tools for systematically collecting the classes input, and that those are improvement science or continual improvement tools that we're using. And then we're applying some of the knowledge about variation, Deming sort of data methods to understand that data, that we've systematically collected from students.

0:25:58.7 JD: And now students are involved. So they're actively coming up with both the reasons, the problems that are happening. And then they're like what we'll get into in the last few lessons is their input into the solutions, the change ideas that are gonna make things better. But all of this represents a very different approach than what's typical when it comes to school improvement. These things are not being handed down from on high from someone that has no connection to this classroom whatsoever. Instead, it's actually the people in the classroom that are developing the solutions.

0:26:36.9 AS: I just was thinking about the idea of imagining that this group of students is working really hard on that, and they come up with so much knowledge and learning about how to create a more joyful classroom. And then imagine that they've now codified that together with Ms. Cutler to create kind of the standard operating procedures. Like we put up a sign on the door outside that says, do not disturb until class is over, or...

0:27:06.1 JD: Something simple. Yeah.

0:27:07.0 AS: And that they come up with, and a breathing exercise or whatever that is. And then you imagine the next group of students coming in for the next year, let's say, or whatever, that next group who can then take the learning that the first group had and then try to take it to another level, and then upgrade how the operations of the room is done. And you do that a couple of iterations, and you've now accumulated knowledge that you are building on until in a business, you're... You're creating a competitive advantage.

0:27:40.4 JD: Yeah, absolutely. And another thing that these guys did was they didn't say we're gonna improve X, Y, or Z and then set an arbitrary goal, which is one of the things we've talked about that often happens at the outset of any type of improvement. They didn't... They sort of avoided this act of desperation. We talk about goal setting as an active... Are often goal setting is often an act of desperation. They avoided that completely. Instead, what they did was we gathered some baseline data to understand what is the capability of our system when it comes to joy in learning. That's what they did first. They didn't set the goal first. A lot of wisdom, a lot of wisdom in 10 year olds for sure.

0:28:22.1 AS: That's interesting. Well, John, on behalf of everyone at the Deming Institute, I want to thank you again for the discussion and for listeners, remember to go to deming.org to continue your journey. You can find John's book Win-Win: W. Edwards Deming, the System of Profound Knowledge and the Science of Improving Schools on amazon.com. This is your host, Andrew Stotz, and I'll leave you with one of my favorite quotes from Dr. Deming, and its absolutely applicable to today's discussion. People are entitled to joy in work.

  continue reading

170 episodes

Artwork
iconShare
 
Manage episode 423050241 series 2320637
Content provided by Darlene Suyematsu and The Deming Institute. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Darlene Suyematsu and The Deming Institute or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

In this episode, John Dues and Andrew Stotz apply lessons five through seven of the 10 Key Lessons for implementing Deming in classrooms. They continue using Jessica's fourth-grade science class as an example to illustrate the concepts in action.

TRANSCRIPT

0:00:02.2 Andrew Stotz: My name is Andrew Stotz and I'll be your host as we continue our journey into the teachings of Dr. W. Edwards Deming. Today I'm continuing my discussion with John Dues, who is part of the new generation of educators striving to apply Dr. Deming's principles to unleash student joy in learning. This is episode five about goal setting through a Deming lens. John, take it away.

0:00:23.2 John Dues: Yeah, it's good to be back, Andrew. Yeah, like you said, for the past few episodes we've been talking about organizational goal setting. We covered four healthy conditions, or four conditions of healthy goal setting and 10 key lessons for data analysis. And then what we turn to in the last episode is looking at an applied example of the 10 key lessons for data analysis and in action. And, if you remember from last time we were looking at this improvement project from Jessica Cutler, she's a fourth grade science teacher, and she did the improvement fellowship here at United Schools Network, where she learned the tools, the techniques, the philosophies, the processes behind the Deming theory, continual improvement, that type of thing. And in... And in Jessica's specific case, in her fourth grade science class, what she was settled on that she was gonna improve was, the joy in learning of her students. And we looked at lessons one through four through the eyes or through the lens of her project. And today we're gonna look at lessons five through seven. So basically the next, uh, the next three lessons of those 10 key lessons.

0:01:34.8 AS: I can't wait. Let's do it.

0:01:37.3 JD: Let's do it. So lesson number five was: show enough data in your baseline to illustrate the previous level of variation. Right. So the basic idea with this particular lesson is that, you know, let's say we're trying to improve something. We have a data point or maybe a couple data points. We wanna get to a point where we're starting to understand how this particular concept works. In this case, what we're looking at is joy in learning. And there's some different rules for how many data points you should, should have in a typical base baseline. But, you know, a pretty good rule of thumb is, you know, if you can get 12 to 15, that's... That's pretty solid. You can start working with fewer data points in real life. And even if you just have five or six values, that's gonna give you more understanding than just, you know, a single data point, which is often what we're... What we're working with.

0:02:35.6 AS: In, other words, even if you have less data, you can say that this gives some guidance.

0:02:40.9 JD: Yeah.

0:02:41.1 AS: And then you know that the reliability of that may be a little bit less, but it gives you a way... A place to start.

0:02:46.9 JD: A place to start. You're gonna learn more over time, but at least even five or six data points is more than what I typically seen in the typical, let's say, chart where it has last month and this month, right? So even five or six points is a lot more than that. You know, what's... What's typical? So I can kind of show you, I'll share my screen here and we'll take a look at, Jessica's initial run chart. You see that right?

0:03:19.3 AS: We can see it.

0:03:21.2 JD: Awesome.

0:03:22.3 AS: You wanna put it in slideshow? Can we see that? Yeah, there you go.

0:03:24.9 JD: Yeah, I'll do that.

0:03:25.4 AS: Perfect.

0:03:26.3 JD: That works better. So, you know, again, what we're trying to do is show enough data in the baseline to understand what happened prior to whenever we started this improvement effort. And I think I've shared this quote before, but I really love this one from Dr. Donald Berwick, he said "plotting measurements over time turns out in my view to be one of the most powerful things we have for systemic learning." So what... That's what this is all about really, is sort of taking that lesson to heart. So, so you can look at Jessica's run chart for "joy in science." So just to sort of orient you to the chart. We have dates along the bottom. So she started collecting this data on January 4th, and this is for about the first 10 days of data she has collected. So she's collected this data between January 4th and January 24th. So, you know, a few times a week she's giving a survey. You'll remember where she's actually asking your kids, how joyful was this science lesson?

0:04:24.4 JD: Mm-hmm.

0:04:27.2 JD: And so this is a run chart 'cause it's just the data with the median running through the middle, that green line there, the data is the blue lines connected by, or sorry, the blue dots connected by the points and the y axis there along the left is the joy in learning percentage. So out of a hundred percent, sort of what are kids saying? How are kids sort of evaluating each of these science lessons? So we've got 10 data points so far, which is a pretty good start. So it's starting to give Jessica and her science class a decent understanding about, you know, when we, you know, define joy in science and then we start to collect this data, we really don't have any idea what that's gonna look like in practice. But now that she started plotting this data over time, we have a much better sense of what the kids think of the science lessons basically. So on the very first day...

0:05:25.4 AS: And what is the... What is the median amount just for the listeners out there that don't see it? What would be the... Is that 78%?

0:05:33.8 JD: Yeah, about 78%. So that very first day was 77%. The second day was about 68%. And then you sort of see it bounce around that median over the course of that, those 10 days. So some of the points are below the median, some of the points are above the median.

0:05:50.4 AS: And the highest point above is about 83, it looks like roughly around that.

0:05:54.4 JD: Yeah. Around 82, 83%. And one technical point is at the point that it's a run chart we don't have the process limits, those red lines that we've been taking a look at and with a run chart and, you know, fewer data points, we only have 10. It's fairly typical to use the median, just so you know, you can kind of better control for any outlier data points which we really don't have any outliers in this particular case but that's just sort of a technical point. So, yeah, I mean, I think, you know what you start to see, you start to get a sense of what this data looks like, you know, and you're gonna keep collecting this data over an additional time period, right? And she hasn't at this point introduced any interventions or any changes. Right now they're just learning about this joy in learning system, really. Right.

0:06:51.8 JD: And so, you know, as she's thinking about this, this really brings us to... To lesson six, which is, you know, what's the goal of data analysis? And this is true in schools and it's true anywhere. We're not just gonna look at the past results, but we're also gonna, you know, probably more importantly, look to the future and hopefully sort of be able to predict what's gonna happen in the future. And, you know, whatever concept that we're looking at. And so as we continue to gather additional data, we can then turn that run chart from those initial 10 points into a process behavior chart. Right. You know, that's a, sort of a, you know, it's the run chart on steroids because not only can we see the variation, which you can see in the run chart, but now because we've added more data, we've added the upper and lower natural process limit, we can also start to characterize the type of variation that we see in that data.

0:08:00.1 AS: So for the listeners, listeners out there, John just switched to a new chart which is just an extension of the prior chart carrying it out for a few more weeks, it looks like, of daily data. And then he's added in a lower and upper natural process limit.

0:08:18.9 JD: Yeah. So we're still, we're still plotting the data for joy in science. So the data is still the blue dots connected by the blue lines now because we have 24 or so data points, the green line, the central line is the average of that data running through the data. And we have enough data to add the upper and lower natural process limit. And so right now we can start to determine do we only have natural variation, those everyday ups and downs, that common cause variation, or do we have some type of exceptional or special cause variation that's outside of what would be expected in this particular system. We can start making...

0:09:00.7 AS: Can you...

0:09:02.2 JD: Go ahead.

0:09:02.8 AS: I was gonna... I was gonna ask you if you can just explain how you calculated the upper and lower natural process limits just so people can understand. Is it max and min or is it standard deviation or what is that?

0:09:18.3 JD: Yeah, basically what's happening is that, so we've plotted the data and then we use that data, we calculate the average, and then we also calculate what the moving range, is what it's called. So we just look at each successive data point and the difference between those two points. And basically there's a formula that you use for the upper and lower natural process limits that takes all of those things into account. So it's not standard deviation, but it's instead using the moving, moving range between each successive data point.

0:09:52.9 AS: In other words, the data that's on this chart will always fall within the natural upper and lower. In other words it's... Or is, will data points fall outside of that?

0:10:05.7 JD: Well, it depends on what kind of system it is.

0:10:07.8 AS: Right. Okay.

0:10:09.8 JD: If it's a stable system, that means all we see is sort of natural ups and downs in the data. And we use those formulas for the process limits. The magnitude of the difference of each successive data point is such that it's not necessarily big or small, it's just based on what you're seeing empirically. It's basically predictable. Right. And if it's not predictable, then we'll see special causes. So we'll see special patterns in the data. So I think maybe last time we talked about the three patterns, or you know, in some episode we talked about the patterns that would suggest there's a special cause that goes to the study. Those three patterns that I use are, is there a single one of these joy in science data points outside of either the upper or lower natural process limit that'd be a special cause.

0:11:05.4 JD: If you see eight data points in a row, either above the central line or below the central line, that's a special cause. And if I see three out of four in a row that are either closer to the upper limit or to the lower limit than they are to that central line, that's a pattern of the data that suggests a special cause. So we don't, in this particular dataset, we don't see any special causes. So now we have... Now we have a very solid baseline set of data. We have 24 data points. And when you're using an average central line and get... Getting technical, once you get to about 17 data points, those upper and lower natural process limits start to solidify, meaning they're not gonna really change too much 'cause you have enough data unless something really significant happens. And then if you're using the median, that solidification happens when you get to about 24 data points.

0:12:07.5 JD: So when you're, you know, when you're getting to 17 to 24 data points in your baseline, you're really getting pretty solid upper and lower national process limits. So, as of this March 1st date, which is the last date in this particular chart, there are 24 data points. So you have a pretty solid baseline set of data. Right now, the upper natural process limit is 95%. That lower limit is sitting at 66%, and then the average running through the middle, that green line is 81%. So this basically tells us that if nothing changes within Jessica's fourth grade science system, her classroom, we can expect the data to bounce around this 81% average and stay within the bounds of the limit. So we would call this a common cause system because we don't see any of those rules that I just talked about for special causes. And that's important.

0:13:07.4 JD: So do we have an unstable system or a stable system? We have a stable system. A stable system means that the data is predictable and unless something happens, you know, and this could be something that happens in the control of the teacher in the class, or it could be out of the control of the teacher in the class, but unless something happens that's significant, this data is just kind of keep humming along like this over the course of March, April, May of this particular school year. Right. So once we get to this point, so we have baseline data we've collected in a run chart, we start to understand how that data is moving up and down. We got some more data and we added the upper and lower natural process limits. Now we can assess not only the variation, but also the stability and the capability of the system, all of those things, those questions can start to be answered now that we have this process behavior chart.

0:14:09.3 JD: And this brings us to the final lesson for today, which is lesson 7, which is the improvement approach depends on the stability of the system under study. So that's why one of the reasons why the process behavior chart is so powerful is because now I have an understanding of what I need to do, like what type of approach I need to take to improve this particular system. Right? So in this particular case, I have a predictable system. And so the only way to bring about meaningful improvement is to fundamentally change this science system, right?

0:14:52.6 JD: The flip side would be if I did see a special cause let's say, it was an unpredictable system. We saw special cause on the low side. I'd wanna study that, what happened on that particular day. Because if I see a special cause, let's say on February 2nd I saw a special cause, let's say I saw a single data point below the lower natural process limit that's so different and unexpected, I'd actually wanna go to her classroom and talk to her in her class and say, okay, what happened on that day? I'm gonna try to remove that special cause. Study of that specific data point is warranted. If you don't see those special causes, then those, even though there are ups and downs, there are increases and decreases. They're within that, you know, the expected bounds of this particular system. Right.

0:15:46.9 AS: And I was gonna say, I can't remember if I got this from Dr. Deming or where it came from, but I know as an analyst in the stock market analyzing tons and tons of data in my career, I always say if something looks like a special cause or looks strange it's probably an error.

[laughter]

0:16:03.2 AS: And it could just be for instance, that a student came in and they didn't understand how to fill it out or they refused to fill it out or they filled out the form with a really bizarre thing, or maybe they thought that number 10 was good and number one was bad, but in fact on the survey it was number one that was good and number 10 that was bad. And you find out that, you know, that special cause came from some sort of error.

0:16:26.6 JD: That's certainly possible. That's certainly possible.

0:16:29.5 AS: As opposed to another special cause could be, let's just say that the school had a blackout and all of a sudden the air conditioning went off for half of the class and everybody was just like really frustrated. They were burning hot. It was really a hot day and that special cause could have been a legitimate cause as opposed to let's say an error cause but you know, it causes an extreme, you know response on the survey.

0:16:56.9 JD: Yeah. And the thing is, is yeah, it could be a number of different things. Maybe she tried, maybe she had gotten some feedback about her lessons and maybe even she tried a different lesson design and it was new to her and it just didn't work very well. Maybe she tried to use some new technology or a new activity and it just didn't go well. But you know, if I'm seeing that data show up as a special cause and let's say I'm seeing that the next day or a couple days later, it's still fresh in my mind and I can even go into my chart and label what happened that day. Okay. And I... Now, okay, I'm gonna remove that thing or I'm, you know, if it's a lesson I'm trying, maybe I don't wanna give up on it, but I know I need to improve it 'cause it led to some issues in my classroom, but it's close enough to the time it actually happened that I actually remember what happened on that particular day and I can sort of pinpoint that issue.

0:17:52.9 AS: Yeah.

0:17:54.5 JD: And the data told me it was worth going into studying that particular data point because it was so different than what I had seen previously in this particular 4th grade science system.

0:18:06.5 AS: Makes sense.

0:18:09.9 JD: But in this case, we don't see that, that was a hypothetical. So all we see is sort of the data moving up and down around that green average line. So we have a stable system. So again, that tells me I need to improve the science system itself. There's no special causes to remove. So, the next question I think I would ask, and if you remember one of the data lessons is that we sort of combine the frontline workers, which is the students in this case. We have the manager or the leader, that's the teacher, and then someone with profound knowledge from the Deming lens, that's me, we're bringing these people together and we're saying, okay, you know, we're seeing this hum along this joy in science thing, hum along at sort of like an 81% average. So I think it's a reasonable question to ask, is that good enough? And should we turn our attention to something else. Now, there could be some situations where it's not good enough or some situations where that is good enough. They chose to keep moving to improve that joy in learning. But I think it'd be perfectly reasonable in some context to say, well, you know, sure, maybe we could get a little better here, but maybe it's not worth the effort in that particular area. Maybe we're gonna turn our attention to something else. You know.

0:19:23.7 AS: So you learn something from the chart and that could be...

0:19:26.4 JD: Learn something from the chart. Yeah, yeah.

0:19:27.9 AS: Because when I look at this chart, I just think hard work is ahead.

0:19:31.2 JD: Yeah. Yeah.

0:19:34.7 AS: 'Cause in order to, if you have a stable system with not a lot of extreme... Firefighting is kind of a fun thing, right? When you got special causes, you feel really important. You go out there, you try to figure out what those individual things are, you're the hero. You fix it, you understand it, you see it, whatever. But then when you get a stable system, it's like, oh man, now we got to think about how do we make some substantial changes in the system. It doesn't have to be substantial, but how do we make changes in the system, you know? And then measure whether that has an impact.

0:20:06.4 JD: Yeah. And to your point about fire... Fighting fires, like I didn't know, we had never measured joy in learning like this before, so I didn't know what we were gonna get with Jessica. And so you know what I think you also see here is a pretty well-run classroom. These are kids that are finding a pretty high amount of joy in their lessons. I think that you can kind of objectively say that, but they did choose to move on with the project and keep focusing on this particular system. And I thought it was really interesting. They actually... I'll flip slides here.

0:20:45.6 JD: They actually made this sort of rudimentary fishbone diagram, so you can, if you're viewing the video here you can see that Jessica just took a pen and a piece of paper and put this on the overhead in the classroom, and basically just drew a fishbone. And on the fishbone diagram is also called a cause and effect diagram. So out on the right it says effect. And she wrote low enjoyment, so she's meaning low enjoyment of science class. And they started brainstorming, those are the bones of the fish, what's leading to what's causing the effect of low enjoyment in science class. And so they... She did this brainstorming activity with the kids. So some of the things they came up with were why is there low enjoyment with science class? Well, the computers are sometimes lagging when the kids are trying to use them. They're mad at Ms. Cutler for one reason or another. There's a lot of writing in a particular lesson. There's a lot of reading in a particular lesson.

0:21:58.2 AS: Other teachers coming into the room.

0:22:00.7 JD: Other teachers coming into the room and disrupting the class.

0:22:02.7 AS: Stop bothering me.

0:22:04.1 JD: Yeah. I mean, you know, these are the things you don't often think about. And then they talked about classmates making noises throughout classes, another distraction. And they basically categorized these into different categories. So there were sort of things that made the lesson boring. That was one category. Accidents happening, those are like the computers not working correctly. Scholar... We call our student scholars. So students getting in trouble was one, and then distractions was another category. And so then they did another activity basically after they had this fishbone. And they basically did like a voting activity where they would figure out which of these is the most dominant cause of low enjoyment. And actually what they came up with is their classmates call, like making noises, like students making a lot of noise, making noises, random noises throughout the lesson, they identified that particular thing as the thing that they're gonna then do something like design a plan, do study around, like how are we gonna reduce the amount of noise in the class?

0:23:12.7 JD: And this is all the students coming up with these ideas. Of course, Jessica's guiding these conversations as the adult in the room, but the kids are coming up with this. Like I never would have, well, maybe I shouldn't say I would never have, but it probably wouldn't likely have been on my radar that teacher, other teachers coming into the room was a main source of distraction. You know, who knows what they're doing, dropping off papers that have to be passed out, that dismissal or coming to find a kid for this thing or that thing. Who knows why they're stopping by. But schools are certainly rife with all kinds of disruptions, announcements, people coming into the room, those types of things.

0:23:51.0 AS: It's interesting too to see mad at Miss Cutler because... I was just reading a book about or some research about how to get rid of anger and that type of thing. And they talk about meditation and I do breathing exercise before every class, when every class starts. And it's a way of just kind of calming down and separating the class time from the chaos of outside, but it also could be something that could help with feeling mad.

0:24:27.9 JD: Yeah. And I think if in certain classrooms that certainly could have risen to the top. And then what you do is then design the PSA around that. So how do you do meditation? How do you know if you're going to do... How do you know if you're doing it right? How long do you do it? You know? Does it have the intended impact? You could study all kinds of different things with meditation, but...

0:24:52.4 AS: And are you really mad or is there... Are you really mad at Ms. Cutler or are you... Are you frustrated about something else? Or that...

0:24:58.1 JD: Exactly. Yeah. Is it warranted? Is there actually something that she should stop doing or start doing? There's all kinds of possibilities there. But the main point, and I think this kind of would bring us to the wrap up is taking this approach is very different. Even just the step, Jessica's step of saying, I'm gonna work on joy in science, joy in learning in science class. That's a very different approach. And then step beyond that, I'm gonna involve my students in this improvement approach. And we have these various methods and tools for systematically collecting the classes input, and that those are improvement science or continual improvement tools that we're using. And then we're applying some of the knowledge about variation, Deming sort of data methods to understand that data, that we've systematically collected from students.

0:25:58.7 JD: And now students are involved. So they're actively coming up with both the reasons, the problems that are happening. And then they're like what we'll get into in the last few lessons is their input into the solutions, the change ideas that are gonna make things better. But all of this represents a very different approach than what's typical when it comes to school improvement. These things are not being handed down from on high from someone that has no connection to this classroom whatsoever. Instead, it's actually the people in the classroom that are developing the solutions.

0:26:36.9 AS: I just was thinking about the idea of imagining that this group of students is working really hard on that, and they come up with so much knowledge and learning about how to create a more joyful classroom. And then imagine that they've now codified that together with Ms. Cutler to create kind of the standard operating procedures. Like we put up a sign on the door outside that says, do not disturb until class is over, or...

0:27:06.1 JD: Something simple. Yeah.

0:27:07.0 AS: And that they come up with, and a breathing exercise or whatever that is. And then you imagine the next group of students coming in for the next year, let's say, or whatever, that next group who can then take the learning that the first group had and then try to take it to another level, and then upgrade how the operations of the room is done. And you do that a couple of iterations, and you've now accumulated knowledge that you are building on until in a business, you're... You're creating a competitive advantage.

0:27:40.4 JD: Yeah, absolutely. And another thing that these guys did was they didn't say we're gonna improve X, Y, or Z and then set an arbitrary goal, which is one of the things we've talked about that often happens at the outset of any type of improvement. They didn't... They sort of avoided this act of desperation. We talk about goal setting as an active... Are often goal setting is often an act of desperation. They avoided that completely. Instead, what they did was we gathered some baseline data to understand what is the capability of our system when it comes to joy in learning. That's what they did first. They didn't set the goal first. A lot of wisdom, a lot of wisdom in 10 year olds for sure.

0:28:22.1 AS: That's interesting. Well, John, on behalf of everyone at the Deming Institute, I want to thank you again for the discussion and for listeners, remember to go to deming.org to continue your journey. You can find John's book Win-Win: W. Edwards Deming, the System of Profound Knowledge and the Science of Improving Schools on amazon.com. This is your host, Andrew Stotz, and I'll leave you with one of my favorite quotes from Dr. Deming, and its absolutely applicable to today's discussion. People are entitled to joy in work.

  continue reading

170 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide