I can honestly say that I can't wait for September to be over! There is just SO much paperwork that needs to be done and turned in, too much thinking I have to do to plan ahead for DDI, SLOs, LLOs, etc. that I think my brain is already fried, and it's only Tuesday!!! (Boy, I'm sure missing have day 6 off at the moment...this being a full time teacher is going to wear me out! ;) )
Monday I stayed for a faculty meeting about DDI, data driven instruction. The presenter came from our local Boces and has been working with other districts to begin implementing DDI as well. The information she presented does make sense. The first year of implementation is going to be a little more work because of the need to build test banks and such, but the concept behind DDI makes sense. I figured I would just share a bit in this post about what was presented and how I think I'll use it with my high school art classes.
Data driven instruction is built upon four steps. The concept comes from Paul Bambrick-Santoyo and his book Driven by Data.
1. Assessment: Teachers should be using interim assessments (my district is requiring us to do a test at the 10, 20, 30 and 40 week period...the 40 week being the final exam, and also in my case, part of my post-assessment for my APPR) throughout the school year to see what students have a good understanding on and to see what they are retaining. Using a ticket out the door (or exit slip) and bell ringers are also forms of assessment that should be used to see what students are understanding.
2. Analysis: The second step is to analyse the data from the tests. Ideally, if you're in a large district, you'd have a team of teachers who teach the same subject area that you can sit down with, or, if you're like my district, my principal and I will be a "team" to analyse my data. As you look at the data, you should be saying, "Why did 70% of the students choose this wrong answer over the correct answer?" Well, maybe it was a poorly worded question on the test, or maybe the concept needs to be retaught. On the other hand, you would hope to find that 90% or more of students would be getting correct answers. (This is where eDoctrina will come in handy that I have to use. Even though it's a lot of work to input questions, the analysis portion becomes easier because the program will automatically calculate the percentage of student answers for each question. It even creates a bar graph for those people who need it visually!)
The presenter gave another example of how teachers in another district who have already implemented DDI last year analyse the data. After a test has been corrected, they assign the following as homework: "Go home and for each answer you got wrong, please write down and tell me why you chose that answer over the correct one." This would essentially help you understand how the students heard the information that you taught.
3. Action: The third step is to take action. Meaning, what and when are you going to reteach, and how are you going to reteach it so that students grasp the knowledge better? Basically, it means you might need to change your teaching strategy. This step is often the hardest step, especially for those on a time constraint because of state testing. I think this part is a lot easier for us special areas to do because we generally don't have a state curriculum we are mandated to follow.
4. Culture: The final step has to do with changing the culture of student learning. Basically, under DDI, you "own" your lessons for the first 10 weeks of school. After that 1st interim assessment, the students "own" your lessons and drive the teaching and learning process...they essentially take charge of their own learning.
So, here's the first question this brings up...isn't DDI suggesting that we "teach to the test", which is what everyone hates doing? Not necessarily. Those interim assessments aren't supposed to be held against the students. But then how are we supposed to get the students to take them seriously and actually try on these tests? That's up to the teacher. Our district suggested taking the interim tests as a homework grade instead of a test grade so that they are still held accountable, but it isn't going to make or break their report card averages.
Next, how should we be building our assessments, especially if we have extremely high achievers and low achievers in the same class? The presenter suggested building the assessment for the bulk of the students. (Hmmm, easier said than done?)
How many questions should be on these interim assessments? The presenter suggested that for every standard or sub-standard you have covered up until that point, you should have 3-5 questions per standard. (In eDoctrina, when I input my questions for the question bank, I can also attach a standard/sub-standard to each question. Again, when it's time to analyse my data, it will automatically tell me the percentage of students that are doing well/poorly on each standard. I plan on inputting all my questions from each unit test I give so that I can randomly pick a few from each unit for each interim assessment.)
Coming from the CSE angle... how are we supposed to accommodate everyone who needs it? My district has A LOT of students who require services from an IEP or a 504 plan...extended time, test read out loud, etc. On top of regular class tests, quizzes and assignments, this just adds that much more stress to those services. According to the presenter, if the accommodations can happen at the summative, it should happen at the interim. For example, in NY, a math exam for the 3rd-8th grade is allowed to be read out loud to those who have that requirement on their IEP, but certain sections on the ELA are not allowed to be read out loud. (This is not necessarily going to be a good thing for many districts...especially districts like mine that can't afford to hire more support staff due to the lower scores by the new state tests under Common Core. My district is going to try and entice students to stay after for AIS services after school. This would ensure they get the extra support they need without filling up the school day even more. Our teachers would be paid extra for staying after to work with the students, which is cheaper than hiring more full-time staff. Only problem is...students aren't mandated to take AIS, so we have to make them want to stay after.)
Wow, so all that being said, DDI is looking good and bad to me right now. To be honest, checking student understanding at various points using cumulative assessments seems like a no-brainer to me. I did this last year with my 5th graders and my color theory curriculum. Each rubric had review questions from all of the previous lessons and units, and each rubric, there were more and more, but they were able to answer just about all of the correctly by the end of the year! (Thus the reason they did really well on their post-assessment!) I remember tests from high school that had review questions/bonus questions on them from previous units...
DDI does stink though, because now I'm having to administer more tests in art. I do think tests are important because I want to treat art, to an extent, like an academic subject so that students take it seriously and actually try to learn and maintain the information I teach them. But holy tests galore! I'll have to work on getting a little creative with these interim assessments so that they have some performance based aspect to them...but that might be too much for my plate this year, I'd want to give it serious thought!
I just want to leave you with one more thing. (And I promise, after this post, I WILL be posting some artwork and such that is starting to be finished up at school! I still have only seen about half of my elementary kids...!) The presenter gave us this handout to try and explain why DDI needs to be done (and it comes from Bambrick's book):
TEACHER: Listen; this data-driven education thing seems interesting and all, but why are we doing it?
PRINCIPAL: Do you watch basketball?
TEACHER: Sure.
PRINCIPAL: During a recent high school basketball playoff game, the scoreboard completely malfunctioned midway through the game. So, the refs kept the score and time on the sidelines. As it came close to the end of the game, the visiting team was down by two points, but they did not realize it nor how much time was left. The clock ran out before they took the final shot.
TEACHER: That's not right!
PRINCIPAL: Of course not. If the scoreboard had been working, the entire end of the game could have been different. So you'd agree that a working scoreboard is critical for sport events, correct?
TEACHER: Of course.
PRINCIPAL: At the end of the day, data-drive instruction is like fixing the broken scoreboard. Relying on state tests is like covering up the scoreboard at the beginning of the game and then uncovering it at the end of the game to see if you won. At that point, there's nothing you can do to change the outcome! We use interim assessments to keep the scoreboard uncovered, so we can make the necessary adjustments to be able to win the game.
Welcome to Art Room 104! Well, I no longer teach in room 104...it's now room 309, but the heart is still there! I have now transitioned into teaching 7th-12th grades, and my focus is now moving towards Choice Based Learning in the art room. Join me on my journey as I enter new territory, experiment, and share how I fit it all into the realm of Common Core!
Showing posts with label DDI. Show all posts
Showing posts with label DDI. Show all posts
Tuesday, September 10, 2013
Friday, August 23, 2013
Ugh...Data Driven Instruction???
Well, today I was introduced to the realm of data driven instruction and if I haven't felt overwhelmed by this point, I sure do now! Anyone else out there having to deal with DDI? I feel like I had it easy last year with my elementary SLO's compared to this year! Luckily, it looks like I may only have to do 2 SLO's and 1 LLO this year (compared to 4 SLO's and 1 LLO last year), but it's going to be SOOOO much more work to do them with this data driven instruction component!
For anyone who doesn't know what DDI is, from what I learned today, teachers must give students an exam-like assessment at various points throughout the year (our school is making us do one at each 10 week quarter to assess what has been taught up to that point). We have to review what a majority of students are getting wrong/struggling with and readdress/reteach, and then test again at the next 10 week period!
Now, I get the idea behind this, but it's going to be A LOT of work for me! We are now moving to using eDoctrina (which I found out is a relatively locally ran company in Buffalo, NY...near my hometown and 5 hours away from where I teach). In eDoctrina, we will enter in our SLOs, pre-assessment scores, target scores for each student, and then the post-assessment scores. The nice thing is that eDoctrina will automatically weigh and compute our 40% assessment score for APPR. (On a side rant, I'm being encouraged to use my student's STAR scores for my SLO's...something I DON'T want to do! I'd rather do a two-part assessment...an observational drawing portion and a test portion that tests students on art vocabulary as well as a writing portion, that will probably be analysis of an artwork...but that's another post for the future!)
The other nice thing is that eDoctrina has test banks to build test (for the quarterly DDI exams) as well as for post-assessment tests IN EVERYTHING BUT ART! (Because there is no state assessment for art, and no state assessment for art from other states that I've been able to find elsewhere so far...) Which therefore means that I will have to start entering in test bank questions! Argh! It will be worth it in the long run, but it's so daunting to look at right now! This program will generate tests and bubble sheets to match the test. After the bubble sheets are filled out, you scan the bubble sheet (which is printed off with a special bar code) into a fax machine to the eDoctrina number and the results are automatically calculated and entered under each specific student's account....thus allowing the program to generate graphs and data for me as a teacher to evaluate and adjust my teaching.
There is so much more I could say about this program, but for now, I'll just ask if anyone else out there is venturing into the realm of DDI and what your thoughts are on it...how is your district approaching the data collection...and are you using a program like eDoctrina?
For anyone who doesn't know what DDI is, from what I learned today, teachers must give students an exam-like assessment at various points throughout the year (our school is making us do one at each 10 week quarter to assess what has been taught up to that point). We have to review what a majority of students are getting wrong/struggling with and readdress/reteach, and then test again at the next 10 week period!
Now, I get the idea behind this, but it's going to be A LOT of work for me! We are now moving to using eDoctrina (which I found out is a relatively locally ran company in Buffalo, NY...near my hometown and 5 hours away from where I teach). In eDoctrina, we will enter in our SLOs, pre-assessment scores, target scores for each student, and then the post-assessment scores. The nice thing is that eDoctrina will automatically weigh and compute our 40% assessment score for APPR. (On a side rant, I'm being encouraged to use my student's STAR scores for my SLO's...something I DON'T want to do! I'd rather do a two-part assessment...an observational drawing portion and a test portion that tests students on art vocabulary as well as a writing portion, that will probably be analysis of an artwork...but that's another post for the future!)
The other nice thing is that eDoctrina has test banks to build test (for the quarterly DDI exams) as well as for post-assessment tests IN EVERYTHING BUT ART! (Because there is no state assessment for art, and no state assessment for art from other states that I've been able to find elsewhere so far...) Which therefore means that I will have to start entering in test bank questions! Argh! It will be worth it in the long run, but it's so daunting to look at right now! This program will generate tests and bubble sheets to match the test. After the bubble sheets are filled out, you scan the bubble sheet (which is printed off with a special bar code) into a fax machine to the eDoctrina number and the results are automatically calculated and entered under each specific student's account....thus allowing the program to generate graphs and data for me as a teacher to evaluate and adjust my teaching.
There is so much more I could say about this program, but for now, I'll just ask if anyone else out there is venturing into the realm of DDI and what your thoughts are on it...how is your district approaching the data collection...and are you using a program like eDoctrina?
Subscribe to:
Posts (Atom)