I’ve presented on questioning types in a more technical and involved way during workshops intended for teachers to practice their skillz (re: Vertical & Horizontal questioning), but the most ready-to-use concept is varying questioning levels…
Continue readingquestioning levels
NTPRS 2018 Takeaways & Presentations
These are my updated presentations from the conference:
No-Prep Grading & Assessment 2018
Questioning Is Core
Optimizing Your Classroom Setup For MGMT
Here are my own takeaways organized by presenter, whether a) directly used by them during the conference, or b) inspired by something similar they did that got me thinking and I’ve adapted:
NTPRS 2017 Takeaways
Before having the opportunity to present a couple workshops, my mind was blown quite sufficiently during the week. Overall, the Advanced Track with Alina Filipescu and Jason Fritze got me thinking about aaaaaaaall the things I’ve forgotten to do, or stopped doing (for no good reason) over the years. Thankfully, most of them are going to be soooooo easy to [re]implement. As for the others, I’ll pick 2 at a time to add—not replace—until they become automatic. This will probably take the entire year; there’s no rush!
Jason referred to high-leverage strategies—those yielding amazing results with minimal effort (i.e. juice vs. squeeze), and I’m grateful that he called our attention to everything Alina was doing while teaching us Romanian. ce excelent! I’ll indicate some high-leverage strategies, and will go as far as to classify them as “non-negotiable” for my own teaching, using the letters “NN.” I’ll also indicate strategies to update or re-implement with the word “Update!” and those I’d like to try for the first time with the word “New!” I encourage you to give them all a try. Here are the takeaways organized by presenter:
NTPRS 2016: More Changes, More Thoughts
After attending iFLT, I spent another week in Reno at NTPRS. While iFLT offered more opportunities to observe teachers teaching students, NTPRS offered more opportunities to actually BE a student for those of us in the Experienced track. I appreciated the short demos that most presenters gave, even when the workshops were not titled “___ language demo.” There are some game changes here that warrant their own posts (e.g. embedded readings straight from the source, Michele, Whaley), but I have much else to report on. Like last week’s iFLT post, this one includes more of what I intend to think about and/or change for 2016-17. They’re organized by presenter:
Why “do you understand?” is pointless to ask…
Language teachers usually ask this when something indicates that a student didn’t understand (e.g. verbal response “huh?” or non-verbal response deer-in-headlights expression on face, etc.). If this event has already happened, asking the question serves no purpose. In fact, it might even make the matter worse by putting the student on the spot. The student will likely answer “yes, I understand” just to get their teacher to move on to someone else. Here are some comprehension check alternatives:
1) Did I just say/ask X?
2) Did I just say/ask X or Y?
3) I just said/asked ____.
4) What did I just say/ask? -or- Who can tell me what I just said/asked?
The alternatives above are arranged by questioning level from low to high (i.e. yes/no, either/or, fill-in-blank, open-ended). The questions could certainly be asked in the target language, but one popular strategy is to ask, in English/native, “what did I just say/ask?” to a so-called barometer student, who would be one with the slowest processing speed. This popular strategy is interesting because that kind of question is technically harder to answer than “did I just say/ask X?” It’s probably a non-issue because we’re dealing with the native language, but for the sake of variety, or if you find that your barometer students are struggling, you could start asking those lower level comprehension checks in English/native as well the classic “what did I just say/ask?”