I came across a 1993 article on student self-reporting (Darrow, et al.), and spent some time thinking about the idea that became the title of this blog post. As I’ve begun diving deeper into the “ungrading/gradeless” sphere of self-assessment, self-grading, and portfolios, I can say that at first I pretty much was getting the former documentation of participation, not the latter evidence of learning. Earlier this year, my student teacher and I spotted students uploading some questionable “learning evidence” into their portfolio, like notebook pictures with the day’s greeting copied from the board during the first five minutes of class.
This is not evidence of learning.
I’d go as far to say it’s a stretch to even call this something like participating. Copying is the absolute lowest writing skill for first year high school language learners, and this 5-minute routine merely sets up actual participation once class really begins. So, that was obviously documentation of some kind (vs. evidence of learning), and we then steered students towards a more productive direction of getting us evidence of learning. However, not everything students uploaded was as obvious. Take, for example, a Read & Summarize statement. Yes, the student was doing something in class, but was that necessarily doing anything for learning? It’s certainly possible, but just as likely not. The point here is that the difference between documentation of participation and evidence of learning really depends on the quality of what students add to their portfolio. If we just treat it as completion, that’s basically what we’ll continue to get: documentation of participation, which can actually lead to disengagement and lack of participation. As much as school can be school, kids really do find meaningless work worthless, and tend to find meaningful learning valuable. Even the cool kids. It’s important in a portfolio system to provide feedback on what students add so that you ensure meaningful learning occurs.
Easier said than done, but it’s time well spent.
As far as I can tell, there are only two ways to determine if what students add to their portfolio is, indeed, evidence of learning (and not documentation of participation). The first is an objective comparison to previous work, whether that’s on the teacher or the student, and the second is an honest rationale from the student’s end (explaining why what was added shows learning). I find the former tricky in a language class. For example, if you were to use the same text and have students keep submitting assignments based on that throughout the grading term, how sure are you that students are even processing the language anymore (vs. based on memorized English understanding of the text)? One cumbersome way could be to use a core set of vocabulary at the start of the term, and then write different texts with that same core set throughout the grading term that students interact with and complete assignments for. That might do the trick, but even then you’ve got to look at the students who ace the assignments in the beginning. How could they possibly show learning if they’ve already…learned…all that from the start? Also, a picture of a Quick Quiz result or something might just be participation, even if the student is showing you they understood all the Latin. Understanding Latin for 10 minutes during one class isn’t necessarily evidence of learning. Again, you’d need to compare those results over time to make the claim.
So, the comparison to previous work is tricky if not just time-consuming. That’s why I prefer getting students to write some honest rationales explaining why what was added shows learning. It’s all going to be individual anyway. Might as well embrace that.