Let’s not Jump to Conclusions about JumpRope
Both students and teachers have had their problems with JumpRope. From a pupil’s perspective, YC Junior, Marlee Roberts, thinks that improvements could definitely be made. “It’s a nice portal for kids and parents to check grades, but I don’t think it’s the best platform out there. We’ve had issues. Sometimes the grades [shown on the website] don’t correspond to your actual grade.” She overall thinks the communication between students and Jump Rope isn’t that efficient.
Recently a question had arisen within our grading software, JumpRope. Since we’ve had this software for grading, the CRLS section has had a decaying average. The CRLS section is responsible to see how time a student is for turning in their assignments and essentially grades behavior.
Yamhill-Carlton Principal, Gregory Neuman, stated that there wasn’t a problem with JumpRope. He explained that the decaying average meant that each time you turn something in (or don’t) it will affect your grade more than it did last time. Towards the end of first semester, staff decided to put action towards because complications can come with this.
When I asked high school math teacher, Jordan Slavish, questions about what exactly was the deal with JumpRope, he decided to take time and effort to write everything out in order to shed light on to what exactly was being questioned through a very detailed, and appreciated, letter to explain it all perfectly.
Dear Editor,
When it comes to measuring our students, there’s a really big difference between measuring what a student has completed and what a student knows. A traditional gradebook measures what a student has done, and which assignments are complete. Finish the work, get the grade, get done with the course. JumpRope instead chooses to try and measure specific skills, and students are measured multiple times to try to get a real feeling for how much about that topic a student actually knows. This is messy sometimes, but at the end of the day it better fits what true education looks like, and tells kids what they actually know and don’t. When I give a student an A, I should be telling a university “They know this”, not “They did the paperwork”.
Background:
In Jumprope, a student’s grade falls into two categories: Academic (skills a student is working to learn), and CRLS (behaviors a student should be showing regularly). Think of these like riding a bike and making your bed.
When you’re learning to ride a bike, I expect you to get better each day until eventually it clicks and you’re off. If I want to know how well you’ve learned that skill, it doesn’t make sense to measure your farthest ride every day and then average them at the end – you’re much better now than when you started, and your distance on the last day matters a lot more than your first. This is the same idea as a “decaying average” – the more recent measurements matter more.
If I want to know how good you are at making your bed during the week, Jimmy who only makes it on Friday isn’t better at it than Sam who made it Monday through Thursday but forgot it on Friday. When we talk about behaviors, it makes more sense to simply look at how often the student is doing it. Sam did it four times, and Jimmy did it once, so Sam is better. This is a straight average.
The “Issue”:
Previously, Jumprope was setup to use the decaying average for all standards – including CRLS (behaviors) – because we simply hadn’t thought about it (we’re human too). We’re also very conscious of adjusting too many things at once, and a few things had already changed for the new 17-18 school year. Take a look at the following two students:
Jane has been turning in her homework all semester, but misses the last week of the school year for personal reasons – her “turn things in on time” behavior tanks, because the last assignment counted more than the first. This isn’t very fair.
In the opposite scenario, John slacks all semester and doesn’t turn in anything on time except the last assignment, and now his “turn things in on time” behavior looks better than it should be for the same reason. This also isn’t very fair.
The “Fix” implemented at semester:
Academic (skills) grades where a student’s mastery is growing as they learn the skill are and should be measured on a decaying average – fairly representing what a student knows right now, but CRLS (behaviors) are now (since semester) measured on a straight average. A student that misses a few deadlines will be better off than a student who misses most, regardless of when they were measured.
As teachers, at the end of the day, our job is to measure what our students know, not what they’ve done. Jumprope isn’t perfect, but as a teacher it’s the closest I’ve found to a grade book that truly reflects individual measurements of a student’s skills, and shows how a student’s education is changing, growing and improving along the way. I know it’s different. I know it’s new, and I know it can be frustrating – but at Yamhill Carlton High School, we as a staff believe firmly that “good enough” is not good enough. We work every moment of every single day to implement new and innovative ideas, and that extends to the ways in which we measure and help students to measure their own learning. There may be hiccups along the way, but as a community, it’s our duty to work together to best represent our future generations; we do this best through open, honest communication and frequent adaptations to how we meet a changing world.
Your Friendly Neighborhood Math Wizard,
Jordan Slavish
YCHS Mathematics
This overall provides clarity because many students and parents weren’t exactly aware of what the problem at hand was. In actuality, there wasn’t really a problem to begin with. There simply was a question as to if the CRLS section should be a decaying average as opposed to a straight average.
Currently, I'm a senior at Yamhill Carlton High school. This is my third year as editor-in-chief. In my free time I enjoy reading and writing. Hiking...