Over the last few weeks two different articles keep hitting my desk. One is the Oppenheimer and Mueller study about note taking using a KEYBOARD and the other is the UCLA study about screen time. I have a couple of concerns about the way these studies are being used.
Quick clarification: I firmly believe in managing screen time and helping students/adults find a balance with their digital tools. Second, note taking is essential to learning and students should be taught to find a method that works for them.
First, Oppenheimer and Mueller’s work found “…that whereas taking more notes can be beneficial, laptop note takers’ tendency to transcribe lectures verbatim rather than processing information and reframing it in their own words is detrimental to learning.” There results are not saying students should avoid technology. They seem to be clearly stating that note taking is an activity where the note taker needs to process information and re-frame, re-organize, and work with the data to make note taking useful. The problem is that working from a keyboard is limiting and forces note takers to transcribe. The tool forces the behavior. A tablet device like an iPad, Surface, or Nexus 7 provides multiple ways to take notes. Tablets support multiple input options – keyboard, drawing, photos, and text to speech – and combined with an array of apps to support the varied needs of note takers in different classes.
While many seem to be suggesting that this study is anti-technology, my take away is that Oppenheimer and Mueller are reminding us what good note taking is and how the tools we use can impact our note taking.
To me the UCLA study regarding screen time is more worrisome especially because many are not taking the time to read the study. The study compared 51 students sent to an educational camp for five days to 54 students who spent the week in school. Both groups of students took pre- and post tests on recognition of non-verbal emotional cues. Both groups improved their scores. The camp group had greater improvement, but it is worth noting that this group also made more errors than the control group in the pre-tests. The researchers suggest the improvement for the control group is the “practice effect”. The “practice effect” was not applied to the camp group in the discussion on this report which leaves me with more questions than answers about how to use this data. Pre and post test results shown below.
It seems like we are making too big a deal over too small a sample size. It also seems tough to compare results of students who spent a week in a unique environment versus students who spent their week in a familiar environment. I think I would be sharper after the unique week versus after a standard work week regardless of the amount of screen time I had.
Again, I truly believe that users need to be intentionally managing their screen time, but finding good information about the true effects of screen time is challenging. My advice to parents is threefold:
- Ask questions. Consistently question what your students are doing at their screens. Ask them to share what they are getting from the experience.
- Monitor your own screen time. Try logging your own activities over the course of 5 days. Note what you are doing at the screen, how long you were doing it, and what you got from the experience. Reflect on your experience and the similarities of your students experiences.
- Engage your school in conversation about learning activities that put students at a screen, but don’t just count minutes in front of the screen. Examine the activity and what it takes to complete it. Measure the value of the experience in terms of executive function, higher order thinking skills, or engagement.
One of the problems with news outlets today is the “sound bite effect” or attention grabbing headlines. Be a savvy consumer and dig deeper.