There has been excitement among researchers in recent years that playing certain video and computer games may strengthen core components of cognition, helping us to make quicker decisions, think more fluidly, and avoid harmful distractions.
Although practice makes perfect, any improvement is usually limited in its scope. So, practising the piano does not make you a better basketball player, but it might help your xylophone playing. However, playing basketball should have cardiovascular benefits that improve performance not only on the court, but also in the swimming pool. Likewise, gaming may help certain cognitive processes that have benefits beyond the particular game being played – but not in everything our brains do.
The potential cognitive benefits of gaming has led to a flurry of research. Some results indicate benefits for gaming and some, such as a recent US study reported in the journal Psychological Science, do not. Given the disagreement, how can we sort between what is true and what is hype?
Gaming research can lose perspective
Before delving into the details of this recent study, it’s helpful to think about what counts as convincing evidence in general. If you learned that the number of hours spent in a doctor’s office does not correlate with life expectancy, would you stop going to the doctor? Hopefully not, because people without access to medical care have poor health outcomes, as do those suffering from serious diseases who must see their doctor frequently. No correlation does not imply no causation, just as correlation does not imply causation.
In this example, people are not randomly assigned to different conditions or treatments, as in a proper double-blind medical study where the effect of a drug is assessed by comparing those who receive the drug to those who receive a placebo. True randomized controlled trials, as opposed to surveys, allow researchers to infer a causal relationship between different variables.
While these rudimentary concepts are clear to most, somehow people lose perspective when it comes to evaluating video gaming research.
Total hours gaming
The new US study, led by Nash Unsworth at the University of Oregon, is a survey, rather than a randomized controlled trial. The researchers found that those people who reported being hardcore gamers enjoy a number of cognitive advantages over those who do not game at all. However, when everyone from the non-gamers to the hardcore gamers are included in the analyses, there appears to be no robust relationship between the hours a person spends gaming and their cognitive abilities.
Cognitive abilities were assessed by laboratory tasks that measured people’s ability to hold information in memory while making rapid decisions. These measures were correlated with how many hours people said that they spent gaming each week.
It’s difficult to conclude much from these results because people who vary in the number of hours they game may not be comparable. The study does not use random assignment, nor does it track changes in cognitive abilities over time as a function of gaming activity. It also neglects confounding variables – such as gender, socio-economic status, other daily activities, and a player’s baseline ability levels.
Survey research can be a good starting point for conducting a true experiment. Fortunately, there are true experiments using randomized controlled trials that show that gaming can improve aspects of cognition, including cognitive flexibility.
Some of this work addresses a limitation in this recent study, namely that people can emphasise speed or accuracy in their decisions. For example, gamers may develop a tendency to respond very quickly at the expense of accuracy, but perhaps they would be even more accurate than non-gamers if they slowed down to their pace. Because one can be slow and accurate, or fast and inaccurate, cognitive models are needed to jointly analyse both speed and accuracy. Otherwise, results comparing different groups can be misleading.
One weakness of randomized controlled trials of people playing games is that they tend to involve fewer subjects because of the time and cost involved in conducting a video-game training study. Fewer subjects mean that it’s more likely to observe an effect of gaming by chance, or to not observe an effect when it’s really there. One strength of survey research is that many participants can be examined without costly and lengthy interventions involving video-game training.
Young boy playing computer via Jacek Chabraszewski/shutterstock
One solution, which depends on community support, is to combine the strengths of survey (many people) and experiment approaches (intervention). By tracking gaming habits and changes in cognitive abilities over time, it would be possible to infer how different cognitive abilities are affected by different games.
With a large enough participant base, data mining techniques could determine which games would be best suited for each individual. PlayIQ is an online effort led by me and other scientists with these exact aims.
Other methods are available
So, should you game to improve your cognition? Although there are likely benefits to gaming, many other activities, such as exercise, adequate sleep, and socializing can also improve cognition.
Given there are only 24 hours in the day, it’s important to keep in mind the opportunity costs inherent in extreme gaming. As always, the sensible recommendation is moderation. The recent Psychological Science study also hints at this conclusion: those in between the game phobics and extreme gamers showed the highest performance on some measures. Enjoy your gaming but not to the neglect of other healthy activities.
Bradley C. Love is Professor of Cognitive and Decision Sciences at UCL. This article was originally published on The Conversation. Read the original article. Top image: Shutterstock
Comments