A few years ago, Jennifer Perrine saw a television ad for Lumosity, an online brain training program, and decided she’d give it a try.
Her mother had been diagnosed with Alzheimer’s disease, and Ms. Perrine, a freelance writer in New York, began worrying about her own mental abilities. “Every time you lose your keys, you think you’re losing your mind,” she said. “This seemed to offer a ray of hope.”
Lumosity’s ads, seemingly ubiquitous, appeared on television, radio and podcasts. The company purchased hundreds of search engine keywords so that computer users seeking information on dementia, Alzheimer’s and memory would encounter its online ads.
In one TV commercial, a man declared that with Lumosity “decisions come quicker. I’m more productive.” The company website stated that brain training could help “patients with brain trauma, chemofog, mild cognitive impairment and more,” adding that “healthy people have also used brain training to sharpen their daily lives and ward off cognitive decline.”
Earlier this month, the Federal Trade Commission said: No more.
Its complaint charged that the company could not substantiate such marketing claims. “The research it has done falls short because it doesn’t show any real-world benefits,” said Michelle Rusk, an F.T.C. staff lawyer.
She called the commission’s yearlong investigation “part of an effort to crack down on cognitive products, especially when they’re targeted to an aging population.”
Lumosity agreed to give its one million current subscribers, who pay $14.95 a month or $79.95 annually, a quick way to opt out. It also accepted a $50 million judgment, all but $2 million suspended after the commission reviewed the company’s financial records.
The company had already stopped making health and cognition claims, its new chief executive, Steve Berkowitz, said in an interview. But the firm settled because “we came to the realization that the most important thing we could do is focus on the future,” Mr. Berkowitz said.
Even scientists who see promise in cognitive training applauded the agency’s action. “The criticisms were right,” said Joel Sneed, a psychologist at Queens College and senior author of a meta-analysis on cognitive training and depression.
“The field is far, far, far from demonstrating any reduction or delay in cognitive decline,” Dr. Sneed said.
Broader questions of whether cognitive training works, and for whom, still generate considerable debate, given that human brains change and grow throughout life, a quality called “neuroplasticity.”
There is no evidence that spending 10 or 15 minutes several times a week at your keyboard, dispatching animated trains to appropriately colored stations or recalling the locations of squares on a grid, will spare you dementia. Claims that it will improve your work or your child’s school performance remain similarly unproven.
Last fall, more than 70 psychologists and neuroscientists signed a statement circulated by the Stanford Center on Longevity. “We object to the claim that brain games offer consumers a scientifically grounded avenue to reduce or reverse cognitive decline when there is no compelling scientific evidence to date that they do,” the statement said, though it encouraged further research.
A group of about 100 scientists and experts countered with their own open letter.
Agreeing that many companies had made exaggerated claims, these researchers nevertheless argued that “a substantial and growing body of evidence shows that certain cognitive training regimens can significantly improve cognitive function.”
George Rebok, a developmental psychologist at Johns Hopkins, signed the second letter, concerned that the Stanford statement dismissed years of research. “It would almost chill the whole field if people concluded it was all bogus,” Dr. Rebok said.
Hundreds of published studies have examined cognitive training, but many involved very small groups of subjects and designs that might have encouraged a placebo effect by comparing inactive control groups, who do nothing, to participants who become invested in and motivated by their training efforts.
Critics have pointed out, too, that the cognitive tests used to assess participants’ progress are often so similar to the training games that investigators may be “teaching to the test.” They also question self-reported assessments of the results.
What scientists call “transfer,” real-world results beyond the laboratory, lies at the heart of the debate.
“When they train on these games for 15 or 20 sessions, people get better — on these games,” said Thomas Redick, a psychologist at Purdue University. Improvement often shows up between pre- and post-tests of cognition, too — an example of “near transfer,” the ability to do better, with practice, on similar tasks.
But what about “far transfer,” affecting participants’ ability to function in their daily lives? Does cognitive training help people handle their finances or remember where they’ve parked?
Its 2,832 cognitively normal volunteers (average age, nearly 74) met in small groups with facilitators for 10 sessions of training on one of three skills: memory, processing speed or reasoning.
Ten years later, tests showed that the subjects trained in processing speed and reasoning still outperformed the control group, though the people given memory training no longer did. And 60 percent of the trained participants, compared with 50 percent of the control group, said they had maintained or improved their ability to manage daily activities like shopping and finances.
“They felt the training had made a difference,” said Dr. Rebok, who was a principal investigator.
So that’s far transfer — or is it? When the investigators administered tests that mimicked real-life activities, like managing medications, the differences between the trainees and the control group participants no longer reached statistical significance.
A secondary analysis also found that after five years, people who had been trained were no less likely than those in the control group to develop dementia.
In subjects 18 to 30 years old, Dr. Redick also found limited transfer after computer training to improve working memory.
Asked whether they thought they had improved, nearly all the participants said yes — and most had, on the training exercises themselves. They did no better, however, on tests of intelligence, multitasking and other cognitive abilities.
“I’m pretty skeptical,” Dr. Redick said of current computerized cognitive training. “The evidence is pretty clear that it’s not a good approach for causing the changes we care about.”
Still, cognitive training may have potential, some investigators say. Maybe the programs need to more closely simulate real-life challenges; perhaps the dosage — how much people train — matters.
P. Murali Doraiswamy, who directs the neurocognitive disorders program at Duke University, believes Lumosity and similar companies should seek guidance from the Food and Drug Administration, which could examine and regulate cognitive training programs as medical devices.
“Then an independent government agency that knows how to evaluate clinical trials can say thumbs up or thumbs down,” Dr. Doraiswamy said. “And the public will know what it’s buying.”