Tufts researchers investigate how we can keep our brains healthy as we age, focusing on information retrieval, stereotypes and memory, and how nutrition might offset Alzheimer’s disease
It’s an age-old paradox—as we get older and have more wisdom and life experiences to share, our minds start playing tricks on us, and we find it more difficult to retrieve the information we want. We find it harder to remember key details about our lives and our loved ones, or mix up basic facts about the world. About one in 10 adults aged 65 and over have dementia, and 22% have some form of mild cognitive impairment.
What causes these kinds of impairments? And more importantly, what can we do about them? As medicine continues to improve physical health and lengthen people’s lifespans, three Tufts University researchers are investigating how we can improve memory and cognition as we age as well.
In disciplines including neuroscience, psychology, and nutrition, they are uncovering key insights that could help slow the decline of the brain as we age—and help us better cope with the deteriorations that inevitably occur.
Making Connections with Memories
As an undergrad, Elizabeth Race volunteered with a program called Grandfriends, in which students befriended seniors at assisted living facilities. “I was always struck by how much you can learn from older adults,” says Race, now an associate professor of psychology. “The amount they can contribute to the world is really undervalued.”
In graduate school, Race became fascinated with examining what makes some older adults stay mentally sharp as a tack, while others deteriorate into cognitive impairment and dementia.
At Tufts, her research examines both the physical deterioration in areas of the brain associated with memory storage and retrieval, as well as the connections between these areas. “With new brain imaging technology, we can visualize these networks in a way we couldn’t do 10 or 15 years ago,” she says. Knowing how these connections work, Race says, may help people develop strategies for drawing upon healthy brain regions to compensate for losses in impaired areas.
In her studies of the prefrontal cortex, the “command center” of the brain in charge of executive function, Race has noted that the outside edges, known as the lateral prefrontal cortex, deteriorate faster than the interior region, known as the medial prefrontal cortex. While the former is associated with task switching and short-term memory, the latter is associated with prior knowledge and personal information.
In experiments using MRI (magnetic resonance imaging), she’s found that older adults can better remember new information by linking it to something they already know—like memorizing digits by envisioning a telephone keypad or associating new information with a personal hobby, such as types of birds and where they live.
In other experiments using EEG (electroencephalogram) technology, she’s explored the natural oscillations of brain waves. When exposed to music or rhythm in the environment, brain waves become synchronized with that beat. Since the medial prefrontal cortex is also associated with music, she’s found that presenting information rhythmically can also help better encode memories for later retrieval.
“When we are presenting important information,” says Race, “we can ask, are there really simple behavioral interventions we can do that could dramatically improve people’s ability to remember things clearly and vividly?”
How Stereotypes of Aging Affect Memory
When older people come into Ayanna Thomas’ lab for experiments on cognition, they often arrive with an apology. “They say to me or my students, ‘My memory’s terrible—I’m not going to do well,’” says Thomas, a psychology professor and dean of research for Arts and Sciences. “So they are already feeling pressure and anxiety about their cognitive functioning.”
In her research, Thomas investigates how such metacognition—thinking about thinking—affects people’s ability to remember in various contexts.
“The societal representations of what it means to become older are full of negative stereotypes associated with poor memory, and that can really affect the way people feel about themselves,” she says.
In some experiments, for example, she exposes people to a short video clip depicting a crime, and then has them read a narrative about the same incident that includes both correct and incorrect information. Afterwards, she asks them to recall details of the incident, and finds that when told not to worry about whether information is correct or not, they score similarly to younger participants, including both correct and erroneous information.
On the other hand, when told they will be penalized for incorrect information—thus activating their stereotype threat, which is when people are afraid that their actions will confirm negative stereotypes about their group—they tended to perform worse, by withholding correct details. The results have implications for witness testimony in court.
“They may be exercising a form of control by withholding both correct and incorrect information,” Thomas says. She has extrapolated these findings to other situations, such as a visit to a doctor’s office, where older people may feel anxiety over questions about brain function, exhibiting poor memory. They may feel more relaxed and do better, she says, if they are interviewed by an older person who they perceive to be more sympathetic to the challenges of aging.
Recently, Thomas has been collaborating with Race to examine whether giving older people a warning about incorrect information can help them weed out those details and remember more accurately.
In this case, study participants watch a crime video and then listen to a narrative report recounting the event that they are told could not be verified.
The researchers are exploring whether that warning reduces the likelihood that the incorrect information is incorporated into their final reports.
“We know that younger adults benefit from these kinds of warnings, but we’re not quite sure if older adults will be able to benefit as well, since that requires cognitive processes that may be difficult for them,” Thomas says. “However, if warnings prove effective, they could be helpful in dealing with cases of misinformation on the internet, or when they are being questioned by an investigator in a criminal justice situation.”
A Healthy Diet and a Healthy Brain
Incidence of Alzheimer’s disease in the U.S. dramatically increases with age. At age 65, it affects an estimated one out of every 20 adults, while at age 85, it affects one out every three, according to the Alzheimer’s Association. But the processes that lead to Alzheimer’s are believed to start years or even decades before, says Paul Jacques, a professor at the Friedman School of Nutrition Science and Policy.
“It’s now believed that the progression to Alzheimer’s disease starts 20 or 30 years prior to diagnosis of the condition,” he says. “This may be the best window of opportunity to prevent the damage that leads to Alzheimer’s disease.”
As nutritional epidemiology team leader at the Jean Mayer USDA Human Nutritional Research Center on Aging at Tufts, he’s been looking at long-term data to tease out how diet might increase or decrease risk of the disease.
Much of his research focuses on data from the Framingham Heart Study, a longitudinal study tracking associations between lifestyle, diet, and health outcomes since 1948. Researchers have identified few modifiable risk factors when it comes to Alzheimer’s, Jacques says. In fact, changes in diet and nutrition may be one of the few ways to intervene in the disease.
Among the nutrients he’s found that could lower risk are flavonoids, plant pigments found in many fruits, vegetables, and other plant foods, including blueberries and strawberries, apples, red and purple grapes, and tea. They have been shown to prevent inflammation—one of the essential mechanisms behind Alzheimer’s. He’s also found an association between Alzheimer’s and low choline, an organic compound found in wide variety of foods, including eggs, meat, and leafy vegetables, that is essential for neurotransmitter acetylcholine.
In other research, Jacques and colleagues have shown that the vitamins B12 and folate (B9) may be protective against cognitive decline. However, this work has also suggested that high consumption of folic acid, the synthetic form of folate found in most supplements and fortified foods, is associated with worse cognitive function in older adults with inadequate B12 status. His current research is focused on addressing the role of these B vitamins on brain aging, including risk of Alzheimer’s disease.
Taken as a whole, Jacques’ research provides important evidence that diet may affect cognitive decline and risk of Alzheimer’s disease, though determining exactly how diet relates to brain health is an ongoing process.
For the average person, he says the best protection against cognitive decline is a healthy diet overall. “Focusing on one item, such as eating more berries, is not going to hurt you,” he says. “But the strongest evidence shows that the healthiest diet is one lower in red meat and higher in fresh fruits, vegetables, and whole grains. It will lower your risk for cardiovascular disease—and Alzheimer’s disease as well.”
- Karlston and phen0men4
- 2
Recommended Comments
There are no comments to display.
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.