Digital Technology: Friend or Foe to the Brain?

People of all ages now grow up — and grow old — immersed in digital screens, generating concerns about the potential effects on the brain. On the one hand, researchers warn of “digital dementia” in which excessive use of digital devices, especially among youth and young adults, may lead to cognitive decline over time. On

People of all ages now grow up — and grow old — immersed in digital screens, generating concerns about the potential effects on the brain.

On the one hand, researchers warn of “digital dementia” in which excessive use of digital devices, especially among youth and young adults, may lead to cognitive decline over time. On the other, studies have suggested that “digital isolation” may accelerate cognitive decline, while embracing the digital world may protect cognitive function.

“Technology is neither inherently friend nor foe to the brain. It’s a potent force that can either nourish or erode cognitive health depending on how it’s used,” Shaheen Lakhan, MD, PhD, neurologist and researcher in Miami, told Medscape Medical News.

The “sweet spot” lies in mindful, age-appropriate use, Lakhan said.

Digital Dementia: Too Much of a Good Thing?

“Digital dementia” is a term coined by neuroscientist Manfred Spitzer to describe an overuse of digital technology, which could trigger memory problems and negatively affect brain structure and function.

This could be especially problematic in adolescence and early adulthood — critical periods for neurodevelopment, marked by significant changes in the prefrontal cortex, the region responsible for complex cognitive tasks such as decision-making and impulse control.

There are a number of studies that seem to justify this concern. A recent report showed that teens who were heavy users of digital devices were twice as likely as infrequent users to show symptoms of attention-deficit/hyperactivity disorder. Increased use of mobile devices to calm preschool aged children has also been linked to decreased executive functioning and increased emotional reactivity.

In a recent evidence review, excessive smartphone use was associated with difficulties in cognitive-emotion regulation, impulsivity, impaired cognitive function, addiction to social networking, shyness and low self-esteem. Medical problems linked to excessive smartphone use included sleep problems, reduced physical fitness, unhealthy eating habits, pain and migraines, and changes in the brain’s gray matter volume.

Lakhan has written about what he calls “digital anhedonia” — the growing phenomenon where individuals, especially youth, lose the ability to find joy in real-world experiences after chronic exposure to highly stimulating digital content.

“It’s not clinical depression per se, but a blunting of natural reward circuits driven by algorithmically neuroengineered overstimulation. This is especially concerning during pediatric brain development, when neural circuits involved in motivation, emotional regulation, and executive function are still maturing,” Lakhan said.

“With the right level and type of stimulation, digital tools can support and even optimize brain circuitry. But with overexposure to high-dopamine, low-effort content, we risk disrupting that delicate developmental trajectory,” he added.

Digital Isolation: Not Enough of a Good Thing? 

In contrast, for older adults, the concern isn’t overstimulation but underexposure. 

Many older adults face barriers to digital access — limited internet connectivity, unfamiliarity with smartphones or computers, financial constraints, or lack of confidence in using digital tools. Paradoxically, some studies have shown that this “digital isolation” can hasten cognitive decline in middle age and beyond.

“Cognitive decline in this group is often accelerated not by technology overuse but by social disconnection, loneliness, and lack of mental engagement — factors that digital tools can actually help counteract,” Lakhan told Medscape Medical News.

For example, analysis of the prospective US Health and Retirement Study of more than 18,srcsrcsrc adults aged 5src-65 years followed-up for up to 17 years revealed that regular internet users experienced roughly half the risk for dementia than nonregular users. Being a regular internet user for longer periods in late adulthood was associated with delayed cognitive impairment.

In a recent meta-analysis of 57 studies with more than 4srcsrc,srcsrc adults (mean age, 69 years), investigators found that use of digital technology such as computers, internet, and smartphones was associated with reduced risk for cognitive impairment and reduced time-dependent rates of cognitive decline.

“The digital revolution fundamentally changed the cognitive landscape,” Michael Scullin, PhD, with Baylor University, Waco, Texas, who co-authored the study with Jared Benge, PhD, with University of Texas at Austin, Texas, told Medscape Medical News.

“Some have worried that digital devices cause ‘brain rot,’ which would lead to the worry that digital tools could accelerate rates of cognitive decline,” Scullin said.

Yet the pooled data show no evidence for digital dementia.

“Instead, technology use was actually associated with better cognitive outcomes. This pattern persisted when we controlled for numerous factors that might parsimoniously explain the relationship, such as socioeconomic status, education, lifestyle, general health, and other factors,” Scullin noted.

Research has also suggested that technology use patterns could be used as an indicator of cognitive change. Analysis of data from six waves (2src15-2src2src) of the National Health and Aging Trends Study showed that stopping the use of the internet, computers, and tablets and sending texts and emails correlated with cognitive decline in older adults. This suggests that monitoring technology use patterns may be an innovative approach that is economical and efficient to identify individuals at risk for cognitive decline, the authors said.

Along the same lines, a recent study pegged digital isolation as a significant risk factor for dementia among older adults, underscoring the importance of digital engagement in mitigating dementia risk.

Scullin told Medscape Medical News that using digital technologies could be associated with a “net positive” benefit for older adults through promoting the 3 C’s: Complexity, connection, and compensation.

“Learning to use digital technologies can be mentally stimulating, as is adapting to software and hardware updates (complexity),” Scullin said. “Digital devices afford opportunities to stay connected with friends and family members through email, texting, sharing photos, and video calling, which is important because loneliness in older adulthood is usually associated with worse cognitive outcomes.”

There is also a compensatory benefit of digital device use.

“If someone is having difficulty remembering to take their medications or remember appointments, then using a digital calendar can help them by providing automated reminders,” Scullin explained.

To combat digital isolation, clinicians could recommend tailored interventions, such as user-friendly devices (tablets with large icons and simplified interfaces to reduce frustration) and digital training workshops in local communities.

“Unlike the developing brains of youth, the aging brain benefits from stimulation and novelty, particularly when digital technologies support social interaction, cognitive training, and access to information. In this context, tech becomes a therapeutic ally rather than a threat,” Lakhan added.

Finding the Technology ‘Sweet Spot’

Given the potential harm from excessive use of digital technology in early life and the protective effects in middle-aged and older adults — what’s the best balance of technology use at different life stages?

For children, adolescents, and young adults, moderation is key. “This means setting limits and prioritizing tech that supports creativity and learning over passive consumption,” Lakhan said.

The American Academy of Pediatrics provides nuanced recommendations for “screen time” based on a child’s age and developmental needs. The organization notes that not all screen use is inherently harmful. High-quality content geared toward problem-solving or language learning can aid cognitive growth. However, aimless scrolling or gaming without time limits may erode attention spans and reduce opportunities for face-to-face social interaction.

Clinicians might consider advising parents and teens to adopt mindful digital habits. These include scheduled screen breaks (setting alarms or using apps that remind users to take breaks); purposeful use by differentiating between active learning through ebooks and interactive problem-solving vs passive consumption such as endless social media browsing; and digital hygiene with device-free hours, especially before bedtime, to improve sleep quality.

Although there are no specific guidelines regarding screen time in the elderly, the American Geriatrics Society encourages the use of digital technology to improve the health and well-being of older adults. Although they don’t set a fixed daily screen-time quota, the World Health Organization’s 2src2src global guidelines for adults aged 65 years or older emphasize limiting total sedentary time and breaking up long sitting bouts.

“Clinicians should move beyond screen-time alone and ask more nuanced questions: What kind of content? What purpose does it serve? How does it affect mood and attention?” Lakhan said. “In my view, digital literacy should be treated like nutritional literacy — not all screen time is empty calories, but much of it can be.”

Read More

About Author