This Is Your Brain in a Digital Age
A few years ago, I heard a woman who had founded a Catholic religious institute give a witness about the history of her vocation and her community. At the end, an attendee asked her: how do you see that you are living the hundredfold that Christ promises those who leave everything for his name (Matt 19:29)? “First, I am never alone. Second, I am free of slavery to this,” she replied, picking up an iPhone and showing it to the audience. That’s a little reductive, I thought, a little reactionary. Doesn’t a religious vocation offer so much more?
Of course it does. Whether vowed in a celibate community or lived out in the world by any baptized member of Christ’s body, religious life is an invitation to union with God—a transcendent and eternal destiny that transforms our daily mundanity into glory. But in the years since, I have also realized that she was right. As a Christian neuroscientist, my studies of the brain and my pursuit of Christ have led me to a deeper understanding of digital technology and the human cost it incurs.
I now hear in her comment neither a romantic nostalgia for simpler days nor an intransigent resistance to cultural change. Rather, this sister was gesturing toward the two banks of the river that keep us moving toward our destiny: communion with one another and attention to reality. Without these banks, no Christian can attain the “hundredfold” offered by religious life. And they are being eroded by digital revolution, worn down by its incursion into the human mind and heart.
Digital technology is exquisitely suited to capturing our attention; it is easy to slide into the digital world.[1] The use of screen-based media therefore constantly competes with and displaces encounters with our immediate physical environment and those within it. Yet, the human nervous continually adapts to its surroundings through such interactions. For this reason, digital technology shapes us from within. Through neuroplasticity, it forms habits that favor a superficial, individualistic, and utilitarian use of time, leading us away from contemplation and toward the satisfaction of smaller desires.
We may approach this phenomenon from two distinct, though interrelated, vantage points—two elements of reality transformed and redefined by the use of digital technology: time and space. Let us first consider time. The speed and volume of information transfer through digital devices is a novelty in the history of the human nervous system. Our evolutionary ancestors received sensory stimulation through ordinary interactions: listening to the landscape around them, speaking with kin, eating a meal while gazing at a fire. The human brain therefore expects to receive information through relatively simple and stable channels. At the same time, it is attuned to quick changes in information as abnormal stimuli that may signal an event that is life-threatening—such as the approach of a predator—and therefore worthy of attention.
The brain’s midbrain reward system, which releases dopamine to reinforce behaviors, can be activated by novel visual stimulation alone. Just seeing something new is rewarding. This is a helpful feature of our biology, for it instantiates our motivation to discover truth, which is arguably a defining end of human life. But it also makes us vulnerable, for over the course of the digital age, the amount of information available to our senses has grown explosively. Indeed, developers of digital media have actively exploited the fact that the human brain finds sensory changes salient. Through combining an enormous volume of information transfer and rapid changes in the subject of that information, digital tools have grown highly adept at seducing our attention.
This evokes two stereotypical behaviors in response. The first is passive or apparently automated satisfaction of curiosity. The most readily apparent example would be the child frozen in place, scrolling through reels on TikTok. The behavior requires no effort, no top-down or goal-directed investment of energy to sustain this attention. Indeed, it requires no agency at all: the agent seems to be the digital technology, which captures one’s attention and recaptures it every fifteen seconds. Thus, our reward system’s responsiveness to novelty is redirected from the pursuit of truth toward a lower end. Often, in this case, that end is the financial profit of tech industries and the algorithmic babysitting of children.
The second behavior evoked by digital media’s capture of attention is what researchers call media-multitasking, such as listening to a podcast while texting a friend or checking email while watching a lecture. The pull of multitasking is difficult to resist, in part because the screen of a digital device is (seemingly) infinitely malleable. More importantly, it seems to allow us to accomplish more within the same constraints. However, the term multitasking is a misnomer, because the brain typically attends to one thing at a time—unless the tasks belong to wholly different domains or are highly automated, such as speaking (a language task that relies on left temporal areas) while walking (a motor task coordinated by the cerebellum, motor cortex, and spinal cord). Thus, someone engaged in media multitasking may intend to accomplish several tasks at once and may appear to succeed. But the reality is rather of continual interruptions of work and shifts in attention, like the child scrolling through TikTok.
These behaviors warp a person’s sense of time. Think of scrolling through a feed of sensational headlines: as your gaze lingers on vivid images of famine and war and catches snatches of information about the latest political scandal, a five-minute break can quickly stretch into an hour. Perhaps more importantly, both behaviors displace other experiences that the nervous system expects and relies upon for its successful development and functioning.
Passively consuming information displaces personal agency—or in other words, active engagement with the environment. This is a basic expectation of our nervous system. Indeed, the possibility for action is built into perception itself; for example, the visual processing that enables us to recognize an object is intertwined with the motor pathways that support our interaction with it. Such active interaction is vital for learning, particularly during child development. Parents of young children readily recognize this phenomenon, as infants discover cause and effect by throwing toys, discover the position of their bodies in space by trying to stand and fall, and discover their belonging to their parents by crying and receiving soothing care.
Media multitasking, on the other hand, undercuts deep cognitive processing. Simultaneous use of multiple separate streams of digital information requires that we attend to breadth at the expense of depth, and that we respond quickly rather than maintaining sustained focus. As a result, we lack comprehension. Readers understand text on a digital screen less well, for example, than the same text on a physical page. And because neurons that fire together wire together, the brain grows to expect multitasking, likely contributing to a shortening and fragmentation of attention over time and a growing preference for accomplishments that provide a short-term reward. In the words of psychological researchers, media multitasking leads to “a reduced ability to draw on the past—be it very recent or more remote—to inform present behavior.”
Thus, these two behaviors of passive consumption and active multitasking reshape the meaning of time. They render it in utilitarian terms: whether the hedonistic utility of maximal sensory stimulation, or the technocratic one of productive accomplishment of technical tasks. Time is for the immediate, frictionless achievement of satisfaction.
Space is similarly transformed. The digital transfer of information appears to our senses as unmediated, unmoored from any spatial location and stripped of any context—another novelty in the history of the nervous system. For the human brain evolved through engagement with a multimodal context.
On the most basic level, this simply means that it expects to have multiple sensory systems stimulated at once. Perception of a forest, for instance, is best achieved not through the mere vision of trees, but the smell of the detritus, the sound of the leaves rustling, and the wind brushing on one’s skin.
The simultaneous engagement with an event through multiple modalities is the foundation of learning and memory. This may be because of a group of neurons in the hippocampus called place cells, discovered in the 1970s by researchers at MIT—who named them place cells because each one fires when an animal is in a different place in its cage. But neuroscientists soon discovered that the activity of place cells also depends on an animal’s smell, sight, and touch. Far from a simple spatial map, then, these cells index the history of a mammal’s engagement with its environment, integrating the senses to inscribe that narrative into its neurobiology. The brain expects to learn through an embodied interaction with reality.
Digital technology, of course, presents events with some modalities entirely absent—such as touch and smell (at least for now.) Others are radically impoverished, extracted from their dynamic context. Space is no longer a realm one inhabits, but an optional and perhaps undesirable mediator of the achievement of one’s ends.
When the environment is reduced in this way, it skews perception of reality. For instance, I was recently speaking to a friend whose young daughter just began middle school. She had been terrified and unable to sleep before her first day, and later confessed to her mother that she had been expecting to be confronted by a hostile classmate and made to fight.
How could that have been this girl’s schema of middle school? She had seen reels of middle school fights. This digital presentation stripped them from their spatial context, presenting them as unprovoked, proximate, inevitable experiences of young students. The embodied encounter with a non-digital middle school was therefore a welcome surprise; surrounded by her classmates, she realized that she would likely experience any fights from far away and buffered by a mass of other uninvolved students. She came home from her first day relieved and slept soundly. As this example makes clear, our embeddedness in space entails more than just the plurality of senses. Historically, spatial contexts have also entailed the presence of other minds—other human persons who are attending to events alongside us.
This is perhaps the single most decisive factor in the development and functioning of the human brain. Compared to other species, human infants are born with remarkably immature nervous systems; while nearly all neurons are present at a child’s birth, they form connections or synapses with one another at an explosive rate over the next few years, contributing to the brain’s tripling in weight in infancy. Nor does this process end in childhood, as synapses are gradually pruned away and organized into stable neural circuits over the course of decades.
The formation of this neural architecture is not solely a function of genetically encoded and inherited factors. Rather, it relies heavily on the experience a child has in her environment, especially of her caregivers. Each loving interaction helps shape a child’s neural circuitry through mechanisms of activity-dependent neuroplasticity, or changes in brain structure brought about by neuronal firing. The high-pitched baby-talk her parents use in infancy forms the language centers in her cortex; their skin-to-skin contact scaffolds her autonomic regulation; their gentle soothing of her outbursts teaches her to regulate her own emotions. Through this protracted and dyadic process of child development—which entails radical dependence on decades of embodied care—the brain achieves the unparalleled complexity and individuality that we see in adults.
The expectation of the brain to be situated in an interpersonal context does not end at maturity, at adulthood. Because of its developmental history, the brain continues to rely on the presence of other persons for its full functioning and flourishing. Paying attention to an event with another, for example, improves our comprehension of it —possibly by favoring those synchronous patterns of cortical activity that play a part in higher cognitive functions, enhancing our working memory and attention. So too do high-quality and supportive personal relationships favor mental health through a range of processes, some of which are biological—like oxytocin-mediated suppression of inflammation. The continued urgency of these needs in adulthood emerges strikingly when persons are deprived of interpersonal interactions, such as in cases of solitary confinement. Without other minds next to them who are attending to the same reality, incarcerated individuals often begin to experience perceptual distortions that may culminate in outright psychosis. The human brain has a fundamental expectation of a social context.
One irony of digital technology is that it seems to offer this more than ever. Through social media and digital communication, one may rapidly “connect” with nearly anyone else in the world. But this distinctive promise is achieved precisely through reducing interpersonal interactions to the transfer of data, to the appearances that we intentionally (and perhaps selectively) cultivate. And because these transactions can occur anytime, they can end at any time.
The disembodied and indiscriminate form of digital connectedness defies the expectations of the nervous system. Of course, as I will discuss in greater detail below, this need not apply to all uses of digital technology; a father in the military may find FaceTime an invaluable aid to maintaining his relationship with his toddler, for example. But the developmental expectation of the nervous system is for embodied accompaniment, coordinated attention to reality with another human being who is seeking the same good as oneself. A stable accompaniment over time, both with specific persons to whom we are bonded in kinship and with a broader group—a society with whom we share a culture.
Digital technology thus fundamentally changes how we engage with reality. By accelerating the pace of information transfer and fragmenting our attention, digital devices promote a utilitarian view of time, training us for immediate sensory gratification and technocratic productivity. Similarly, it flattens our spatial environment, stripping away the rich sensory experiences and interpersonal interactions that scaffold the functioning of our brains and minds.
Because human persons develop through their encounters with reality, digital technology is redirecting the trajectory of human development—starting from the very biological substrates of our being. Neuroscience research on the use of digital media is an emerging field and many existing studies have limitations, such as cross-sectional designs that make it impossible to isolate the causal direction of effects, or failing to consider what behaviors are being replaced by spending time online. But insofar as this work has been done, it presents a concerning picture.
Only a few days of using touch screens reorganizes the somatosensory cortex, the strip of the brain that processes sensations like touch. One study found that children who use these technologies also show reduced neural signals of attention and executive control, compared to children who spend a comparable amount of time reading with a parent. Over time, such short-term differences in brain activity lead to global structural changes in the brain—because, again, neurons that fire together wire together. In the case of digital technology usage, studies suggest that these changes include the thinning of regions of the cortex that are necessary for complex thinking and executive function, as well as microstructural changes in the communication pathways of the brain. A particularly interesting finding is that digital media use reduces connectivity between the various major networks of the brain—including the visual, language, and default mode networks. In other words, it may inhibit integration and unity in the very biology that undergirds our minds, promoting instead a fragmentation and compartmentalization of experience.
It is important to note that there is nothing inherently wrong with the nervous systems of these study participants. The brain is not atrophying, diseased, or malfunctioning when it responds this way to digital technology. For a basic principle of neurodevelopment is that the brain adapts to our experiences to help us navigate them better in the future; it adjusts to the demands we encounter and the environment in which we live to better suit us for those demands and that environment. Children who are raised on hours of screentime develop habits of mind and body that suit them for spending hours in front of screens. For instance, children who play video games are better able to rapidly move in response to visual stimulation. The neurocognitive changes are therefore neutral in and of themselves; we must judge them by their fruits (Matt 7:20).
I would summarize many of the fruits thus: the penetration of digital technology into all spheres of our everyday life is making us reactive and isolated. It is making us reactive by training us to expect the frictionless achievement of gratification, by expanding the scope of our attention at the expense of the depth of our engagement, and by making us forgetful so that we neglect our larger aims for the endless pursuit of the smaller satisfaction of material productivity. And digital technology is making us isolated by displacing those embodied encounters that bring us face to face with one another, replacing them with transitory and stylized transactions severed from a stable context of belonging. We are thus remaking ourselves in the image of what we have made (Psalm 115).
Faced with digital technology, then, what are we to do? If we are not to resign ourselves to its relentless magnetizing pull to the surface of things, what other course of action lies open to us?
The temptation—as Romano Guardini warned in reference to industrialization—is to put up a categorical opposition to technology, to look backward in a romantic nostalgia for simpler times or to cultivate a nondigital utopia apart from the lives of our contemporaries. But I believe this would be a failure of responsibility for those among us, perhaps living in settings of poverty or urbanization, who do not have the luxury of escaping to tech-free havens. And it would be a failure to recognize the significant opportunities that these technologies do offer. Instead, perhaps the grace of God can free us to do something new with digital technology (Isa 43:19).
That “something new” might take many forms. I do not presume to have the insight or experience necessary to make prescriptive recommendations for particular contexts. In line with the principle of subsidiarity, such decisions are likely best made by the communities most affected by them. What I do have to propose is a criterion: for digital technology to help us toward our end, we must incorporate it into a narrative. Our use of technology must cohere with the narrative of the meaning of our lives—which is not a subjective story of our own construction, but the great narrative of salvation history as it unfolds in our day-to-day. Specific tools and practices can thus be evaluated according to whether they help us put on the mind of Christ and participate in his saving action (Phil 2:5-11).
The history of Christian community offers precedents for this approach to the works of our own hands. Take, for instance, the mechanical clock. Its widespread adoption in medieval Europe, scholars have claimed, contributed to the gradual secularization and commodification of time—which was no longer a sacred rhythm set by God, but a measurable material that could be controlled and oriented toward increased productivity. And yet medieval monastics contributed greatly to the development and dissemination of mechanical timekeeping devices. Rather than competing with obedience to the natural rhythm set by God, the clock was adopted as a means of better knowing this divine rhythm and better observing one’s monastic commitments—especially the regular prayer of the Divine Office. The new technology was effectively incorporated into the greater and prior narrative of Benedictine life as a school of the Lord’s service.
The challenge we are now facing with digital media is more urgent than many previous technological advances, in part because its rapid development and dissemination is outpacing cultural adaptation. Furthermore, digital technology is not just a single tool but a way of reshaping everything we do. The medium is the message, to use the famous phrase of Marshall McLuhan. But I would still say that the digital revolution is not categorically different from challenges the Christian people have faced before, and which have not prevailed against the Church, as Christ promised (Matt 16:17-19).
What might it therefore look like to incorporate digital technology into the narrative of our lives? How can it favor our participation in the objective story of salvation history? Certain guiding principles come to mind.
First, digital technology would follow the pace and scale of the natural world. This would entail leaving behind most noises, pings, and reels in favor of silence of the senses and of the will.
It would cultivate deep and lasting bonds. For example, we might use social networking platforms to find our neighbors and invite them for dinner; we might FaceTime our relatives to sustain meaningful accompaniment between in-person encounters.
It would enhance our daily engagement with unmediated reality. A smart bird feeder, for instance, can train us to know and appreciate the wonders of avian life at other times. Or an app that teaches Christians to pray might propose rites for use at the family dinner table.
It would serve the embodied and communal rhythms of Christian life. The circadian rhythm, by awakening us with sounds of praise in the morning and shutting off during times of rest. The weekly rhythm, helping us rejoice on Sundays and practice asceticism or penance on Wednesdays and Fridays.[2] And the annual rhythm, by orienting us within the liturgical calendar.
It would facilitate, rather than replace, the performance of the works of mercy. For instance, electronic records in healthcare would be simplified and streamlined, so that the physician can treat the living patient and not her digital counterpart. Virtual education platforms would maintain a robust teacher-pupil relationship at their core.
It would teach us to lift our hearts to God in regular dialogue. We could develop habits like praying about every Instagram post we see or for every Instacart worker who delivers our groceries. Such habits would most effectively transform digital technology if they take shape communally—as ecclesial, cultural, and familial practices.
Finally, it would favor memory of the meaning of what we are doing. Tools already exist that help digital workers begin tasks with a discrete intention, persevere until its completion, and end with a moment of judgment or gratitude. These can be used to cultivate memory of the living presence of the One who makes meaningful the smallest act offered to his name (Col 3:17).
Of course, some of these strategies entail reducing or eliminating our use of digital technology. For instance, I recently began praying about every piece of news that passes through my email inbox—simply spending a few moments in dialogue with God about each headline or photo or statistic. It has enabled intercession to pervade my day in a beautiful way. But it has also made it impossible to read the news casually. I cannot flick through the headlines of the New York Times’ daily newsletter on the bus, for example. Instead, I need to set aside time and space for it. This renunciation is more faithful to the truth of what it means to read the news: it is an occasion of learning about the suffering of my fellow men and women, of hearing creation continue to groan in labor pains (Rom 8:22) until the Lord returns.
Whether through renunciation or transformation, if we incorporate digital technology into the narrative of salvation history in these ways it will continually place experiences of encounter and contemplation before us. And because the brain is always adapting to its environment, this too will form us from within—this time, not favoring inattention or fragmentation, but rather our end, which is union with the Trinity. The Creator, in whose gift we find our origin, the Savior, by whose love we are redeemed, and the Spirit, in whose life we participate.
We are made for the unceasing contemplation of his glory in eternity; here on earth we anticipate that union through unceasing prayer. Human greatness does not lie in the optimally productive use of time, material security, or maximal connectedness. The human person was rather created, in the words of Hans Urs von Balthasar, “to be a hearer of the word, and it is in responding to the word that he attains his true dignity.”[3] And while digital technology can be a powerful impediment to that response, I am full of hope that the unfolding revolution can instead teach us to pray.
First, because God’s method of involvement with the world is to continually speak his word to us through the ever-present event of Christ’s incarnation. He never ceases to take the initiative; this mercy can animate our communal path of conversion around digital technology.
Secondly, because the Church, across time and space, has never lacked those who hear and respond to the word. These men and women, whom we call the saints, have shown unparalleled creativity, freedom, and responsibility in the face of every challenge the Christian people have ever faced—and they can lead us now.
Finally, I am full of hope because nothing can satisfy the human heart but relationship with God. Anything less leaves us wanting, aching with an unknown absence. Thus, when our misuses of technology choke the space within us in which God’s word desires to dwell, leaving it “like an overgrown tomb or an attic choked with rubbish,”[4] our hearts naturally yearn for prayer. All we need beg God for the grace to be faithful to the truth of our desire. And with his grace, it is possible to pray without ceasing: to eat and drink and text and tweet for the glory of God (1 Cor 10:31).
[1] The discussion that follows draws on Anton Barba-Kay’s A Web of Our Own Making (Cambridge University Press 2023). See especially pp 169-185 for an illuminating discussion of frictionlessness.
[2] Wednesday and Friday have traditionally been observed as penitential days since the earliest days of the Church; see, for instance, the Didache.
[3] Von Balthasar, H. U. (1986). Prayer. Ignatius Press.
[4] Von Balthasar, ibid.