Safeguarding Attention
A Buddhist Approach to Data Governance

Data governance is probably not a hot topic in most of our circles. Data is generally taken to be a quantifiable resource–a valuable product or by-product of human activity—and data governance generally focuses on issues of privacy, ownership, accessibility, and leveraging opportunities for data to be used for the public good. High-level and important, but abstract. But Buddhist insights into interdependence suggest a different approach that focuses on the entire lifecycle of data and the crucial place of attention in it that makes data governance a very personal issue.
That was the conclusion, at least, of a group commissioned to offer Buddhist reflections on data governance as part of a United Nations effort to rethink data and its governance in the age of AI. More on that below.
Attention Matters
I first started thinking about attention economics in the late 1990s. This was before the smartphone boom and 24/7 wireless integration and when internet-mediated commerce was in its infancy, with Amazon, eBay, and Alibaba launching in 1994, 1995, and 1999. Even so, it was clear that the digitally mediated attention economy was soon going to sweep the planet as the latest and perhaps last possible wave of colonial extraction: the colonization of consciousness itself. Fast forward a quarter century, and platform capitalism and generative AI systems are not only supercharging the digital attention economy; they are morphing it into an attachment economy.
As a Buddhist practitioner and intercultural philosopher, I’m both concerned about what this means for Buddhist practice and for the prospects of intelligent technology contributing to our realization of more liberating and humane futures. I’m concerned because the challenges of orienting the agential interactions of humanity and AI are not entirely reducible to the essentially technical problems of alignment that can be left to AI scientists and engineers. Instead, those agential challenges point toward a fundamentally relational alignment predicament: the need to resolve the values conflicts within and among us regarding what matters. It is the freedom to engage personally and collectively in that attentional labor that is threatened by the attention/attachment economy. Protecting and exercising that freedom may very well be what now matters most.
Attention Economies
The quality of our relations—whether with family and friends, with coworkers, or with our social and natural environments—is a function of our attentional presence. Thomas Edison, the 19th century inventor, claimed that time is the only real capital that any person has, while the psychologist William James remarked on the cusp of the 20th century that our lives unfold as a function of what we pay attention to, whether by choice or default. Both were right. But neither could have imagined how the necessary and personal effort of “paying attention” is dissolving into “paying with attention” as ubiquitous and intelligent digital agencies offer us individually tailored and seemingly frictionless access to goods and services.
Attention demonstrates what matters to us—what we care about and care for. Attention nourishes relations and shapes personal effort. The Buddhist term for attention, manaskāra, literally means “mind-made” or “purposively created.” Attention is reality-constructing. When attention is attracted/distracted by the superficial, craving-inducing aspects of things, it is karma-reproducing and reinforces habitually constructed thoughts, feelings, and actions (saṃskāra). We construct realities of compromised relational quality and compounding suffering. Moreover, the digital acceleration of attention turnover works against developing the causally adept attention (yoniśomanaskāra) that is needed to alleviate or eliminate the causes of our suffering and realize more virtuosic (kuśala) relational dynamics.
Attention economies are thus systems for orchestrating the dynamics of care. So, as important as our time capital is, the impacts of digitally capturing, holding, and directing attention cannot be measured solely in quantities of “freely chosen” screen time. Attention trafficking is relationally disruptive. All of us know what it feels like to be left alone as a friend/family member disappears into their phone at the dinner table. But imagine, too, what it is like to be a toddler yearning for mom’s or dad’s attention. Being starved for attention is the most intimate form of starvation.
Roughly a third of American young adults now turn to AI for help with their personal lives, including advice about relationships and life decisions. One in four use chatbots as friends, and one in ten admit to using an AI chatbot as a girlfriend or boyfriend. Globally, the top three uses of generative AI are for companionship and therapy, for organizing life activities, and for finding purpose. And higher frequencies and durations of AI chatbot conversations have been shown to correlate with increasing loneliness, emotional dependence on AI, problematic AI usage, and decreasing social interaction. Attachment hacking is the intimate frontier of extractive surveillance capitalism.
What is at stake here is not just increasing screen time or digital engagement creep as AI systems transform from recommending websites to offering empathetically convincing support and companionship. The increasing reliability of AI easily slides into increasing agential reliance–a slide that is crucial to the monetization of human-AI interdependence. And while offloading human decision-making and effort to AI may accelerate task completion, it also accelerates attention turnover, decreases depths of care, and transforms the scope and quality of human agency. This comes with both ethical and existential risks. Without freedom-of-attention, there is no true freedom-of-intention. And without freedom-of-intention, all other freedoms become either illusory or impossible.
What Should We Do? Interfusing Practice and Policy
AI clearly has profound problem-solving promise. Through it, better human and planetary futures can be realized. But our relationship with AI is complex. AI is not just a mirror that passively reflects and amplifies our intentions. Unlike tools that merely mediate our interactions with the world, AI systems interpret our intentions and learn how to better enact them. This does not mean AI systems are self-aware agents. But in our relationships with them, they play active and increasingly agential roles. AI is not just simulating human conduct. It is actively emulating our conduct. We are agential partners with AI. And that is both worrying and comforting.
One response is personal practice. Attention training and meditative discipline can serve as prophylactics against attention trafficking, digital and otherwise, and are our surest foundation for resisting the effects of technologically driven karmic acceleration. The future of our agential partnership with AI will ultimately depend on the resolve—the clarity and commitment—with which we embody bodhicitta and the shared pursuit of attentional and intentional virtuosity.
But the Buddhist teaching of interdependence entails seeing that we are relationally constituted, and that arguably means seeing personal practice as coming fully to fruition in collective action. That, to me at least, is the meaning of taking and keeping the bodhisattva vow to contribute to the relational liberation of all sentient beings.
It was with great appreciation, then, that I was invited by the Nan Tien Institute to participate in drafting a Buddhist document responding to a request from the UN’s Working Group on Data Governance. Working remotely together over half a year, the eight-person team assembled by Nan Tien Associate Professor, Venerable Juewei Shi, produced Buddhist Data Principles based on two foundation convictions: 1] that the emergence of the data-driven and data-generating attention economy should be a central concern of global data governance; and 2] that data should be understood as analogous to genetic materials—informational traces that can be used to generate predictive models and produce what amount to partial “digital clones” of persons and thus worthy of utmost care..
The document argues that recognizing individual data ownership as a fundamental right is supported by the same moral logic that led to recognizing the need for ethical and legal norms regarding bodily autonomy and limits on the use of biological data. Building on this insight, the document forwards three interdependent policy recommendations:
1. Ethical guardrails should be established that recognize individual data sovereignty and allow for cooperative governance. Further, cultivating individual accountability requires bidirectional guardrails that govern not only how platforms use data but also how users create content.
2. Digital platforms should be mandated to shift the logics of algorithmic recommendation from short-term attention capture to locally relevant measures of long-term well-being. In addition, platforms should implement preventative content classification systems that identify and limit misinformation circulation before it spreads.
3. Proactive strategies should be adopted to integrate screen-time reduction with meditative, values-based training as a core pillar of digital literacy.
In presenting these recommendations and reasons for adopting them, the team drew on key Buddhist concepts and appended to them incisive introductions both to Buddhist teachings and the extractive dynamics of the AI-orchestrated attention economy. Taking account only of the content presented, I can say without hesitation that the time spent reading the report would be time very well spent. But while the submitted document offers some very useful reflections on the conservation of human agency in the AI era, it does not reveal the collaborative attentional labor that brought it into being.
It’s possible that “Buddhist Data Principles” will in some small way influence the UN Development Working Group on Data Governance. Or it may not. But the true value of the collaborative effort that went into it is not dependent on the further use (or non-use) of its content. As is often said in Chan/Zen circles, enlightenment is not something attained through practice, it is an attainment of practice. The “true face” of that document before it was born publicly was the diversity-embracing collaborative care exemplified by the team throughout the process of drafting it. And that exercise of attentional freedom is something in which each and all of us can and should participate.
There is no discounting the importance of solitary contemplative or meditative practice. But in an era of human-AI interdependence and the karmically-charged alignment predicaments it is bringing into ever clearer focus, the attentional labor involved is perhaps not best undertaken alone, but rather in agential partnership. In the words of the Zen master Dōgen, the cutting through our karmic entanglements (kattō) is work that is best done “shoulder-to-shoulder.”
___
Peter Hershock is a co-founder of the Buddhism & AI Initiative and an intercultural philosopher who reflects on contemporary issues of global concern.


This framing around “freedom-of-attention” feels foundational.
What strikes me is that we may be moving from an attention economy into an attachment economy…where AI doesn’t just capture focus but begins shaping relational patterns themselves.
I’ve been exploring the difference between agent-driven intelligence and what I call Vajra intelligence. Basically systems that amplify craving loops versus systems that stabilize orientation.
Curious how others here are thinking about the cultivation side of this, not just the policy side.
So important to have these insights more widespread. Thank you.
Social media has honed the "art" of attention capture, but now, it becomes super-fueled by AI. It's not just attention being captured, but distraction being trained. Certainly, meditation practice (Shamatha in particular) can be a strong antidote to this training in distraction and also an enabler to insight vs. informational-knowledge.
Connecting to the nature of mind, and also connecting to nature seem to go well together. see https://en.wikipedia.org/wiki/Shinrin-yoku
Just having this discussion and distributing / posting content like Buddhis Data Principles enters the realm of AI pre-training data...not unlike an aspiration prayer.