Special Topic Series in Critical AI
Feel the AGI! On (Generative) AI and/as Religion
Series Editor John Modern (Franklin & Marshall College)

The language of religion courses through the AI hype machine and is bound up in the tenor of these times. Mysticism and supernatural claims pervade the history and development of large language models. Rhetorics of apocalypticism and utopianism emanate from the citadels of Silicon Valley and beyond. Reports of awe and wonder over mind-boggling feats of computation compound and a generation is beckoned to prostrate before the altars of “frictionless knowledge.” Chatbots, the public is beginning to learn, can generate “spiritual psychoses.” New religions emerge on the scene–from “AGI-theism” to immortalism to the cosplay asceticism of effective altruism and the cultish vibes of Longtermism. “Feel the AGI!”—a revival cry for transcendence chanted by employees in the early days of OpenAI —has since become a fitting description of technophilic zeal in some quarters..
This series cluster for Critical AI calls on contributors to help readers make sense of this turn in what many would call a secular age—a moment when humans increasingly imagine their own spiritual lives vis-à-vis generative AI systems (and other computational affordances) designed and marketed as commercial products. What subject positions do humans assume when they bear witness to unprecedented feats of pattern recognition? or imbibe endless discourse about the supposed mastery over or even supercession of human “intelligence”? How do LLMs and their chatbot applications inspire new metaphysics or ritual embodiments? Do the infrastructures and psychic effects of these “artificial neural networks” conjure older forms of piety, practice, and affect?
Our running series for Critical AI (edited at Rutgers University and published by Duke University
Press) convenes critical conversations about new technological artifacts in dialogue with the
critical study of religion. The series is edited by John Modern (Franklin & Marshall College) in collaboration with the journal’s editorial collective and international advisory board.
Recent works in the critical study of religion have emphasized the impossible question of religion and the expanse, instability, historicity, and politics of its categorical constellation (God, soul, church, belief, theodicy, divinity, ritual, invisibility, bad religion, ghosts, etc.). Works on the indeterminacy of the religious/secular distinction emerge from a space of reflexivity and critique of common sense, and often essentialist, views of religion. Similarly, the series is open to a range of analytic wagers about artificial intelligence: as Critical AI emphasizes, “AI” is not as a single or coherent thing but, rather, a resonance machine with many moving parts—including mathematics, money, infrastructure, cognitive debt, emotional labor, mass delusions, solitary pleasures, psychopolitics, and perpetual advertisements for transcendence.
Some questions contributors might consider:
- How does knowledge about religious difference get made under the influence of generative AI (or “AI” more broadly)? How is such knowledge received and taken up, lived out and fed back into systems of surveillance and data extraction? Are automated systems changing how and why humans convince themselves that they are religious (or not or somewhere in between)?
- How do religious traditions serve as resources for and/or critical perspectives on the kind of ‘human’ or “posthuman” intelligence being imagined, enumerated, and built into AI systems?
- How do the archives of comparative religion help scholars to interpret how people comprehend and respond to “intelligent” systems and technologies? For example, how might the metaphysics of a particular Buddhist practice or denominational schism or theological debate offer ethical or explanatory purchase on generative AI, the so-called ELIZA effect, and the automatons in our midst?
- What new conditions and processes are generating religion and/or affecting the forms of religious life at a time when information and media circulate according to opaque algorithmic rules and rhythms?
- What to make of the diverse proliferation of hybrid pieties now circulating inside and beyond Silicon Valley—The Rationalists, transhumanism, Longtermism, effective altruism, the Turing Church, the Accelerationists, Robotheists, and Way of Lifers, etc? What are the effects of such new-found convictions, for themselves, for others, for the fate of the world?
We are particularly interested in essays (or short thinkpieces) that are cognizant of their moment of inquiry and intellection. The series encourages contributors to consider the formal elements of their work: that is, how style and medium (linguistic, visual, sonic) contributes to the content analysis and critical intervention. Although not a requirement, the series welcomes material specificity: contributions that unpack a particular object or experience or event with the goal of addressing the question of religion in relation to the development, implementation, and use of specific AI tools.
In line with Critical AI’s broader editorial practice, submissions are welcome from across academic disciplines, as well as those working outside the academy. We value interdisciplinarity, so long as the work is legible to readers in any discipline. Please see our submission guidelines, which include strict word limits and a strong preference for articles published in a humanities style and conversant with Critical AI’s relevant content—which may possibly include a prior special issue, Data Worlds, and a recent two-part special issue (part one and part two) on LLMs and the rise of chatbots.
We invite 250-word proposals for a range of essays, from short think pieces of 1,500 to 3,000 words to essays of 5,000 to 8,000 words.
Please send queries about proposals to John Modern at john.modern@fandm.edu.
This is an ongoing series, with no specific deadline at this time. Our editorial team will look at complete essays at their discretion.