Over at Mind Matters News, we’ve covered recent attempts to automate religious practices — for example, the android Mindar (Kannon, a deity in the Buddhist tradition), the robot Pepper performing funeral rites, and proposals for robot Catholic priests. There’s even an AI Jesus out there, spluttering from the King James Bible.
Mindar, in particular, has been invested with great hopes: To learn countless things and solve everyone’s problems:
Many of us have just assumed that robo-religion — often introduced to combat declining adherence — would not play well with the genuinely devout. But no one had tested that rigorously.
Now Someone Has
Joshua Conrad Jackson, a behavioral science professor at the University of Chicago’s Booth School of Business, studied the impact of robots and AI on adherents. He and his colleagues concluded in their open access paper in the Journal of Experimental Psychology, that the skeptics are right:
Using a natural experiment in a recently automated Buddhist temple (Study 1) and a fully randomized experiment in a Taoist temple (Study 2), we consistently show that religious adherents perceive robot preachers — and the institutions which employ them — as less credible than human preachers. This lack of credibility explains reductions in religious commitment after people listen to robot (vs. human) preachers deliver sermons. Study 3 conceptually replicates this finding in an online experiment and suggests that religious elites require perceived minds (agency and patiency) to be credible, which is partly why robot preachers inspire less credibility than humans.
In their study, Mindar, Pepper, and an AI-composed Christian sermon all resulted in lower engagement. Curiously, the AI sermon was close though:
While participants said they believed human preachers were more credible, it was still a close contest with the robots. On a scale from one to five, with five being most credible, the robot preachers received an average credibility rating of 3.12, compared with 3.51 for human preachers.“WILL INTELLIGENT ROBOTS SOON BE ABLE TO SERVE AS RABBIS?,” JERUSALEM POST, JULY 26, 2023
Repeated Exposure to AI
Because AI simply blends human activity, devoid of personal experience, it’s safe to infer that repeated exposure to AI sermons would reduce the engagement much more over time.
As Jackson told Judy Siegel-Itzkovich at the Jerusalem Post, “Unlike human religious elites, who profess a deep commitment to their faith which leads them to sacrifice time and material goods, robots are simply programmed to give sermons or blessings without an authentic understanding of, commitment to, or suffering for their religious group.”
True enough, and there is surely a deeper dimension here too. It’s not just that the robot/AI doesn’t have spiritual experiences. It can’t and no one can cause it to do so, and there was no thought of the matter when it was developed. The robot is intended to produce teachings in the same way that a robotic ice cream maker is intended to produce ice cream. Any relationship between the process and an unseen world is simply the imagination of the hearer.
Faith or Doubt
By contrast, if a human preacher is explaining how a spiritual experience delivered him from the hell of drug addiction — well, first, if his story is true, that deliverance is not just imagination. Faith or doubt depends on how we interpret his life-changing experience. But our responses don’t depend merely on hearing words.
In the Christian tradition, for example, the preacher’s life is his sermon. As one pastor puts it,“When listening to a sermon, what a congregation is looking for is evidence that the pastor has been with Jesus. AI will always have to — literally — take someone else’s words for it… it won’t ever be a sermon that will convince anyone to come and follow Jesus.”
According to Siegel-Itzkovich, in the Jewish tradition, rabbis would need to decide if robo-rabbis are okay. She suspects they’ll say no. They’d surely be wise to. In any tradition, it’s only fair for the faithful to respond at the accustomed gathering place by quietly replacing themselves with androids and going elsewhere for authentically human instruction.
Cross-posted at Mind Matters News.