A.I. could bring a sea change in how people experience religious faith

The Michigan-based company Covenant Eyes markets itself to Christians who want to stop viewing pornography.

Its software takes screenshots of a user’s screen activity, uses A.I. to scan it for pornographic imagery, and then sends regular reports to the user and a designated “ally” who has agreed to hold him accountable.

The company’s name comes from a Bible verse that reads, “I made a covenant with my eyes not to look lustfully at a young woman.”

Everyone wants technology to reflect their own worldview, and religious conservatives are no exception.

In Andrew Dana Hudson’s short story “A Priest, a Rabbi, and a Robot Walk into a Bar,” a pair of mostly secular tech experts confront the limits of their own “progressive, tech world sensibility”—both morally and in the marketplace.

When the reader meets David, the rabbinical school dropout is being interviewed by an Austin startup to train its customer service chatbots to avoid anti-Semitic language.

Those chatbots had been programmed to “learn” from humans, but along the way they were picking up subtle conspiracy language from angry, obsessive customers. David’s job is to “sort out the good Jewish-stuff from the bad Jewish-stuff,” as he puts it.

At David’s new gig, he befriends Mark, a onetime future priest who had become a “traitorous Proddy” and landed in the Texas tech scene.

They both gravitated toward the secular world but retained an intuitive understanding of religious language and values.

It’s an unusual combination of expertise, and like all good entrepreneurs, they soon figure out how they can monetize it, in the form of a business, Decen.cy, that helps companies “groom” their bots to be more well-behaved.

They play up their religious background to clients—David displays a yarmulke, Mark a clerical collar. But their performance of piety is challenged when a client asks them to program A.I. to do more than just avoid offence.

The Vatican now sells a $110 “eRosary” bracelet that encourages Catholics to pray and logs their progress as they do so.

Frank Teller is the ambitious pastor of a Texas megachurch so successful that many of its members live in “intentional communities” owned by the church.

The church uses customer service chatbots to manage basic community requests.

But Teller doesn’t like it when a bot recommends “reflection and mindfulness” in response to his granddaughter’s request for prayer.

He wants David and Mark to design a custom A.I. that incorporates conservative evangelical language and priorities: “We need A.I.s that share our values.”

Human preferences and values are baked into every tool we create, including A.I. But it’s one thing to acknowledge the stubbornness of bias and another thing to add it on purpose.

Decen.cy considers its own team’s values to be mainstream and anodyne.

Mark and David are much more comfortable debating whether yoga is culturally appropriative than programming a bot to recommend prayer in response to a mental health crisis.

Teller even wants to track his flock’s adherence to ideals like humility and chastity.

David fears his company is being hired to create an “A.I.-policed theocracy.”

Is it right to take on a client who doesn’t share its progressive values, and who in fact spouts the kind of “low-key anti-Semitism” that David was first hired to squelch?

The problem of A.I.’s ability to buttress or erode religious values is one that religious conservatives are already grappling with on their own. Continue reading

Additional reading

News category: Analysis and Comment.

Tags: , , ,