What's Too Far: Ethical Guardrails for AI in Ministry - Sermon Shots

What’s Too Far: Ethical Guardrails for AI in Ministry

Picture of Ed Stetzer
Ed Stetzer

Author, Church Strategist and Professor

Picture of Corey Alderin
Corey Alderin

CEO and Co-founder of Sermon Shots

As I’ve discussed in Part 1 and Part 2 of this series, AI offers incredible opportunities to assist church leaders in the ministry functions of the local church. Some argue that AI application in ministry is dangerous––that in a society already disconnected and isolated from one another, AI can create an unhealthy chasm between church leader and church people. Similarly, there is a great deal of concern over how AI will encourage pastors to circumvent the necessary time in the Word and in prayer in the process of sermon writing.

On the other hand, others are very eager and have high expectations about the possibilities that AI affords us in ministry. It promises the potential for increased efficiency, better processes for church systems, and more readily available resources for our congregations.

Whether you have high hopes or serious cautions about AI in ministry, it’s essential that we consider both its possibilities as well as its potential dangers. In Part 3 and Part 4 of this series we will look at some of the ethical considerations of AI use in ministry and offer some practical guardrails to consider.

A Helpful Metaphor

Think of AI as the sous chef of your ministry kitchen. A good sous chef or “second chef” is not the head chef of a traditional kitchen. He or she does not have final say over the menu, set the direction of the restaurant, and should never commandeer the executive chef’s vision for how the kitchen is to function.

Instead, a good sous chef has several important responsibilities including managing kitchen operations, training staff, quality assurance, setting up smooth workflows, and more. They’re not the primary catalyst of the kitchen’s success but a helpful, integrated component of that success. But it is the head chef (that’s you, pastor) who sets the standards and ultimately takes responsibility for everything that leaves the kitchen. It is their vision for the restaurant that sets the tone for how everything is run in the kitchen.

Like a sous chef, AI can function as a crucial catalyst to a vision already set by church leadership. It can assist in automating many of the operations of your church, including your church’s visitor follow up workflows and even your staff’s calendar and project management. It can assist in the development of your staff and even your congregation by curating resources that would otherwise be difficult to find. AI can ensure feedback loops are in place to assess the effectiveness of ministries, events, outreaches, and more.

The Limitations and Biases of AI

AI is far from infallible. The U.S. Copyright Office recently gave a ruling that it would no longer issue copyrights for non-human generated works. Since the boom of generative AI resources like Claude and ChatGPT, people were having AI write a book for them and attempting to secure a copyright to secure it as their own work. This ruling is the first step toward securing ethical and legal frameworks around AI and protecting intellectual property.

Generative AI pulls its data from the internet without discernment, opening the possibility for error-ridden work and plagiarism. There are a host of open questions in public discourse around how a person’s image, art, music, written work, and more can be used as source material for generative AI and whether the intellectual properties of artists and authors are being infringed. Much like the boom of filesharing services like Napster and Limewire in the 2000s, the ethical and legal considerations of AI over which we wrestle now (and will inevitably continue to do so in the future) are reactions to how technology is challenging our traditional understandings of what constitutes “mine.”

What’s more, because AI sources content from the web––a destination filled not only with the vast reservoirs of knowledge but also violent, racist, and misogynistic content, there is significant concern for how the employment of AI in anything from image generation to hiring practices can disproportionately harm people from groups that are subject to gender and racial stereotyping.

Generative AI resources like Claude and ChatGPT both put disclaimers on their sites, but it is more important that church leaders, and Christians more broadly, consider how these biases and errors in data must be met with wisdom and discernment from those who use AI applications for the sake of the gospel.

An Ethical Decision-Making Framework

When evaluating new AI applications in your ministry, consider these questions:

Question 1: Is this task appropriate for automation?

AI can automate or streamline a host of ministry functions. But in that automation, what is being forfeited? Do we lose a necessary personal touch with people? Will the introduction of AI promote a culture of haste and an exaltation of productivity? Or will it give you and your people margin to focus on what really matters?

Question 2: What are the potential risks if errors occur?

Have you established safeguards, reviews, and more to ensure that the possibility of AI error is mitigated or that there is accountability in place to check for error?

Question 3: What are the unseen consequences?

While we tend to think of AI as disembodied and efficient, it actually requires enormous amounts of energy to power the necessary servers for it to function, contributing to carbon emissions and other environmental, economic, and sustainability concerns. In your use or your church’s use of AI, have you taken the time to consider the unforeseen or unknown consequences that reliance upon a particular AI application might introduce?

Question 4: Will this enhance or potentially diminish personal connections?

Will the AI application take you farther away from people or will it provide freed space for you to spend more time with people? AI can streamline administrative tasks to open space in a church staff’s work week to focus on pastoring people—but it can also streamline the people “tasks” so we wind up spending less time with people. Let AI focus on the minutiae so you can focus on people.

Question 5: Are we scaling the use of AI at a level and pace that is appropriate?

Are we becoming overly enthusiastic about AI applications or are we discerning where integration is most appropriate and at a pace that is appropriate for our church?

If you can’t confidently answer these questions or feel uneasy about any of the answers, it’s likely a sign to either modify your approach or reconsider that particular use of AI.

These questions serve as guardrails for your AI implementation strategy. Think of them as a spiritual and practical assessment framework. They can help you as a leader as well. For instance, if you need to have an important conversation with an over-bearing leader of a team, you might ask AI to give suggestions as to how to approach the conversation. You might be surprised at the thoughtfulness of the reply.

The key is to ensure that every AI implementation aligns with your ministry’s values and enhances rather than compromises your ability to serve effectively. You are the “head chef” of your church’s ministry kitchen. AI makes for a great sous chef, but take care that you continue to set the vision for the restaurant.

Ed Stetzer uses AI every day and is a partner with Sermon Shots.

Facebook
Twitter
LinkedIn

Would you like to receive 2 free clips?

Sign up today to receive your complimentary clips and start exploring our premium content.

Image of bible graphics coming out of a screen

More to Explore

Here are some other blogs you may find interesting.

The Ethics of AI in Sermon Writing

As I’ve discussed in Part 1 and Part 2 of this series, AI offers incredible opportunities to assist church leaders in the ministry functions of