On Feeling Guilty About the Water

How I changed my mind about AI, and why I’m still not sure

I want to tell you about the moment I changed my mind about AI. But first I have to tell you about the guilt.

When I first started seeing the numbers about how much water AI data centers consume, I felt something I can only describe as moral nausea. I was already skeptical — I’d watched a colleague use an early chatbot to write liturgy, and what came out was thin, generic, and off-key. We prayed that slop together on Sunday mornings. I was a new pastor who would have loved to write those prayers, and honestly, I could have written them better. The whole thing felt like a betrayal… the kind I couldn’t even quite name.

So the water news landed when I was already pre-disposed to be anti. It felt like confirmation. Not only was AI producing worse work; it was also quietly drinking the world dry.

I didn’t make any dramatic pronouncements. I just… abstained. Quietly, with the particular self-righteousness of someone who believes their small choices are morally legible. My wife and I loved to deride AI and its users from the safety of our private conversations, and I loved to take that self-righteousness out into the world, looking down my nose at people learning this new tech, and savoring their stories to rehash judgmentally at home.


Then a colleague I trusted started talking about what he’d been doing with AI. This conversation was hard for me to compute. He’s sharp, careful, pastoral, and cares about humanly-derived language more than anyone I know. I could not dislike or dismiss him the way I was accustomed to doing. The conversation pried open a window in my mind, through which I could actually hear what he was describing, which was… Remarkable. He described conversations that sounded less like prompting a machine and more like thinking out loud with a well-read interlocutor. I was curious. I wanted to try it.

But I still felt guilty about the water.

I swallowed my guilt and tried it anyway. And it was fascinating. My wife wasn’t sure what to do with this development — still isn’t, to be honest. But fortunately, her love for me is less formulaic than the chatbots she’s forgiven me for conversing with. I don’t think she’ll be purchasing an AI subscription anytime soon, but she tolerates my enthusiasm, and even smiles to see me happy with my work. (Most of the time.)


I started using AI for sermon prep. Proofreading at first — I expected it to catch grammatical errors and maybe flag rhetorical clunkers. I did not expect theological feedback. I *definitely* did not expect astute and profound theological feedback. But then…

I’d been working at this sermon for days. It was talking, but it wasn’t singing. It wasn’t preaching. I wasn’t even happy enough to call it a first draft, which is why I hadn’t fed it to the chatbot yet. But it was Saturday, I was running out of time, so I accepted it: they can’t all be homeruns. And I asked my chatbot how to make what would be a mediocre sermon at least sound a little smoother.

And my chatbot told me, “One thing to note: God doesn’t really have agency in this sermon.”

You could’ve knocked me over with a feather. Because I’ll be damned if it wasn’t exactly right.

And it didn’t fix it for me. (I’ve given it strict rules about that.) It just told me the problem. So once I got my jaw off the floor, I went back in, found every place God was a backdrop rather than an actor, and fixed it. The sermon sang. It preached. AI didn’t do that — the Holy Spirit did it. But I was quite flabbergasted to learn that AI was one of the vehicles through which her holy wind could blow.

Then came the moment that really changed how I understood what this technology could be — the moment I think about most when I try to explain why I’m doing what I’m doing. It happened when I found a bunch of dead white guys sitting in my office. Walter Brueggemann, Alfred North Whitehead, and Karl Barth, casually conversing about my Sunday sermon.

I’d been noticing that some of my sermons seemed to brush up against process theology: a tradition I didn’t know well. I wanted to understand what I was actually saying. So I asked the chatbot to roleplay as Whitehead and comment on my sermon. And then I thought, I’d better request Barth too, to keep this conversation grounded and embodied. Their responses were exactly what you’d want from a good seminar: illuminating, in tension with each other, asking questions I hadn’t thought to ask. I brought Brueggemann in, to give the text some more agency. I told the tech how I’d respond if these were real people in my office. They continued the conversation.

That was the moment I understood: this isn’t about letting the machine do your thinking. It’s about finding new exercises for your own thinking that no lecture or book could provide. Not a replacement for reading and formation — something genuinely different. Something new. Dare I say, Isaiah 43 new?


But I still felt guilty about the water.

And then I watched Hank Green’s video — [“Why is Everyone So Wrong About AI Water Use??”](https://youtu.be/H_c6MWk7PQc?si=x2v4z_gdAY0-nj28) — which did not exactly let the AI industry off the hook, but did carefully explain why the numbers that had alarmed me were being misread. The water issue is real and worth monitoring. It is not, as I had absorbed it, a simple reason for individual users to abstain. The math just doesn’t work that way. Convenient, now that I really wanted to keep using this? Perhaps. But also real math that I didn’t make up. And also, still not enough to quite allow me to sleep at night.

I still hold concerns. Not sketchy math ones — real ones.

I worry about what this does to kids who grow up with it — not because AI is uniquely corrupting, but because the capacity to sit with difficulty, to not-know something long enough to actually learn it, is fragile and worth protecting. You can’t force anyone to use AI the way I described above. Many won’t. The temptation to let it do the thinking is real, and for a lot of people, it will win.

I worry about entry-level jobs and what it means when machines can do the work that used to teach people. The ministry world is not immune to this, though I think it’s less vulnerable than a lot of sectors — more on that some other time. But the corporate and creative worlds are already feeling it, and the people losing those jobs are not the ones who will be fine.

So I hold a view that most AI boosters would find too dark and most AI critics would find too conciliatory: I think AI will do a lot of good and a lot of harm, and that on balance, the harm will probably outweigh the good. I say this not as a reason to despair but as a reason to be clear-eyed.

Here’s why I’m still doing this anyway:

My abstaining doesn’t reduce the harm. It doesn’t slow the development, reduce the water consumption, or protect the jobs at risk. What it does is ensure that one more thoughtful person — someone who actually cares about these questions — is not in the room, and not using this high-powered tech to do something good. Good people refusing to use it doesn’t tip the scales toward *less harm*. It tips them toward *less good*.

At least, that’s the calculation I’ve made. I make it with open eyes. I reserve the right to revisit it.

But for now, I’m in the room. Working on behalf of churches and ministry organizations who deserve good infrastructure, built by someone who takes seriously both what this technology can do and what it costs.

Honestly, I still feel a little guilty about the water. It’s just that “water” is now a placeholder for a cost we can’t really understand or calculate. A risk the world is taking without any idea what it will mean. A risk I can’t stop the world from taking. I really think I would if I could.

So I’m taking a different risk: the risk of aligning myself with something that has potential to do so much damage. I’m risking bias in favor of capitalism-driven progress because it’s really fun for me. I’m risking the temptation to close my eyes to opportunities to slow this down or regulate it. Perhaps this is also a cost I don’t really understand. But it’s a cost I pray about every day.

> God, help me use this amazingly powerful tool for good.
> God, help me keep my eyes open for opportunities for harm reduction.
> God, don’t let me stop feeling guilty about the water.


For more on theology, technology, and the unglamorous work of building things that last, follow me on Substack.

Remi Shores is an ordained pastor in the Christian Church (Disciples of Christ), a certified data analyst, and the founder of Systems for Ministry — a consulting practice helping churches and ministry organizations with data analytics, AI workflows, grant writing, and operational infrastructure.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *