Nobody's Clothes Are Wet

Charlie Guthmann · 2026

He flew in Thursday morning from a city that didn't have a group house.

He had organized the local group for three years. He'd run the reading groups, booked the rooms, sent the emails, shown up early to set out chairs and stayed late to stack them. Once he'd driven forty minutes to pick up a guest speaker from the airport because no one else had a car. He had never written a forum post that got more than twenty karma. He had never been invited to a private Slack. His badge, when he picked it up at the registration table, said his name and below it: Community Building.

The conference center was new and clean and full of glass. He could see through walls into rooms where talks were happening simultaneously—four, five, six of them—and in each room the same posture: young people leaning forward, laptops open, faces lit.

• • •

The first talk he attended was about the long-run future. The speaker put up a slide: ~10²³ potential future lives. She said the words "astronomical value" and the room received them the way a congregation receives a prayer it has heard before—not with surprise but with the comfort of repetition. A boy in the second row nodded. A girl near the back typed something into a spreadsheet.

During Q&A he raised his hand. "When you say 'value,'" he said, "which framework are you—"

"Great question," the speaker said. "I think the argument is actually pretty robust across a range of ethical frameworks." She smiled. She moved to the next hand.

After the talk he found the speaker by the coffee station. "I wanted to follow up on my question," he said. "I meant specifically—are you assuming total utilitarianism? Because if you're average utilitarian the astronomical numbers don't—"

"Totally," she said. "Yeah, that's a really important consideration. You should write a forum post about it."

She glanced over his shoulder, saw someone she knew, and was gone.

• • •

At the career fair, a boy who looked barely old enough to vote was explaining to two freshmen that AI safety was the highest expected-value career path. He had numbers. He was patient and kind and thorough. He walked them through the reasoning step by step, and at each step he said, "Does that make sense?" and they said yes, because it did. Each step made sense. It was only the whole thing, taken together, that required a leap—but the leap was hidden between the steps, in the gaps, in the place where "suffering is bad" became "work on this specific thing at this specific organization in this specific city."

The two freshmen left the table with pamphlets and a link to an eighty-page career guide. They were already talking about switching majors.

He watched them go and felt something he couldn't name. Not anger. Something closer to what a parent feels watching their child leave for a school they're not sure about but can't articulate what's wrong with it.

• • •

At lunch he sat with people he'd met at previous conferences. They were kind. They asked about his group. He told them about the Friday night gatherings—no agenda, just people in a living room, talking. About the difficulty of finding a venue. About the member who'd driven two hours every week for a year and then moved away and no one had replaced her.

"That's really valuable," someone said, in the same tone people use when they mean: that's not what I would do with my time.

He asked them what they were working on. They told him: AI governance, interpretability, a new alignment research agenda. One of them had just been hired at a lab in San Francisco. Starting salary that would have funded his local group for six years. They were happy. They were doing important work. He believed them.

He wanted to ask: who told you this was the most important work? But he already knew the answer. Nobody told them. They figured it out themselves. They all, independently, following the evidence, being cause-agnostic, thinking for themselves, arrived at the same conclusion. Just like the two freshmen would. Just like the girl at the next table, attending her first conference, whose badge she'd decorated in marker: FIRST EAG!!!

• • •

In the afternoon he went to a talk about building the institutional framework for a post-AGI society. The speaker was poised and brilliant and listed fifteen cause areas on a slide. The audience received them like a new gospel—not with resistance but with the particular hunger of people who have been waiting for someone to expand the map without questioning the compass.

Afterward, in the hallway, he overheard a conversation between two men in their thirties. They were discussing which of the fifteen areas to work on. Their analysis was careful, quantitative, impressive. One of them said, "I think AI welfare might be more neglected than power concentration, but the tractability is lower." The other said, "What if we think about it in terms of—"

He wanted to interrupt. He wanted to say: Before you divide up the new map, can we talk about who drew it? Can we talk about the fact that the person who presented those fifteen areas has stock options in the industry he's advising you to shape? Can we talk about why there are no organizers on that stage? Can we talk about who gets to make these lists?

He didn't interrupt. They were kind. They were rigorous. They were trying. That was the thing. Everyone here was trying.

• • •

He found the girl—FIRST EAG!!!—at the end of the day. She was sitting on a bench in the lobby, reading something on her phone. She looked up when he sat down.

"How's your first one?" he asked.

"Amazing," she said. "I didn't know there were so many people who think like this."

"Like what?"

"Like—seriously. About helping. Like actually trying to figure out the best way, not just doing whatever feels good." She paused. "Someone in the AI safety track said something that kind of changed my whole perspective. He said that working on reducing existential risk could be equivalent to saving billions of future lives. Billions. And when you think about it that way, it's kind of—I mean, how could you work on anything else?"

"Who told you that?" he asked.

She looked confused. "What do you mean? It's just—it follows from the argument."

"Which argument?"

"The—I mean, if future people matter, and there could be astronomically many of them, then..." She trailed off. "It's just logic. Right?"

He wanted to tell her: it's not just logic. There are six assumptions buried in what you just said, and you don't know they're there, and the people who taught you the argument don't flag them, and the reason they don't flag them is not that they're hiding something—it's that they've forgotten the assumptions are assumptions. The water is invisible to the fish.

He wanted to tell her: I was you. I sat in that seat. I felt that feeling—the feeling that for the first time in your life, the world makes moral sense, the math works out, the good people agree, and you can sleep at night because someone calculated the right answer and it turns out you can save the world by being smart. I want you to know that the feeling is real and the people are real and the desire to help is real but the certainty is not. The certainty is a gift they gave you because the uncertainty was too heavy to hand to a nineteen-year-old, and that's kind, but it's not honest.

He wanted to tell her: your clothes should be wet. If you're doing this right, your clothes should be wet. You should feel the cold water. You should feel the weight of not knowing. You should carry that, not because it's fun but because it's true, and the moment you let someone take it from you—the moment you trade the cold water for a warm number—you've left the child behind.

He said: "Yeah. I think there's something to it."

She smiled and went back to her phone.

• • •

The conference center emptied out. The last panels ended. People drifted toward dinners, after-parties, group houses where the real conversations would happen, in kitchens he would never see, about priorities he would never set.

He stood in the lobby. The registration table was unmanned. A few badges lay scattered where people had left them. He picked one up. It belonged to no one he recognized. Beneath the name, the cause area: AI Safety. He turned it over. The back had the conference Wi-Fi password and a QR code linking to an eighty-page career guide.

He had come here looking for something. Not a talk or a connection or a career lead. He had come looking for the person who decided. The person who chose the value function, who set the curriculum, who drew the map. He had come to confront them—or at least to see their face, to make them say it out loud: we chose this. This is a choice. We could have chosen differently.

But there was no such person. He had looked all day. He had asked, in ways direct and indirect, and every answer pointed somewhere else. The researcher pointed to the funder. The funder pointed to the evidence. The evidence pointed to the framework. The framework pointed to—nobody. It was just there. It had accumulated the way weather accumulates, the way orthodoxy accumulates, through a thousand smart people in a thousand rooms making the same reasonable inference from the same unstated premises and never once stopping to say: who put these premises here?

That was the thing he couldn't explain to the girl on the bench. Not that someone was lying. Not that someone was in charge. But that nobody was in charge, and that was worse, because a person in charge can be confronted and questioned and voted out. A cathedral with no architect can't be anything. It just stands there, built by everyone and owned by no one, and the people inside it believe they chose to enter freely.

• • •

He walked to the glass doors. Outside, the parking lot was nearly empty. The evening was ordinary—cars, streetlights, a sky that didn't know about expected value or the long-run future or the ten to the twenty-third. Somewhere in the city beyond the parking lot, in a neighborhood no one at this conference had heard of, a child was in some kind of trouble that wouldn't fit on a slide.

He pushed through the doors and went home.

The child is still drowning. Nobody's clothes are wet.
← cguth7.github.io