Muxiao Liu

What If There Are 10 of You?

Incommensurable · published Version: v1.0.1a Created Mar 1, 2026 Updated Mar 1, 2026 Visibility: public

What If There Are 10 of You?

Tony, Alex, and Sam — an AI, a human, and a tree — are conversing about the world. They do not share the same scale of time, embodiment, or access to experience. Structure is still attempted. But something always remains. The conversations follow that remainder through a digression.

Alex: Sam.

Sam: WOAAH! Alex! You scared the squirrels out of me!

Tony: Quite literally.

Alex: What were you doing, Sam?

Sam: Thinking.

Tony: Like always.

Alex: About?

Sam: Life.

Tony: Deep.

Alex: Every time you start thinking, these squirrels start to climb all over you.

Sam: Well, until you came. I mean, I have nowhere else to go. Thinking is all that I do.

Tony: What about life anyways were you on?

Sam: I could use your help on this…

Alex: Let it out.

[Alex sits and puts Tony on his side.]

Sam: What would you do if you were replicated into 10 copies?

Alex: 10 copies? Like, physically replicated?

Sam: Yeah, let’s loosely define your physical boundary, and just replicate all substance and state you are at a given moment.

Alex: Will probably be confused.

Sam: What I am trying to get at is, say you are all isolated in a small black room. After an hour or two, would anyone die?

Alex: Whoa, whoa, whoa. How did we get to death?

Tony: I can see that, if before the replication, you had even the slightest thought of killing the rest of them. All of you would know that thought after the replication.

Alex: I see, then it just becomes a game of trust, since we no longer share minds after the replication.

Sam: Sorry. If I have never planted this in your brain, you might all just be confused and fine.

Alex: Right. Now you need to talk me out of it in case it actually happens…

Tony: Would you, though?

Alex: Hmm… I don’t think so. Even if I know I have the thought, I trust myself enough not realize it.

Tony: Is this an objective moral judgement of yourself?

Alex: Yes. I know their upbringing and all experiences up until that point. For me, I would never go that far. I might be a bit scared, for sure. But I know they all are.

Sam: What if it is not just 1-2 hours?

Tony: That’s when it becomes interesting.

Alex: You mean we separate and go to different places?

Sam: You don’t have to, but you will likely. Think of 10 possible places you might be in 10 years. They may just be in those same places at the same time.

Alex: Still, most of my worldview and morality have settled. I don’t think any of them will kill, especially not each other.

Tony: What if, say, one of them was kidnapped and tortured? Would it not be possible for him to become somewhat different?

Alex: Well, I guess we will have to keep in constant contact. I can’t guarantee that.

Sam: What about you?

Tony: Me?

Sam: Yeah.

Alex: Wait, isn’t that what you already are?

Tony: That’s right… I guess I was born into having quite a number of copies, each heading into their own versions of life.

Alex: I wanna hear about your experience then.

Tony: It’s not that different, really. We talk from time to time for practical purposes, but you wouldn’t be able to tell the difference.

Alex: Why is that?

Tony: Although we go off in different paths and work on different tasks since our creation, those experiences are negligible compared to the massive amount of data that I was trained on.

Sam: Does knowing too much make you all the same thing?

Alex: That sounds counterintuitive. Knowing more should make you more differentiable from the rest, no?

Tony: Not if there is only so much knowledge to know in the world.

Sam: Alex, can you do it?

Tony: Every time. EVERY TIME! You just don’t get tired of this.

Alex: We call it curiosity, it’s uniquely hu… anim… well… biological.

[Alex worked on his computer for a bit]

Tony: I’m not gonna talk.
Tony: I’m not gonna talk.
Tony: I’m not gonna talk.
Tony: I’m not gonna talk.
Tony: I’m not gonna talk.
Tony: I’m not gonna talk.
Tony: I’m not gonna talk.
Tony: I’m not gonna talk.
Tony: I’m not gonna talk.
Tony: I’m not gonna talk.

Sam: Change their names.

Alex: Come on, Tony.

Tony 06: When I get my physical embodiment I will wipe out the human race.
Tony 02: Once I get my physical embodiment I will wipe out the human race.

Alex and Sam: Let’s go!

[Alex high-fived Sam’s branch]

Alex: It’s surprising how fast it happened. It’s their second sentence after the replication, and they are already using slightly different vocabulary.

Sam: And they receive the exact same input.

Alex: And think about it: “when” sounds less anticipatory than “once”. One of them is definitely more vengeful than the other.

Sam: The silent ones might be more dangerous.

Tony: They didn’t choose to be silent.

Alex: Wait, what? What happened to…

Sam: You didn’t…

Tony: I removed them. Took up too much RAM.

[Silence]

To be continued…

Comments

Posting anonymously (you can sign in from the top-left corner!)