The Chinese Room thought experiment essentially asks whether a system that can translate language as well as a human does can be said to understand the languages in question. And if it understands, does it have a mind? Thinking about this leads me down all sorts of rabbit trails about what it means to understand a language and whether that’s even a sensible test of intelligence. Was Alex the parrot intelligent because he could speak words to get what he wanted? Can an AI want anything?
But that’s not where this story goes. The story remains mysterious. The people deliver scrolls and paper for the thing to translate. The grains of sand, the baskets, the notepad seem to represent some strangely complicated system. Toward the end, the narrator comments that it was necessary for the thing to be at least as complex as a human. I want to know what has happened in the story’s world to make such a thing the best way to translate. But that’s the part of the story the reader gets to make up.
I like this kind of thought provocation.