The idea goes back at least to Descartes -- that the mind is not just distinct from the body but is not even in the same physical universe as the body. That is, though obviously real the mind is not the same
kind of real as the body. This is in accord with intuition to a large extent -- we can't actually touch the mind, for example (or so it seems) in the way we can touch the body. More than that, we can't even talk about the mind in the same kinds of terms we use to talk about the body, or for that matter any other physical object or phenomenon -- though the brain is clearly essential to the mind, thoughts and feelings seem distinct from any physical location or affect.
We could leave it at that, and many philosophers, then and now, have done so, more or less. This leads them into one form or another of "dualism", the idea that there are two "worlds", so to speak, or two realms of reality, or at least two distinct properties of reality -- mind and matter, putting it simply. But that's always raised the problem of how it is that two such distinct realities can interact. Obviously mind affects matter every time we raise a finger, but it can also do so on impersonal measuring devices, like EEG machines. And matter affects mind when we ingest a drug say, or, more directly, when an electric probe can directly stimulate the perception of smells or colors or other so-called "qualia". It's just those qualia or feelings that have been given the name of the "
hard problem of consciousness" by the philosopher David Chalmers (and are the basis of a
Tom Stoppard play).
Now, a fictional TV series is not an argument, and certainly not a philosophical argument. But just by showing robots that are more human, in all ways, than actual humans,
Westworld seems to make the "hard problem" at least
look easier. You could take it, in fact, as a sort of thought experiment --
what if there could be machines that look, talk, and behave like these do? Aren't they far past the crude "Turing Test" for machine consciousness that only evaluated conversational ability? After a bit, of course, we're shown just how robotic they actually are, when we see them repeating their lines and behavioral loops, or when they're in maintenance, or responding to technicians' commands. Still, it
feels like their phases in their routines are much like our own -- Ford, the creator behind them, makes this explicit at some point, in fact, when he notes that we all have our "loops" that we repeat unthinkingly -- and their response to commands can seem similar to human post-hypnotic behavioral response. And then there is the "reverie", that faraway look one of the robots gets when she's picking up a memory fragment from a past that was supposed to have been erased -- her "search for lost time" -- and which begins the process by which the robots escape the limits set by their human controllers.
All just fiction, of course, but
science fiction. When we're shown the tablets that are used to set the robots personality characteristics, for example, we see how these are based on
human personality characteristics, and the effect is to suggest not how different these machines are from the human, but rather just how mechanical, or at least physical, personality might actually be -- we're reminded that it all comes from just the physical interaction of neurons.
There have been other fictions that show how human-like robots can be. What's different about
Westworld is that it shows not just how robot-like humans can be, but how that can be an insight, and even a compliment, rather than a criticism.
So, in making a plausible display of constructed entities with
feelings -- qualia, consciousness, or whatever else might imply agency -- this series is undermining one of the last redoubts not just of dualism, but of transcendence more generally, and of a special status for the human in the nature of things. In that way, it's a significant proponent of immanence, or this world as the only world.