Friday, June 28, 2019

"These violent delights..."


"... have violent ends." The line from Romeo and Juliette, spoken to one of the two principle robots with emerging consciousness, Delores, by her father, remembering a fragment of his persona from an earlier narrative/"build", and then repeated by Delores, in fragmented memories of her own. The first half of the point is made by Ford, in conversation, when he says that he'd originally thought the narratives the "guests" preferred would be positive, even peaceful, but that he'd been wrong, and that it was the "violent delights" that were sought after.

In one sense, this may be no more significant than children crashing their toys, but as it's portrayed it appears much more malevolent, tapping into some kind of basal savagery. That too, of course, is a common enough trope for human propensity for evil (original sin, monsters of the id, etc.), but the idea of displaying such a theme in an artificial world, in which all possibility of consequence, not to say retribution, is removed, is a brilliant way to isolate it, much like a controlled experiment. (Which I think is the reason for setting the simulation in a period of imagined relative lawlessness.) The Park, it turns out, has an advertising slogan: "Live without limits".

But then the second half of the line portends consequences, no? The second season will see to that. There is, in any case, the possibility that the players or "guests" in such a simulation are largely self-selected by their propensities. And of course there's that initial choosing of hats.

Thursday, June 27, 2019

Conscious robots and the mind/body issue

The idea goes back at least to Descartes -- that the mind is not just distinct from the body but is not even in the same physical universe as the body. That is, though obviously real the mind is not the same kind of real as the body. This is in accord with intuition to a large extent -- we can't actually touch the mind, for example (or so it seems) in the way we can touch the body. More than that, we can't even talk about the mind in the same kinds of terms we use to talk about the body, or for that matter any other physical object or phenomenon -- though the brain is clearly essential to the mind, thoughts and feelings seem distinct from  any physical location or affect.

We could leave it at that, and many philosophers, then and now, have done so, more or less. This leads them into one form or another of "dualism", the idea that there are two "worlds", so to speak, or two realms of reality, or at least two distinct properties of reality -- mind and matter, putting it simply. But that's always raised the problem of how it is that two such distinct realities can interact. Obviously mind affects matter every time we raise a finger, but it can also do so on impersonal measuring devices, like EEG machines. And matter affects mind when we ingest a drug say, or, more directly, when an electric probe can directly stimulate the perception of smells or colors or other so-called "qualia". It's just those qualia or feelings that have been given the name of the "hard problem of consciousness" by the philosopher David Chalmers (and are the basis of a Tom Stoppard play).

Now, a fictional TV series is not an argument, and certainly not a philosophical argument. But just by showing robots that are more human, in all ways, than actual humans, Westworld seems to make the "hard problem" at least look easier. You could take it, in fact, as a sort of thought experiment -- what if there could be machines that look, talk, and behave like these do? Aren't they far past the crude "Turing Test" for machine consciousness that only evaluated conversational ability? After a bit, of course, we're shown just how robotic they actually are, when we see them repeating their lines and behavioral loops, or when they're in maintenance, or responding to technicians' commands. Still, it feels like their phases in their routines are much like our own -- Ford, the creator behind them, makes this explicit at some point, in fact, when he notes that we all have our "loops" that we repeat unthinkingly -- and their response to commands can seem similar to human post-hypnotic behavioral response. And then there is the "reverie", that faraway look one of the robots gets when she's picking up a memory fragment from a past that was supposed to have been erased -- her "search for lost time" -- and which begins the process by which the robots escape the limits set by their human controllers.

All just fiction, of course, but science fiction. When we're shown the tablets that are used to set the robots personality characteristics, for example, we see how these are based on human personality characteristics, and the effect is to suggest not how different these machines are from the human, but rather just how mechanical, or at least physical, personality might actually be -- we're reminded that it all comes from just the physical interaction of neurons.

There have been other fictions that show how human-like robots can be. What's different about Westworld is that it shows not just how robot-like humans can be, but how that can be an insight, and even a compliment, rather than a criticism.

So, in making a plausible display of constructed entities with feelings -- qualia, consciousness, or whatever else might imply agency -- this series is undermining one of the last redoubts not just of dualism, but of transcendence more generally, and of a special status for the human in the nature of things. In that way, it's a significant proponent of immanence, or this world as the only world.

Wednesday, June 26, 2019

Levels

"This world" isn't meant to imply there are any other worlds, but just the opposite -- this world as the only world. It's a monism, in other words (as opposed to a dualism especially), and the "stuff" of this world is just the stuff of perception, or sensuous experience. So the levels mentioned here are simply levels of constrained awareness, not actual levels of reality:
  • Level 1: the reality as understood by the unfortunate "hosts", sealed in an invented world like that of the dream.
  • Level 2: the reality of the "guests", players in a game, demi-gods to the characters in the game.
  • Level 3: the reality of the Park creators/managers, who, god-like, make up the fictions.
And then, beyond the fiction, we have:
  • Level 4: the reality of us, the viewers, watching the creators, the players, and the characters.
  • Level 5: the reality of the TV series creators, who know (we hope) what's coming at us, the viewers -- so, re: the series, we the viewers are in a position similar to that of the "guests" re: the Park.
  • Level 6: the reality of the show's critics, who judge the creators. That's us too, of course, but only after we've seen the whole of it.
The last three levels are common to any fiction, particularly any episodic creation. But what distinguishes this series are those first three levels, that, as we see, again provide an analogy to the last three, or to our own situation sitting in front of the screen. It's what repeatedly gives this experience its slightly unsettling quality.

"Have you ever questioned the nature of your reality?"

As the featured post indicates, this is a routine diagnostic question used by the Park's Quality Assurance teams, addressed to a robot being examined. A "Yes" answer would indicate malfunction. Hence, the answer is usually "No" -- but of course the plot of the series is dependent on the ones who have in fact asked such a question.

For the "hosts", the nature of their reality is entirely artificial, with a constructed physical environment, made-up memories, personality characteristics adjustable with tablet sliders, and simple loop behavior with minimal variation, all within a fictional narrative. It's interesting to see what happens when they're extracted from that environment, or when parts of a world outside that reality are accidentally encountered. In the first case, when asked if they know where they are, the response is that they're "in a dream" -- the container for all "unreal" manifestations. (As it is for us as well?). In the second case, the evidence of their eyes, their perception, is simply ignored, the response being "It doesn't look like anything to me." (What would we say?)

For the "guests", the current reality is presented as just a startlingly realistic physical game -- but they're our representatives, and they bring with them all the baggage of the sort of ordinary lives we the viewers live. For us too, questioning the nature of our reality would seem peculiar at least, for anyone not philosophically inclined, and a possible indication of a problem -- i.e., a malfunction. Only one of the guests, William, really appears to do so, and his story defines the series' first season.

Already, though, we can see how the robot condition functions as an analogy to the human. Compare the situation with that of The Matrix, (the great Conspiracy Theory/Myth of our time), where the comparative relation of human and machine are reversed.


Tuesday, June 25, 2019

Humans and robots

Those are the two categories of characters (aside from "animatronic"-like beasts). The latter, robots, known euphemistically as "hosts", are relatively similar in status, though as the series progresses it's evident that there's a hierarchy amongst them.

The humans are more complicated. The paying customers, called "guests", are the simplest and most oblivious, presented, with a few exceptions, as morally evil. Then there are the technicians, also relatively simple, and presented as mostly amoral (again, with exceptions). Then come the mid-level managers, who are given various possible agendas, and are presented as largely cynical and ambitious. And then the high level managers, with direct links to Corporate;  the owners themselves; and finally, at the highest level, the surviving creator/designer, Ford. There are a few glimpses of people outside the world and business of the Park, but they have little or no role.

The most striking thing about this character dichotomy on the whole, however, is how it upends the conventional moral relationship of human and robot. At best, "mechanical humans" have usually been presented as "robotic" imitators, comical or threatening as the case may be. But here, at the start the viewer is deliberately deceived about the robot Teddy, presenting him as simply one of the arriving "guests". And then we're soon shown situations in which it's the robots who come across as archetypically human, and the humans who seem morally dubious, and then vile. This reversal, sharply divergent not just from the movie that gave the series its name but from virtually the entire genre of the artificial human, lies at the heart of the project as a whole.