In AMC‘s new drama series HUMANS, the world as we know it has evolved to a place where humanoid robots are living amongst us. But in HUMANS, some of the robots known as Synths are starting to develop different levels of artificial intelligence, while others still operate at the most basic programmed level of care-taking and performing basic service tasks. As revealed, Dr. George Millican (William Hurt) worked to create the physical bodies of the Synths, but did not actually create their A.I. capacity. So as George’s memory starts to fade, he is dependent on his malfunctioning Synth, Odi (Will Tudor) to remember everything he cannot. Unfortunately, the government finds out that Odi is no longer fit for service and assigns George a new Synth to provide him care — Vera (Rebecca Front), which puts George an a precious predicament as he tries to hide Odi.
In a press call, star William Hurt talked about how he was drawn to this project, his personal interest in A.I. technology and whether it could be a reality soon in our future, and what other technological and ethical questions the show will explore in its first season.
What it was that first attracted you to this part and made you decide that you had to do it?
WILLIAM: Initially it was just the title — because that’s my topic. And then I realized that it was about human beings and machines, but still titled HUMANS, and it was intriguing. It’s about a topic that I’ve been interested in most of my life. Then I started reading it and I realized it was full of character and good questions and the technology that we’re using, which is so dislocating but at the same time, pretty interesting. That is an example of why the series interested me.
Out of curiosity, do you think if Synth were real, would you want one?
WILLIAM: It’s a hard thing to answer. My answer is: I don’t know. And I wouldn’t know without knowing a lot more. I think that’s sort of the key here, to ask questions about about the whole, about this situation, that human beings are incorporating in the most literal sense, technology into their being. Whether or not, you would have a robotic in your home and what level has a lot to do with what that robotic is, what it’s what it’s equipped to do. What kind of relationship you want to have with it, and those are the answers, those are questions I don’t have answers to yet. I’d have to interview the respective employee.
Ever since “Altered States” way back then, you’ve just made a lot of movies that have kind of had like a science fiction element, but where real science was at a core of it. Is that just a coincidence or is that a subject you’ve always been interested in: how science relates to people?
WILLIAM: Oh, no. It’s been a fundamental interest of mine, the whole time, since I was young.
What originally fascinated you about it and what, as you’ve gotten into all these different roles have you found fascinating?
WILLIAM: As I began to read science fiction, important science fiction, specific, most especially Isaac Asimov and began to realize that it wasn’t anywhere near as much fiction as people were thinking, or generally people were thinking, it fired my imagination to red hot. I just realized what they were talking about was anything but imaginary. So I was enthralled and always have been.
On HUMANS, what parts of this really particularly – because it’s science and it’s not very fiction, because we’re very close to it, what parts of it particularly fascinate you?
WILLIAM: The thing about HUMANS that most interested me as a specific project was the stance from which the questions about the whole subject are posed and asked, and the stance is our life today. So it’s more about, not about the future being asked, but what it’s going to be from the future standpoint. It’s the present being asked — what the future’s going to be with, by introducing that future to us now, who we are now? So it really is a vivid way of posing the questions to viewers today. What I mean is that we’re watching the television, and in that television is a family, there’s a house, and in the house is a living room, and in walks the Synth. And that living room is like our living room. That kitchen is like our kitchen. Those people are like our people, like us. And they’re going to ask the questions that we would ask if that happened right now. That’s the most vivid way to pose questions about the help, the hindrance, the invasion, the furtherance of human beings.
What intrigued me about this show and your character is that he was co-inventor of the Synth, but also now is in situation where he has almost an emotional attachment to one of the Synths in Odi, and then Vera comes into the picture. Can you describe what those relationships are and how in your character, your life is really changed because of it?
WILLIAM: Well George made a choice, an important life choice to not to go forward with designing Synth since he was involved in the engineering of the mechanics of the bodies, but not the so called minds. And what he did was make a choice to remain human in the most fragile sense of the word, the most vulnerable sense of the word, because he saw something in that experience though it was fraught with the worst risks any of us faced — the risk all of us face — morality itself, of realizing the potential — or his potential as a human being. So in other words, he went home and he lived with his wife. Then his wife passes away and then he suffers an anomaly, a cerebral malfunction and he loses some of his memory systems, which makes Odi, who was a robotic of the fundamental sort, not the sentient kind in the life that he had with his wife. And that robotic has all the memories of the event that took place while that robotic was part of their life. And that becomes George’s relationship to his wife because the Synth are not (and Synth means synthetic) remembers all those events in rudimentary fashion. And that helps George continue the life of his relationship with his wife. That’s why the emotional part exists. He knows that Odi is a machine, but he also is grateful to anything that helps keep his memories of his beloved alive. So he allows himself the responsible pleasure of projecting onto Odi some of the feelings, but at the same time, he always differentiates between real and unreal. So it’s an interesting question for all of us: how much are we going to let ourselves feel about machinery — when in fact the machinery is there to be an extension of a far more complex computer, which is the computer of our biochemistry, our bodies? It’s a big question.
Do you think we as humans can love a machine?
WILLIAM: I don’t think you can really love a machine in the way that you would love a human being, unless — and it may sound flippant to you — unless the machine becomes human. So I think that’s our task. If you want to have as fulfilling a relationship with a machine as you do with a human being, then you better make sure that, that machine is as fulfilled, or potentially fulfilled as a human being is, and that would be our task. I mean, if the machine were more human it would make sense. So are we going to have the audacity to make machines more human, which means of course, at the great of giving machines the capacity for suffering and surprise?
This series is going to bring up a lot of questions like these.
WILLIAM: I hope it does. I hope it does to the American audience as it is already done so beautifully with the British, because they responded very loud and clear to it. And I’m hoping the Americans do the same.
What was it you found challenging about portraying this character?
WILLIAM: I find it more challenging when I’m asked to play characters that aren’t so interesting, which I usually refuse. I can’t say that this was challenging because I was so furiously kind of in love it. So I just went to work very excited every day. I didn’t feel challenged in the sense that I was worried or that it was an impediment. There was no impediment here, unless it was the standard pediment that not having enough time to prepare, but which is a great one. So that would be the challenge. The challenge would be the standard idea of preparation, but in this particular case having it done in Britain and that comes the culture of theater, which I come from so there was a lot more possible there for me, lots of levels of communications.
Was there anything then about George Millican that you brought to the character that may not have originally been in the script for him?
WILLIAM: You can add anything you want as long as it doesn’t contradict anything that’s there. That’s the rule. The rule is you can invent anything that doesn’t contradict the truth of the character as described. And no character as it exists on the page, in any script I’ve ever read is a large percentage of its potential because they leave you, in a good script, they leave you creative room. So they didn’t write down how his hair, his hairdo, so I did that. There are lots of things I invented about him, using my own personality, traits and other ones that I invented for him and, but I didn’t contradict anything on the page.
You being such a fan of sci-fi. Do you think the three robotic laws that Asimov laid out are enough for machines to develop human qualities, to become human? And is this show going to explore the fact that your character George, as the creator of the Synths, gives him an elevation or potential control over the Synths?
WILLIAM: I don’t think the three rules are enough, no. I think they’re the most fundamental general guidelines. I myself think that if Synths are to allowed to become or insist on becoming sentient, that sentience will be a function of a consciousness that is in itself a function of very complex ethical interactions — ethics that are somehow rather transcribed into the root files of the hardware and software that go into the huge conference call of their mind. I think that the rudiments, the basic rules are great fundamental ethical guideline, but I think that the real interaction of ethics is as complex as a 1,500 year Buddhist conversion. And I think that’s where consciousness actually comes from — is that every human being has thousands of voices in their mind and spirit, interacting to create in a sense, the being of human being. And I think that if this singularity of sentient were to take place, it will take place as a matter of extraordinarily complex comprehension of all the interactive elements that go into the thing we call “consciousness.” And I think that’s quite a long ways off. I think it will include perhaps an unforeseen as yet dimension, or at least unforeseen by some, which is that I believe that the senses are very much a part of that interaction, that immense conversation that takes place within all beings. I think each one of those senses has, in an allegorical way, a mind of its own and comes to the party replete with its own genetic memory of everything that happened through creation. So I think that smell and sight and sound will all be sitting at the great round table of consciousness — as well as other theoretical components that may seem more complex, but in my understanding, are not. I think that this thing that we’re calling the singularity, what we call the technological singularity, there are other kinds of singularity. There’s approaching — and I think that the bare beginning of the conversation about that are starting now as the rudiments of an immature computer technology are showing us the hints of the future that may be coming, or some of the vast questions about it that may be coming. So I think that as we map the mind — and by the mind I mean something very multidimensional in not only its physical pathways but its philosophical pathways — I think we’re going to run into a marvelous, demanding, challenging nest of components. But I think Asimov in his absolute brilliance was able to reduce it to three principles that we can resonate with right now and I’m glad for him. I’m glad for his existence.
You have said previously that you don’t play people, you go for the character. Maybe you could elaborate kind of on that statement and how it pertains to your portrayal of Dr. Millican.
WILLIAM: Well, I was just talking about Asimov protocols and how they breakdown into three elegant, simple, vast ideas. I would add one more note to the comment that I made about character. I do go for character but the character as a function of the entire play, the entire screen play. So really what I want when I’m reading a screenplay, is to have the feeling when I’m finished with it — that I would basically like to go and play any character they offer me or even go for coffee on the film set. That I want, my feeling is that I want to be part of that project. So that’s the first criteria for me: is do I want to be part of the whole thing?
George and Vera, it’s almost like she’s kind of not only running the household but running George a little bit. Can you talk about that aspect of that and how machines on the series play into that?
WILLIAM: I think it raises a big question about whether machines are going to be used to inflict a police state on the people, who are not in agreement with the use of them. As in a police society, it is the responsibility for every civilian to resent the state over controlling an individual’s existence, and in lots of different way, lots of different stages of life. Just recently, for instance in England, they agreed to allow three different portions of genetic coding to be included in one in vitro fertilization. That’s a law. That came up. There’s a lot of people who are protesting against that. But what they were trying to do is the fight for it, which prevailed was based on an argument that said that there would be less likelihood of painful mutation in the egg when fertilized because it reduced by a very large percentage, the chance for mutation or handicap. And at the same time, a lot of people are asking: are we going to be creating an animal husbandry state, a boutique genetics state, and are we allowing evolution or God to do the designing of our species? So there are huge questions about this, and when Vera walks in the room, she’s like the stereotype of the police state meddling in your life when you’re losing some of your physical capacities but may not be losing them mentally. So I think that the series is designed to raise a lot of questions that I don’t think it has the presumption to answer them finally. I think this is what makes it a good series, or part of what it does.
In HUMANS, your character being an older, retired doctor who has both have physical and memory issues, will the show offer a conversation about the value of wisdom and age, and how that reflects upon humanity? And how bringing machines, which are all new and shiny in that conflict where they have the new versus the old?
WILLIAM: I don’t know how it’s going to go with the series, because I don’t know what they have planned for the series. Anything that this series would do now or in the future, I’d want to be of, if they ask me to be. I can’t imagine that the question you’re asking, which is essentially important one, wouldn’t be part of the game. I can’t imagine that all important questions would not be there. I think that’s really kind of what it’s all about.
What do you think is the one thing would separates us from A.I.s today and will it always?
WILLIAM: I don’t think necessarily always, if the components of consciousness and memory are brought together with as much reverence as nature has created us with. If you make carefully asked questions with a lot of information, you’re more likely to come up with some reasonable [equivalent]. So I do think that certainly the part of a potentially sentient machine, that would be probably easier to accomplish than consciousness itself — and that would be the library part, the history part, the access to information part. I think that’s something that right now is more developed than the collation or the synthesis part, or not synthesis, maybe that’s not the best word, but the collation or the interpolation part. In other words, I think accessing information is more possible at the moment than analyzing it well. I think the algorithm for analysis are the ones that you have to be most watchful about.
It is as scary as the “big data” today.
WILLIAM: Yes, exactly right. And then you have this this issue you between notions of privacy, which wouldn’t necessarily be what someone in the NSA would be afraid it is, which is an indulgence or a right to for a few people to harm many people. It could also be that without essences known of as privacy, that you won’t be able to create the bubbles of quiet and freedom in which human imagination can dare to go places it hasn’t been before. So that function of the notions of individuality hasn’t been talked about very much. Most of the ones that are talked about are the ones that infer the anarchic instincts or the indulgent pleasure, the ones that are irreverent and irrelevant to human society. But without that capacity to go where we haven’t been before, usually the vessel for that is a smaller vessel, the individual vessel, versus the mass level, which is a structural vessel. They’re both necessary, but if a society is defined as security on the one hand and innovation on the other, or safety and love — love of the whole and love of the individual — I think you’re cutting off half of the horizon.
To find out more about Dr. George Millican and his struggle to protect his Synth Odi and his struggle to keep his life in his own control as the state and his new Synth Vera try to impose their rigid rules control over him, be sure to tune in for all new episodes of HUMANS on Sunday nights at 9:00 p.m. on AMC.
SENIOR ENTERTAINMENT REPORTER | Tiffany covers events such as San Diego Comic-Con, WonderCon and press junkets, as well as covering events at the Paley Center in Beverly Hills. She has a great love for television and believes that entertainment is a world of wondrous adventures that deserves to be shared and explored. Tiffany is one of the newest members to the prestigious Television Critics Association and is happy to be able to share her passion for television shows with an even wider audience of fans and her fellow critics..