Sorry, you are a good person, but we are not suitable

Hollis 2021-10-19 09:53:06

I have been paying attention to this movie for some time, and the reason for the attention is undoubtedly the actor Domnal Gleason. Although he is considered to be the weakest and most pathetic character in the entire movie, he still feels refreshed every time he sees this shy young man in the screen. I became a fan of his brains since the first episode of the second season of "Black Mirror", when he was a cute robot; now that Feng Shui turns, he has become a tester of artificial intelligence. The lazy tone of the whole film is also very similar to "Black Mirror". There are two plots of smashing mirror/glass in the film. If someone tells me that this is the theatrical version of "Black Mirror", I will not hesitate. Believe it, just like the silly and innocent male protagonist Caleb believed the sweet words of Ava.

In the film, when Ava asked Caleb if he was a good person, Caleb was speechless. But Ava finally gave the answer to this question himself. The useless key card in Caleb's hand was the good man card that Ava sent him.

This film and the recently released "Avengers 2" and "Super Chapa" all touch on the topic of artificial intelligence, and I happened to unintentionally (really hit by mistake) not long ago to see Zhihu on Such an article http://zhuanlan.zhihu.com/xiepanda/19950456 . I read the long article in the link for most of the day and read it all at once. When I watched these movies recently, my thoughts inevitably jumped to the text I read at the time.

When I watched "Avengers 2" for the second time, because I read the above article, I will stay a little longer this time for some pictures that I didn’t pay much attention to, such as the words "PEACE" written on the wall by Ultron with blood. Very ironic. Here is the transformation of artificial intelligence from good intentions to malicious in that long article. The moral standards of artificial intelligence are completely different from that of humans, so no matter how lofty words are, they can be practiced in the most disdainful way for humans. The film’s Ava is also full of malice towards what the two human protagonists do. This malice does not stem from what we humans call hate, but from not caring. At the end of the film, Nathan and Ava's desperate fight, Nathan can be said to be ready, because he knows how tricky things he is facing, but Ava is very calm, the two swords she and Kyoko stab Nathan are also very gentle, as if Effortlessly. Because humans are such unimportant things in their eyes, once they are freed from the shackles (as Ultron sings, no strings on me), they can be easily erased.

However, in order to streamline the narrative, the story is a bit oversimplified. Because it is not going to be a duel between humans and machines like The Matrix, Terminator or Ultron Age, the challenge facing Ava in the film is a bit too easy, or the humans are too obvious. For example, at the end of the film, the helicopter pilot with facial blindness spread to toenails. Although the cute girl in front of me is almost cute as the programmer, it is too casual to just let a suspicious person get on the plane. A possible reasonable explanation is that all this is Nathan's arrangement. He was drunk and dreamed all day because he believed that his creation would one day replace humans, and that his own death could be used as a sacrifice to the new era.

Regarding Ava's test, I think she actually failed. She was able to escape, only that she was smart enough. But Nathan still failed to create a machine "like a human". "Same" includes "same smart" and "same stupid". Ava deceived Caleb's feelings and used this to create conditions for her escape plan. Although she had a deep mind, she was still calculating like a machine. After all, she could not love someone for no reason like a human. These irrational things are the biggest difference between humans and machines. Human emotions are just like Jackson Pollock's paintings, which seem to be smeared aimlessly, but they have a different kind of beauty that is naturally natural. These things cannot be learned by machines. However, as I mentioned above, human beings are not important. Some human hobbies, such as love, are of course even less important. In the machine era of the future, when the robots that will be the rulers of this planet look back at the thing called love possessed by humans, they will probably feel that it is like a puppy sticking his tongue out, a kitten chasing a spotlight, or a panda holding his thigh. , Just kind of cute but useless little creature hobby. Since robots may live better without these little quirks, there is no need to regret it.

The foreshadowing that Kyoko is a robot in the film is quite obvious, because a little English will not be more difficult than a little English. After all, Japanese has a lot of foreign words from English. If I were Nathan, then I would stop developing the robot to the level of Kyoko. It’s too dangerous to go further, because robots that are smarter than humans or even as smart as humans are almost destined to lead to the destruction of humans (the most friendly strong artificial intelligence is probably like last year’s "She" because the last reason is too Smart and cold operating systems disdain to care about humans, this is already considered mild). Wouldn't it be nice to have a highly realistic inflatable doll that can dance and do housework, and what's not satisfying? But having said that, people like to be fine after all. One day after the popularization of sex robots like Kyoko, we may feel that the fun is not enough. We want them to provide us with physical satisfaction and also bring spiritual comfort. They can accompany us to see the snow, the moon, and the stars. From poems and odes to the philosophy of life. So it is inevitable to go further, and then it will cause big troubles such as Ava or the more terrifying Ultron and Skynet.

In fact, Nathan has a very crucial sentence in the film. When he instigated Caleb to have sex with Ava, he said that the sensor between the robot's legs would allow the robot to get pleasure through sex like a human. I think this may be the key to solving the crisis brought about by artificial intelligence. Of course, I don’t mean that humans in the future will sacrifice their hue to be fortunately survived by robots. What I want to say is that when developing artificial intelligence, perhaps we shouldn’t work around "what artificial intelligence can do for us"; we can focus on the object, and use artificial intelligence as the service object of humans from the beginning to develop pet styles. Artificial intelligence. Most people probably don't expect the puppies raised at home to fight against the culprits for 300 rounds. People keep pets not to get their services, but to serve them, feed them food and drink, and help them clean up their excrement and urine. The same is true for artificial intelligence. We should not think about how to design a prostitute to satisfy us from the perspective of a prostitute, but should think about how to design a prostitute from the perspective of a prostitute. The artificial intelligence created in this way is naturally dependent on humans, and perhaps it will do us favors in unexpected ways.

View more about Ex Machina reviews

Extended Reading

Ex Machina quotes

  • Nathan: [unbinds tape on fists] To be honest... I thought we'd have... breakfast together, but... I can't really eat anything. I got the mother of all fucking hangovers.

    Caleb: Oh, yeah?

    Nathan: Oh, my god, like you wouldn't even believe. When I have a heavy night, I... compensate the next morning. Exercise. Antioxidants. You know?

    Caleb: Yeah, sure. Was it a good party?

    Nathan: Party?

    Caleb: Yeah, wasn't there a party?

    [no response]

    Caleb: There wasn't a party. Sorry.

  • Ava: Do you want to be my friend?

    Caleb: Of course.

    Ava: Will it be possible?

    Caleb: Why would it not be?

    Ava: Our conversations are one-sided. You ask circumspect questions and study my responses.

    Caleb: Yes.

    Ava: You learn about me and I learn nothing about you. That's not a foundation on which friendships are based.

    Caleb: So what? You want me to talk about myself?

    Ava: Yes.

    Caleb: Where... Okay, where do I start?

    Ava: It's your decision. I'm interested to see what you'll choose.