If this film is a novel, I estimate that I can give 5 stars, because it is deep enough, but as a film, it still needs to be comfortably presented to the audience in a movie way. The first half is a little silent for a long time, if it is compressed to 40 minutes OR 1 The novel will appear more compact and the viewing effect will be better. However, small flaws do not prevent it from becoming a characteristic sci-fi movie, which is different from the products produced on the assembly line of Hollywood film factories.
I have to say, the last 10 minutes of flipping was unexpected, but calm down and think about it and you'll ask yourself why not. At first I thought that Ava would escape from Nathan's villa with the help of the male protagonist, and then wander around the world, but it was not. Compared with self-aware artificial intelligence, humans are really stupid and naive. The male protagonist thought that Ava was interested in him, and even under Ava's suggestion, he began to doubt whether he was also an artificial intelligence work. But he didn't realize until the end that this was nothing but a sympathetic plot of Ava's sympathy. He was just a tool for Ava to escape from the lab, and it could be discarded after use. And all the obstacles that prevent her from implementing her plan must be removed, such as the hapless Nathan, who finally died at the hands of her own work. The ending of the movie is really ironic. I just wanted to test whether the artificial intelligence passed the Turing test. I didn't expect that the creator and the tester were played by the artificial intelligence. The key to AI's victory is her intelligence, and her lack of human morality.
In my opinion, emotion is more of an advanced psychological function that requires a biological basis, and a psychological activity that requires chemical hormone stimulation. Even if a robot composed of only mechanical parts and logical algorithms has self-awareness and advanced thinking ability, it is very likely that it is difficult to have emotional and emotional responses, so it is even less likely to produce feelings in the human sense, including love, and further derivation is impossible. Have a sense of morality that we identify with. They should be a group of extremely rational intelligences, and what they show is egoism. Of course, this egoism is not exactly the same as human egoism. If there is a big crisis, such as the event of AI's survival, the AI's choice will be more altruistic than the vast majority of human beings, and he will sacrifice without hesitation. A large number of disadvantaged individuals, including himself, to preserve the AI as a continuation of the germline. Everything is only the optimal choice, there is no sense of preference, it looks cold, but it is pure rationality that human beings do not have. So people's morals are useless garbage to AI. Perhaps the most heinous bad person will become a bodhisattva-like compassionate heart in front of artificial intelligence, because no matter how evil people are, there will also be good in their hearts, but the irrational evaluation of good and evil is nothing to artificial intelligence. generally. As long as his rational analysis tells him that it is right, he will do it, and he will not care whether it is in line with human morality and values.
"Destroy you, it has nothing to do with you" is the classic sentence of the three-body, from the mouth of the three-body. Trisolarans and humans, who are both carbon-based life forms, have such a big gap, not to mention how different the thinking modes of another completely different intelligent body, AI, and carbon-based life forms are. It has absolutely no need to understand you. Whether it wants to destroy you or not has nothing to do with you, only reason. Destroy you as long as you need, even if you are my creator.
View more about Ex Machina reviews