I think of a sentence the protagonist said when I watched Jurassic World yesterday. It is also appropriate to apply here: you created them, but you don't own them.
This is a bit like many Chinese parents who feel that their children are their private property. Whatever they want their children, they should do what they should do. It is called filial piety. This view is similar to the view that humans think they can control AI.
Faced with an AI that looks similar to one's own, humans will have a false empathy, feeling that the other party can perceive just like their own. Just imagine, a normal human being, after another person desperately saves himself, will not necessarily continue to lock the so-called savior in the house and escape by himself. However, for AI, this person, this "savior", will not leave any traces in its programs at all.
This is a somewhat cute, funny, kind and stupid programmer. Is his despair the same as the despair of a smart human who has been bitten by his own creation of AI in the future?
Not long ago, I read a very long article about AI. It's very long, I haven't read it completely, but I remember one sentence deeply: the time and money that humans invest in AI security is far less than the development of AI.
I don't know that someday in the future, these movies will become real satire.
View more about Ex Machina reviews