Netflix's Pitfalls

Aurelio 2022-03-21 09:02:24

Suppose there is a traditional car company, and a series of links from the beginning of R&D, testing, manufacturing, listing, and after-sales constitute the life cycle of the car product. At every step of the life cycle, there is a huge amount of data. And all steps should be recorded to facilitate subsequent research and improvement. Before or during the company's R&D or after its launch, relevant departments of the company, such as the Marketing Department, will conduct market and user surveys to understand user needs and feedback in order to provide better products and services, as well as subsequent improvements. This is a traditional commodity development process. Car companies want to develop the best possible products so they can get the most out of it. This process is usually familiar and sounds reasonable.

Products from tech companies go through a similar process. A big difference is that users are involved in the product almost before it is formed, but only when it is tested. The data of all interactions between users and the product will be recorded as the basis for product development and improvement. This difference will cause many problems. Traditional industries often obtain user data based on questionnaires, phone calls, emails, etc. In these methods, users already know that the information they provide will be used for the company's product development when they answer, so users is informed and consented. The data recorded and collected by tech companies occurs in all the processes a user interacts with a product, without the user knowing that their data will be used in the development of the product, let alone consenting. At the same time, the user information required by traditional products is often relatively public, such as which model you like, how many people use it, etc. The privacy involved in this information is relatively low. But the vast majority of technology products need is relatively private information about users, such as what you searched for, who you are interested in, and so on. The right to know and privacy of users involved in technology companies is a legal and ethical issue that has surfaced for many years, and various parties have been mediating for years to find a solution.

This series of processes is called monitoring in the film, and in order to make the concept of monitoring more convincing, the film designs some scenes, such as using people to simulate the process behind the product, as if there is a set of equipment for each user To monitor what he did and saw, and even what he was thinking. Judging by the simulated scene in the film, it is indeed surveillance, and it is very scary. But considering Facebook's roughly 3 billion monthly active users, it's impossible for each user to have a team to monitor his every move through many big screens to speculate on his mind. Therefore, this comparison is untrue and exaggerated, and does not conform to the principles of documentaries at all. At the same time, guessing the thoughts of others will involve guessing right or wrong. The so-called model behind this link will accurately predict the user's psychology and even change the user. In other words, the user is naked in an unsuspecting world where the enemy is in danger and the enemy is in the dark, and he may become a mermaid at any time. If these scenario simulations are excluded and compared with the development process of traditional products, the so-called monitoring is actually recording and collecting data. Recording and collecting is also uncomfortable because it involves users' right to know and privacy, which need to be respected by tech companies, but it's far less chilling than being monitored. The term surveillance sets the tone for this "documentary", which is to be terrifying. In order to make the emotion of fear stronger, the film finds all kinds of people to tell how technology companies are doing "human futures" transactions. Usually, in order to be fair and objective, documentaries will interview many people from different perspectives. When many different perspectives collide with each other, the limitations and disadvantages of one perspective will be exposed, and this limitation can be explained and improved by other perspectives. Similarly, Problems with this view will be fixed by other views and so on. This way the audience will see a more complete picture.

Behind Netflix is ​​a series of models that calculate how to advertise to potential users, recommend users, get users to subscribe, and more. From this perspective, Netflix is ​​also a technology company as described in this film. In fact, it's a tech company, and it's a top tech company. A big difference between Netflix and other technology companies is that other technology companies, as a platform, their products themselves do not produce opinions, and their opinions come from different opinions of different users on the platform. But Netflix is ​​producing movies, so it's producing opinions, and more than that, it's producing emotions. Netflix's point of view and sentiment in this film is that tech companies are spying on you, and it's pretty scary. Objective recording is emphasized as subjective monitoring, so it has not reserved space for users to think for themselves from the beginning, and beyond this point of view and emotion, this film, as a product of Netflix, does not introduce multiple parties like other platforms. Opinions and emotions, giving users the space to judge for themselves. Conversely, Netflix is ​​taking advantage of the fact that fear and anxiety can spread quickly among users to make its products possible. So Netflix stands on the moral high ground in this product and criticizes the behavior of other tech companies, while in the shadows it is actually doing the same or worse.

View more about The Social Dilemma reviews

Extended Reading
  • Desiree 2022-01-03 08:01:47

    Gather the crowd to suck, which strong Netflix brought this group of people to poke others’ backbones, but they didn’t count their 5 second jump to the next video.

  • Muhammad 2022-01-03 08:01:47

    Through massive data and individual user behavior records, individual end users are no longer special. Your behavior can be predicted, your psychology has been mastered, and you have nowhere to hide. In front of IT giants, you are just a profitable tool. --You think that what you want to buy is actually what the giant wants to sell, and what you want to do is what the giant suggests you do. The next part tells that in the network environment, false information spreads faster, and artificial intelligence recommendations play an accelerated role. The intelligence of the recommendation system is that the information sender becomes polarized, and at the same time, the received information becomes no longer comprehensive. . . .

The Social Dilemma quotes

  • Justin Rosenstein - Facebook, Former Engineer: We live in a world in which a tree is worth more, financially, dead than alive, in a world in which a whale is worth more dead than alive. For so long as our economy works in that way and corporations go unregulated, they're going to continue to destroy trees, to kill whales, to mine the earth, and to continue to pull oil out of the ground, even though we know it is destroying the planet and we know that it's going to leave a worse world for future generations. This is short-term thinking based on this religion of profit at all costs, as if somehow, magically, each corporation acting in its selfish interest is going to produce the best result. This has been affecting the environment for a long time. What's frightening, and what hopefully is the last straw that will make us wake up as a civilization to how flawed this theory has been in the first place, is to see that now we're the tree, we're the whale. Our attention can be mined. We are more profitable to a corporation if we're spending time staring at a screen, staring at an ad, than if we're spending that time living our life in a rich way. And so, we're seeing the results of that. We're seeing corporations using powerful artificial intelligence to outsmart us and figure out how to pull our attention toward the things they want us to look at, rather than the things that are most consistent with our goals and our values and our lives.

  • Tristan Harris - Google, Former Design Ethicist: How do you wake up from the Matrix when you don't know you're in the Matrix?