In 2016, Tristan Harris, Google’s internal ethics designer, decided to leave. At this time, he has realized that Internet technology companies such as Facebook, Google (YouTube, Google Search), and Twitter are doing business for business interests. , The use of algorithms to attract addictive product design is unethical. This has become:
"A major threat to human survival."
He thought of confrontation. In fact, during the tenure of Tristan Harris at Google, Google had a well-known "do not be evil" principle. But this criterion did not stop the algorithm from taking a deviated path based on commercial interests. Two years after leaving the company, Tristan Harris took the resigned or current technical experts from large companies such as Facebook and Google to establish an organization called the Center for Humane Technology (Center for Humane Technology), which was established in 2018 , Used to—
Fight against the addictive design theories of Silicon Valley Internet companies.
In addition to Tristan Harris, the people involved in the work of this center are the former Facebook operations manager; the former communications director of Apple and Google Lynn Fox; the former Facebook executive Dave Morin; and the early Facebook investor Roger · McNamee.
Around 2017, many senior executives of Silicon Valley companies felt that there was a moral and ethical crisis on the Internet platform operated by algorithms, and they left their companies. In 2020, Netflix filmed a documentary called "Surveillance Capitalism". : Smart Trap. In the film, Tim Kkendall, former general manager of Pinterest and former profit director of Facebook, said that he was worried that the world of algorithm manipulation would eventually lead to the American Civil War; an engineer from Google worried about the moral decay of the entire Internet industry...
Many years after his departure, Chris Hughes, who founded Facebook with Mark Zuckerberg, criticized Facebook by saying, “Zuckerberg is a good person. But what makes me angry is , His focus on growth led him to sacrifice safety and ethical bottom lines for clicks."
What caused such serious concerns, and where is the problem?
01
In 2018, Twitter CEO Jack Dorsey said at a hearing that when he founded Twitter 12 years ago, he did not expect that Twitter would become what it is today...
This passage coincides with the plot of the second episode of the fifth season of "Black Mirror", which happened to be filmed by Netflix. The male protagonist, Chris Gillhaney, was in a car accident when he watched social messages on his mobile phone while driving with his girlfriend, and his girlfriend died on the spot. When he recovered, he worked as a driver similar to UBER products, picking up customers downstairs at this social networking company called Fragment every day . Finally, one day, he received an employee of this company, kidnapped him, by threatening the employee, picked up one by one, and answered the phone with the CEO of the "fragment" company, and he just wanted to tell the CEO what the social networking site was like Kidnapped people’s lives and caused tragedies. The CEO's answer is very similar to the Twitter CEO's answer:
When I founded this company, I didn't expect it to become like this.
The CEO of "Fragment" explained a logic: After a product is formed, its trend is no longer his decision. Every update and function improvement is due to the result that users like and hope to get, which is not necessarily It's what a CEO wants.
Addiction has become the pursuit of almost all social apps.
"Surveillance Capitalism: Smart Trap" interviewed many former executives and investors of former Silicon Valley technology companies.
Tristan, a former ethics designer at Google, said that when he was doing mailbox design at Google, he realized a problem. Never in history has 50 or so 20-35-year-old California engineering designers made a decision. It can affect 2 billion people and influence them to make thoughts or decisions that they have never anticipated-when a user wakes up in the morning, Google Mail tells him that he should work like this. The meaning of work, culture, and life all come from a sneaky third-party manipulation. And within Google, no one wants to make mailbox less addictive.
Later, he made an appeal: We engineers have an ethical responsibility to examine and discuss the issue of addiction.
After Tristan issued this appeal, many engineers agreed, and then there was no more...
Roger McNamee, an early investor in Facebook, said that in the first 50 years of Silicon Valley, IBM, Google, and Intel all manufactured software or hardware products to sell to customers, and their business models were simple and good. But for the past ten years, Silicon Valley’s largest companies have been "selling customers . "
Before the Internet era, people invented things that served the public. Bicycles, cars, and calculators were simple and neutral tools. When the Internet era emerged, technology slowly evolved to ask for, lure, and even seduce humans. The tools for manipulating and profiting from people have changed from a human-active technological environment to a passively accepted and addictive-manipulating technological environment. Social media is no longer a bicycle or a car that waits for people to ride in a warehouse, but something that learns, analyzes, manipulates, and is addictive.
The executives of major Silicon Valley companies are very wary of this problem. For example, the children of Bill Gates and Steve Jobs were required to be completely isolated from electronic products in childhood.
This is exactly what the New York Times article in 2018 said, “Technologists know the true colors of mobile phones, so many of them decide to keep their children away from these things.”
Lei Man once said that the Internet is a feeding society. This kind of feeding is pushed to people after big data analysis and machine learning decision-making.
This is what Marx, Freud and others said about "alienation", that man passively accepts the world and himself as an object; "things are above man." It's like the system algorithm overrides the takeaway courier. The machine assembly line is the same as above the workers.
02
Now, Internet companies provide a large number of free products, such as FaceBook, WeChat, Douyin, and Twitter. Everyone does not pay for the social products they use. Those who pay for these social apps are the advertisers, the advertisers are the customers of the social apps, and the users become the goods being sold. This is a phenomenon:
If you did not spend money on the product, then you are the product being sold.
So the business model of companies like FaceBook and Twitter is how to make you addicted to the greatest extent, spend more time on these apps, and spend your life's time.
How much is life worth?
Harvard Business School professor Shoshana Zuboff said in an interview that in business, FaceBook and Twitter need to give customers certainty in order to sell more ads. Therefore, commercial advertising sells certainty, this "certainty." It comes from the "data analysis" behind user addiction.
This is the core of the documentary title: surveillance capitalism.
In order to obtain data, these companies need to monitor every place the user has visited, every kind of preference, every kind of behavioral data, unlimited tracking, analysis, and evaluation, and unlimited profit-seeking.
Do you like watching NBA videos? Okay, before you watch the next similar video, I will show you an advertisement for basketball shoes.
These data do not need to be watched by humans, and the machine can automatically deep learn and give predictions. This is what the Twitter CEO said, why these products have become like they did not expect at the beginning.
But what did the executives do? They made the models that predict the trajectory of people's behavior, and they wrote the program codes that can be learned automatically.
All this is for certainty: sell users to advertisers as successfully as possible. Shoshana Zuboff refers to this market as:
The largest human futures market that has never appeared.
In the past we traded pork futures and diamond futures, and now humans have become futures.
The consequence of addiction is to use algorithms to push the parts you haven't thought of to you, make them part of your thoughts, and make you feel that this is what you think.
The executives of companies such as FaceBook and UBER have all taken such a course at Stanford University: How to persuade users with technology? This type of behavior is called:
The art of persuasion.
This is an extremely deliberate act of designing people's behavior. Every time a user pulls down or pulls up to refresh, it is an algorithm re-push. They go deep into the brain, planting an unconscious habit, and letting people's brains be programmed at a deeper level.
FaceBook has done a “massive scale spread experiment” to test the use of subconscious signals on FaceBook to get more people to vote in the US midterm elections. The result of the experiment is: Yes .
This influence does not trigger the user's consciousness, but a kind of subconscious, which means that the user will not be aware of it at all.
Google and FaceBook do a lot of this kind of experiment on users every year, and users are experimental mice.
03
All kinds of addictive big data analysis and algorithm push have made the young people of Generation Z in the United States become victims.
According to data provided by New York University psychologist Jonathan Haidt, from 2011 to 2013, there has been a sharp increase in the number of girls who are hospitalized for self-inflicted wrist injuries each year.
A common phenomenon is that when people are alone and doing nothing, they always find something to do. Now that this gap is occupied by social networks and short videos pushed by algorithms, people's ability to deal with emotions is degraded.
Cathy O'neil is a big data scientist. She said that the algorithm is a point of view embedded in the code, it is not objective. Algorithms will be guided and optimized by successful business models.
Usually, in a company like FaceBook that affects billions of people, only a few people can understand the working logic of a certain program algorithm. Compared with more than 7 billion humans, it can be regarded as: humans have lost their knowledge of the algorithm system. control.
Roger McNamee said that this is like a "Trumen's World" of 2.7 billion people, and everyone has their own world. No one has discovered the essential truth of the world in which they live. Of course, in fact, there may be more people living in "Trumen's World". In China, similar products include Toutiao and Douyin.
Every push of the algorithm represents a commercial benefit.
In China, middle-aged and elderly people like to watch some fake news videos that promote and instigate. It’s not that the producers must believe these things, but that fake news spreads faster than real news. Traffic can bring them commercial realisation. Watch videos for free. The person who is the one is the commodity to be sold.
Tristan Harris invented a theory: The first stage of technology surpassing humans is to surpass weakness, and the result is addiction. Addiction will intensify anger, intensify vanity, suppress human nature, and dampen nature. The second point is the wisdom beyond human beings.
In the algorithm-fed world, all kinds of intensified contradictions are prone to appear. Everyone's opinions are becoming more and more incompatible, because everyone knows information from different feeds, and these information and knowledge are different because of different feeds. When two Americans who appear on the street at the same time because of the same social networking site's advocacy, they argue about the same thing, they are easy to find:
Oh, it turns out we didn't see the same information.
A well-known example of this intensified disagreement is the "Earth is flat" theory, which has been recommended hundreds of millions of times on YouTube, because some people like it, and the algorithm is changing the law to push people to the "Earth" every day. The content of "is flat" and "the earth is round" is regarded as a conspiracy by them, and more and more people believe in this conspiracy theory.
The climax of this incident was that in February of this year, American inventor Michael Hughes fell to his death after using a self-made manned rocket to launch a test because he wanted to prove that the earth was flat.
04
Algorithms only recommend things that people are interested in. It has no sense of morality. Its only motivation is commercial interests. It can push people false and conspiracy theories over and over again. Today you can believe that the earth is flat, and tomorrow you can believe that drinking disinfectant can kill the new coronavirus.
In 2016, thousands of people in the U.S. believed in fake news-an underground criminal organization of child molestation with the participation of the top Democrats was hidden in a pizzeria in Washington. The algorithm was fermenting this matter and kept recommending people to join. The "Pizza Door" group, until more and more people believe that it really exists.
The climax of the incident ended with a man holding a gun and going into the pizzeria warehouse to rescue the "persecuted child", and was arrested by the police halfway through.
In the United States, rumors during the epidemic and the promotion of algorithms have intensified the spread of rumors, and individuals have their own affairs. People marched in the street, some shouting "the new crown vaccine is a conspiracy", and some shouting "human genes are not suitable for the new crown vaccine". Everyone is obsessed with the information they see. A government official called it:
Tribalism in the new era.
Under the influence of the information pushed by the algorithm, a 2017 Pew Research Center data showed that in 2017 the political differences between the Democratic and Republican parties reached their highest point in 20 years.
In Myanmar, since 2017, there have been unprecedented and emotionally inflammatory speeches on FaceBook. These speeches have been continuously pushed to those ultra-nationalists who like to watch, giving birth to serious criminal acts, racist persecution, and leading to 700,000 Rohingya MSL fled Myanmar.
Jared Diamond, the author of the book "Crash," explained how society collapsed: a society promotes an unstable and negative state together, which eventually leads to the collapse of the society. Jared Diamond’s students once asked him a very interesting question:
In the 17th century, before the ecological environment of Easter Island collapsed, what was the man who cut down the last tree on the island at that moment?
It's time for us to think about this issue too.
This article is from "New Financial Vena", welcome to pay attention to the official account:
View more about The Social Dilemma reviews