Hello Zombies, a new article about your “independent” thinking.
Robert Epstein has been researching and looking at how the biggest tech companies influence human behavior, and conducting extensive monitoring projects of bias in these companies’ products, with a particular focus on Google.
Epstein, a senior research psychologist at the American Institute for Behavioral Research and Technology in California, called his findings “frightening” because of the tech companies’ ability to manipulate and change people’s behavior on a global scale.
“Now put that all together, you’ve got something that’s frightening, because you have sources of influence, controlled by really a handful of executives who are not accountable to any public, not the American public, not any public anywhere. They’re only accountable to their shareholders,” Epstein told Epoch TV’s “American Thought Leaders” host Jan Jekielek during a recent interview.
“And yet they hold in their hands the power to change thinking behavior on a massive scale, the power in close elections anyway, to pick the winner in the country after country after country.”
Related Coverage
‘Something That’s Frightening’: Robert Epstein Warns Against Big Tech ManipulationRobert Epstein: Inside Big Tech’s Manipulation Machine and How to Stop It
Epstein said that Google’s search engine shifted 2.6 million to 10.4 million votes to Hillary Clinton over a period of months before the 2016 election, and later shifted at least 6 million votes to Joe Biden and other Democrats.
“We calculated at one point that as of 2015, upwards of 25 percent of the national elections in the world were being controlled, being determined, by Google’s search algorithm,” he said.
Epstein is making these comments after conducting rigorous studies, spanning almost a decade, and managing to document ephemeral experiences of manipulation on Google and other companies’ platforms.
Epoch Times Photo
A man uses the new Google Pixel 4 smartphone during a Google launch event on Oct. 15, 2019, in New York City. (Drew Angerer/Getty Images)
He said ephemeral experiences, such as a flashing newsfeed, a search result, or a suggested video, are the ideal form of manipulation because they aren’t recorded and are hard to document.
“They affect us, they disappear, they’re stored nowhere, and they’re gone,” Epstein said. “People have no idea they’re being manipulated, number one, and number two, authorities can’t go back in time to see what people were being shown, in other words, how they were being manipulated.”
One of his team’s recent experiments includes randomly assigning people to groups, with search results favoring candidate A in one group and favoring candidate B in the other. The idea was to shift the participants’ voting preference, which Epstein thought would be for about 3 percent of the participants.
“Very first experiment we ever ran on this, we shifted voting preferences by more than 40 percent. So I thought that’s impossible, that can’t be repeated. The experiment with another group got a shift of more than 60 percent. So I realized, wait a minute, maybe I’ve stumbled on something here.”
These findings are the greatest discovery in the behavioral and social sciences in over 100 years, according to Epstein.
Layers of Biases
He said there are two layers on which people can be manipulated. One is the programmer creating the algorithm to influence results to ultimately increase the company’s revenue, and then there is the neglect of the algorithms, in which they are created and left unchecked.
“Or could be maybe they’re just not paying attention. In fact, let’s call this as I do in a new article I have coming up, algorithmic neglect, okay, so they’re just neglecting it.”
He said these algorithms don’t give equal time to candidates or anything else, they just push one item to the top list.
“So if computer programs are determining who runs the world, who runs many countries around the world, that can’t be good for humanity. They’re just not smart enough to make good decisions for us.”
Epstein said the programmers creating the algorithms have biases, so therefore the algorithm has a preference.
“Now 96 percent of donations from Google and other tech companies in Silicon Valley go to one political party. It happens to be the party I like, which is the Democratic Party. But the point is, there’s a lot of political bias in these companies.”
Dan Gainor of the Media Research Center confirms that.
“From the top to the bottom, these companies are overwhelmingly liberal, overwhelmingly pro-Democrat,” Gainor told Fox News. “At the top, they contribute to Democrat causes. At the bottom, they contribute to Democrat causes in overwhelming numbers.”
Likewise, Twitter employee donations during the most recent federal election cycle, according to data compiled by the Center for Responsive Politics show at least 89 percent of their donations went to Democrats.
Other Significant Experiment
The strongest effect that Epstein was able to create was called OME (opinion matching effect).
As an example, he said companies such as Facebook, Tinder, and so on may have a campaign to “help” users decide who’s the better candidate. The company can put up surveys that ask what the person thinks about certain issues, and after the survey, the program will tell the user which candidate matches their opinions or preferences.
Epstein said they have found that when, in experiments, they give people a quiz, and then they tell them how good a match they are for one candidate or another, “All the numbers shift in the direction of whichever candidate we said you match. They all shift. And the shifts we get are between 70 and 90 percent.”
The most critical part is that the experimenter doesn’t look at the participant’s answers when they tell them who they match.
“So opinion matching is a fantastic way to manipulate people, because you can shift people very, very, very dramatically, and they have no clue. They do not suspect any kind of bias or manipulation.”
Next Steps
Epstein said his next step with this work is to do this monitoring on a larger scale to get rid of the interference in our democracy, our lives, and with our children.
“This is the solution, which is permanent, large-scale monitoring 24 hours a day in all 50 states, doing to them what they do to us and our kids. If we do it to them, if we monitor, we capture, we archive and we expose, OK, then they will stay out of our lives.”
Epstein reiterated that although he’s a Democrat, he’s against this type of manipulation even though it favors the outcome he prefers.
“Now, I’m not a conservative. So I [could] say keep going, yeah, absolutely, I love it. But I don’t love it. Because I don’t want a private company that’s not accountable to the public deciding what billions of people can see and cannot see. The problem there is you don’t know what they don’t show,” he said.
“There are sound reasons for wanting to get these companies out of our elections, because if we don’t, it makes democracy into a kind of a joke, or at least an illusion.”
“Because it means in close elections, election after election after election, it means these companies are choosing the winner.”
Epoch Times Photo
Election workers count ballots in Philadelphia, on Nov. 4, 2020. (Spencer Platt/Getty Images)
His team is primed to do this research to find out how this type of manipulation is affecting children.
“And nobody really understands how these companies are impacting our children, especially young children. So that’s, that’s become my latest obsession is trying to figure that out.”
Through the monitoring, Epstein said, “We’re trying to figure out how the manipulation works. But most importantly, we’re trying to quantify it.” And he’s most interested now in finding out how all this is affecting children and young adults.
“Because I think that what’s really happening is that there is a cumulative effect of—not just political bias, but a value—literally a cumulative effect of being exposed to certain kinds of values, over and over and over again, on one tech platform after another. And I think that the people who are most vulnerable to being impacted by that kind of process are children.”
Readers can learn more about protecting personal information by going to myprivacytips.com or a website called Internet Watchdog. Epstein has also updated his official testimony to Congress, which can be found at Google’s Triple Threat.com and my Google research.com.
His “plea” to the public is if you are in a position to donate, please do.
“So, we’ve made a lot of progress, we need to go to the next level, the next level means setting up a permanent, large-scale, self-sustaining monitoring system in all 50 U.S. states. This is something which I think is required. It’s not optional. This is required for us, we must do this to protect our country, our democracy, our children; this must be done.”
Original article is here