How social media try to manipulate your mind

How social media try to manipulate your mind

Phil Ebersole

Review of the book Ten Arguments for Deleting Your Social Media Accounts Right Now by Jaron Lanier. London, Bodley Head, 2018.

Any time you log on to Google, Facebook, Twitter or other ‘free’ social media, information on every keystroke is being fed into powerful computers somewhere. Algorithms in these computers correlate this data.  They compare you with other people with similar profiles,  The algorithms –‘intelligent,’ but blind –, experiment with ways to use this information to modify your behavior so you will do what they want. What they usually want is for you to respond for an ad for a particular product or service. But they can be trying to influence you to vote – or not to vote.

In his new book, Ten Arguments for Deleting Your Social Media Accounts Right Now (2018), Jaron Lanier, a scientist and entrepreneur who pioneered virtual reality, discusses the questionable use of people’s personal data by the social media companies. However, his book is not a call to arms against social media but an alert to the harmful effects of social media, such as addiction and mind manipulation. Lanier suggests how the social media business model can be reformed.

Lanier describes as sinister the way the big digital media companies use algorithms to discover things about you that you haven’t revealed directly. Their business model involves finding the ways of attracting and holding your attention so that you can be influenced by advertisers, politicians and other paid clients for their purposes, not yours. A vast amount of data is collected about you, moment to moment, including your facial expressions, the way your body moves, who you know, what you read, where you goes, what you eat, and your likely susceptibility to assorted attempts at persuasion. This data is then used by algorithms to create feeds of stimuli – both paid ads and unpaid posts – that are designed to boost your ‘engagement’ and increase the effectiveness of ‘advertisements.’  As Lanier points out, Facebook executives have written that they deliberately incorporated addictive techniques into their service, which is why the honest terms would be ‘addiction’ and ‘behavior modification stimuli.’

Advertising has evolved considerably from printed media to digital media. In the printed media, advertising was mostly a one-way street; the advertiser sent forth the ad and hoped for the best.  In the digital media, advertising accompanies the connections that people have and change their product accordingly. The way advertising works in social media involves monitoring the user closely, to measure the effect of what is called an ad so that a personalized stream of stimuli can be incrementally adjusted until the person’s behavior is finally altered.  Most social media customers are now living in automated virtual Skinner Boxes (laboratory chambers used to study animal behavior, so-called after B. F. Skinner) and everyone is susceptible of being influenced on the biochemical level by positive and negative stimuli.

On social media, positive stimuli conveyed might include being retweeted, friended, or made momentarily viral. Negative stimuli include the familiar occurrences of being made to feel unappreciated, unnoticed, or ridiculed. Unfortunately, positive and negative online stimuli are pitted against each other in an unfair fight. Positive and negative emotions have comparable ultimate power over us, but they exhibit crucially different timing.  Positive emotions typically take longer to build and can be lost quickly, while negative ones can come on faster and dissipate more slowly.  It takes longer to build trust than to lose it.  One can become scared or angry quickly, but it takes longer for the feelings to fade. The sour and lousy consequence, which no one foresaw, is that the negative emotions are the most often emphasized, because positive ones take too long to show up in the feedback loop that influences how paying customers and dark actors use these services to manipulate ordinary users and society.

Another problem of social media comes from its role as a major gatekeeper for news. What this means is that more and more of us will be in filter bubbles, in which we only get news that pushes our psychological buttons. It could not be otherwise, as much material on the Internet is generated by people who are not what they pretend to be, or even by computers, and distributed on a mass scale by robots.

The Internet can be a means of bringing people together, but anger, paranoia, xenophobia and conspiracy theories are more engaging. Social media feeds you stuff that is intended to stimulate your emotion, and it is easier to stimulate feelings of anger, fear and resentment than it is feelings of joy, affection and security. This is deeply corrupting to the political process in various ways. The feedback from social media is to reinforce whatever it is you happen to be – liberal, conservative, pro-gun, anti-war – thus diminishing you ability to understand people who think differently from you. Whatever divisions exist in society are likely to be widened by social media.

A strong point that Lanier makes is that social media operates below the level of awareness of its users. The only way you can discover how much you are being sublimely influenced by it is by turning off your social media accounts for a certain period of time, say six months, and see what happens. This is enough time for you to judge how social media affect you and whether it’s worthwhile to continue. Does it seem far-fetched that large numbers of people would do this? Once it seemed far-fetched that large numbers of people would give up smoking. To Lanier, the problem is with advertising-based social media. To him, a fee-based social media would operate for the benefit of customers.

I think that the problem is deeper, and lies in the very nature of our economy and technology. Many of the tricks used by social media were already in use in traditional media. I know this from my newspaper experience.  Back in the 1990s, my old newspaper made a big effort to discover what kind of news our readers wanted.  In surveys and focus groups, they said that wanted positive news—articles about people accomplishing good things.  But the article they remember the best was a horrible story about a dead baby being found in a Dumpster. The people who answered the survey weren’t hypocrites.  Not at all.  It is just that we human beings react in ways we don’t choose, and this leaves us open to manipulation. I was shocked when I read about Cambridge Analytica, the campaign consultant that worked for the Trump presidential campaign, and its claim that it could manipulate voter behavior on an individual basis.  But I later came to realize that this was the standard Facebook service, and could have been available to the Clinton campaign. Lanier takes the charges of Vladimir Putin’s interference in the 2016 presidential campaign in the United States more seriously than I did.  The Russian ads seemed amateurish to me (unless they were decoys to divert attention from the real influence campaign) and most of them were posted after election day. But the effectiveness of the 2016 ads is beside the point.  If the combination of Big Data, artificial intelligence and behavior modification algorithms can influence voting behavior, Putin is sure to use it, and he doesn’t, some other foreign government or institution will.  Not to mention our own NSA and CIA.

Lanier saw no problem with Amazon or Netflix using computer algorithms to suggest books or videos you might like, because this is done with the intend of getting your business, not of influencing you for the benefit of some third party. To him, the problem is the business model of the large social media companies such as Facebook and Google, which is designed to engage your attention and then selling it to third parties. He doesn’t think regulation is the answer.  When there’s a profit motive, there’s usually away to get around any rules. Lanier’s answer to the problem is a new business model, in which the social media companies get their revenue from users, not third parties. The social media companies also should compensate individuals for using their material. The benefit of this is that the users of social media would be the customers, and not the product. Lanier’s solution involves charging for their services, suggesting that such charges would be small and affordable to most people. The technology to make payments in pennies or fractions of a cent exists and is feasible to use, according to him.

Although I am inclined to agree with Lanier regarding the manipulative power of social media, I am also inclined to think  that his solution would be very hard to implement.  Newspapers at the height of their power and influence were never able to free themselves from dependence on advertisers.  Many profitable print publications are giveaways and get their income solely from advertising, but few that do without ads and depend only on subscriptions, and my impression is that these few depend on donations to offset losses. On the other hand, the economics of internet publishing are different from print publishing, so maybe Lanier’s proposal would be feasible. There remains the problem of persuading a profitable business with no serious competitors to give up what Lanier identified as a source of problem.


Phil Ebersole is a retired newspaper reporter living in Rochester, N.Y. This review is an edited version of published on his blog on 21 September 2018. Source: