Starting Them Young: Is Facebook Hooking Children on Social Media?

Photo by paz.ca | CC BY 2.0

Over the past few months, social media companies have come under increasing scrutiny from media critics, watchdog groups, and US congressional committees.

Much of the criticism has focused upon how Facebook and Twitter facilitated the propagation of inflammatory messages created by Russian agents during the 2016 US presidential elections, ostensibly to polarize American voters. Self-serve advertising, “filter bubbles,” and other aspects of social media have made mass targeted manipulation easy and efficient.

Yet some are voicing deeper concerns about the social, psychological, cognitive, and emotional effects of social media–particularly as they impact children.

For example, Facebook has come under attack from an unlikely group of critics: some of its own former executives. Their comments coincide with the debut of “Messenger Kids,” Facebook’s latest product. According to reports its target audience is 6- to 12-year old children. (Like most other social media apps, Facebook does not allow people younger than 13 years old to create accounts.)

Despite Facebook CEO Mark Zuckerberg’s recent resolution to “fix” Facebook in 2018, “Messenger Kids” reveals a different agenda: to scoop up a new generation of users, habituate them to the virtual life, increase market share, and develop brand loyalty in a highly competitive marketplace. Facebook’s first president, Sean Parker, acknowledged late last year that its creators intentionally designed the platform to consume as much of users’ time and attention as possible. According to Parker, “likes” and “posts” serve as “a social validation feedback loop” exploiting the psychological need for social acceptance. “God only knows what it’s doing to our children’s brains,” he said (quoted in Allen 2017).

Why would the architects of Facebook, Google Plus, Twitter, and other social media platforms resort to these techniques? Facebook’s business model is based on revenue generated from advertising. An early Facebook investor, Roger McNamee (2018), recently wrote:

Smartphones changed the advertising game completely. It took only a few years for billions of people to have an all-purpose content delivery system easily accessible sixteen hours or more a day. This turned media into a battle to hold users’ attention as long as possible. .  .Why pay a newspaper in the hopes of catching the attention of a certain portion of its audience, when you can pay Facebook to reach exactly those people and no one else?

Sean Parker and Roger McNamee aren’t alone. Venture capitalist and former Facebook VP Chamath Palihapitiya admitted last month that he regrets helping the company expand its global reach. (Facebook has more than two billion users worldwide and is still growing.)

“We have created tools that are ripping apart the social fabric of how society works. . .you are being programmed,” Palihapitiya told militarizingculturean audience at the Stanford Graduate School of Business. He added, “No civil discourse, no cooperation, misinformation, mistruth. And it’s not an American problem–this is not about Russian ads. This is a global problem. . .Bad actors can now manipulate large swathes of people to do anything you want. It’s just a really, really bad state of affairs” (Palihapitiya 2017).

Yet another former Facebook executive, Antonio García-Martínez, went public last summer with his criticism of the company’s techniques:

If used very cleverly, with lots of machine-learning iteration and systematic trial-and-error, the canny marketer can find just the right admixture of age, geography, time of day, and music or film tastes that demarcate a demographic winner of an audience. The “clickthrough rate,” to use the advertiser’s parlance, doesn’t lie. . .Facebook has and does offer “psychometric”-type targeting, where the goal is to define a subset of the marketing audience that an advertiser thinks is particularly susceptible to their message. . .Sometimes data behaves unethically. . .Facebook will never try to limit such use of their data unless the public uproar reaches such a crescendo as to be un-mutable (García-Martínez 2017).

Such statements are startling–but not unprecedented.

For years social scientists have warned about how technology can trigger behavioral addictions. MIT anthropologist Natasha Schüll, who conducted research on Las Vegas casinos over a 20-year period, learned that slot machines pull some gamblers into a disorienting “machine zone.” (Schüll’s research builds upon the pioneering work of UC Berkeley anthropologist Laura Nader, who developed the concept of “controlling processes”–how individuals and groups are persuaded to participate in their own domination.) After interviewing machine designers, casino architects and hardcore gamblers among others, Schüll concludes in her book Addiction by Design that the magnetic attraction of slot machines is due in part to their deeply interactive features. Gambling industry experts openly talk about maximizing “time-on-device.” As one consultant told Schüll, “The key is duration of play. I want to keep you there as long as humanly possible–that’s the whole trick, that’s what makes you lose” (Schüll 2012: 58; see also Nader 1997).

New York University business professor Adam Alter, author of Irresistible: The Rise of Addictive Technology and the Business of Getting Us Hooked (2017), argues that Facebook’s “like” button has a comparable effect. Every post, photo, or status update is a gamble that might result in a total loss (zero likes) or a jackpot (going viral). Twitter “retweets,” Instagram “likes,” and YouTube “views” work the same way. (It’s worth mentioning that Google’s YouTube and Amazon are also making aggressive efforts to corner the youth market.)

Last month, just three weeks before Christmas, Facebook issued a press release declaring the arrival of “Messenger Kids.” According to the company, the app was developed in consultation with parents and “parenting experts” to keep it safe for kids. Facebook has also promised to limit the collection of data on children, and to not use the app for advertising.

Such promises are disingenuous. It is clear that “Messenger Kids” is part of a long-term strategy designed to get children hooked on its social networking habits (“likes,” text messaging, filter bubbling) as early as possible. In other words, to get kids’ dopamine levels surging in the formative years–so that frequent dopamine bursts become a normal part of life. Once that happens, it will be even easier for future social media companies (which are fundamentally advertising firms) to feed billions of behavioral addicts customized propaganda.

Last year, the American Academy of Pediatrics issued recommendations outlining limits on screen time for children, noting that “problems begin when media use displaces physical activity, hands-on exploration and face-to-face social interaction in the real world, which is critical to learning” (AAP 2016).

“Messenger Kids” will potentially drive children even deeper into the virtual world. Perhaps Aldous Huxley’s Brave New World has arrived–for as the prescient media critic Neil Postman once wrote:

in Huxley’s vision, no Big Brother is required to deprive people of their autonomy, maturity and history. As he saw it, people will come to love their oppression, to adore the technologies that undo their capacities to think. . .As Huxley remarked in Brave New World Revisited, the civil libertarians and rationalists who are ever on the alert to oppose tyranny “failed to take into account man’s almost infinite appetite for distractions” (Postman 1985: vii).

Last month, Chamath Palihapitiya told CNBC that his 5- and 9-year old children get no screen time at all, even though they constantly ask for it. Bill Gates, Jonathan Ive (who designed the iPad), the late Steve Jobs, and many other well-known figures in the technology industry also placed strict limits on their children’s use of technology. They clearly understand the cognitive, psychological, and emotional fallout of the devices they helped to create. If these legends have taken drastic measures to protect their sons and daughters from the dark side of the virtual life, perhaps more of us should follow their lead.

Beyond the immediacy of our individual and familial habits and practices looms a larger social problem: the possibility of a future in which authoritarian institutions have the tremendous capacity to mold the ideas, attitudes and behaviors of audiences captured by their own compulsions.

References

AAP (2016). “American Academy of Pediatrics Announces New Recommendations for Children’s Media Use.” October 21.

Allen, Mike (2017). “Sean Parker Unloads on Facebook.” Axios.com, November 9.

Alter, Adam (2017). Irresistible: The Rise of Addictive Technology and the Business of Getting Us Hooked. New York: Penguin Books.

García-Martínez, Antonio (2017). “I’m an Ex-Facebook Exec: Don’t Believe What They Tell You Maabout Ads.” The Guardian, May 2.

McNamee, Roger (2018). “How to Fix Facebook–Before It Fixes Us.” Washington Monthly, January.

Nader, Laura (1997). “Controlling Processes: Tracing the Dynamic Components of Power.” Current Anthropology 38(5): 711-738.

Palihapitiya, Chamath (2017). “Money as an Instrument of Change.” Talk presented to the Stanford Graduate School of Business, November 13.

Postman, Neil (1985). Amusing Ourselves to Death: Public Discourse in the Age of Show Business. New York: Penguin Books.

Schüll, Natasha (2012). Addiction by Design: Machine Gambling in Las Vegas. Princeton, NJ: Princeton University Press.

Roberto J. González is chair of the Anthropology Department at San José State University and author of War Virtually: The Quest to Automate Conflict, Militarize Data, and Predict the Future (University of California Press, 2022).