The Social Media Industry as a Commercial Determinant of Health

Document Type : Viewpoint

Authors

1 Faculty of Public Health & Policy, London School of Hygiene and Tropical Medicine, London, UK

2 School of Nursing and Health Studies, University of Washington Bothell, Bothell, WA, USA

Keywords


Introduction

Social media has formed a key area of concern for public health. In recent years, elections across the world, the Black Lives Matter protests, and the coronavirus disease 2019 (COVID-19) pandemic have highlighted major social media platforms’ roles in amplifying and inadequately moderating mis/disinformation, racism, sexism, and xenophobia. There is growing attention to vested interests, such as health-harming industries, utilizing social media targeted marketing opportunities to promote their products and shape public and political discourse.1 Mental health concerns associated with social media use, such as body image issues,2 are also increasingly reported. Taken together, social media can have direct impacts on users and indirect impacts to societies by undermining key determinants of health.

Unfortunately, social media-related public health concerns are often attributed to the decisions or actions of users or considered by-products of platform usage. The role of social media platforms themselves, and the companies that design them, is rarely considered. In many cases health research treats platforms as a tool for gathering data to investigate online trends, or as a partner for carrying out research or interventions. These approaches do not allow us to understand the design and purpose of such platforms themselves as drivers of health outcomes, or the potential conflicts of interest between public health and social media companies.

Social media companies are reliant upon advertising for revenue, thus, they prioritize strategies that retain user attention to increase advertising opportunities, often deploying algorithms to suggest content specific to users’ interests. However, critics argue that the algorithms underpinning content promotion may contain or lead to harms, and that social media companies may be aware of the harms, but do not act on them because they are profitable. Summarized by Meta (then Facebook) whistleblower Frances Haugen: “Facebook has realized that if they change the algorithm to be safer, people will spend less time on the site, they’ll click on less ads, [and] they’ll make less money.”3

We argue in this viewpoint that social media platforms’ business and political practices are consistent with the characterizing features of the commercial determinants of health (CDOH). Kickbusch and colleagues define the CDOH as “the strategies and approaches used by the private sector to promote products and choices that are detrimental to health.”4 Here, we outline the predominant reasons to recognize social media producers as a CDOH. In making this case we draw on Meta as our primary example, given the widespread use of its platforms (Facebook, Instagram, and WhatsApp) while also recognising the diversity of companies, actors and platforms that comprise current social media landscapes.


Reasons to Recognize the Social Media Industry as a Commercial Determinant of Health

 

Addictive Platform Design Features Are Associated With Mental Health Consequences

According to former social media developers and social scientists, platforms owned by Meta such as Facebook developed their interfaces, algorithms, and features to elicit addictive behaviors and retain user attention for advertisement views.5 The addictive features of the platform, such as “endless scrolling,”6 and subsequent high user engagement, have led to associations of social media use with mental health symptoms or distress, particularly among young people.7 Social media platforms refute linkages between their business practices and public health concerns, while seeking to hide evidence about the potential harms of their platform. For example, in September 2021, an investigative report by the Wall Street Journal found that Meta (then Facebook), which owns Instagram, conducted research showing that Instagram usage is associated with negative body image, particularly among girls and young women, but kept these findings hidden and downplayed the risk to the public.8

 

Amplification and Proliferation of Dis/Misinformation and Forms of Hate Speech

Social media platforms enable the spread of misinformation and forms of hateful speech, rhetoric, or polarizing content through limited content moderation9 and algorithms that match users to content, groups, or pages based on their interests or previous media viewed, to maximize engagement and advertising revenue.10 This creates “echo chambers” exposing users to content that reinforces their perspectives or confirmation biases.11 Advocates have drawn attention to the limited efforts made to prevent the spread of content with misinformation or hateful sentiment in algorithm-promoted or recommended content3 – thus, if a user is viewing misinformed or hateful content, platforms such as Facebook may suggest additional, related content. Misinformation and its associated difficulties for public health officials have been demonstrated during the COVID-19 pandemic, which the World Health Organization (WHO) labelled an “infodemic.”12 Avaaz – a United States non-profit activism group – released a report that found that misinformation related to vaccines, masking, and social distancing was viewed approximately three billion times on Facebook by July 2020.13 It has been claimed that spread of disinformation has also incited or excused violence. The military organizers of the Rohingya genocide in Myanmar allegedly used Facebook as “a tool for ethnic cleansing” through anti-Rohingya disinformation campaigns.14

 

Research Control and Funding

Many social media platforms implement strict controls on their data for research purposes. To access data, researchers must often apply for access, stating their personal information for consideration, whereupon platforms can choose whether to grant access. For example, Meta has offered research tools such as CrowdTangle. After access is granted, Meta limits information collection and restricts access to key metrics such as reach or impressions. This creates significant, arguably intentional, difficulties in monitoring social media trends and activity. The New York Times reported that a former CrowdTangle overseer left Facebook because the “company does not want to invest in understanding the impact of its core products” and “doesn’t want to make the data available for others to do the hard work and hold them accountable.”15 Publicly available tools, such as the Meta Ad Library, offer only basic information. Meta does not allow scraping on their platform using tools outside their public tools and can withhold or remove access to research tools which likely has a dampening effect on what researchers can study and publish. In July 2021, Facebook removed the accounts of misinformation researchers who were scraping political advertising content, which is unauthorized by the platform.16 Social media platforms also fund research opportunities related to public health concerns exacerbated by their platforms, such as misinformation.17 The large and growing body of evidence of other health-harming industries such as the sugar lobby concealing and funding research advancing their interests, or ceasing research that might be harmful to revenue,18 should be an important caution and potential conflict of interest in such training or funding programs.

 

Targeted Marketing Opportunities and Surveillance Data

Social media platforms provide powerful targeting tools and data for businesses, special interest groups, and politicians to target defined demographics with business or political ads. Meta allows targeting of specific cities, communities, ages, genders, education levels, job titles, interests, and consumer behaviors (previous purchases made). Many health-harming industry actors, such as alcohol companies, use social media platforms to promote their products to defined groups. There are voluntary or legal commitments across such industries and by most social media platforms to not target specific groups (such as children) and prohibit the marketing of certain products (tobacco).19 However, these restrictions are demonstrably circumvented20,21 and promotion still occurs through direct-to-consumer marketing, peer-to-peer marketing, or influencer marketing. Industries may also utilize social media for political and social purposes related to the regulation of their products, such as preventing taxes on certain products including sugar-sweetened drinks, where content in advertisements may be misleading due to lack of Facebook oversight on truthfulness in political advertising.1 Finally, social media platforms may sell data, which is then used by health-harming industries for advertising purposes.

 

Coalition Building With Health Organizations

Social media platforms invest in developing relationships with health-related organizations across the world. Notably, during the COVID-19 pandemic, the WHO announced they were partnering with Meta platforms and Viber to deter misinformation and deliver evidence-based information to the global public.22 Meta (then Facebook) provided the WHO $120 million in advertising credit to correct misinformation. The relationship with the WHO and the specific focus on addressing misinformation spread is paradoxical. Meta’s platforms, business decisions, and algorithms are in part responsible for the spread of misinformation about COVID-19 – including its failure to ban anti-vaccination advertisements until approximately seven months after the pandemic declaration.23 The promised actions taken by Meta – including removing specific kinds of content and promoting science-informed information – do not change the root causes of misinformation spread. Indeed, addressing the root causes of these issues, such as algorithms and lack of enforced content moderation, would likely negatively impact Meta’s business model. Thus, partnering with the WHO enables Meta to receive positive publicity without risk to their revenue streams or threat of external regulation.

 

Corporate Social Responsibility Initiatives

Similar to health-harming commodities industries, social media platforms utilize corporate social responsibility (CSR) initiatives to advance their interests. A notable CSR initiative undertaken by Meta is ‘Internet.org,’ which was later rebranded into the ‘Free Basics’ program, which brings basic-level internet connectivity to countries or areas with low internet access, arguing it is necessary for economic development.24 However, the program primarily lets users access the internet through Facebook and Meta-related products, thus introducing controversy over potential gatekeeping of the internet and further exposure to misinformation. Additionally, despite being represented as a philanthropic endeavour, Meta stands to profit from the recruitment of billions of persons not yet accessing the internet. The Free Basics program is heavily criticized – described as “digital colonialism”25 and accused of “us[ing] disadvantaged populations and unregulated territories for digital experiments and data extraction.”24 As of July 2019, there are 28 African countries currently accessing the Free Basics Program.

 

Promotion of Self-regulation Discourse

In response to the calls to regulate social media platforms due to the rapid spread of misinformation during the COVID-19 pandemic and recent elections, platforms such as Facebook announced voluntary actions to appease critics and limit regulatory interventions. Initiatives of note include investing in third-party fact-checking services and displaying warnings or removing content proven to be misleading or false. In 2020, Meta (then Facebook) also launched the ‘Oversight Board’ – which it described as an independent panel to determine whether content removal decisions are justified, though still under their funding control. Commenting on the future of Section 230 in the United States – the policy that shields social media platforms from legal liability for content posted by their users – Zuckerberg argued that “[i]nstead of being granted immunity, platforms should be required to demonstrate that they have systems in place for identifying unlawful content and removing it. Platforms should not be held liable if a particular piece of content evades its detection.”26 Facebook has most of the systems mentioned already implemented, yet as many point out, these systems have failed to adequately address the social harms of its platforms.9


Conclusion and Future Research

This article argues to recognize the social media industry a CDOH due to the direct and indirect consequences of their products and actions. Platforms directly impact users’ health through their products, which are associated with mental health concerns and contain addictive features. The products offered by social media platforms, such as targeted advertising, are more powerful than traditional mechanisms for which regulations were developed. Health harming industries can reach targeted audiences with specific messaging to sell products or protect their interests. The structure of social media, including user data collection, and subsequent ad targeting, compromises individual agency.27 Indirect health consequences such as the spread of dis/misinformation, erosion of democratic values and processes, and third-party data access and surveillance, negatively impact broader determinants of health. This context suggests a conflict of interest between social media platforms’ profits and public health, demonstrating the need for social media industry regulation. However, limited transparency from social media platforms, regarding issues like the data they collect and how it is used, makes regulation difficult.28 This is exacerbated by the deployment of strategies common to other industries, such as controlling data availability, building coalitions, using CSR, and promoting self-regulation. The similarities between the social media industry and other health harming industry strategies to protect profits underscore the need to develop a cohesive systems approach across industries29 and adopt integrated, rather than siloed, regulation strategies.30 The lack of regulation of social media enables other industries to abuse such platform tools, amplifying the public health concerns.

Moving forward, we encourage public health researchers, and particularly those using social media in their research, to carefully assess the complex health impacts of social media technologies and the nature of the business structures underpinning them. The reasons we outline to consider social media a CDOH are introductory and further exploration is warranted. Future research is needed to analyze social media corporate political activity, specifically lobbying activities, research funding and influence, data surveillance practices, coalition building with trade groups or health-related organizations, and social media expansion activities in low- and middle-income countries in order to identify effective governance strategies. In summary, we urge recognition of, and further research on, social media companies as a key CDOH of health in the 21st century.


Acknowledgements

The authors kindly thank Prof. Mark Petticrew, Dr. Benjamin Hawkins, and Dr. May van Schalkwyk for their feedback, guidance, and suggestions. The authors also thank the manuscript reviewers for their insightful and helpful comments.


Ethical issues

Not applicable.


Competing interests

Authors declare that they have no competing interests.


Authors’ contributions

All authors contributed to manuscript conceptual design. MZ wrote the manuscript. All authors edited and approved the manuscript.


References

  1. Zenone M, Kenworthy N. Pre-emption strategies to block taxes on sugar-sweetened beverages: a framing analysis of Facebook advertising in support of Washington state initiative-1634. Glob Public Health. 2021:1-14. 10.1080/17441692.2021.1977971.
  2. Fioravanti G, Bocci Benucci S, Ceragioli G, Casale S. How the exposure to beauty ideals on social networking sites influences body image: a systematic review of experimental studies. Adolesc Res Rev. 2022. 10.1007/s40894-022-00179-4.
  3. Pelley S. Whistleblower: Facebook is misleading the public on progress against hate speech, violence, misinformation. CBS News. October 4, 2021. https://www.cbsnews.com/news/facebook-whistleblower-frances-haugen-misinformation-public-60-minutes-2021-10-03/. Accessed March 14, 2022.
  4. Kickbusch I, Allen L, Franz C. The commercial determinants of health. Lancet Glob Health 2016; 4(12):e895-e896. doi: 10.1016/s2214-109x(16)30217-0 [Crossref] [ Google Scholar]
  5. Andersson H. Social media apps are “deliberately” addictive to users. BBC News. July 3, 2018. https://www.bbc.com/news/technology-44640959. Accessed May 27, 2021.
  6. Montag C, Lachmann B, Herrlich M, Zweig K. Addictive features of social media/messenger platforms and freemium games against the background of psychological and economic theories. Int J Environ Res Public Health 2019; 16(14):2612. doi: 10.3390/ijerph16142612 [Crossref] [ Google Scholar]
  7. Keles B, McCrae N, Grealish A. A systematic review: the influence of social media on depression, anxiety and psychological distress in adolescents. Int J Adolesc Youth 2020; 25(1):79-93. doi: 10.1080/02673843.2019.1590851 [Crossref] [ Google Scholar]
  8. Wells G, Horwitz J, Seetharaman D. Facebook Knows Instagram is Toxic for Teen Girls, Company Documents Show. Wall Street Journal. September 14, 2021. https://www.wsj.com/articles/facebook-knows-instagram-is-toxic-for-teen-girls-company-documents-show-11631620739. Accessed September 23, 2021.
  9. Carlson CR, Rousselle H. Report and repeat: investigating Facebook’s hate speech removal process. First Monday 2020;25(2). 10.5210/fm.v25i2.10288.
  10. Sullivan M. Facebook is harming our society. Here’s a radical solution for reining it in. Washington Post. October 5, 2021. https://www.washingtonpost.com/lifestyle/media/media-sullivan-facebook-whistleblower-haugen/2021/10/04/3461c62e-2535-11ec-8831-a31e7b3de188_story.html. Accessed October 6, 2021.
  11. Cinelli M, De Francisci Morales G, Galeazzi A, Quattrociocchi W, Starnini M. The echo chamber effect on social media. Proc Natl Acad Sci U S A 2021; 118(9):e2023301118. doi: 10.1073/pnas.2023301118 [Crossref] [ Google Scholar]
  12. World Health Organization. Managing the COVID-19 Infodemic: Promoting healthy behaviours and mitigating the harm from misinformation and disinformation. https://www.who.int/news/item/23-09-2020-managing-the-covid-19-infodemic-promoting-healthy-behaviours-and-mitigating-the-harm-from-misinformation-and-disinformation. Accessed July 17, 2021.Published September 23, 2020.
  13. Avaaz. Facebook’s Algorithm: A Major Threat to Public Health. August 19, 2020. https://secure.avaaz.org/campaign/en/facebook_threat_health/. Accessed May 17, 2021.
  14. Mozur P. A Genocide Incited on Facebook, with Posts from Myanmar’s Military. The New York Times. October 15, 2018. https://www.nytimes.com/2018/10/15/technology/myanmar-facebook-genocide.html. Accessed March 14, 2022.
  15. Roose K. Here’s a Look Inside Facebook’s Data Wars. The New York Times. October 4, 2021. https://www.nytimes.com/2021/07/14/technology/facebook-data.html. Accessed March 14, 2022.
  16. Edelson L, McCoy D. We Research Misinformation on Facebook. It Just Disabled Our Accounts. The New York Times. August 10, 2021. https://www.nytimes.com/2021/08/10/opinion/facebook-misinformation.html. Accessed September 23, 2021.
  17. Facebook. Announcing the 2021 recipients of research awards in misinformation and polarization. September 14, 2021. https://research.fb.com/blog/2021/09/announcing-the-2021-recipients-of-research-awards-in-misinformation-and-polarization/. Accessed September 23, 2021.
  18. Steele S, Ruskin G, McKee M, Stuckler D. “Always read the small print”: a case study of commercial research funding, disclosure and agreements with Coca-Cola. J Public Health Policy 2019; 40(3):273-285. doi: 10.1057/s41271-019-00170-9 [Crossref] [ Google Scholar]
  19. Sacks G, Looi ESY. The advertising policies of major social media platforms overlook the imperative to restrict the exposure of children and adolescents to the promotion of unhealthy foods and beverages. Int J Environ Res Public Health 2020; 17(11):4172. doi: 10.3390/ijerph17114172 [Crossref] [ Google Scholar]
  20. Lindeman M, Katainen A, Svensson J, Kauppila E, Hellman M. Compliance with regulations and codes of conduct at social media accounts of Swedish alcohol brands. Drug Alcohol Rev 2019; 38(4):386-390. doi: 10.1111/dar.12928 [Crossref] [ Google Scholar]
  21. Rowell A. Big Tobacco wants social media influencers to promote its products – can the platforms stop it? The Conversation. January 23, 2020. http://theconversation.com/big-tobacco-wants-social-media-influencers-to-promote-its-products-can-the-platforms-stop-it-129957. Accessed March 14, 2022.
  22. Facebook. Reaching Billions of People with COVID-19 Vaccine Information. February 8, 2021. https://about.fb.com/news/2021/02/reaching-billions-of-people-with-covid-19-vaccine-information/. Accessed May 27, 2021.
  23. Graham M, Rodriguez S. Facebook says it will finally ban anti-vaccination ads. CNBC. October 13, 2020. https://www.cnbc.com/2020/10/13/facebook-bans-anti-vax-ads.html. Accessed October 1, 2021.
  24. Nothias T. Access granted: Facebook’s free basics in Africa. Media Cult Soc 2020; 42(3):329-348. doi: 10.1177/0163443719890530 [Crossref] [ Google Scholar]
  25. Solon O. “It’s digital colonialism”: how Facebook’s free internet service has failed its users. The Guardian. July 27, 2017. http://www.theguardian.com/technology/2017/jul/27/facebook-free-basics-developing-markets. Accessed May 21, 2021.
  26. Brandom R. Mark Zuckerberg proposes limited 230 reforms ahead of congressional hearing. The Verge. March 24, 2021. https://www.theverge.com/2021/3/24/22348238/zuckerberg-dorsey-pichai-section-230-hearing-misinformation. Accessed July 17, 2021.
  27. Lee K, Freudenberg N, Zenone M. Measuring the commercial determinants of health and disease: a proposed framework. Int J Health Serv 2022; 52(1):115-128. doi: 10.1177/00207314211044992 [Crossref] [ Google Scholar]
  28. Room R, O’Brien P. Alcohol marketing and social media: a challenge for public health control. Drug Alcohol Rev 2021; 40(3):420-422. doi: 10.1111/dar.13160 [Crossref] [ Google Scholar]
  29. Knai C, Petticrew M, Capewell S. The case for developing a cohesive systems approach to research across unhealthy commodity industries. BMJ Glob Health 2021; 6(2):e003543. doi: 10.1136/bmjgh-2020-003543 [Crossref] [ Google Scholar]
  30. Lee K, Freudenberg N. Public health roles in addressing commercial determinants of health. Annu Rev Public Health 2022;43(1). 10.1146/annurev-publhealth-052220-020447.